import requests
response = requests.post(
"https://api.downloader.org/api/v1/submit/",
headers={"Authorization": "API_KEY"},
json={"url": "URL"},
)
for item in response.json()["items"]:
print(item["type"], item["url"])
Telegram Embed Саволҳо оид ба боргирӣ
Paste any public Telegram Embed URL into the box at the top of this page and click Download. Your file is ready in a few seconds — no signup, no install.
Telegram Embed is an image-sharing platform. Posts can be a single image or a gallery; multi-image posts download in their upload order.
No — Downloader doesn't sign in to Telegram Embed. Anything Telegram Embed serves publicly can be downloaded without authentication on either side.
Telegram Embed downloads preserve the original format — JPG for photos, PNG when the source uses transparency. We don't recompress; you get what Telegram Embed actually stores.
Yes. We pass through whatever Telegram Embed serves — no re-encoding, no recompression, no resolution downgrade. What you see playing on Telegram Embed is exactly what you download.
Telegram Embed has no platform-specific gotchas worth flagging. The standard paste-and-download flow handles it cleanly.
No. Telegram Embed sees a normal page-load request; the poster receives no notification. Downloads are anonymous from the platform's perspective.
Yes. Open Downloader in your mobile browser, paste a Telegram Embed link, and tap Download. The file saves to your Photos / Files / Music app — no separate app required.
Processing on our side is constant — typically under a second. Actual download time after that depends on the file size and your internet connection.
Free accounts have a daily download cap (counted across all platforms, not just Telegram Embed). Pro accounts remove the cap entirely and add priority processing.
Telegram Embed attracts every kind of user — casual viewers, dedicated fans, professionals. The download flow is identical for all of them.
Downloading content you have the right to save — your own posts, content released under an open license, public-domain material — is standard fair use in most jurisdictions. For anything else, respect copyright and Telegram Embed's terms.
[Error: All translation engines failed for batch: MADLAD batch translation failed: CUDA out of memory. Tried to allocate 2.00 MiB. GPU 0 has a total capacity of 23.87 GiB of which 3.62 MiB is free. Process 3280094 has 228.00 MiB memory in use. Process 2050901 has 244.00 MiB memory in use. Process 3310941 has 1.43 GiB memory in use. Process 3310930 has 1.56 GiB memory in use. Process 3310934 has 1.06 GiB memory in use. Process 3310933 has 1.12 GiB memory in use. Process 3310931 has 1.10 GiB memory in use. Process 3310938 has 1.53 GiB memory in use. Process 3310945 has 1.19 GiB memory in use. Process 3310935 has 1.02 GiB memory in use. Process 3310940 has 1.06 GiB memory in use. Process 3310929 has 1.04 GiB memory in use. Process 3310947 has 1000.00 MiB memory in use. Process 3310943 has 1.06 GiB memory in use. Including non-PyTorch memory, this process has 8.95 GiB memory in use. Process 3358747 has 336.00 MiB memory in use. Of the allocated memory 8.76 GiB is allocated by PyTorch, and 14.78 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)]
[Error: All translation engines failed for batch: MADLAD batch translation failed: CUDA out of memory. Tried to allocate 2.00 MiB. GPU 0 has a total capacity of 23.87 GiB of which 3.62 MiB is free. Process 3280094 has 228.00 MiB memory in use. Process 2050901 has 244.00 MiB memory in use. Process 3310941 has 1.43 GiB memory in use. Process 3310930 has 1.56 GiB memory in use. Process 3310934 has 1.06 GiB memory in use. Process 3310933 has 1.12 GiB memory in use. Process 3310931 has 1.10 GiB memory in use. Process 3310938 has 1.53 GiB memory in use. Process 3310945 has 1.19 GiB memory in use. Process 3310935 has 1.02 GiB memory in use. Process 3310940 has 1.06 GiB memory in use. Process 3310929 has 1.04 GiB memory in use. Process 3310947 has 1000.00 MiB memory in use. Process 3310943 has 1.06 GiB memory in use. Including non-PyTorch memory, this process has 8.95 GiB memory in use. Process 3358747 has 336.00 MiB memory in use. Of the allocated memory 8.76 GiB is allocated by PyTorch, and 14.78 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)]
[Error: All translation engines failed for batch: MADLAD batch translation failed: CUDA out of memory. Tried to allocate 2.00 MiB. GPU 0 has a total capacity of 23.87 GiB of which 3.62 MiB is free. Process 3280094 has 228.00 MiB memory in use. Process 2050901 has 244.00 MiB memory in use. Process 3310941 has 1.43 GiB memory in use. Process 3310930 has 1.56 GiB memory in use. Process 3310934 has 1.06 GiB memory in use. Process 3310933 has 1.12 GiB memory in use. Process 3310931 has 1.10 GiB memory in use. Process 3310938 has 1.53 GiB memory in use. Process 3310945 has 1.19 GiB memory in use. Process 3310935 has 1.02 GiB memory in use. Process 3310940 has 1.06 GiB memory in use. Process 3310929 has 1.04 GiB memory in use. Process 3310947 has 1000.00 MiB memory in use. Process 3310943 has 1.06 GiB memory in use. Including non-PyTorch memory, this process has 8.95 GiB memory in use. Process 3358747 has 336.00 MiB memory in use. Of the allocated memory 8.76 GiB is allocated by PyTorch, and 14.77 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)]
🚀
Зеркашии оммавӣ - Зеркашии оммавии як клик
📥
Дастгирии URL-ҳои сершумор - Мундариҷаро аз якчанд URL якбора бо вергул ҷудо кунед