Continuously-updated public IP ranges for major cloud providers, services, and apps — one txt file per provider, ready to drop into firewalls, allowlists, or geo blocks.
35 providers · 113,771 IPv4 entries · 267,505 IPv6 entries · refreshed every 4 hours via GitHub Actions
All addresses come from public sources (vendor-published JSON/TXT or DNS resolution of vendor domains). The lists are committed back to main after each refresh, so consumers can pin to either main (rolling) or a specific commit (frozen).
Each provider directory contains four files:
| File | Contents |
|---|---|
ipv4.txt |
IPv4 CIDRs, one per line |
ipv4_merged.txt |
Same list reduced to the smallest equivalent set of CIDRs |
ipv6.txt |
IPv6 CIDRs, one per line |
ipv6_merged.txt |
Same list reduced |
Fetch them straight from raw.githubusercontent.com:
curl -fsSL https://raw.githubusercontent.com/mrkhachaturov/ipranges/main/cloudflare/ipv4_merged.txtipset create cloudflare hash:net family inet
curl -fsSL https://raw.githubusercontent.com/mrkhachaturov/ipranges/main/cloudflare/ipv4_merged.txt \
| xargs -I{} ipset add cloudflare {}
iptables -I INPUT -m set --match-set cloudflare src -j ACCEPTgeo $is_googlebot {
default 0;
include /etc/nginx/googlebot-ipv4.conf; # generated from googlebot/ipv4_merged.txt as "<cidr> 1;"
}import ipaddress, urllib.request
url = "https://raw.githubusercontent.com/mrkhachaturov/ipranges/main/openai/ipv4_merged.txt"
nets = [ipaddress.ip_network(line.strip()) for line in urllib.request.urlopen(url) if line.strip()]
def is_openai(ip: str) -> bool:
addr = ipaddress.ip_address(ip)
return any(addr in n for n in nets)| Provider | IPv4 | IPv4 (merged) | IPv6 | IPv6 (merged) | Notes |
|---|---|---|---|---|---|
| Akamai | txt | txt | txt | txt | |
| Amazon (AWS) | txt | txt | txt | txt | |
| Anthropic (Claude) | txt | txt | txt | txt | |
| Apple (Private Relay) | txt | txt | txt | txt | |
| Atlassian | txt | txt | txt | txt | |
| Bing (Bingbot) | txt | txt | — | — | |
| Cloudflare | txt | txt | txt | txt | |
| Cloudflare Tunnel (Argo) | txt | txt | txt | txt | |
| Amazon CloudFront | txt | txt | txt | txt | |
| DeepL | txt | txt | txt | txt | |
| Devolutions | txt | txt | — | — | |
| DigitalOcean | txt | txt | txt | txt | |
| Discord | txt | txt | — | — | |
| Facebook (Meta) | txt | txt | txt | txt | |
| Games | txt | txt | txt | txt | |
| GitHub | txt | txt | txt | txt | |
| Google (Cloud & GoogleBot) | txt | txt | txt | txt | |
| Google (GoogleBot) | txt | txt | txt | txt | To allow GoogleBot, block all Google IPs first, then allow these. |
| Groq | txt | txt | txt | txt | |
| Kino.pub | txt | txt | txt | txt | |
| Linode | txt | txt | txt | txt | |
| Microsoft | txt | txt | txt | txt | |
| Notion | txt | txt | txt | txt | |
| OpenAI (GPTBot) | txt | txt | txt | txt | |
| OpenTofu | txt | txt | txt | txt | |
| Oracle (Cloud) | txt | txt | — | — | |
| ProtonVPN (exit nodes) | txt | txt | — | — | |
| Roblox | txt | txt | txt | txt | |
| Spotify | txt | txt | txt | txt | |
| Sunsama | txt | txt | txt | txt | |
| Tana | txt | txt | txt | txt | |
| Telegram | txt | txt | txt | txt | |
| Twitter / X | txt | txt | txt | txt | |
| Vultr | txt | txt | txt | txt | |
| Wispr Flow | txt | txt | txt | txt | |
| All-in-one (every provider combined) | txt | txt | txt | txt | Aggregate of every provider above |
The provider table above is generated from utils/providers.json by utils/render_readme.py. Don't edit between the
<!-- BEGIN AUTO:* -->markers by hand — the GitHub Action will overwrite changes on the next run.
A GitHub Action runs every 4 hours (cron: '8 */4 * * *') and on workflow_dispatch:
- Executes every
*/downloader.shdiscovered under the repo - Concatenates all per-provider lists into
all/ipv4.txtandall/ipv6.txt - Runs
utils/merge.pyto produce*_merged.txtfor each list - Runs
utils/render_readme.pyto refresh the table and counts above - Commits any changes back to
main
If you want fresher data than the cron, trigger the Update workflow manually from the Actions tab.
The full contract lives in CLAUDE.md. The short version:
- Create
<provider>/downloader.sh. Use devolutions/downloader.sh (DNS-resolution pattern) or google/downloader.sh (vendor-JSON pattern) as templates. Both invocation modes — repo root and inside the provider directory — must work. - Run it once locally to seed
<provider>/ipv4.txtand<provider>/ipv6.txt. - Add an entry to utils/providers.json (kept in alphabetical order). The README updates itself on the next workflow run.
# system deps
sudo apt install -y whois parallel gawk dnsutils jq python3-pip
# python deps
pip install -r utils/requirements.txt
# run a single provider
cd cloudflare && bash downloader.sh
# regenerate the merged file for one list
python utils/merge.py --source=cloudflare/ipv4.txt | sort -V > cloudflare/ipv4_merged.txt
# refresh README from current data
python utils/render_readme.py
# CI guard — fail if README would change
python utils/render_readme.py --checkMIT — © 2026 Ruben Khachaturov. The IP data itself comes from public sources and is not subject to copyright on its own.