Skip to content

mrkhachaturov/ipranges

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4,103 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IP Ranges

Continuously-updated public IP ranges for major cloud providers, services, and apps — one txt file per provider, ready to drop into firewalls, allowlists, or geo blocks.

Update License: MIT Last commit

35 providers · 113,771 IPv4 entries · 267,505 IPv6 entries · refreshed every 4 hours via GitHub Actions

All addresses come from public sources (vendor-published JSON/TXT or DNS resolution of vendor domains). The lists are committed back to main after each refresh, so consumers can pin to either main (rolling) or a specific commit (frozen).

Quick start

Each provider directory contains four files:

File Contents
ipv4.txt IPv4 CIDRs, one per line
ipv4_merged.txt Same list reduced to the smallest equivalent set of CIDRs
ipv6.txt IPv6 CIDRs, one per line
ipv6_merged.txt Same list reduced

Fetch them straight from raw.githubusercontent.com:

curl -fsSL https://raw.githubusercontent.com/mrkhachaturov/ipranges/main/cloudflare/ipv4_merged.txt

Use in ipset + iptables

ipset create cloudflare hash:net family inet
curl -fsSL https://raw.githubusercontent.com/mrkhachaturov/ipranges/main/cloudflare/ipv4_merged.txt \
  | xargs -I{} ipset add cloudflare {}
iptables -I INPUT -m set --match-set cloudflare src -j ACCEPT

Use in nginx

geo $is_googlebot {
    default 0;
    include /etc/nginx/googlebot-ipv4.conf;  # generated from googlebot/ipv4_merged.txt as "<cidr> 1;"
}

Use in a Python allowlist

import ipaddress, urllib.request
url = "https://raw.githubusercontent.com/mrkhachaturov/ipranges/main/openai/ipv4_merged.txt"
nets = [ipaddress.ip_network(line.strip()) for line in urllib.request.urlopen(url) if line.strip()]
def is_openai(ip: str) -> bool:
    addr = ipaddress.ip_address(ip)
    return any(addr in n for n in nets)

Providers

Provider IPv4 IPv4 (merged) IPv6 IPv6 (merged) Notes
Akamai txt txt txt txt
Amazon (AWS) txt txt txt txt
Anthropic (Claude) txt txt txt txt
Apple (Private Relay) txt txt txt txt
Atlassian txt txt txt txt
Bing (Bingbot) txt txt
Cloudflare txt txt txt txt
Cloudflare Tunnel (Argo) txt txt txt txt
Amazon CloudFront txt txt txt txt
DeepL txt txt txt txt
Devolutions txt txt
DigitalOcean txt txt txt txt
Discord txt txt
Facebook (Meta) txt txt txt txt
Games txt txt txt txt
GitHub txt txt txt txt
Google (Cloud & GoogleBot) txt txt txt txt
Google (GoogleBot) txt txt txt txt To allow GoogleBot, block all Google IPs first, then allow these.
Groq txt txt txt txt
Kino.pub txt txt txt txt
Linode txt txt txt txt
Microsoft txt txt txt txt
Notion txt txt txt txt
OpenAI (GPTBot) txt txt txt txt
OpenTofu txt txt txt txt
Oracle (Cloud) txt txt
ProtonVPN (exit nodes) txt txt
Roblox txt txt txt txt
Spotify txt txt txt txt
Sunsama txt txt txt txt
Tana txt txt txt txt
Telegram txt txt txt txt
Twitter / X txt txt txt txt
Vultr txt txt txt txt
Wispr Flow txt txt txt txt
All-in-one (every provider combined) txt txt txt txt Aggregate of every provider above

The provider table above is generated from utils/providers.json by utils/render_readme.py. Don't edit between the <!-- BEGIN AUTO:* --> markers by hand — the GitHub Action will overwrite changes on the next run.

How updates work

A GitHub Action runs every 4 hours (cron: '8 */4 * * *') and on workflow_dispatch:

  1. Executes every */downloader.sh discovered under the repo
  2. Concatenates all per-provider lists into all/ipv4.txt and all/ipv6.txt
  3. Runs utils/merge.py to produce *_merged.txt for each list
  4. Runs utils/render_readme.py to refresh the table and counts above
  5. Commits any changes back to main

If you want fresher data than the cron, trigger the Update workflow manually from the Actions tab.

Adding a new provider

The full contract lives in CLAUDE.md. The short version:

  1. Create <provider>/downloader.sh. Use devolutions/downloader.sh (DNS-resolution pattern) or google/downloader.sh (vendor-JSON pattern) as templates. Both invocation modes — repo root and inside the provider directory — must work.
  2. Run it once locally to seed <provider>/ipv4.txt and <provider>/ipv6.txt.
  3. Add an entry to utils/providers.json (kept in alphabetical order). The README updates itself on the next workflow run.

Local development

# system deps
sudo apt install -y whois parallel gawk dnsutils jq python3-pip

# python deps
pip install -r utils/requirements.txt

# run a single provider
cd cloudflare && bash downloader.sh

# regenerate the merged file for one list
python utils/merge.py --source=cloudflare/ipv4.txt | sort -V > cloudflare/ipv4_merged.txt

# refresh README from current data
python utils/render_readme.py

# CI guard — fail if README would change
python utils/render_readme.py --check

License

MIT — © 2026 Ruben Khachaturov. The IP data itself comes from public sources and is not subject to copyright on its own.

Source

https://github.com/mrkhachaturov/ipranges

About

Continuously-updated public IP ranges for major cloud providers, SaaS services, and bots — one txt file per provider, refreshed every 4 hours via GitHub Actions. Drop-in for firewalls, allowlists, and geo blocks.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors