An intelligent Chrome extension that uses AI to automatically organize your browser tabs into logical groups.
Built with React 19, Vite, TypeScript, TailwindCSS 4, and Bun.
- One-click organization: Click the extension icon to instantly organize all tabs
- Multiple AI providers: Choose from OpenRouter (cloud), Ollama (local), or Chrome Built-in AI
- AI-powered grouping: Intelligently categorize tabs using large language models
- Automatic ungrouping: Clears existing groups and creates fresh, optimized groupings
- Privacy-focused: Use local Ollama or Chrome's built-in models for complete privacy
- Customizable: Choose your preferred AI model and provider
- Zero-cost option: Use Chrome's built-in Gemini Nano completely free
bun install# For development (watch mode)
bun run dev
# For production
bun run build- Open Chrome and navigate to
chrome://extensions/ - Enable "Developer mode" in the top right
- Click "Load unpacked"
- Select the
distfolder from this project
Pros: Completely free, private, no setup required, no API key needed
Cons: Chrome 128+ only, requires 22GB storage, desktop only (no mobile)
-
Requirements:
- Chrome 128+ on desktop (Windows 10+, macOS 13+, Linux, or ChromeOS)
- 22 GB free storage space for Gemini Nano model
- GPU (more than 4GB VRAM) or CPU (16GB RAM, 4+ cores)
- Unmetered internet connection for initial model download
-
Configure extension:
- Right-click the extension icon and select "Options"
- Select "Chrome Built-in AI" as provider
- Check the status to see if the model is available
- Click "Save Settings"
- On first use, Chrome will automatically download Gemini Nano (~22GB)
-
Note: The model download happens automatically the first time you click the extension icon. The status indicator in the options page will show if the model is ready, needs to be downloaded, or is unavailable on your device.
Pros: Access to many powerful models, no local setup required
Cons: Requires API key, costs money, sends data to cloud
- Visit OpenRouter
- Sign up and create an API key
- Right-click the extension icon and select "Options"
- Select "OpenRouter" as provider
- Enter your API key
- Enter your preferred model (e.g.,
anthropic/claude-3.5-sonnet) - Click "Save Settings"
Pros: Free, private, no API key needed, works offline
Cons: Requires local setup, needs decent hardware
-
Install Ollama:
# macOS brew install ollama # Or download from https://ollama.ai
-
Start Ollama with CORS enabled:
# Add to ~/.zshrc or ~/.bash_profile: export OLLAMA_ORIGINS="chrome-extension://*"
-
Pull a model:
# Fast and lightweight (recommended for most users) ollama pull gemma3:4b (3.1 GB) # Good balance of speed/quality
-
Configure extension:
- Right-click the extension icon and select "Options"
- Select "Ollama (Local)" as provider
- Base URL:
http://localhost:11434(the extension will automatically append/api) - Model name: Enter the model you pulled (e.g.,
gemma3:4b) - Click "Save Settings"
- Open multiple tabs in your browser
- Click the Tab Organizer extension icon
- Wait a few seconds while the AI analyzes your tabs
- Your tabs will be automatically organized into logical groups!
Uses Google's Gemini Nano model:
- Runs entirely on-device
- No cost, completely free
- Supports English, Spanish, and Japanese
- Optimized for desktop/laptop hardware
- Multimodal capabilities (text-to-text for this extension)
You can use any model available on OpenRouter, including:
anthropic/claude-3.5-sonnet(recommended - best quality)anthropic/claude-3-opus(most capable)openai/gpt-4-turbo(very good)openai/gpt-4(reliable)google/gemini-pro-1.5(fast)
See OpenRouter Models for the full list.
Popular local models that work well:
llama3.2(recommended - fast and accurate, 3GB)llama3.2:1b(ultra-fast, low RAM, 1.3GB)qwen2.5-coder(excellent for dev/tech tabs, 4.7GB)deepseek-coder(great for programming content, 16GB)gemma2(good balance, 5.4GB)mistral(fast and capable, 4.1GB)
To see all available models: ollama list (installed) or visit Ollama Library
- Icon Click: User clicks the extension icon
- Tab Collection: Extension fetches all open tabs in the current window
- Ungrouping: Removes all existing tab groups
- AI Analysis: Sends tab titles and URLs to your chosen AI provider
- Chrome Built-in AI: Uses Gemini Nano running locally in Chrome
- OpenRouter: Cloud API call to selected model
- Ollama: Local API call to your running Ollama instance
- Grouping: LLM analyzes and returns logical groupings (e.g., "Work", "Social", "Shopping")
- Application: Extension creates new tab groups and organizes tabs accordingly
- ... (green): Processing tabs
- ✓ (green): Successfully organized
- ! (red): Error occurred (check console for details)
For Chrome Built-in AI:
- Make sure you're using Chrome 128+ on desktop
- Check available storage space (need 22GB free for model)
- Verify your device meets hardware requirements (see Options page)
- If model needs download, ensure you have an unmetered internet connection
- Wait for model download to complete (first use only, ~22GB)
- Open Chrome DevTools (F12) to see detailed errors in Console
For OpenRouter:
- Make sure you've configured your API key in the options page
- Check that your API key is valid and has credits
- Verify your internet connection
- Open Chrome DevTools (F12) to see detailed errors in Console
For Ollama:
- Make sure Ollama is running:
ollama serveor check if process is running - Verify CORS is enabled:
OLLAMA_ORIGINS="chrome-extension://*" ollama serve - Check the model is installed:
ollama list - Try pulling the model again:
ollama pull <model-name> - Verify base URL is correct (default:
http://localhost:11434) - Check Chrome DevTools Console for detailed error messages
- Try a different model (some models perform better than others)
- For Ollama: Larger models generally perform better (but are slower)
- Make sure you have at least a few tabs open
- Check that tabs have proper titles and URLs
If you see CORS errors:
# Make sure Ollama is started with CORS enabled
OLLAMA_ORIGINS="chrome-extension://*" ollama serve
# Or add to your shell profile for persistence:
echo 'export OLLAMA_ORIGINS="chrome-extension://*"' >> ~/.zshrc
source ~/.zshrcYou may need to restart Ollama for the changes to take effect
- Make sure you've built the project first (
bun run build) - Make sure you're loading the
distdirectory, not the root project directory - Check that all files are present in the
distfolder - Try reloading the extension from chrome://extensions/
For active development:
- Run
bun run devto start the development build in watch mode - Load the
distfolder as an unpacked extension in Chrome - Make changes to your source files in
src/ - The extension will automatically rebuild
- Click the reload button in chrome://extensions/ to see your changes
Chrome Built-in AI:
- All processing happens locally in Chrome using Gemini Nano
- No data is sent to external servers
- Complete privacy - your tabs never leave your device
- No API key required
- No cost and no usage tracking
OpenRouter:
- Tab data (titles and URLs) is sent to OpenRouter's cloud API for analysis
- Your API key is stored locally in Chrome's sync storage
- Data is processed in real-time and not stored by this extension
Ollama (Local):
- All processing happens locally on your machine
- No data is sent to external servers
- Complete privacy - your tabs never leave your computer
- No API key required