Skip to content

andrewchee/huggingface-uv-cursor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

huggingface-uv-cursor

Python uv macOS Transformers JupyterLab Gradio License

macOS/MPS‑optimized Hugging Face starter managed by uv; works in VS Code/Cursor. Jupyter, Accelerate, and Gradio included.

πŸš€ Quick Start

  1. Activate the virtual environment:

    source .venv/bin/activate
  2. Run the demo script:

    python src/demo.py
  3. Start JupyterLab:

    jupyter lab

πŸ“ Project Structure

huggingface-uv-cursor/
β”œβ”€β”€ .vscode/
β”‚   └── settings.json          # VS Code/Cursor settings
β”œβ”€β”€ notebooks/
β”‚   └── intro.ipynb           # Jupyter notebook examples
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ demo.py               # Basic HF model demo
β”‚   β”œβ”€β”€ training_example.py   # Accelerate training demo
β”‚   └── gradio_demo.py        # Web interface demo
β”œβ”€β”€ data/                     # Data files (gitignored if large)
β”œβ”€β”€ .venv/                    # Virtual environment
β”œβ”€β”€ accelerate_config.yaml    # Accelerate configuration
β”œβ”€β”€ .gitignore               # Git ignore rules
β”œβ”€β”€ pyproject.toml           # Project dependencies
└── README.md               # This file

πŸ› οΈ Setup

This project uses:

  • uv for fast Python package management
  • PyTorch with CPU wheels (MPS acceleration on Apple Silicon)
  • Transformers for pre-trained models
  • Datasets for data loading
  • Accelerate for distributed training
  • JupyterLab for interactive development
  • Gradio for web demos
  • WandB for experiment tracking

🍎 Apple Silicon (MPS) Support

The project is configured to use MPS acceleration when available:

import torch
device = "mps" if torch.backends.mps.is_available() else "cpu"

πŸ”§ VS Code/Cursor Configuration

The .vscode/settings.json file configures:

  • Python interpreter pointing to .venv/bin/python
  • Automatic virtual environment activation
  • Jupyter notebook settings
  • Type checking mode

πŸ“š Examples

Basic Text Generation

from transformers import pipeline

pipe = pipeline("text-generation", model="sshleifer/tiny-gpt2")
pipe.model.to(device)  # Move to MPS if available
result = pipe("Hello world:", max_new_tokens=20)

Model Loading

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("sshleifer/tiny-gpt2")
model = AutoModelForCausalLM.from_pretrained("sshleifer/tiny-gpt2").to(device)

Training with Accelerate

python src/training_example.py

Web Demo with Gradio

python src/gradio_demo.py

πŸ’‘ Gradio tips

  • Pick a free port automatically: we launch with server_port=None, so Gradio chooses an available port. If you prefer a fixed port:
    GRADIO_SERVER_PORT=7861 python src/gradio_demo.py
  • Local vs public link:
    • Local only (default): opens at http://127.0.0.1:<port>.
    • Public share link:
      # in src/gradio_demo.py
      demo.launch(share=True)
      Useful for quick demos; anyone with the link can access while the app runs.
  • Port already in use: free 7860 and retry
    lsof -i :7860 | awk 'NR>1{print $2}' | xargs -r kill
    python src/gradio_demo.py

πŸš€ Next Steps

  1. Authenticate with Hugging Face:

    huggingface-cli login
  2. Explore models and datasets:

  3. Create your own experiments:

    • Add new scripts in src/
    • Create notebooks in notebooks/

πŸ“¦ Dependencies

Core dependencies are defined in pyproject.toml and require Python 3.10+.

Install with uv (recommended):

uv pip install -e .

If you prefer pip:

pip install -e .

πŸ”’ Privacy & safety

  • No secrets are committed. Authenticate locally with huggingface-cli login; tokens are stored in your keychain and never written to this repo.
  • Large artifacts and private data are ignored via .gitignore (data/, datasets/, model files, logs, caches).
  • Before making this public, quickly scan your git history for accidental secrets:
    git log -p | grep -iE "(hf_|token|password|api|secret)" || true

🧩 Use this as a template

Click β€œUse this template” on GitHub or run:

git clone <your-repo-url> my-hf-sandbox
cd my-hf-sandbox && ./setup.sh

🀝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test with python src/demo.py
  5. Submit a pull request

πŸ“„ License

This project is open source and available under the MIT License.

About

macOS/MPS Hugging Face starter managed by uv; Jupyter, Accelerate, and Gradio; works in VS Code/Cursor.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors