Manual Model Downloads
Some models require manual download from Hugging Face due to licensing requirements or access restrictions. This guide covers the general process.
When Manual Download is Required
You'll need to manually download models when:
- The model requires accepting a license agreement
- The model is gated (requires Hugging Face authentication)
- You want to use a custom or fine-tuned model
- Automatic download fails due to network issues
Prerequisites
Install Hugging Face CLI
Choose one method:
Option 1: Using pipx (Recommended)
pipx install huggingface_hubOption 2: Using pip
python3 -m pip install --upgrade "huggingface_hub[cli]"Authenticate with Hugging Face
- Create a Hugging Face account at huggingface.co
- Generate an access token at huggingface.co/settings/tokens
- Log in via CLI:
huggingface-cli loginEnter your token when prompted.
Download Process
Step 1: Find the Model Directory
Izwi stores models in a specific location:
| Platform | Location |
|---|---|
| macOS | ~/Library/Application Support/izwi/models/ |
| Linux | ~/.local/share/izwi/models/ |
| Windows | %APPDATA%\izwi\models\ |
Step 2: Download the Model
Use the Hugging Face CLI to download:
huggingface-cli download <repo-id> \ --repo-type model \ --local-dir "<izwi-models-path>/<model-name>"Example for macOS:
huggingface-cli download google/gemma-3-1b-it \ --repo-type model \ --local-dir "$HOME/Library/Application Support/izwi/models/Gemma-3-1b-it"Example for Linux:
huggingface-cli download google/gemma-3-1b-it \ --repo-type model \ --local-dir "$HOME/.local/share/izwi/models/Gemma-3-1b-it"Step 3: Verify the Download
izwi list --localThe model should appear in the list.
Step 4: Load the Model
izwi models load <model-name>Downloading Specific Files
If you only need certain files (e.g., to save space):
huggingface-cli download <repo-id> \ --include "*.safetensors" "*.json" \ --local-dir "<path>"Resuming Interrupted Downloads
The Hugging Face CLI automatically resumes interrupted downloads. Just run the same command again.
Using a Custom Cache Directory
By default, Hugging Face caches downloads in ~/.cache/huggingface/. To use a different location:
export HF_HOME=/path/to/cache huggingface-cli download <repo-id> --local-dir "<path>"Common Models Requiring Manual Download
| Model | Repository | Notes |
|---|---|---|
| Gemma 3 1B | google/gemma-3-1b-it | Requires license acceptance |
| Llama 3 | meta-llama/Llama-3-* | Requires license acceptance |
See specific guides:
Troubleshooting
"Access denied" or "401 Unauthorized"
- Ensure you're logged in:
huggingface-cli whoami - Check that you've accepted the model's license on Hugging Face
- Verify your token has read permissions
Download is very slow
Try using hf_transfer for faster downloads:
pip install hf_transfer export HF_HUB_ENABLE_HF_TRANSFER=1 huggingface-cli download <repo-id> --local-dir "<path>"Model not detected by Izwi
- Verify the model is in the correct directory
- Check the folder name matches what Izwi expects
- Restart the Izwi server:
izwi serve
Disk space issues
Check available space before downloading:
df -hLarge models can be 10+ GB. Ensure you have enough free space.