OLLAMA_MODELS
to specify a desired storage location.As the use of large language models (LLMs) grows, managing these models efficiently becomes essential, especially when working across multiple systems. With Ollama, the challenge often lies in ensuring that your model files are consistently available without having to repeatedly download them for each system. This guide provides an in-depth explanation of how you can save your Ollama model files and transfer them across systems by leveraging a custom environment variable, transferring files manually, and verifying compatibility between systems.
Ollama saves its downloaded models on disk by caching them in a designated directory. The location of these files can differ based on your operating system:
~/.ollama/models
or within ~/Library/Application Support/Ollama/
.~/.ollama/models
or similar hidden directories.C:\Users\%username%\.ollama\models
or within the AppData directory.These directories contain several important files, including both the blobs (which hold the bulk of the model data) and manifests (which provide necessary metadata). A deep understanding of these components is key to ensuring that the models operate correctly when moved to another system.
Blobs are binary files that store the actual data and weights of the model. Since these files can be quite large, it is crucial to maintain their integrity during the transfer process. A typical blob file may span several gigabytes depending on the model complexity.
Manifests are metadata files that accompany the blob files. They contain important information such as file hashes, versions, and configuration details. The manifest must not be altered during the transfer, as it ensures that Ollama recognizes and correctly utilizes the model on any system.
One of the best strategies to facilitate model reuse across multiple systems is to define a custom storage path for your Ollama models using the environment variable OLLAMA_MODELS
. This approach not only standardizes the file location but also simplifies the process of transferring models.
Follow these steps to set the custom storage location on Windows:
OLLAMA_MODELS
and assign it a path, for example, D:\OllamaModels
.Setting the environment variable on Linux can be achieved through shell configuration or updating the Ollama service:
# For temporary setting in a terminal session
export OLLAMA_MODELS=/path/to/new/location
# For persistent setting, add the above line to your ~/.bashrc or ~/.profile file
Additionally, if Ollama is running as a service, update its configuration with:
# Open the systemd service override file
sudo systemctl edit ollama.service
# Add the following lines under the [Service] section:
Environment="OLLAMA_MODELS=/path/to/new/location"
# Reload and restart the service:
sudo systemctl daemon-reload
sudo systemctl restart ollama.service
On macOS, you can set the environment variable using the terminal command:
launchctl setenv OLLAMA_MODELS /path/to/new/location
After setting the variable, restart Ollama to apply the new storage location. With this setup, any model you download moving forward will be saved in the specified directory.
Once you have established your custom model storage location, transferring the model files becomes a straightforward process. The following step-by-step guide outlines the necessary actions:
First, determine the current storage directory of your Ollama models. With the custom environment variable in place, this will be the folder you specified (e.g., /path/to/new/location
on Linux or macOS, or D:\OllamaModels
on Windows).
To prevent file corruption or issues during the copying process, make sure that Ollama is completely shut down. This ensures that no model files are actively being written to or modified during the transfer.
Using your file explorer or command line, copy the entire model directory, including both the blobs and the manifest files. For example, on Linux you might use:
cp -r /path/to/new/location /path/to/backup/location
Zip the folder if you are transferring it via physical media or network to preserve the entire structure and then copy it to the target system.
Choose your transfer method:
Before starting Ollama on the new system, ensure that the directory structure remains identical to the source system. If you are using the environment variable method, also set OLLAMA_MODELS
on the new system to point to the transferred directory.
After placing the model files in the right location on the new system, start Ollama and run:
ollama list
This command should display the models you transferred. For further confirmation, test running your model:
ollama run <model_name>
If the model starts successfully and performs inference as expected, the transfer was successful.
Transferring models between systems like Windows and Linux may require some additional care:
Although copying the model files provides a robust solution, it is important to consider the following best practices for long-term management:
OLLAMA_MODELS
is consistently set across all systems to simplify troubleshooting.Step | Action | Details |
---|---|---|
Step 1 | Locate Model Directory |
Identify the custom or default model storage path. Typically: ~/.ollama/models or custom location via OLLAMA_MODELS .
|
Step 2 | Shutdown Ollama | Ensure that Ollama is not running to avoid file corruption. |
Step 3 | Copy Files | Copy both the blobs and manifests folders, preserving the file structure. |
Step 4 | Transfer Files | Use USB, network, or cloud storage to move the files to the target system. |
Step 5 | Set Up New System |
Install Ollama. Set up the same custom storage path with OLLAMA_MODELS on the new system.
|
Step 6 | Verify Functionality | Run ollama list and ollama run <model_name> to test the model. |
It is essential that both the source and target systems are running compatible versions of Ollama. If one system is using an older version, there might be discrepancies in how models are managed or updated. Before transferring model files, verify that both systems are updated to a version that supports the same cache and model file formats.
If you encounter issues where models are not detected on the target system:
OLLAMA_MODELS
is correctly set and pointing to the directory where the models reside.ollama list
command to trigger a re-scan of the model files.
Transferring Ollama model files without repeated downloads in the future is not only achievable but also highly efficient with the right approach. Fundamentally, setting a custom storage location using the OLLAMA_MODELS
environment variable simplifies managing these files across different systems. By ensuring that your models are stored in a predictable and controlled directory, you mitigate the challenges that come with repeated downloads and potential file corruption.
The process involves several key steps: identifying the proper storage location, shutting down Ollama to safeguard ongoing processes, and then carefully copying and transferring the model files to a new system. Taking careful note of the directory structure and adjusting file permissions ensures that the transferred models function as expected once transferred.
Moreover, by employing shared storage solutions such as network drives or cloud-based storage, you can further streamline the accessibility of models across multiple systems, reducing system-specific configuration and the need for redundant downloads. As best practices, ensuring compatibility between systems and regularly reviewing for updates in Ollama’s file management protocols helps maintain a robust setup over time.
Adhering to these guidelines guarantees that you will be able to set up and verify your Ollama models on any system quickly and efficiently, thereby allowing you to focus on leveraging the power of large language models rather than managing downloads and file transfers.