Unlock AI Power: Run DeepSeek Locally on OpenMediaVault with Docker Compose
This guide walks you through running DeepSeek on OMV using Docker Compose, Ollama (for local AI management), and Open WebUI (a user-friendly interface)
Introduction
OpenMediaVault (OMV) is a versatile, open-source NAS platform perfect for self-hosted workflows. By combining it with Docker, you can deploy powerful tools like DeepSeek—a state-of-the-art AI model optimized for coding, mathematics, and logical tasks—directly on your hardware. This guide walks you through running DeepSeek on OMV using Docker Compose, Ollama (for local AI management), and Open WebUI (a user-friendly interface), bypassing cloud dependencies and keeping your data private.
Why Host DeepSeek on Your OMV Server?
- Full Data Control:
 Avoid sending sensitive code, financial data, or proprietary information to third-party servers.
- Offline Accessibility:
 Use DeepSeek even without an internet connection—ideal for labs or remote workspaces.
- Hardware Flexibility:
 Scale resources based on your needs. Run smaller models on modest hardware or leverage GPU acceleration for complex tasks.
Step-by-Step Installation Guide
Prerequisites
- OpenMediaVault (v6.x+) with OMV-Extras and Docker installed.
- A shared folder for Docker data (e.g., docker).
- At least 8GB of RAM (16GB+ recommended for larger models).
Step 1: Prepare Docker Directories
- In the OMV web interface, go to Storage > Shared Folders.
- Create subfolders for Ollama and Open WebUI under your existing Docker shared folder:- ollama(to store AI models)
- open-webui(to store UI settings and chat history)
 
Step 2: Create the Docker Compose File
- Navigate to Services > Compose > Files in the OMV interface.
- Click Create, name the file deepseek.yml, and paste the following configuration:
version: "3.8"
services:
  ollama:
    image: ollama/ollama
    ports:
      - "11434:11434"
    volumes:
      - /sharedfolders/docker/ollama:/root/.ollama  # Use your shared folder path
    restart: unless-stopped
  open-webui:
    image: ghcr.io/open-webui/open-webui:main
    ports:
      - "8080:8080"
    environment:
      - OLLAMA_BASE_URL=http://ollama:11434
    volumes:
      - /sharedfolders/docker/open-webui:/app/backend/data
    depends_on:
      - ollama
    restart: unless-stopped
Notes:
- Replace /sharedfolders/dockerwith your actual Docker shared folder path (visible under Storage > Shared Folders).
- The ollamaservice hosts the AI models, whileopen-webuiprovides the chat interface.
Step 3: Deploy the Stack
- In the OMV Compose plugin interface, select the deepseek.ymlfile.
- Click Up to start the containers.
- Verify the status under Services > Compose > Containers—both ollamaandopen-webuishould show as “Running.”
Step 4: Access Open WebUI
- Open http://<your-omv-ip>:8080in a browser.
- Create an admin account (disable public sign-ups later via Settings > Authentication).
Step 5: Download the DeepSeek Model
- Option 2: Via Open WebUI:- Go to the Models tab.
- Search for deepseek-coder:33band click Download.
 
Option 1: Via OMV’s CLI (SSH or Services > SSH > Terminal):
docker exec -it ollama ollama pull deepseek-coder:33b
Step 6: Start Using DeepSeek
Select the deepseek-coder:33b model from the dropdown and ask questions like:
- “Write a script to back up my OMV shared folders to an external drive.”
- “Optimize this Python code for memory efficiency.”
Conclusion & Pro Tips
You’ve now transformed your OpenMediaVault server into a private AI workstation. To optimize your setup:
- Integrate with OMV Workflows:
 Use Ollama’s API (port11434) to connect DeepSeek to automation scripts or cron jobs.
- Monitor Resource Usage:
 Check OMV’s System > Performance Statistics to ensure RAM/CPU stays within limits.
- Update Containers:
 Periodically refresh your Docker images via the Compose plugin:- Click Pull for the deepseek.ymlstack, then Down and Up to restart.
 
- Click Pull for the 
- Experiment with Models:
 Try smaller models likedeepseek-coder:6.7bfor faster responses or quantized versions for lower RAM usage.
Final Checklist:
- [ ] Verify shared folder permissions match Docker’s requirements.
- [ ] Set up OMV’s firewall (Network > Firewall) to restrict external access to port 8080if needed.
- [ ] Explore Open WebUI’s prompt templates for coding or math-specific tasks.
With DeepSeek running locally on OMV, you’re equipped to tackle complex challenges while maintaining full ownership of your data. Happy coding!