Categories
LLM Ollama

Automating Ollama Updates in Docker

Managing local LLMs can be a pain—especially when it comes to keeping everything updated. That’s why I created a shell script that handles it all: pulling the latest images, managing containers, and updating models.

Stopping containers, pulling new images, and restarting everything gets old fast. It’s a time suck, and it disrupts the flow. So, I decided to automate the whole process.

The Solution: A Shell Script

My script takes care of:

  • Auto-Updates: Pulls the latest Ollama and Open WebUI images.
  • Smart Container Management: Stop and Starts Containers with the new images.
  • Model Updates: Automatically updates all models after the containers are up.

How to Use It

Clone the repo, tweak the container settings if needed, and run the script when you want to update. That’s it—no more manual updates.

Check It Out

Want to make your life easier? Grab the script from GitHub here. Let me know what you think!