Uncategorised

Ollama OPEN AI

Keep Alive

Us this to run the specified LLM in RAM

$ curl http://0.0.0.0:11434/api/chat -d '{"model": "phi3:mini", "keep_alive": -1}'
{"model":"phi3:mini","created_at":"2024-06-23T16:37:32.29229475Z","message":{"role":"assistant","content":""},"done_reason":"load","done":true}

To reverse this process use

$ curl http://0.0.0.0:11434/api/chat -d '{"model": "phi3:mini", "keep_alive": 0}'

Leave a Reply

Your email address will not be published. Required fields are marked *