Ollama on the Framework Desktop thread

What models are you running? Share your configurations! Discuss!

I managed to get it to recognize the Framework Desktop GPU and load models properly with @Pattrick_Hueper‘s fork here

Running ollama in docker on our Framework Desktop using the GPU

I’m having fun experimenting!

1 Like

not really ollama related .. but last few days i had lots of fun using comfyui and generating images with different models and settings :slight_smile:

1 Like

I’ve been mostly running gemma 27b. really happy with it.