Quantcast
Channel: Raspberry Pi Forums
Viewing all articles
Browse latest Browse all 8041

Advanced users • Re: Performance increase with LLMs running in Ollama by overclocking?

$
0
0
Heat, CPU and Dram limited.

There is a distributed ollama than can run on multiple Pi's.
Would it be faster to use 2 x Pi5's 16GB or 4 x Pi5 8GB or 8 x Pi5 4GB or 16 x Pi5 2GB?

The bigger the model the slower things are.
Smaller models are faster but hallucinate more.
However AI has been advancing that small models are as good as bigger models from months ago.


Running over clocked will probably be cooling limited as LLM max out the CPU's.
4 x 100% and overclocked will cause heat issues.
Liquid cooling?

Statistics: Posted by Gavinmc42 — Tue Feb 25, 2025 2:48 am



Viewing all articles
Browse latest Browse all 8041

Trending Articles