The new family of AI models can run on a smartphone, a Raspberry Pi, or a data centre, and is free to use commercially.
This first article in a series explains the core AI concepts behind running LLM and RAG workloads on a Raspberry Pi, including why local AI is useful and what tradeoffs to expect.