Running vLLM on a Macbook Air M1
I have been looking for a way to run an LLM at home in a way that I can use for not only AI Chat but also to integrate with the Homeassistant voice to keep everything in house. One of the biggest challenges with this is having enough hardware.