I have a Copilot in VIM
DISCLAIMER: This post was generated using suggestions and completions from GitHub copilot.
Jack of all trades master of few :)
DISCLAIMER: This post was generated using suggestions and completions from GitHub copilot.
I have been looking for a way to run an LLM at home in a way that I can use for not only AI Chat but also to integrate with the Homeassistant voice to keep everything in house. One of the biggest challenges with this is having enough hardware.
We have been on the new farm now for close to 2 years and we have been leveraging my Mavic Mini drone for a bunch of real-time activities like checking gutters, locating livestock and getting an understanding of how water flows in the Queensland storm season.
There are 2 schools of thought on how to track container artifacts that have made their way through your quality gates and into production where they can be consumed by your end users.
When we moved to the new house, one of the things we did was step up our home automation from Google home/Action blocks to Home Assistant.
When we made a move further out of town the options were limited for internet connectivity, so once we had a property address I signed up for the preorder of Starlink.
As container runtime and image standards have evolved and standards developed by projects like the Open Container Initiative (OCI), there has been an increase in the number of solutions and/or use case specific tools being developed both in FOSS and commercial tools.
The internet can be a scary place, especially when you have kids.
Recently RedHat released an article about Podman’s machine function and how it can be leveraged on Macos.
Thanks to the introduction of the Kubernetes CRI there are now additional runtimes that exist for running containers within the Kubernetes orchestration ecosystem.
Well after a break I thought it was about time that I start playing with around the farm.