Unleashing Kubernetes Intelligence - running k8sgpt utilizing your own fine-tuned LLM

Video

Watch Mario Fahlandt's talk at Cloud Native AI Day EU 2024

K8sGPT is a fast rising star inside the CNCF landscape. Most of us use it currently with paid resources - because of this, some of us can’t utilize its powers.

Follow me into the rabbit hole, creating your own fine-tuning of an LLM with Kubeflow and taking LocalAI for a spin inside our cluster to use this very model as a base.

Now we have a local interface to use for k8sgpt. So we can scan our Kubernetes clusters, diagnosing and triaging issues in simple English. K8sgpt helps to pull out the most relevant information and enrich it with AI.

As a benefit on top we enrich the LLM in the fine-tuning process with our own project or Open Source Documentation. So LocalAI acts as our very own ChatGPT to ask for help in our environment.

The most amazing thing, we can do everything inside of Kubernetes, in our own infrastructure in our own Datacenter or Private Cloud.

Speaker: Mario Fahlandt, Customer Delivery Architect at Kubermatic

Leading Companies Choose Kubermatic