Google’s kubectl-ai is a revolutionary open-source project that seamlessly integrates with Large Language Models (LLMs) like gemini, bedrock and local LLM providers such as ollama with Kubernetes operations, enabling users to interact with Kuberntes clusters using natural language.
kubectl ai --llm-provider ollama --model gemma3:4b --enable-tool-use-shim
You can also configure kubectl-ai using a YAML configuration file at ~/.config/kubectl-ai/config.yaml to persist those flags so you don’t have to re‑type them in.
kubectl-ai can call any binary you describe in ~/.config/kubectl-ai/tools.yaml.
1
2
3
4
5
6
7
8
{cat<<EOF > ~/.config/kubectl-ai/tools.yaml
- name: helm
description: "Helm is the Kubernetes package manager."
command: "helm"
is_custom: true
EOF
}
To enable the custom tools, you must point kubectl-ai to the directory containing the tool configuration YAML files using the --custom-tools-config flag.