ChatGPT for
DevOps
Complex actions. Simple conversations


How to setup
Kubiya


Action stores are serverless Python functions that power the integration with various platforms. Choose the platforms you wish to integrate with and Kubiya will upload the relevant plugins. Need to connect homegrown platforms? Kubiya can integrate with any API or SDK based system

Security is top of mind when it comes to engineering platforms. Therefore Kubiya offers a hybrid model to connect to your internal platforms over a secure tunnel initiated from within your network.
No need to share credentials or secrets. All actions are executed from within your network.



How is AI used in Kubiya
Kubiya used a 4 different Large Language Models (LLM) for various aspects of the system. Multiple LLMs might be used together to achieve optimal results either in understanding context of a request of composing a natural language response.
All LLMs are private to each customer and private data from one customer is not used to train the experience of another customer.
Conversational AI
We use LLM to understand the context of a user input, compose a vector and compare that against knowledge and workflow embedding to serve the best matchs.
Geof (R&D)
Get deployment logs
Kubiya.Ai
You can get deployment logs from Kuberenetes by running
Get deployment logs from Kubernetes workflow
Amanda (R&D)
Get deployment logs for namespace Observability and deployment Argo
Kubiya.Ai
Wanted to quickly verify what you want me to do.
Do you want me to get deployment logs for Kubernetes,
with the following

Amanda (R&D)
Yes
Context
No two users are alike. We use LLMs to understand if the conversation includes full or partial context (e.g namespaces, clusters) and what other information is needed to execute the request.
Natural language responses
For summarizing knowledge responses (i.e “How do I...” ) to summarizing conversations history and enriching knowledge, multiple LLM models are used to serve a result that uses natural language.
Joe (R&D)
@kubiya Summarize the conversation for me
Kubiya.Ai
The conversation was mainly about asking questions related to Kubernetes namespaces, pods, and deployments, as well as IAM users and MFA enablement. Joe was the one asking all the questions to a user named kubiya.
Joe asked kubiya about the available Kubernetes namespaces and running pods in certain namespaces. They also asked about the number of replicas available for deployment in a specific namespace and which IAM users did not have MFA enabled.
Lastly, Joe asked kubiya to summarize the conversation so far.
To explain the conversation in a way a 5-year-old would understand, Joe was asking questions about a special computer system called Kubernetes and how many things were running on it, how many copies of things there were, and who was allowed to use it. Then they asked the computer person to tell them a short story about everything they talked about.

Generative AI
We use generative AI for prompt engineering and the ability to create a workflow schema based on a natural language workflow description
Re-enforcement learning with human feedback
Kubiya learns from each user and organization interaction. It optimizes the responses based on the actions. For example if a specific user always chooses te same namespace or deployment - Kubiya learns that and over will propose that as a default.
If one workflow is chosen over another for a given term - Kubiya learns and changes the order of it’s response
What’s under the hood
Kubiya uses a number of large language models to understand conversations, context, match the right results and respond like a “human”. All LLMs are private to each organization with the exception of the use of generative AI to create workflow schema, which uses generic kubiya actions and does not include any proprietary information while created.We match the vector for each conversation with embedding.We use Reinforcement Learning with Human Feedback to provide more accurate experience to organizations and users but do not share the feedback across organizations.
To access internal platforms we use a lightweight operator. Any action that gets triggered is triggered through the operator so the organization have full control over which systems the operator can interact with and what actions it can perform.The operator interacts with Kubiya using a 2-way secure tunnel, inlet.
Kubiya maintains knowledge about workflows, knowledge and executions for audit purposes. Data is encrypted in rest and transmit.
Each organization’s data is stored as a separate entity and the embedding can be stored in a dedicated vector database that you or Kubiya can manage.