Operational Handler
Community | Key features | Use cases | Getting started | Workflow | Examples | Contributing | Alquimia AI
Community
We are building enterprise-ready AI agents with a focus on transparency, consistency, and flexibility. Contributions from the community help ensure reliability and innovation.
Get Involved
GitHub Discussions – Join here
Slack – Community chat
Your contributions help improve AI automation for everyone.
Introduction
Alquimia Operational Handler is an advanced, event-driven platform designed to manage multi-agent LLM (Large Language Model) solutions in containerized environments. Built on Knative, it provides seamless orchestration of LLMs, intelligent memory management, context-aware prompting, and complex tool execution.
Designed for Openshift and Kubernetes, the platform offers lightweight deployment, high scalability, and native integrations with modern AI ecosystems, including Openshift AI and LangChain. It supports a diverse range of LLM providers, vector stores, and retrieval-augmented generation (RAG) strategies, making it an ideal solution for enterprises and developers building AI-powered applications.
Key Features
🧬 Event-Driven & Serverless
Built on Knative for automatic scaling and serverless execution.
Fully asynchronous to ensure optimal performance and responsiveness.
☁️ Seamless Cloud-Native Integration
Works natively on Openshift and Kubernetes.
Supports Openshift AI for direct access to deployed models.
LangChain-compatible, enabling powerful agent-driven workflows.
🦜 Flexible Multi-LLM Support
Works with major LLM providers, including:
OpenAI
Mistral
DeepSeek
LLama
📖 Advanced RAG & Vector Store Integration
Supports retrieval-augmented generation (RAG) for enhanced AI reasoning.
Compatible with vector stores like:
Qdrant
Chroma
ElasticSearch
📩 Omnichannel AI Integration
Use custom connectors or community Kamelets (Camel K) for seamless omnichannel support.
Automate AI-powered workflows across multiple communication channels.
🥷 Versatile Tool Execution
Supports server-side, client-side, and hybrid tool execution.
Context-aware execution strategies to optimize performance.
🚀 Lightweight & Production-Ready
Minimal boilerplate, enabling rapid development and deployment.
Enterprise-ready with scalability, reliability, and observability.
Why Choose Alquimia Operational Handler?
✅ Scalability – Effortlessly scale AI workflows with Knative. ✅ Flexibility – Works with multiple LLM providers, vector stores, and orchestration frameworks. ✅ Performance – Asynchronous, event-driven execution optimizes efficiency. ✅ Integration – Native compatibility with LangChain, Openshift AI, and containerized environments. ✅ Serverless Superpowers – Automatically scale workloads, reducing operational costs.
Use Cases
Multi-Agent AI Orchestration – Manage and coordinate complex LLM-driven workflows.
Enterprise-Scale Document Retrieval – Implement RAG for intelligent search and knowledge retrieval.
Omnichannel AI Automation – Deploy AI-powered solutions across multiple communication channels.
Hybrid Tool Execution – Dynamically execute AI tools across client, server, or hybrid environments.
Getting Started
Prerequisites
A running Openshift or Kubernetes cluster.
Openshift Serverless (Knative) runtime installed.
Openshift Service Mesh (Istio) for networking.
AMQ Streams (Strimzi) for event-driven messaging.
A Redis instance for memory and cache management
A Couchdb instance to manage agent configurations
Optional: Vector store (e.g., Qdrant, Chroma, or ElasticSearch) for RAG capabilities.
Installation
Install on Openshift
Ensure you have:
OC client installed.
Knative support for Kafka via Strimzi (AMQ Streams).
Then, deploy the platform:
oc apply -f serving/base.yaml
oc apply -f eventing/base.yaml
Now you are ready to deploy your first agent
Workflow
The proposed architecture is intented to be a common framework for agents. You can change it to adapt your needs. Recommend set up:
sequenceDiagram
participant CC as Client
participant AH as Hermes (Entrypoint)
participant IB as Inbound Broker
participant NB as Normalized Broker
participant CB as Classified Broker
participant AL as Alquimia Leviathan (Execution steps)
participant OB as Outbound Broker
participant SC as Slack Connector
CC ->> AH: Client sends query via Slack
AH ->> IB: Trigger agent inference
IB ->> AL: Trigger normalization Sequence
AL ->> AL: Get memory for current session
AL ->> NB: Pass normalized event
NB ->> AL: Trigger agent custom sequence
AL ->> AL: Executes classification models or LLMs with different roles
AL ->> CB: Pass classified event
CB ->> AL: Trigger Empathy Sequence
AL ->> AL: Waits for other events to complete (tool execution for example)
AL ->> AL: Select best expert profile according to context
AL ->> AL: Invokes final LLM with selected profile
AL ->> OB: Pass outbound event
OB ->> AL: Memory persistance
OB ->> SC: Send answer via Slack
Examples
See our list of full working examples here
Integrations
You can find our custom channel integrations here or deploy community Kamelets
Server tools
Find all server tools available here and see how easy is to create your own bundle.
Local development
For more information on how to set your local develop environment see docs here
CLI Usage
Alquimia Operational Handler provides a CLI for managing embeddings, updating assistants configuration, and invoking AI-powered functions in your cluster.
Available CLI Operations
Install required libs (use of virtual environments recommended):
pip install -r requirements.txt
Then list of available operations by running:
python main.py --help
For more detail see:
Invoke AI Functions – Functions invocation
Manage Embeddings – Embeddings operations
Configure Assistants – Configuration
Contributing
We are building an open, collaborative community. Contributions are always welcome!
If you'd like to add features, improve documentation, or suggest enhancements:
Fork the repository.
Create a new branch (
git checkout -b feature-xyz
).Submit a pull request with your proposed changes.
License
Alquimia Operational Handler is open-source and available under the MIT License.
Last updated