Blog
The technology behind the chat — Talk to the systems and get advice
Veröffentlicht am
14.8.2024

The most important measure in manufacturing is productivity, and generative AI (GenAI) has opened up new opportunities to achieve this. Using this technology, operators and managers can now interact with their systems via a chat interface via oee.ai. In this article, we explore the technology that makes this remarkable ability possible.
Open source large language model as a core
The core of the oee.ai chat feature is a large language model (LLM). A large language model is an advanced AI system that is trained on huge amounts of text data to understand and generate human-like speech. It predicts and creates text based on patterns learned during training, allowing tasks such as answering questions and participating in conversations. These models are built using deep learning techniques, in particular neural networks with billions of parameters.
LLMs are available in various configurations. oee.ai has opted for an open-source hosting model — on its own servers to ensure maximum data protection for our customers. No data leaves our infrastructure and everything is subject to European data protection laws.
Agents get the job done
Agents are autonomous programs that make decisions and take action to achieve specific goals. They can act independently or interact with other agents and systems, often using AI to adapt to changing conditions. Agents are often used in simulations, robotics, and AI applications such as virtual assistants.
In oee.ai, agents interact with three other infrastructure components: the machine data as the core of oee.ai, a vector database to query specific documents using retrieval-augmented generation (RAG), and a relational database to store the history of previous calls.
Agents are the coordinators of the tools needed to answer the user's question.

OEE data from the time series database
To operate the OEE.ai chat system, the agent collects current or historical data for the connected devices by querying the relevant APIs. This data includes productivity status, shift models, and loss cause catalogs — just to name a few — all stored in oee.ai, some of them in a time series database. When a user asks a question, such as “How was OEE on the late shift yesterday? “, the agent retrieves the necessary data via the API and the LLM formulates a clear, user-friendly answer for the chat interface.
Agentic Retrieval Augmented Generation (RAG) for the business administration context
In addition to historical and real-time machine data, users can also request business context and targeted advice from the OEE.AI chatbot. While the Large Language Model (LLM) provides general insights from training, precise and context-specific answers — such as to the question “What can I do to improve OEE? “— used a vector database. This database contains carefully selected, proprietary resources, such as books and websites, to optimize the quality of answers. This also gives the chatbot access to information that is not contained in the LLM's Internet training data, such as configuration guides. This enables the chatbot to provide precise answers to specific questions, such as “Where do I configure the shift model in oee.ai? ”.
One fascinating aspect of this system is its ability to interact with data in just about any language. Whether it's production data or complex documents such as books — the chatbot can easily process inquiries in one language, even if the source material is in another language. It's like writing a book in German and then querying it in Spanish or French. This remarkable ability illustrates the advanced nature of modern AI and shows how extraordinary the world we live in today is.
Conversational memory using a relational database
Although LLMs have extensive knowledge from the Internet, they still need conversational memory to interact effectively with users. This memory is managed by a relational database, which stores both current and past conversations and allows the chatbot to maintain context throughout the dialogue. For example, if a user mentions the name of an attachment at the start of the chat, they can ask questions later in the conversation without having to mention the attachment again. This reflects the natural flow of human conversations, in which the context is remembered.
How to access the chatbot from oee.ai
In the 21st century, apps have become one of the most common ways people interact with technology. To meet this trend, the oee.ai chatbot is accessible via special iOS and Android apps, which are available in the respective app stores. In addition, oee.ai offers an integrated web interface within the web app, which allows easy online access directly from the usual work environment.
Continue to push the limits of what is possible
When employees interact directly with machine data, access to productivity information is significantly easier. This direct access gives them a greater sense of control and responsibility for their work, which can increase job satisfaction and engagement. But this is just the start of what oee.ai technology has to offer. In the future, you can expect even more advanced features, such as systems that help with self-repair. Stay excited.
If you would like to learn how AI can be integrated into the workflow of your employees in the factory, feel free to contact us at info@oee.ai.
Author: Linus Steinbeck