FAQs
Chatbot use
Before the era of generative AI, most chatbots were rule-based, i.e. they worked according to predefined rules and decision trees. The use of NLP, i.e. extensive vocabularies, was necessary for natural conversations. The effort required for larger areas of application for chatbots was correspondingly high. With generative AI, both the design of conversation rules and the creation of the necessary vocabulary can be reduced by up to 90 %. The performance of AI chatbots has made a quantum leap compared to their predecessors. Hybrid chatbots combine the rule-based approach with artificial intelligence, further increasing the accuracy of chatbots.
Chatbots are traditionally used to automate customer service. Website chatbots are available around the clock and can answer simple, repetitive customer queries independently. In an expansion stage, they can be used for further workflow automation such as the assignment of internal service tickets. While an interaction with a service employee costs between 6 and 12 dollars, the costs with chatbots fall to 0.50 to 1 dollar.* The new possibilities of artificial intelligence will also expand the field of application of chatbots in practice. Chatbot solutions for internal knowledge management or the optimization of IT help desks are becoming increasingly important.
*99 Firms: Chatbot Statistics
The biggest risk when using chatbots is customer dissatisfaction. If incorrect answers are generated or the customer service journey does not allow for forwarding from the chatbot to the human in difficult cases, the quality of service decreases. It is therefore important that companies create positive experiences with their chatbot deployment and are able to identify and immediately rectify shortcomings in automated service communication. It goes without saying that compliance with data protection regulations and the legally compliant use of artificial intelligence must be guaranteed.
The success of the use of chatbots can be measured using a variety of key performance indicators (KPIs). The most important KPIs include:
- Accuracy rate: Percentage of answers that are correct and satisfactory.
- Error rate: Percentage of answers that are incorrect or unhelpful.
- Engagement rate: Percentage of users who interact with the chatbot compared to the total number of visitors.
- Total resolution rate: Proportion of user queries that are fully resolved by the chatbot without the need to be forwarded to a human employee.
- Escalation rate: Percentage of conversations that need to be forwarded to human employees.
We are one of the few platforms to offer hybrid chatbots, i.e. to further increase the potential of generative AI by integrating individual vocabularies and ontologies. In contrast to ChatGPT agencies that customize the LLM for individual applications, we provide customers with a sustainable platform solution that transparently maps chatbot interactions and supports the independent further development and management of chatbots through an easy-to-use user interface for prompt engineering features and various LLM connections.
We offer chatbot solutions in various levels of complexity. Website chatbots for small and medium-sized enterprises can go live within two weeks.
We currently offer a chatbot-as-a-service for energy suppliers. Historically, we have many years of experience in chatbot development for insurance companies and financial service providers, and an as-a-service offering is being planned.
Yes, chatbots for product advice can also be easily configured with the aiStudio. Multimedia product information can be added to the corresponding texts in the Multimedia Center.
Technology approach
Kauz.ai stands for maximum transparency and AI control: With aiStudio, you can track chatbot interactions and intervene at any time. The aiStudio can be used by business users. AI expert tools enable advanced users to fine-tune their chatbots.
With a focus on data management, content management and AI control features, we ensure a convincing and secure chatbot lifecycle in terms of content.
We are LLM-agnostic. We select the most suitable model depending on the application. We work with proprietary and open source LLMs. In addition to quality criteria for LLMs, cost considerations play a decisive role.
Of course, we do not use any customer data to train the LLMs used and ensure that the LLM providers do not have the opportunity to do so either, for example by hosting their chatbot instances in European data centers. For maximum security, you can also use our aiStudio on-premise.
We offer API interfaces with which various system integrations can be implemented. In aiStudio, developers can test configurations and chunking strategies for specialist chatbot development. Our partnerships in the LLM sector provide developers with a state-of-the-art development environment for individualized LLMs. We work with Langchain for chatbot workflows.
Our chatbot platform aiStudio is of course GDPR-compliant. Personal data is generally not collected and stored in chatbot conversations.
We consider the transparency of chatbot behavior to be key on various levels. For chatbot users, the traceability of information creates trust and is supported in aiStudio by source references. The Conversation Viewer feature provides organizations with insights into the number and quality of chatbot dialogues taking place. With the Content Management module, topic catalogs can be created and questions and answers can be specifically formulated so that AI chatbots can be controlled in a targeted manner.
Yes. Sources are supported in aiStudio.
Commercial
We offer different chatbot models to. Fully functional website chatbots are available for as little as 499 euros.
Our chatbot platform can be used across all industries. We also offer industry-specific chatbot-as-a-service solutions, for example in the energy sector and finance and insurance. Partner companies can adapt our platform or complete solutions for their customers. Kauz.ai offers various Partner models.