With Dataiku Answers, teams can select their large language model (LLM) of choice, feed it their organisation’s proprietary data through RAG techniques, and build tailored AI chatbots.
Dataiku, a platform for Everyday AI, has announced Dataiku Answers, a new way for data teams to build Generative AI-powered chat using retrieval-augmented generation (RAG) at an enterprise scale.
Sophie Dionnet, Global VP of Product and Business Solutions, at Dataiku, said, “Every organisation can and should be using Generative AI to streamline operations and work smarter, and data leaders need to be able to build these applications with the right level of transparency and reliability to mitigate risk at the right speed. With Dataiku Answers, we take the conversational experience of ChatGPT and the accuracy afforded by RAG to equip data leaders with the enterprise-grade security, control, and visibility required for smart and responsible innovation.”
“Data leaders have been asking how they can deploy RAG-powered chatbots more easily throughout their organisations. Dataiku Answers already addresses this need for more than 20 global retailers, manufacturers, and other organisations. In a matter of a few weeks, we have seen them meet a variety of needs from opening a company-wide LLM chat to their corporate documents down to creating domain-specific chatbots for investor relations, procurement, sales, and other teams. Employees can ask questions just as they would of ChatGPT and feel confident they are getting reliable responses. Meanwhile, data teams get complete visibility and control over usage, costs, and quality. Everyone wins,” Dionnet added.
With Dataiku Answers, teams can select their large language model (LLM) of choice, feed it their organisation’s proprietary data through RAG techniques, and build tailored AI chatbots for all departments across their organisation.
Because Dataiku Answers sits on top of the Dataiku LLM Mesh framework, data teams can connect to preferred LLM vendors like Anthropic, AWS Bedrock, Azure, Databricks, Google Vertext, OpenAI, and more, as well as vector stores like Pinecone, to build their AI chatbots. Alternatively, they can use self-hosted LLMs. From there, they easily build RAG pipelines to give the chatbot access to proprietary content so that its answers are accurate and tailored to the organisation. The Dataiku LLM Mesh has dedicated components for AI service routing, personally identifiable information (PII) screening, LLM response moderation as well as performance and cost tracking, allowing for trusted GenAI deployment.