AI21 Labs, the innovative AI startup, has introduced Contextual Answers, a plug-and-play AI engine designed for enterprise data. This groundbreaking offering, known as Contextual Answers, is a dedicated API that can be seamlessly integrated into digital assets, enabling organizations to leverage large language model (LLM) technology on their data.
The aim of Contextual Answers is to provide users with a conversational experience that allows them to access the required information without the need to navigate different teams or software systems. This technology is being offered as a ready-to-use solution, eliminating the need for significant time and resources to be invested. Tel Delbari, who leads the API team at AI21 Labs, emphasized the simplicity and optimization of the solution, delivering industry-leading results without the involvement of AI, NLP, or data science experts.
Enterprises have been seeking ways to incorporate LLMs into their data stack, following the success of ChatGPT. The usual approach of fine-tuning existing models for specific enterprise scenarios demands extensive engineering efforts and may not be feasible for every company. However, AI21 Labs’ new Contextual Answers API provides a streamlined solution that can bring any generative AI use case to life right from the start.
The implementation process is straightforward. Enterprises can upload their documents to AI21 Labs’ Studio using the web GUI or API and SDK. Once the files are loaded, users can send questions and receive answers through the API. Delbari highlights the user-friendliness of the API, ensuring that any developer, regardless of their expertise in NLP or AI, can utilize it.
Once the AI engine is up and running, business customers and internal employees can ask free-form questions related to internal support, policy checks, or information searches within large documents or manuals. The engine will then deliver concise answers from the context within the uploaded knowledge base, accommodating both structured and unstructured information. Importantly, the model is optimized to adapt to internal jargon, acronyms, and project names, ensuring that it maintains accuracy and remains faithful to organizational data and internal language.
AI21 Labs has considered access control and data security in the design of their AI engine. The API allows for access control and role-based content separation, limiting the model’s usage to specific files, folders, or metadata. Data confidentiality is ensured through the use of AI21 Studio, which provides a secured and soc-2 certified environment trusted by various industries, including banks and pharmaceutical companies. Furthermore, the AI engine can be used through AWS Sagemaker Jumpstart and AWS Bedrock, enabling enterprises to deploy the core capability of the product on their virtual private clouds (VPCs).
AI21 Labs’ future plans involve integrating the Contextual Answers feature into its writing platform, Wordtune. This integration will enable users to retrieve specific information quickly from uploaded documents.
Databricks and Snowflake, prominent players in the data ecosystem, are also exploring similar projects. Databricks recently announced LakehouseIQ, which utilizes large language models to provide context-specific answers on lakehouse data. Snowflake has launched Document AI, a purpose-built multimodal large language model that extracts insights from unstructured documents.
With the introduction of Contextual Answers, AI21 Labs is revolutionizing the way enterprises access and utilize their data. This plug-and-play solution eliminates the barriers and complexities associated with implementing LLMs, providing a seamless and efficient experience for businesses of all sizes.