Highlights of the Mindbreeze InSpire 24.7 Release
Are you interested in the highlights of the Mindbreeze InSpire 24.7 release? Learn more in the following blog post!
New “AI Answers” component available
Mindbreeze customers can now use drag and drop to add the “AI Answer” component into their Insight Apps in the Insight App Designer. AI Answer provides users with answers containing precise and rights-checked information in their natural language. This information comes either from direct hits or from a summary of the content of several documents. In addition, more answers from the source documents are available as additional information.
AI Answer uses retrieval augmented generation (RAG) and large language models (LLMs) to generate appropriate answers. When configuring the component, admins can specify the LLM to be used. Furthermore, users can also stop and restart the generation of the answer.
Meta Llama 3.1 added to Mindbreeze InSpire SaaS environments
With the Mindbreeze InSpire 24.7 release, Mindbreeze SaaS customers now have access to the Large Language Model (LLM) Meta Llama 3.1 Instruct 8B.
Meta Llama 3.1 offers up to 128,000 tokens to enable longer prompts. By using Llama 3.1 in combination with Mindbreeze AI features, such as Mindbreeze InSpire AI chat, customers receive valid and rights-checked information even more intuitively and efficiently.
Mindbreeze InSpire available in Oracle Cloud
Mindbreeze InSpire is now available in the Oracle Cloud as a Mindbreeze InSpire image.
In addition to on-premises, SaaS, Amazon Web Services, Microsoft Azure, and Google Cloud, customers can now use all features, such as the Mindbreeze InSpire AI chat and the 360-degree views, in the Oracle Cloud as well. This is available in the 1M and 10M contract levels.
Detailed information about our innovations and features can be found in our release notes.
Contact our experts for further information.
Latest Blogs
Linking Non Indexed Documents with Instant Semantic Context
Context and ChallengeOrganizations frequently need to act on information that has just arrived, a management report before a board meeting, a contract from a partner, or a project update that requires immediate decisions.
Future-Proofing Enterprise AI with Secure and Flexible Deployment Options
Enterprise AI initiatives depend on reliable access to organizational knowledge. However, connecting information across cloud services, internal systems, and legacy infrastructure requires a product that can balance two critical priorities: security and flexibility.