New York-based Hebbia announced a $130 million B-round financing, with investors including Andreessen Horowitz, Index Ventures, Peter Thiel, and Google's venture capital department.
Hebbia is building something quite simple: a localized productivity interface for LLMs, making it easier to extract value from data, regardless of its type or size. The company has already collaborated with some large companies in the financial services industry, including hedge funds and investment banks, and plans to bring the technology to more enterprises in the coming days.
Product Access: https://top.aibase.com/tool/hebbia
While chatbots based on LLMs can be based on internal documents or prompt documents, many have noticed that these assistants cannot answer complex questions about business functions. In some cases, the problem is with the context window, which cannot handle the size of the provided documents, and in other cases, the complexity of the query makes the model unable to solve it accurately. Errors can even affect the team's confidence in the language model.
Hebbia addresses this gap by providing an agent co-pilot Matrix related to LLMs. The product is located in the company's business environment and allows knowledge workers to ask complex questions related to internal documents - from PDFs, spreadsheets, and Word documents to audio transcriptions - with an unlimited context window.
Once the user provides the query and related documents/files, Matrix breaks it down into smaller operations that the LLM can execute. This allows it to analyze all the information contained in the documents at once and extract the required content in a structured form. Hebbia says that the platform enables the model to reason over any number (from millions to billions) of documents and data modalities while providing relevant references to help users track each operation and understand how the platform ultimately arrives at the answer.
Through the latest round of financing, the company hopes to build on this and attract more large enterprises to use its platform to simplify the way their staff retrieve knowledge.
Hebbia is not the only company in this field. Other businesses are also exploring enterprise-based AI-powered knowledge retrieval, including Glean. This startup based in Palo Alto, California reached unicorn status in 2022 and has built assistants similar to ChatGPT specifically for workplace productivity. There are also participants like Vectara that are striving to achieve a universal AI experience based on enterprise data.
Key Points:
👉 Hebbia has secured $130 million in B-round financing to build a localized productivity interface for LLMs, making it easier to extract value from data.
👉 Hebbia's agent co-pilot Matrix can analyze all the information contained in the documents and extract the required content in a structured form.
👉 Hebbia has collaborated with institutions such as CharlesBank, Center View Partners, and the United States Air Force, and has more than 1000 case studies.