AI

Rimes launches Data Lakehouse Copilot for natural language insights

Leverages best-in-class AI models for data query generation

AI Lab sandbox fully supports ‘bring your own data’ scenarios

Developed in collaboration with major clients

Rimes is pleased to announce the launch of its Data Lakehouse Copilot. The Large Language Model (LLM) powered by the AI Copilot quickly provides unique insights from data using natural language. The initial focus for the AI Copilot is to surface insights from ETF data, open source ESG data, as well as demo data from data partners.

Theo Bell, Head of AI Product Development at Rimes, commented, “The Lakehouse Copilot leverages best-in-class AI models for data query generation and is securely deployed in the client’s Data Lakehouse. The power of AI is immense; for example, adding metadata to standard data models better supports end-user workflows through natural language queries. Our Lakehouse Copilot further supports ‘bring your own data’ scenarios, which enables clients to apply AI to their enterprise data linked with data provided by Rimes. By leveraging the Data Lakehouse, we can help clients accelerate their AI adoption without needing specialist skills in-house and be assured that there is no data leakage.”

The Rimes Data Lakehouse Copilot enables users to significantly accelerate performance by providing investment managers with faster, streamlined access to the specific data sets they need. The solution eliminates the need to write code as its natural language answers enable users to quickly transfer the datasets and apply them to their application of choice.

Justin Brickwood, Chief Product Officer at Rimes, concluded, “Developing these new capabilities in partnership with clients enables us to apply our unique expertise and deep domain knowledge to solve genuine business problems. In our experience, this is the most effective way to deliver game-changing solutions. We are at the forefront of a positive journey to change and truly appreciate the support we are receiving from our users. Watch this space.”

Previous ArticleNext Article