Scaling AI without scaling costs
The previous generations of Natural Language Processing (NLP) use cases faced formidable scalability issues for the financial sector, grappling with the complexities of scenario modeling in their training datasets and relying on rigid supervised models. The hurdles extended to managing large volumes of meticulously cleaned data, amplifying the challenges faced by businesses.
OnFinance wanted to find a solution that did not add to the existing costs related to data warehousing and using Azure AI allowed them to do this.
A breakthrough in NLP
For their NeoGPT product, OnFinance adopted the model used in computer vision where one-shot accuracy allows use cases such as facial id with relative ease. For NLP to work, it needed an underlying model with great zero-shot capabilities which Azure OpenAI provided.
Integrating Azure AI studio allowed OnFinance to run fine-tuning jobs 10x faster with geo-redundancy and apply reinforcement learning (RLHF) efficiently to ensure 85% guardrail compliance. In addition, OnFinance was able to leverage super-powerful closed source models within Azure OpenAI like GPT4 and GPT4V at 100k+ tokens/minute to process unstructured data at scale and truly deliver a productised experience to its end customers.
By adopting zero-shot capabilities of LLMs, OnFinance wanted to disrupt the market built around supervised models. This would also result in reducing the overall cost including cost of data to implement AI use cases.
Anuj Srivastava
Co-founder and CEO
OnFinance.ai
We want to deploy NeoGPT to power the 10 million monthly customer interactions within BFSI in the next 1-2 years and Azure OpenAI will help us scale that.
10x
faster for certain tasks
OnFinance’s product NeoGPT excels at financial tasks to bring Generative AI for internal teams securely.
More details at https://onfinance.ai