Brands
Discover
Events
Newsletter
More

Follow Us

twitterfacebookinstagramyoutube
Youtstory

Brands

Resources

Stories

General

In-Depth

Announcement

Reports

News

Funding

Startup Sectors

Women in tech

Sportstech

Agritech

E-Commerce

Education

Lifestyle

Entertainment

Art & Culture

Travel & Leisure

Curtain Raiser

Wine and Food

YSTV

ADVERTISEMENT
Advertise with us

Tech leaders weigh in on how to future-proof businesses with scalable AI systems

At TechSparks 2024, tech experts deliberate on prioritising data enhancement, integrating robust cloud solutions, and securing networks as fundamental steps for effective AI deployment and technological progress.

Tech leaders weigh in on how to future-proof businesses with scalable AI systems

Friday October 11, 2024 , 4 min Read

Nearly 75% of business leaders concurred that their competitive advantage is dependent on making the best use of artificial intelligence (AI) and tools like Generative AI (GenAI) and machine learning, according to Experian’s latest report by Forrester Consulting. 

“AI in the Enterprise automation includes everything – right from Business processes, data in motion, data at rest, application layer, to decision intelligence. So, quite literally we are breathing and living the AI age,” said Sanjay Koppikar, Co-founder and Chief Product Officer, EvoluteIQ. 

Koppikar was speaking at a panel discussion at TechSparks 2024 on ‘Building scalable and transformative AI systems’. He was joined by Yogesh Agarwal, Founder and CEO of Locomo.io; Ankur Pal, Chief Data Scientist of Aplazo; and Brajesh De, Managing Director of Blue Altair.

Software development company Blue Altair has been using AI for several functions – to increase the productivity of development processes such as coding or writing unit test cases, training, among other things. Citing an example, De said, “In the pharma space, we have been building some interesting solutions using AI to identify if the adverse reaction on a drug was on a pregnant woman.”

Aplazo, an omnichannel payment platform that offers flexible payment solutions and commerce enablement tools to help merchants accelerate sales and grow their brands, has also started investing in GenAI. At Locomo, the use of large language models (LLM) is helping the company build a project management toll that allows its users to do streamline continuous iterative development process from evolving requirements to frequent releases.

Key AI infrastructure considerations

Koppikar said the sheer scale of operations and the widespread commercial availability of graphics processing units are vital for companies in building robust AI infrastructure.

Advising startups, he said, “As both large and small language models are going to be commoditised, do not build a solution on only one stack. Look for the best of both breeds available on cloud as well as something behind the firewall or on a private cloud. It is not a one-size-fits-all affair; it has to be very contextualised.” 

De believes it’s important for companies embracing AI to make their solution safe, trustworthy, and explainable. 

“We have a framework for analysing the solution a company is building and understand the risks associated with it. If it is not safe, it is not going to be scalable,” he noted, underscoring the importance of building an application for the population scale and bringing in the right technologies with security and regulations in place. 

“If data is new oil, who owns the rigs? In case of GenAI providers, your data would be the oil in their rigs. Use them carefully,” Koppikar warned, emphasising the necessity of mindful engagement with GenAI tools.

AI can significantly help in creating better software. At Locomo, the use of large language models (LLM) is helping the company build a project management tool that allows its users to streamline continuous iterative development process from evolving requirements to frequent releases.

“For AI to respond meaningfully, it needs to have the right context. The right prompt is what gives you a good outcome. What becomes problematic is when AI is used randomly,” Agarwal said. 

Best practices for building robust AI systems 

De listed some of the best practices: data governance and data engineering, security regulations, explainable AI, and democratisation of AI.

Listing down the parameters for every AI-first company, Pal of Aplazo said, “Upskilling your own workforce to consume intelligence, AI automation across operations, and design thinking are three key points to be noted for business innovation and efficiency.” 

According to Agarwal, companies can leverage AI at three levels. “One is building a model from scratch, like building an LLM or a ChatGPT, which would require an investment of over $100 million. The second is when an existing model is trained with data to fit in your use case. Third, identifying the right use case meaningfully.” 

AI trends shaping the future

Koppikar advised companies to think beyond LLM, cautioning them to avoid structuring their entire AI offering only with LLMs. “Large action models are what’s next, because language is already handled. LLMs are here to stay, but they are going to be commoditised,” he said.

Going forward, De reiterated the need for finding one’s niche and building industry-specific models, be it for healthcare, insurance or education.

Pal, optimistic about GenAI, believes there would be significant adoption worldwide in the next five years. “And this is true democratisation of intelligence,” he stated.

Agarwal said it's important that companies find the right use case. “As AI becomes more intelligent, I think use cases are going to unfold in ways unimaginable.”

.