‘We are just at the very beginning of what AI is capable of’: AMD CEO Lisa Su
Reflecting on her three-decade career in the semiconductor industry, Su described AI as the “most impactful and high-potential technology” she has encountered.
Artificial intelligence holds the power to enhance productivity and drive innovation, says Lisa Su, CEO of Advanced Micro Devices (AMD).
Reflecting on her three-decade career in the semiconductor industry, Su described AI as the “most impactful and high-potential technology” she has encountered.
“AI is this technology that can make all of us more productive, all of our companies more productive, make all of our discoveries more capable. It's an opportunity for us to take computing to the next level,” Su stated.
“I think we are just at the very beginning of what AI is capable of. It allows us to solve some of the most important problems in the world, and help us find the next discoveries, whether you’re talking about medicine, climate, or science. AI is the next logical step,” she added.
Su, who was recognised in TIME’s ‘Most Influential People in AI 2024’, was speaking at a closed event at the Indian Institute of Science (IISc), Bengaluru on Thursday.
She disclosed that roughly 8,000 of the chipmaker's 27,000 global employees are engineers based in India, making up 25% of its total workforce.
Su highlighted how AI has evolved from an expert-only field to a technology accessible to all, due to the advent of GenAI and large language models like ChatGPT in the last two years.
“We have taken what was now expert technology, and we’ve moved AI to something where everybody can touch and feel it..because when you're able to use natural language to unlock computing capability, that all of a sudden changes who can use it,” she explained.
A direct competitor to NVIDIA, AMD is a semiconductor giant known for its high-performance computer processors and graphics technologies.
Addressing AMD’s strategy, Su further underscored the importance of versatility in computing solutions.
“There’s no one-size-fits-all when it comes to the future of compute. You’re going to need to use the right compute for the right application. For example, a lot of conversation is around sorting the largest GPUs and accelerators for the cloud, along with running the training and inferencing on the largest language model. But we do expect that they're going to be models of all sizes,” she said.
AMD is focusing on an end-to-end AI strategy that spans cloud, edge, and client devices, she added. “We believe everyone should have their own AI PC that allows you to run your models locally and operate on your data.”
Su also spoke about how the chipmaker is focusing heavily on collaboration through open-source initiatives. “Our strategy is that the world needs an open-source software environment. It shouldn’t matter whether it’s AMD or NVIDIA as the hardware layer—you want to build on top of that with software and underneath abstraction. We’re investing significantly in all of the tools, compilers, and abstraction layers that will allow us to build an open-source ecosystem,” she noted.
Edited by Jyoti Narayan