Eventually, Samsung plans to continuously release new versions of AGI Computing Lab chip designs in “an iterative model that will provide stronger performance and support for increasingly larger models at a fraction of the power and cost,” he said.
“Through the creation of the AGI Computing Lab, I am confident that we will be better positioned to solve the complex system-level challenges inherent in AGI, while also contributing affordable and sustainable methods for the future generation of advanced AI/ML models,” Kyung wrote.
Untapped market potential
Samsung’s move in part appears to be an effort to find new revenue streams in an as-yet untapped market, as its core business, which is memory, has become a commodity, noted Gaurav Gupta, VP analyst, emerging trends and technologies, at Gartner.
“They are looking for another opportunity to grow,” he said. “This is where chips for inference come in.”
Indeed, most companies that currently build components for computer processing and memory are trying to keep pace with the rapid evolution of AI in various individual strategies to provide cost-effective computing resources.
Currently, the generative AI chip market for training models is dominated largely by Nvidia, with AMD having some share in the space, Gupta said. But these are for models running on GPUs, which can be scarce and costly and thus aren’t a long-term solution for running AI models.