Empowering pilots
Also in the Flexential survey, 43% of companies are seeing bandwidth shortages, and 34% are having problems scaling data center space and power to meet AI workload requirements. Other reported problems include unreliable connections and excessive latency. Only 18% of companies report no issues with their AI applications or workloads over the past 12 months. So it makes sense that 2023 was a year of AI pilots and proofs of concept, says Bharath Thota, partner in the digital and analytics practice at business consultancy, Kearney. And this year has been the year when companies have tried to scale these pilots up.
“That’s where the challenge comes in,” he says. “This is not new to AI. But it’s amplified because the amount of data you need to access is significantly larger.” Not only does gen AI consume dramatically more data, but it also produces more data, which is something that companies often don’t expect.
In addition, when companies create a model, it’s defined by its training data and weights, so keeping track of different versions of an AI model might require keeping copies of every individual training data set. It depends on the specific use case, says Thota. “Nobody has figured out what the best way is,” he says. “Everybody is learning as they’re iterating.” And all the infrastructure problems — the storage, connectivity, compute, and latency — will only increase next year.