Mainelli acknowledges that most of the practical use cases on AI PCs will be many of the same things people use AI in the cloud for today — content creation, content editing, text summarization, language translation, automation of repetitive tasks, prototyping, personalization, predictive insights, and virtual assistants — but they will run locally on the device making it faster, cheaper, more private, and more secure.
Allocating some AI workloads to PCs offers CIOs other benefits, he says, noting that Microsoft will continue to make its Copilot+ applications available in the cloud.
“The vision around AI PCs is that, over time, more of the models, starting with small language models, and then quantized large language models … more of those workloads will happen locally, faster, with lower latency, and you won’t need to be connected to the internet and it should be less expensive,” the IDC analyst adds. “You’ll pay a bit more for an AI PC but [the AI workload is] not on the cloud and then arguably there’s more profit and it’s more secure.”