Who got the best test score answering questions correctly; Google Assistant, Alexa, or Siri?
Interestingly, the percentage of questions answered correctly by Google Assistant, Alexa, and Siri were pretty much the same regardless of whether they were categorized as complex or simple. This suggests that incorrect answers were not caused by the failure of the digital helpers to understand the questions, but they simply did not know the correct answers. Simple questions were correctly answered by Google Assistant 76.57% of the time, Alexa 56.29% of the time, and by Siri 47.29% of the time. Google Assistant answered 70,18% of complex questions correctly with a 55.05% score for Alexa and a 41.32% score for Siri.
Bespoken’s Chief Evangelist Emerson Sklar commented on the results of the test and stated, “We have two major takeaways from this initial research. First, while Google Assistant outperformed Alexa and Siri in every category, all three have significant room for improvement. These results underscore the need for developers to thoroughly test, train, and optimize every app they build for these voice platforms.”
So how can Siri move out of the basement and breathe that lofty Penthouse air where Google Assistant resides? First of all, too many responses made by Siri force users to tap a link to get an answer, even if a question is about an Apple device.
But it really seems that Google Assistant knows more about anything than Siri does.