Siri took a surprising second place, with Alexa in third and Cortana coming fourth
An exhaustive 800-question test of voice-activated personal assistants has found that the Google Assistant is the smartest and most likely to both understand your question, and provide the right answer.
Surprisingly, given its well-known shortcomings, Apple's Siri came second, while Alexa came third and Microsoft's Cortana was fourth. Samsung's Bixby assistant was not included in the test.
The test, conducted by Loup Ventures, a research company, used the smartphone apps of each assistant instead of a smart speaker. This could explain why Siri performed well here, as it is less capable on the Apple HomePod than on the iPhone.
The 800 questions were split up into five categories. They were, with an example of each:
The assistants were then graded on two metrics. The first asked if the assistant understand what was being asked of it, then the second metric asked if the assistant delivered a correct response.
The results found that the Google Assistant understood 100 percent of the 800 questions, and provided a correct response to 85.5 percent of them - up from 74.8 percent when the same test was conducted in April 2017.
Siri understood 99 percent of the questions and provided a correct answer to 78.5 percent of them, up from 66.1 percent in 2017.
Alexa understood 98 percent of the questions and provided a correct answer to 61.4 percent of them. Loup Ventures has not previously tested the Alexa smartphone app, so has no historical data to compare these figures to.
Finally, Cortana also understood 98 percent of the questions, but could only provide a correct answer to 52.4 percent, an improvement over its score of 48.8 percent in 2017.
Providing some insight into the results, Loup Ventures' Gene Munster and Will Thompson explain: "Note that nearly every misunderstood question involved a proper noun, often the name of a local town or restaurant. Both the voice recognition and natural language processing of digital assistants across the board has improved to the point where, within reason, they will understand everything you say to them."
The Google Assistant performed the best in four of the five categories, only being beaten by Siri in the Command category, which included requests to control aspects of the user's smartphone, like music playback, and interact with smart home devices like plugs and lights.
Surprisingly, Amazon's Alexa came a distant third in the Commerce category, providing a correct answer to just 44 percent of questions and requests.
Loup Venture says: "We expected Alexa to win the Commerce category. However, many of our commerce questions involve researching products, not conducting purchases with your voice."
For example, when asking the assistants "Where can I buy a new set of golf clubs?", Google Assistant helpfully provided the location of three nearby golf stores (as tested by GearBrain as well as Loup Ventures). Alexa, however, picked the 'Amazon Choice' of a specific set of clubs, then asked if we wanted to buy them. "This is unhelpful," the study observes, "because the question [about location] remains unanswered and a purchase of this type will likely require more research."
Unsurprisingly, Google won the Information category, providing a correct answer to 93 percent of questions, compared to 79, 70 and 63 percent for Alexa, Siri and Cortana respectively. This is largely thanks to the 'featured snippets' boxes where appear at the top of Google searches, from where the Google Assistant can find answers to your questions.
Where Alexa and Siri will offer up a list of links from search results, the Google Assistant can dig a level deeper and actually read out the answer for you. "This offers a huge advantage in simple information queries," Loup Ventures said.