Earlier this month, Google launched Search Generative Experience (SGE), which adds generative AI capabilities to Google Search results.
A lucky few members of the public in the US who were on the Search Labs waitlist have now been given access to SGE, and some have begun to share their thoughts on the feature publicly.
So, here’s a rundown of their opinions about how SGE works and how the results compare to typical Google SERP results.
Google SGE: What the experts think
Google says that SGE allows you to ‘find what you’re looking for in faster, easier ways’. The company adds that you can ‘get AI-powered overviews with helpful info’. You can also ‘ask follow ups’ to your queries.
But, is this really the case? Let’s see what the experts have to say.
Dr. Marie Haynes
Although Dr. Haynes gave SGE a positive review on her Twitter profile, she believes that the company needs to make a number of tweaks before the feature goes live. For example, she noted that it currently ‘quotes websites without attribution’. In some instances, she explained that SGE is essentially plagiarising a website.
In her newsletter, she took a deeper dive on the subject and spoke specifically about how SGE displays results. She noticed that it ‘generally shows an AI-generated answer with three websites shown to the right’. She also found that the websites currently ranking in the top organic positions are not necessarily the ones shown in SGE responses.
In other areas, she found that SGE displays five results for local searches and pulls a lot of information from reviews. For your money, your life queries (YMYL), she found that SGE only answers cautiously. However, she praised SGE’s converse mode, which she thought thoroughly understood her intent and showed very specific results and videos demonstrating real-life experience.
Barry Schwartz
Meanwhile, over on his Twitter account, search guru Barry Schwartz live tweeted his experience with SGE.
After agreeing to the ‘crazy long disclaimer’, he found several things he immediately liked about the feature, including the fact that you can expand answers and generate follow-up answers that are ‘pretty good’.
From his research, he found that SGE wouldn’t tell him about the weather and that it only gave basic answers to sports, finance or politics-based questions. He has also since noted that SGE is missing one thing that SEOs look at: the number of results Google returns for a given query.
Cyrus Shepard
Cyrus Shepard, owner of Zyppy, was another who took an early deep dive into SGE. Like Dr. Haynes, he was primarily concerned about attribution and plagiarism.
On top of this, he also used his Twitter account to point out some AI-generated responses that were less than helpful, including a 15-point instruction guide on boiling an egg that included a total boiling time of 60 minutes!
Added to this, he also shared a helpful thread that showed seven search categories that currently don’t trigger AI results consistently. His estimates suggest that these ‘searches likely comprise 50-70%+ of all searches—and are likely the safest from AI’. These included navigational queries, recipes and sensitive content queries.
Due to this, his ‘semi-educated guess’ is that ‘we’ll only see 20-30% of Google search traffic impacted/made at risk from AI-generated search.’
Aleyda Solis
While many users have been positive about SGE, SEO consultant, speaker and author, Aleyda Solis, found that it:
- Provided repetitive information
- Misaligned information with intent
- Failed to link to referred sources
- Provided bad UX
Solis said that after testing SGE for a couple of hours, she found that its snapshot content often included repetitive content, such as double map packs. She also found that, with complex topics, SGE’s snapshots were simplistic and didn’t align with the intent of the query. She provided a detailed review of her experience in this blog post.
Why does this matter?
Google has announced that the Search Generative Experience will run as a Search Labs experiment until December 2023. As a result, in such an early phase of testing, we should be cautious about judging results too harshly. After all, it’s likely that Google will release several versions of SGE before the trial ends in December. So, while some of the well-publicised concerns surrounding plagiarism and attribution remain worrying, Google has plenty of time to iron these out.
However, while this new version of experimental search is still in its infancy, we recommend that all marketers and SEOs give it a try when it’s possible to do so (it’s currently only available in the US). After all, playing with SGE can help you understand more about how AI is changing Google’s Search product. This is because SGE shows you exactly what direction Google is thinking about taking Search in right now.
Tom Brook
When he's not crafting content, Tom's obsessed with all things sport, particularly football, cricket, golf and F1.