A few weeks ago my spouse and I made a bet. I said there was no way ChatGPT could credibly mimic my writing style for a smartwatch review. I had already asked the bot to do that months ago, and the results were ridiculous. My spouse bet that he could ask ChatGPT the exact same thing but get a a lot best result My problem, they said, was that I didn’t know what questions to ask to get the answer I wanted.
To my chagrin, they were right. ChatGPT wrote a lot best reviews like me when my spouse asked the question.
That memory flashed through my mind while I was blogging at Google I/O. This year’s keynote was essentially a two-hour thesis on AI, how it will affect Search, and all the ways it could. bravely and responsibly improve our lives. Much of it was clean. But I felt a chill run down my spine when Google openly acknowledged that it’s hard to ask AI the right questions.
During its demo of Duet AI, a series of tools that will live inside Gmail, Docs, and more, Google showed off a feature called Sidekick that can proactively give you prompts that change depending on the Workspace document you’re working on. In other words, you are prompting you on how to incite he telling you what it can do.
That came up again later in the keynote when Google demoed its new AI search results, called the Generative Search Experience (SGE). SGE takes any question you type in the search bar and generates a mini report, or “snapshot,” at the top of the page. At the bottom of that snapshot are follow-up questions.
As a person whose job it is to ask questions, both demos were unsettling. The queries and prompts Google used in the scenario are nothing like the questions I type into my search bar. My search queries often read like a small child talking. (They’re also often followed by “Reddit” so I get responses from a non-SEO content mill.) Things like “name of movie actor Bald Dennis BlackBerry.” When I search for something I wrote about Peloton’s earnings in 2022, it comes up on “Site: theverge.com Peloton McCarthy Ship Metaphors.” I rarely search for things like “What should I do in Paris for a weekend?” I don’t even think about asking Google for things like that.
I’ll admit that when I look at any kind of generative AI, I don’t know what I’m supposed to do. I can watch a zillion demos and still the blank window triggers me. It’s like I’m back in second grade and my grumpy teacher just called to ask me a question I don’t know the answer to. When I ask something, the results I get are ridiculously bad, things that would take me longer to make presentable than if I did them myself.
On the other hand, my spouse has taken to AI like a fish to water. After our bet, I watched them play with ChatGPT for a full hour. What struck me the most was how different our instructions and inquiries were. Mine were short, open and wide. My spouse left very little room for interpretation by the AI. “You have to hold it with your hand,” they said. “You have to feed him exactly everything you need.” Its commands and queries are hyper-specific, long, and often include reference links or data sets. But even they have to rephrase prompts and queries over and over again to get exactly what they want.
This is just ChatGPT. What Google pitching goes one step further. Duet AI is meant to pull contextual data from your emails and documents and intuit what you need (which is funny since I don’t even know what I need half the time). SGE is designed to answer your questions, even those that don’t have a “correct” answer, and then anticipate what you might ask next. For this more intuitive AI to work, developers need to make the AI know what questions to ask users so that users, in turn, can ask it the right questions. This means that developers have to know what questions users want answered before they ask them. It gives me a headache thinking about it.
I don’t want to get too philosophical, but you could say that all of life is about figuring out the right questions to ask. For me, the most uncomfortable thing about the AI age is that I don’t think any of us know what In fact wants from the AI. Google says that’s what it showed on stage at I/O. OpenAI thinks they are chatbots. Microsoft thinks it’s a really horny chatbot. But whenever I talk to the average person about AI these days, the question everyone wants answered is a simple one. How will AI change and impact my life?
The problem is that no one, not even the bots, has a good answer for that yet. And I don’t think we’ll get a satisfactory answer until everyone takes the time to rewire their brains to speak more fluently with AI.