I asked bingGPT about the quantity of ammunition used in live fire exercises during peacetime.
It told me it was too complicated to answer.
ChatGPT gave a reasonable-sounding number after warning me that it was too complicated.
Am I better off for having been given that number?
Is it reasoning & insight or merely a hallucination?
How can I ever know?
In a practical sense, on ChatGPT, I don’t think I ever can. Unlike BingGPT and Perplexity.ai, ChatGPT doesn’t cite its sources.
I had previously described ChatGPT as a “source of conventional wisdom.” I now realize that that is, in fact, too generous. It is, in many areas, dominated by whoever screamed loudest on the Internet at some point in the past.
That said, it is very often right, even if not reliable.
Perhaps it is best used when looking for something other than correct answers. In particular, inspiration, empathy, and better expression.
Inspiration: It’s been great at brainstorming and at being a coach.
Empathy: In a JAMA article, Ayes, et als., report ChatGPT being an order of magnitude more empathic than doctors, though that might not be a high bar. Many people find chatting with ChatGPT to fill a need they can’t with other humans, and there are a host of more specific language engines for this.
Better Expression: ChatGPT and many others are very good at both taking our writing and making it better and at taking our reading and making it more accessible to us through summarization.
And there is a fourth thing that I don’t have a simple term form. For the moment, let’s call it “out-googling Google.”
When search engines became available, there were suddenly lots of answers I would look for and find that previously would have been too much effort. I am having the same type of experience with ChatGPT and BingGPT. Completing tasks that, with a mere search engine, I just wouldn’t have been willing to start. And while this assuredly counts as inspiration, it is also something more. Something truly enabling.