Open AI recently made ChatGPT 3.5 freely available to anyone on the web here. No doubt it's useful for various applications, but as noted last year, it remains comically confused about virtual worlds and avatars -- even confusing my name with my avatar's name.
But it's not just me! Ask ChatGPT, "Who is the SL artist AM Radio?", and you get:
AM Radio, whose real name is Alejandro Miguel Justino Crawford, is a Second Life (SL) artist known for his immersive and evocative virtual art installations within the virtual world of Second Life. His work often explores themes of nostalgia, Americana, and the passage of time...
I don't even know who the fuck Alejandro Crawford is, but I bet Jeff Berg would be a bit miffed, as would his son.
Ask "Who created the Second Life brand Blueberry?", and you get:
The Second Life brand "Blueberry" is created by a virtual fashion designer known by the username "Blueberryxx" or simply "Blueberry" within the Second Life platform. Blueberry is renowned for creating high-quality virtual clothing and accessories for avatars in Second Life, catering to a diverse range of styles and preferences within the virtual fashion community.
Somehow ChatGPT again gets the description right but totally misses the name -- even though Mishi McDuff has been featured in Vogue and many other major media outlets.
And not just Second Life! Ask ChatGPT, "Who is Syrmor related to VRChat?", and you get:
Syrmor, also known as Roflgator, is a popular content creator within the VRChat community. He gained fame by creating videos that showcase the experiences and stories of various users within the virtual world of VRChat. Syrmor is known for his empathetic approach to storytelling, often focusing on the personal struggles, triumphs, and unique perspectives of individuals he encounters in VRChat.
Roflgator is another VRChat YouTuber with a very different vibe, while real life Syrmor has been featured on HBO.
An AI expert observed that ChatGPT "aspires to be the most mediocre web content you can imagine", but when it comes to virtual worlds, it's actively and hilariously injecting misinformation about a technology used by upwards of 1 billion people.
> when it comes to virtual worlds, it's actively and hilariously injecting misinformation
This is probably true about almost any domain of specialized knowledge. The main reason why people don't share many more of those failures is probably that they are funny only to domain experts.
Posted by: Martin K. | Tuesday, April 02, 2024 at 11:04 PM
Indeed, butter knives are useful for various applications, but "hilariously" they don't function well as screwdrivers.
After over a year, some people remain comically confused and still try to use ChatGPT as a screwdriver. ChatGPT (using the old GPT-3.5 model) without web search is obviously not a Google Search alternative and it is not designed to be.
To make this post more useful: if you are seeking factual knowledge, you should rather use tools specifically designed for web search. Perplexity and Copilot are freely available since a long time. There is no reason to use ChatGPT for that purpose. Moreover, they provide sources for the information they present, so you can check them.
Perplexity:
Copilot More Precise:
"Who created the Second Life brand Blueberry?"
"Who is Syrmor related to VRChat?"
Are they perfect? No. But they do a much better job. People should understand that artificial neural networks, like the natural ones from which they are inspired, don't produce precise results as a calculator would do. So always check the results. Or use language models for language related purposes.
By the way, misuse aside, is "ChatGPT 3.5" another example of how people keep to confabulate, "hallucinate" or are they "actively and hilariously injecting misinformation"? It is ChatGPT, which utilizes the GPT-3.5 model, distinct from ChatGPT Plus that employs the GPT-4 model.
Posted by: n | Wednesday, April 03, 2024 at 01:47 AM
I have asked ChatGPT to explain endings of books and novels, many times. It often changes the names of the characters (why?), what they did (why?), and hallucinates entirely different scenarios. It does this to certain books more than others. So I go back to looking for human-reviews and fan-discussion on forums.
Posted by: Nat | Wednesday, April 03, 2024 at 02:47 AM
For a laugh:
https://arstechnica.com/information-technology/2024/04/the-fine-art-of-human-prompt-engineering-how-to-talk-to-a-person-like-chatgpt/
Posted by: n | Wednesday, April 03, 2024 at 10:14 AM
Good examples, N! Though I will say "[AM Radio] mysteriously disappeared from Second Life in 2011, leaving behind only one of his creations" is taken directly from the opening of a Polygon article I wrote... which actually explains the mystery!
Posted by: Wagner James Au | Wednesday, April 03, 2024 at 12:45 PM
Thank you! Yes, Copilot used Bing to search the web, cited the search results, and linked the sources (your article on Polygon). Indeed, it also mentioned the mystery and it's true that you explained it. To be fair, exploring the mystery was beyond the scope of the question, which was 'Who is the SL artist AM Radio?' and that question was answered. You can ask follow-up questions, though. But another reason could be that Copilot couldn't fully access the article.
So I asked Perplexity "Why did AM Radio Jeff Berg disappear from Second Life in 2011?":
https://www.perplexity.ai/search/Why-did-AM-TlQvLKRzTtuaebkEO44EcA#0
Perplexity, too, links your article, so the user can the read the whole story.
Not too bad, considering that this is the free available version.
Posted by: n | Thursday, April 04, 2024 at 10:03 AM