Does this book exist? Or did ChatGPT make it up?

Pickykim

New Member
Joined
May 1, 2023
Messages
2
I asked ChatGPT to give me a summary of a book I was reading by AG Riddle, The Lost Colony. Instead, it gave me this summary:

The correct plot of the book involves a team of scientists led by Captain Callie Woolf, who are sent to investigate a mysterious signal from a distant planet called Argo. When they arrive on Argo, they discover that the planet was once home to a highly advanced civilization that has now disappeared under mysterious circumstances.

As the team explores the planet, they encounter a variety of challenges, including hostile alien life forms and a hidden conspiracy that threatens their mission. The story involves flashbacks to the history of the lost civilization, which gradually reveal the true nature of the threat that they face.

Throughout the book, the team must navigate treacherous terrain and make difficult choices in order to uncover the truth about the lost colony and prevent a disaster that could threaten the survival of humanity. The story involves themes of technology, morality, and the nature of humanity.

I cannot find any references to this book online, but it sounds interested and I’d like to read it. Does it exist? Did ChatGPT make the whole thing up?

Thank you for any help!
 
Googled the author. A.G. Riddle The Lost Colony does exist-- it's the 3rd book in a series.

Winter World (The Long Winter, #1)
The Solar War (The Long Winter, #2)
The Lost Colony (The Long Winter, #3)

Minimal synopsis on the web site.
 
Googled the author. A.G. Riddle The Lost Colony does exist-- it's the 3rd book in a series.

Winter World (The Long Winter, #1)
The Solar War (The Long Winter, #2)
The Lost Colony (The Long Winter, #3)

Minimal synopsis on the web site.
yes, The Lost Colony by AG Riddle does exist. I’m reading it right now but the synopsis that ChatGPT gave me is not for The Lost Colony. I’m trying to figure out what book it actually is referencing.
 
ChatGPT is a program that produces statistical output based on input, there's no concept of fact checking or accuracy. People have been talking about the fake citations it produces for a while now.
 
 
I managed to get it to give me three different fake citations in a row by just asking "Are you sure?"
 
Ive been looking albeit unsuccessful for the novels of famed horror author Sutter Kane. ;):D
 
I found this...


I'm pretty sure the summary provided by ChatGTP is for a different story (assuming it exists), but Rackman's novel SENTINEL does have a Callie Woolf as one of the characters.
 
AI is basically reading internet search results to create an answer. You can see what AI is up against by searching for - Rackman's novel SENTINEL - and look at the image results. What I have noticed lately is that putting something in quotes is becoming worthless. Put Rackman's novel SENTINEL in quotes and you get more unrelated hits than you do without using quotes. Add wiki to it, Rackman's novel SENTINEL wiki which should limit the results somewhat instead brings up more unrelated image results.

The image hits makes it easy to see what kind of variance is going on in the original results. Using Google, Just Rackman's novel SENTINEL brings up the novel, plus some interesting long shots, including The Adventures of Tin Tin - Explorers On The Moon, The Silver Surfer Comic, Peter Pan in Kensington Gardens, illustrated by Arthur Rockham, The Night Before Christmas.

Bing brings up a much larger selection of titles, but most of it is science fiction.

AI now has a problem, it can't think so it has to rely on probabilities to determine what the answer is to the question. More than likely the programs have been tweaked a million times to correct for wrong results. This process is probably similar to the use of epicycles in astronomy before telescopes to correct the observational positions of planets in the sky, which seemed to go backwards at times. The errors were created by the belief that the planet orbits were circles, not ellipses, and the concept of gravity was not a factor as it wasn't very well understood. AI has to make guess about what it is seeing without completely vetting all the data it is looking at.

AI programs also cobble together information to complete a supposedly more complete answer. If it chance is on AI's side, all the results it picks to cobble together an answer will be about the correct item. If it is using mixed results, the answer could well be talking about different things as if it was a single item.

Add the word author to Rackman's novel SENTINEL search and google narrows it down to mostly 2 choices with only a few wrong results.

Add the word author to Bing's search, that does almost nothing to change results. There are still a lot of novels, apparently including the word author did not trigger bing to try the word Rackman with each of the tiles it found. Apparently for bing, just using the word novel implies that more than one author is involved. A curious short coming.

What AI's will probably do next to get to the next plateau will be to watch humans actively searching for an item and then watch what results are kept and which results are thrown out. It will still be based on probability. AI is an idea finder, but it still needs a fact checker function that only a person can do for now. Humans use probability in thinking about what they can get away with, and pursuing actions, such as gambling on sports, so it's not unlikely that probability should shape AI's fact handling decisions.

It is a good idea to know your subject matter and use AI to organize information which you will need to double check. Getting an instant rough draft you can edit takes a lot of the work out of preparing a report from scratch. That way you can easily find out about things you didn't know about the topic you are writing about and keep what you want.
 

Back
Top