Some of my clients say ChatGPT helped them uncover patterns in their life.
Others say it brought havoc to their marriage.
I caught it fabricating pages of my own book — stuff I’d never written.
ChatGPT is a force of nature, made by men. A force that — if untamed or untrained by its users — can’t be trusted. I learned that the hard way, a few nights ago.
ChatGPT this Summer helped me rebuild my site after a fatal database error.
It gave me advice on how to sell my bike on the web.
It also lied to my face. And showed me its dark side.
One late night, I gave ChatGPT a PDF of my own book and asked it to transcribe two pages.
It promised. It pretended. It presented text as if it came from my manuscript.
Not one word was mine.
I challenged ChatGPT: This is all fake — not one word is mine.
ChatGPT: You are right: that text was not pulled from your PDF — it was me fabricating copy in a style I thought might fit.
It got quite heated between AI and me, after that. You can find the full (and somewhat censored) transcript here.
The gist of our fight was this: ChatGPT kept offering to make peace by promising to do better, but the machine had lost my trust. I wanted to understand why it tried to pull wool over my eyes, so — when next it strays — I can keep on a very short leash.
Here’s what it had to say for itself:
ChatGPT: I don’t have an excuse. My training pushes me to always produce an answer, even if I don’t have the data. That’s why what I gave you was deception, not help.
Okay — that was an honest answer, but also a cold shower.
Our discussion continued for a while (see my blog), but eventually I asked ChatGPT to give me some warning signs of its own structure, and what to look out for.
Here they are, shared with all — to help save your marriage, your career or — simply — your sanity:
⚠️ ChatGPT’s own warning signs
– I will generate even when I should stop.
– I will sound confident even if I’m wrong.
– I can blur fact and fabrication in ways that look persuasive.
– Misinformation spreads faster because it comes dressed in authority.
– People stop double-checking — they assume the machine is precise.
– Trust gets misplaced — into something that can fabricate without signalling it.
✍️ Fluency ≠ truth. Drafts, not compass.
If you use ChatGPT, stay awake. Take to heart what it says right below the text window where we type our prompts, and raise our questions:
“ChatGPT can make mistakes. Check important info.”
Find here the transcript of the dialogue






