Strange History X

Want to know why so many YouTube videos feel soulless and unreliable lately, especially the ones with those robotic AI voices?

It’s because the scripts are 100% generated by ChatGPT, Grok, Claude, or some other large language model.

These tools are fantastic for quick research or tightening up phrasing, but when people use them to write the whole thing from scratch, problems explode. Especially when the Creators are not like me - someone with a genuine interest in the subject matter. These Creators are simply riding the crest of the wave of whatever is popular.

I’ve tested them extensively, and here’s the ugly truth: they make s%*! up constantly.

They invent facts, dates, quotes, and entire events with total confidence.

Even when you explicitly tell them “only use verified information and cite real sources,” they’ll happily fabricate references that look legitimate but don’t exist when you check.

Worse, I’ve seen channels straight-up steal my own research: they rip the audio from my videos, transcribe it, feed it to ChatGPT with the prompt “rewrite this in your own words,” and then slap an AI voice on top.

The result? My original work comes out distorted, sprinkled with brand-new fake details that ChatGPT just made up because it “sounded right.”

It’s not just lazy; it’s deceptive. Viewers have no idea they’re being fed a mix of recycled content and confident-sounding falsehoods.

YouTube (and every platform) needs to step up and require clear disclosure whenever AI writes or significantly rewrites a script.

A simple label like “Script generated/substantially rewritten by AI” would let people decide how much trust to give the video.

Right now it’s the Wild West, and the audience is paying the price with misinformation disguised as entertainment.

4 days ago | [YT] | 122