Detonation #24: The Horror of AI Generated Horror

The end times are near! Don’t believe me? You should. Not that I’m fucking Nostradamus or Baba Vanga, or, heaven help me, Nate Silver. But I am a divorced Gen-X technophile know it all who lives in a microchip factory, drinks Blanton’s Bourbon like its water, and spends WAY too much time dredging social media cesspools for nuggets of filthy amusement. That’s gotta count for something. Right? Hell yes it does.

And I know what you’re thinking. End times? Really? It’s always the end of the world or doomsday or (*** waves arms ***) post-modern Armageddon on social media. The truth, such as truth is, is that endings are always complicated and, unless it’s climate change or the sun going supernova or the moon falling from orbit, probably not the absolute finale, just a change – for better or worse. Usually better. Usually. Except for Coke Dreamworld. Brutalist architecture. And possibly Pontiac Aztek’s.

Of course, in case I wasn’t clear, the latest boogeyman to strike fear into the hearts of creatives across the multiverse is AI. Specifically, AI used to create art and write. Lot of gnashing of teeth on this one. Folks taking a stand against soulless algorithms trained on unlicensed material scraped from a billion network accessible sources. And it is currently problematic in a lot of ways. And pretty crappy. Not that I don’t think it’ll get better, this is pretty much the start of something new, but I like to think it’ll be a valuable tool to assist in a variety of ways as opposed to replacing most artists. It’ll never have true imagination. And plus, bespoke is always better than mass produced garbage.

That said, why not go to the source, asking the daemonic heart of darkness what IT thought of writing horror. Well, gotta say, I was rather surprised.

Using chatGPT to write horror literature is a dangerous and irresponsible endeavor that could have disastrous consequences for the world.

For those who may not be familiar with chatGPT, it is a language model that is trained on a vast amount of text data and is able to generate human-like responses to prompts. While this technology has a wide range of potential uses, it has also been used by some to try and generate horror stories and other forms of scary content.

However, using chatGPT to create horror literature is a terrible idea for several reasons. First and foremost, chatGPT is not capable of truly understanding the nature of fear or horror. It can only generate responses based on patterns and associations that it has learned from its training data, and these responses will likely be shallow and superficial at best.

Furthermore, chatGPT has no moral compass or sense of responsibility, and it is therefore entirely possible that it could generate content that is truly disturbing or even harmful to readers. This could lead to real-world consequences, such as psychological trauma or even physical harm, for those who consume this content.

In conclusion, using chatGPT to write horror literature is a reckless and irresponsible act that has the potential to cause serious harm. It is important that we do not use this technology in such a way, and instead focus on using it for more constructive and beneficial purposes.

There you go folks, from the daemon’s mouth itself.