Embrace the Weird: AI Hallucinations

One of my favorite parts of technology, any technology and especially early technology, is testing for the fringe. I like to understand what's good, what's bad, what's broken, and how I can tap into it better given any of those scenarios.

AI hallucinations—when artificial intelligence told confidently gives "wrong" but plausible answers—are often seen as glitches. But in creative work, these "mistakes" can actually fuel fresh ideas. Sure, we need accuracy in some areas, but when it comes to creativity, letting the AI get weird can open up new possibilities by combining things in ways we’d never think of.

Why Hallucinations Matter in Creative Work
AI systems today can generate some pretty strange stuff. Why? Because they try to predict the most likely response based on patterns they’ve learned, even when they don’t have all the info. In business, we usually try to eliminate these inaccuracies, but in creativity, those weird outputs can be a spark for innovation.

Creativity thrives on blending the familiar with the unexpected, and AI hallucinations offer that in spades—mashing together ideas that humans might not usually connect. Instead of treating them like bugs, we should see them as inspiration to try new things.

Encouraging Hallucinations in ChatGPT
With ChatGPT, you can actually encourage these offbeat responses by playing around with vague or conflicting prompts. For example, if I’m brainstorming a new product, I might ask for something like “a futuristic chair that doubles as a garden.” Now, ChatGPT might give me something bizarre—like a chair that grows plants out of the seat while you sit. Sure, that’s not practical, but it could get me thinking about how to incorporate natural elements into furniture design in ways I hadn’t considered before.

The trick is to take that strange idea and run with it. Maybe I don’t use the exact concept of a plant-growing chair, but the oddball output leads me to something more realistic, like designing a chair with a nature-inspired aesthetic.

MidJourney and Visual Hallucinations
Over in MidJourney, hallucinations happen visually. You might prompt something like “a retro car in a futuristic city” and end up with cars floating on clouds or buildings that melt into the sky. These visuals might not be what you intended, but they can offer up ideas you hadn’t thought of—like using fluid shapes or dreamlike lighting in your designs.

I’ve often started with a basic prompt and gotten back something totally weird. Instead of tossing it aside, I focus on what’s working in the hallucination. Maybe the way the car hovers leads to a whole new idea for a futuristic transportation concept.

Balancing Accuracy and Creativity
Of course, not every hallucination is useful. Sometimes they’re just flat-out wrong, and for things like legal advice or medical applications, that’s not okay. But in creative fields, letting AI make those mistakes can actually lead to breakthroughs. The key is knowing when to reel it in and when to embrace the weirdness.

By treating AI as a collaborative tool—one that gives you strange, sometimes nonsensical ideas—you can use its creativity to push your own thinking. The weird stuff AI throws at you might not always be usable as-is, but it can kickstart your imagination in ways you wouldn’t have gotten to on your own.

The Freedom to Get Weird
At the end of the day, AI hallucinations are just another way to inject some randomness and unpredictability into your creative process. Sometimes, all it takes is that one odd output to spark a totally new direction. So next time your AI gives you something off-the-wall, don’t dismiss it right away. Dive into the weirdness—you might just find your next big idea in the madness.

Article Image Disclaimer
All images in this article were created by taking the entire output of each paragraph and dropping them into Midjourney unedited. I did two re-runs per input and picked the most interesting image to use. The only predefined influence on these images is that they were all generated with "personalize" turned on --p.