“AI hallucination is all that genAI does,” said Symbol Zero CEO Rafael Brown. “All that it does is throw things together, like throwing pasta and sauce at a wall and waiting to see what sticks. This is done based on what the viewer likes and doesn’t like. There’s no real rhyme or reason. There’s isn’t true structure, context, simulation, or process. There is no skill, insight, emotion, judgment, inspiration, synthesis, iteration, revision, or creation. It’s like a word jumble or a word salad generator. It’s not even as good as Scrabble or Boggle. It’s better to think of it as AI Mad Libs — trust your business, your future, and your creation to AI Mad Libs.”
There’s also the possibility that genAI might well implode as it starts feeding on itself and all reality-based data vanishes. That’s how Pascal Hetzscholdt, senior director at content protection at publisher Wiley, sees it.
“Models like ChatGPT4 must constantly be retrained on new data to stay relevant and useful,” he said. “As such, this means generative AI is already starting to eat itself alive by being trained on its own output or other AI output.