Is It Absurd To Critique Generative AI?
I often find Cory Doctorow’s commentary interesting, and typically discover a few nuggets in each text that I find to be salient and useful. While my recent read of his March 12, 2026 Pluralistic article proved the same, there was one statement that hit me as being patently not accurate. So much so that I left a comment (I almost never comment on anything) and am writing more about it here.
In the Pluralistic post, Doctorow talks about how AI hype and the dawn of AI psychosis as a topic—and makes many good points along the way. He then says this:
This is an extremely normal technological situation: for a new technology to be promoted and productized by shitty people who have grandiose goals that would be apocalyptic should they ever come to pass — and for some people to find uses of that technology that are nevertheless beneficial to them and their communities. The belief that AI is an exceptionally bad technology (as opposed to an exceptionally bad economic bubble) drives AI critics into their own absurd culs-de-sac (sic). There are many, many skilled and reliable practitioners of technical and creative trades who’ve found extremely reasonable, normal ways in which AI has automated some part of their job. They aren’t hyperventilating about how AI has changed everything forever and the world is about to end. They’re not mistaking AI for god, or a therapist. They’re just treating AI like a normal technology, like a plugin.
(I added the bold formatting myself for emphasis.)
I don’t think that AI critics have driven themselves into an “absurd cul-de-sac” by claiming that AI is both an exceptionally bad technology and an exceptionally bad economic bubble. It can be both. And this doesn’t mean that the people using AI in their day to day—the people Doctorow goes on to describe in the following paragraphs—are bad.
But the tech, the tech—yes, there are reasons to say that AI is exceptionally bad, reasons that don’t apply to other forms of commercial consumer-and-enterprise technology that have come before.
Why is AI “more bad” than some other tech?
The transition from typewriter to computer was a big shift. Using a computer, even just for word processing, requires additional knowledge, power, and computation beyond dusting off your Underwood or plugging in an IBM Selectric. You have to know how to find and open the word processing app of your choice, a step that involves additional tools (like a mouse) and time. I don’t mean to claim for one minute that this wasn’t a jarring shift in some ways.
But even though a computer might draw more energy than an electric typewriter (and obviously more than a manual one), there’s are two things that make the shift from typewriter to personal computer less damaging than the shift from personal computer to generative AI: power and noise.
AI and Power Consumption
My computer sits on my desk. If I plug it in and turn it on, it will begin to draw power. If I turn it off and unplug it, it stops drawing power.
Anker, a maker of charging bricks and cords, says that a 13” MacBook Air draws between 8-10 watts per hour when used for a typical workload. If we split that down the middle and go with nine watts, then the computer will use 72 watt-hours over the course of the workday.
AI tools, however, are basically always drawing on the power grid due to how popular they are.
Sam Altman, CEO of OpenAI, has stated that one ChatGPT query uses 0.34 watt-hours of electricity. Yes, that’s less than an hour of computer use, but think how many ChatGPT messages get sent each day.
The Institute of Electrical and Electronics Engineers (IEEE) calculated that the average ChatGPT user’s daily queries use the energy equivalent of turning on a 10-watt LED lightbulb for one hour. The IEEE also estimated that over the course of a year, all generative AI queries draw an amount of power equivalent to what two nuclear reactors produce in the same time span.
And, of course, you have to use a computer to interact with generative AI tools. So increasing AI use adds onto the power draw each person collects while using their computers. It seems small at first, but when you consider the total volume of computer and AI use over time—along with widespread pushes for everyone to use more generative AI—there are some real problems.
The IEEE continued to crunch their numbers and, cross-referencing a Schneider Electric report, determined that we actually do not produce enough power to keep up with AI-related energy demands. Building an additional 44 nuclear reactors could meet that need by 2030 but…that’s not exactly something you whip up in four years.
Forget power, what about noise?
Even if you say the power component alone isn’t an issue, we can’t overlook noise pollution.
If turn on my computer and fire up every single app, I might hear the fan start to whir. My neighbor, however, will not hear my computer whir, even if I have all the windows open. (And they do indeed live close enough to me that I can talk to them through my office window if they’re outside.)
What my neighbors and I can hear, though, is the sound of the data center about half a mile (.8km) from our residences. It’s not all the time, but when the generators and cooling systems kick on, that sucker is LOUD. The Environmental and Energy Study Institute reports that data centers emit high- and low-frequency sounds that can raise noise levels up to 96 decibels. And that’s the sound from the servers—when the generators kick on, the sound tops 105 decibels; akin to a jet flying overhead.
The EESI also says that sounds above 65 decibels can increase physical stress, while sounds above 85 decibels begin to hurt people’s hearing.
Oh yeah—your data
And finally, we get to a third issue that I haven’t touched on yet: data harvesting. A brand new computer is a fairly empty slate—it doesn’t come pre-loaded with all of the books, art, and films that have been made by people before you. But generative AI does.
If you wanted to keep a copy of every book used to train Meta’s large language model (LLM) in your own home, you’d need 5,125 Kindle Paperwhite e-readers. And that’s just for the books—this doesn’t include any other media they’ve included in their AI training data set.
Every time you use a generative AI tool, you’re transmitting data back to a corporation. Yes, if you have a paid account, you can usually turn off the “use my data to train your model” option, but that doesn’t mean your chats are private or encrypted. Heck, Claude Code on the web copies your entire repository of code to an Anthropic-owned machine.
This doesn’t make you bad. It’s normal to be interested in a new technology. Many people have to use it for their jobs, as it’s become a requirement. As Doctorow points out in his article, there are legitimate reasons for using generative AI tools, and I can’t say that there won’t be any benefits in some areas over time. But generative AI is a fundamentally different consumer technology than others that have hit the market over the past few years, and to claim that anyone criticizing the tech is in an absurd cul-de-sac doesn’t quite hold water with me.
But we don’t have to see eye-to-eye. A difference of opinion in that area won’t keep me up at night. The noise from the data center can do that all on its own.