Writing for busy readers, and why not to use AI

Writing for busy readers, and why not to use AI
Photo by Aaron Burden / Unsplash

I recently read The Productivity Shift (TPS) – a report from Grammarly with the basic premise that businesses should use AI to help employees communicate better, which will increase productivity. You can skim the report if you want, but it’s mostly a lengthy marketing pitch for Grammarly – a company that sells AI-powered software that edits your writing. Maybe Grammarly sells other stuff, too, but I’ve never used any Grammarly product so I’m not sure what their, like, breadth is.

Within the past year or so, I also read Writing for Busy Readers (WfBR), a book by Todd Rogers and Jessica Lasky-Fink about how to write more effectively. The premise of the book is similar to the premise of the Grammarly report – a lot of workplace writing is ineffective, and if we can write better, we will be more productive. It’s a full-length book, so there’s more nuance than this, but this feels like a fair enough one-sentence synopsis.

I liked WfBR a lot. I thought TPS was trash.


In my mind, the fundamental difference between the two is that TPS argues that you should use AI to improve your communication, whereas WfBR argues that you should implement these six principles (and related strategies) to improve your communication. In other words, TPS wants you to outsource your writing to AI, whereas WfBR wants you to own your writing.

I'm neither pro- nor anti-AI. But I am vehemently pro-thinking. If TPS were about an AI tool that could read my notes to automatically schedule follow-up meetings, I’d be all for it. That tool would save me time by automating a mundane, mindless task that requires little-to-no real cognition. But it’s not about that. It’s about an AI tool that is supposed to automate my writing.

There’s a fairly famous quote from William Zinsser – the author of On Writing Well – that applies here: “Writing is thinking on paper. Anyone who thinks clearly should be able to write clearly – about any subject at all.” If we run with this idea, then, if we go all-in on AI, we’re not just outsourcing our writing to AI; we’re outsourcing our thinking as well. So if we’re struggling to write clearly, it probably means we’re struggling to think clearly. The solution shouldn’t be to just offload that thinking to Grammarly or Claude or Gemini or ChatGPT as soon as it becomes difficult. The solution should be to wrestle with your thoughts until you can make them more intelligible.

Take a look at the six principles that Rogers and Lasky-Fink outline in WfBR if you haven’t yet (and really, read the book if you can). Sure, they’re principles for writing, but many are principles for thinking, too. “Less Is More” isn’t just about using fewer words or including fewer ideas; it’s about thinking through your topic thoroughly enough to use precisely the right words and include only the most important ideas. “Tell Readers Why They Should Care” means you, the writer, need to have thought through the “so what” of whatever you're writing about. “Make Reading Easy” means you need to be confident enough in your ideas to state them plainly rather than hiding them behind doublespeak.

Maybe AI tools can take mediocre, unclear writing and make it more concise. Maybe they can make it easier to read. Maybe they can even tell readers why they should care. But is letting AI coopt your thinking really a long-term strategy, especially in a white-collar knowledge-economy job?


What does this have to do with data? Well, the vast, vast majority of “data people” in education aren’t exclusively “data people.” Many are teachers or school administrators who are also tasked with making sense of data. Some may be business analysts, financial analysts, or program administrators. Some are researchers. Some straddle multiple roles. I’m part researcher, part data scientist, for example. Regardless, nobody only works with data. These roles are just as much about communication as they are about data analysis. 

And both data analysis and communication require clear thinking. If I can’t clearly communicate the results of an analysis I’ve conducted, then there’s a good chance that:

  1. I don’t understand the data;
  2. I don’t understand the methodology;
  3. I don’t understand the bigger picture; or
  4. All of the above

So if I analyze some data and find myself struggling to write about it, this should be a clue that, somewhere along the line, I’m missing something. It shouldn’t be an invitation to dump a tangle of half-coherent thoughts into Grammarly and pray that the AI-gods can bail me out. 

If you’re enjoying reading these weekly posts, please consider subscribing to the newsletter by entering your email in the box below. It’s free, and you’ll get new posts to your email every Friday morning.

Read more