
Melissa Jacobs, the director of library services for NYC Public Schools, has recently noticed a somewhat unsettling trend in education.
More and more of the professional writing materials she sees from her educator peers seem to be written by AI.
“I think newsletters are more AI-driven. I think evaluations and summaries and narratives and things that used to be the individual creating are now being driven by AI,” she tells me recently.
Article continues below
I’ve noticed the same thing in my communications with students and, in some cases, from peers. I recently wrote about how, at the start of 2026, I started suspecting students were writing me AI-generated emails. Jacobs began noticing this uptick in AI use by educators over the past six months.
We discuss this growing and little-talked-about problem in education, and what can be done about it.
Using AI Doesn’t Always Make Things Easier
One area Jacobs has seen a big uptick in AI-style writing is on grant applications she oversees.
“The answers to the questions for a large majority of them appear to be very AI-driven, rather than, ‘This is my own thought, my own belief, my own answer to this question,’” Jacobs says.
Educators looking for a grant shouldn’t be outsourcing the grant-writing process to AI for many reasons, but one is that it does the opposite of making your grant application stand out.
Additionally, the application process is designed to help grant recipients succeed once they receive a grant. Part of that, Jacobs explains, is to help applicants plan out the program for which they will use the grant.
“If they’re not doing the work in the beginning, then they’re not going to be successful running the program if they do receive the funding,” Jacobs says. “They’re missing out on a big component of it. We’re asking questions not as busywork, but as a thought process.”
AI Shouldn’t Replace Thinking
In the AI era, it has become a cliché that AI shouldn’t be used to outsource thinking, but Jacobs believes some educators may need a reminder of that.
“I’m not saying it’s not a good tool to use, but I am encouraging original thinking,” Jacobs says. She adds that some educators are using AI in just the way we try and discourage students from using it. “They’re not using it as someone that they’re editing with, but they’re like, ‘Oh, okay, it’s done. I’m just going to cut and paste this here.’”
She adds this approach can be seen in the quality of work they submit. “Their voice is not consistent across all of their work, if you’re looking at multiple pieces, and I think that they’re not putting the effort into it.”
Like we constantly tell students, it may be okay to use AI in certain instances to help you figure out how to write, but you shouldn’t use it to write for you.
We Need AI Work Culture Norms in Education And Beyond
Jacobs says that right now the use of AI by adults who are professionals can create awkward situations. She notes that it’s hard to provide feedback to someone who didn’t create the work that you’re critiquing.
These are the kind of discussions and challenges educators have been navigating with their students since the debut of ChatGPT, but now are becoming more common among educators themselves. Jacobs wants to see this issue get more discussion, and for more workplace cultural norms to develop around AI use.
In the meantime, she believes educational leaders can look to the playbooks they developed around AI use and students.
“There has been a lot of focused attention on teaching students how to use AI effectively, but there has not been a lot of focus on how to use it effectively with colleagues,” Jacobs says. She adds that it might be time to change that.

