Many college instructors are encountering a familiar classroom dynamic: students arrive able to discuss the “main idea” of a reading, but struggle to point to specific passages, explain how an argument develops, or engage closely with the text itself. Increasingly, that surface-level familiarity comes not from reading, but from AI-generated summaries.
Generative AI tools such as ChatGPT and NotebookLM can quickly produce fluent overviews of complex texts. While these tools can be useful for review or clarification, they also make it easy for students to bypass the cognitive work that reading is meant to support. Rather than attempting to monitor or restrict AI use, we experimented with a simpler approach: redesigning the reading environment to make engagement with texts more visible, tangible, and accountable.
Why Reading Needs Redesigning
Reading has long been understood as an active, interpretive practice that underpins discussion, analysis, and writing (Haas & Flower, 1988; Bean & Melzer, 2021). Yet instructors across disciplines report growing difficulty sustaining students’ attention to assigned texts (Bergman, 2024). In AI-saturated classrooms, this challenge is intensified: students can access summaries, explanations, and argument breakdowns without encountering the language, structure, or evidence of the original text.
The problem is not simply that students are using AI, but that reading itself has become increasingly invisible. If students can participate in class using secondhand representations of texts, there is little incentive to engage deeply with them. We wanted to make reading visible again: something students do, rather than something they delegate.
The Context for the Intervention
The intervention was implemented in a first-year writing program at an American branch campus in the Middle East, where all students complete two writing courses focused on critical reading, analysis, and argumentation. Class sizes are small (sixteen students), allowing instructors to monitor reading practices. Courses are organized around shared themes, identity, inequality, or taste and distinction, and include a range of genres, from scholarly articles to essays and empirical studies.
Like many instructors, we noticed that students’ preparation increasingly relied on AI-generated summaries. Students could often identify a text’s topic, but had difficulty engaging with its reasoning or language. We therefore focused on redesigning the conditions under which reading happened, rather than focusing on students’ tool use.
Although the intervention was implemented in a writing course, the instructional problem it addresses—students’ limited engagement with assigned readings—is common across college courses. We therefore approached the design with transferability in mind.
The Intervention: A Printed Reading Booklet
At the beginning of the semester, each student received a printed booklet containing all assigned readings for the course. Instead of accessing texts through PDFs or links, students worked with a physical packet they kept throughout the term.
Students were asked to annotate the readings by hand—highlighting key ideas, underlining claims, and writing questions or comments in the margins. This material shift reframed reading as a tangible activity rather than a screen-based task that can be easily interrupted or replaced. Research on handwriting suggests that writing by hand encourages selective attention and deeper processing (Mueller & Oppenheimer, 2014), a pattern we observed in practice.
Over time, the booklet became a cumulative record of students’ engagement with course texts, which students frequently referenced during class discussions and written assignments.
Making Reading Accountable
To reinforce active reading, we paired the booklet with a simple accountability mechanism. Before each class, students uploaded photos of their annotated pages to the university’s learning management system.
These submissions were not graded. They functioned as evidence of preparation, similar to arriving ready to participate in discussion. Because class sizes were small, reviewing submissions was manageable, and students quickly understood that reading was expected and visible.
This small step had a noticeable impact. Students arrived to class prepared to reference specific passages, ask text-based questions, and support their claims with evidence. Importantly, handwritten annotations made it difficult to rely exclusively on AI-generated summaries, which typically do not leave traces of interpretive decision-making.
What Changed in Practice
Across the semester, several patterns emerged.
Students reported reading more carefully and with greater focus. Many noted that working with a printed text encouraged them to slow down and pay attention to language and structure in ways they did not when reading on a screen.
Class discussions became more grounded in the text itself. Students referred to page numbers, quoted language directly, and built on one another’s observations rather than relying on general summaries.
In written assignments that required engagement with course readings, students demonstrated clearer use of textual evidence and more precise integration of sources. Reading was no longer just a preparatory task, but a visible component of students’ intellectual work.
Practical Considerations: Format and Permissions
Because the intervention involved compiling course readings into a single printed booklet, we coordinated with library staff to ensure that all materials were already covered under existing licenses or course reserves. In practice, the booklet did not introduce new copyright issues; it simply consolidated readings that had already been approved for instructional use.
Instructors considering a similar approach should consult their institutional guidelines, but in many contexts, assembling assigned readings into a single format is a manageable and compliant process.
Why This Works in an AI-Rich Classroom
This intervention does not prohibit AI tools or frame them as inherently problematic. Instead, it creates conditions in which reading cannot be fully outsourced. By making students’ interaction with texts visible and tangible, the booklet restores reading as a site of interpretation, judgment, and meaning-making—forms of engagement that AI-generated summaries cannot replace (Bender et al., 2021).
The goal is not a return to print for its own sake, but intentional pedagogical design. When instructors change the material conditions of learning, they also change what students attend to, and what they can easily avoid.
A Small Change with Broad Applications
Although this intervention was developed in a first-year writing course, it is easily adaptable to any course where reading matters. It requires no new technology, minimal ongoing labor, and little explanation to students.
In a moment when generative AI is reshaping how students engage with academic work, instructors do not always need complex technological solutions. Sometimes, a low-tech redesign is enough to put responsibility for meaning-making back where it belongs: in students’ hands.
María Pía Gómez-Laich, PhD, is an Associate Teaching Professor of English at Carnegie Mellon University in Qatar, where she teaches and researches academic writing and genre-based pedagogy in higher education. She is the co-author of Analysis and Argument in First‑Year Writing and Beyond: A Functional Perspective. Her work has appeared in Linguistics and Education, English for Specific Purposes, TESOL Quarterly, and the Modern Language Journal.
Silvia Pessoa is a Teaching Professor of English at Carnegie Mellon University in Qatar where she teaches and researches academic and disciplinary writing. She is the co-author of Analysis and Argument in First‑Year Writing and Beyond: A Functional Perspective. Her work has appeared in TESOL Quarterly, Journal of Second Language Writing and the Journal of English for Academic Purposes. She is on the editorial board for English for Specific Purposes.
References
Bean, John C., and Dan Melzer. Engaging Ideas: The Professor’s Guide to Integrating Writing, Critical Thinking, and Active Learning in the Classroom. San Francisco: Jossey-Bass, 2021.
Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret Shmitchell. “On the Dangers of Stochastic Parrots.” Proceedings of FAccT 2021, 610–623.
Bergman, Lisa. “Students’ Reading in Higher Education: Challenges and Ways Forward.” Journal of Adolescent & Adult Literacy 67, no. 4 (2024): 414–423.
Haas, Christina, and Linda Flower. “Rhetorical Reading Strategies and the Construction of Meaning.” College Composition and Communication 39, no. 2 (1988): 167–183.
Mueller, Pam A., and Daniel M. Oppenheimer. “The Pen Is Mightier Than the Keyboard.” Psychological Science 25, no. 6 (2014): 1159–1168.

