When I first designed an assignment called Seeing with the Machines, I wanted students to move beyond using artificial intelligence as a convenience and begin studying it as a social actor. I teach at a large public university in Appalachian Ohio. Many of my students are first-generation or Pell-eligible. They work, share devices, and juggle care. Technology runs through every student’s story, essential to learning and daily life, yet few pause to ask how these systems know what they know or who they are built to serve. The guiding question that now shapes much of my pedagogy is simple but generative: What does it mean to teach sociology in a moment when AI is both an object of study and a tool shaping pedagogy?
Michael Burawoy reminds us that our students are our “first and captive public.” When I treat the classroom as a public space, I invite students to practice sociology not as an academic exercise, but as a civic act. That idea has taken on new meaning in one of my newest courses, The Sociology of Artificial Intelligence. The class grew out of a broader rethinking of what it means to teach sociology at a time when our publics are increasingly mediated by machines. Rather than treating AI as a technical subject, I approach it as a sociological phenomenon, a set of systems that shape visibility, belonging, and power in everyday life.
Our conversation often begins with an unexpected realization: our “public” now includes the machine itself. Algorithms track, predict, and respond to us, sometimes more quickly than our institutions do. As these systems become everyday interlocutors, they also become our newest public, as invisible institutions of governance. The classroom has become a field site of algorithmic power. I realized that to teach sociology responsibly, I had to help students see how their education was mediated by technologies that also classified, ranked, and predicted them.
The assignment began as a short ethnographic exercise. Students selected a machine they interacted with daily, such as ChatGPT, Spotify, Google Maps, or even the self-checkout scanner at the local grocery store, and observed how it saw them. They wrote fieldnotes on moments when the system appeared to misrecognize or over-recognize their behavior, when an algorithm insisted on the wrong music genre, or when a predictive text app completed their sentences in ways that revealed cultural bias. One student laughed as she described Spotify’s algorithm as “that clingy friend who thinks it knows me better than I do,” before realizing that its pattern of suggestions mirrored the narrow labels attached to her digital profile.
I pair this with short readings and quick drills. We map the people and tools around a familiar app, name the labels it uses, and ask who benefits, who is misread, and who is missing. We aren’t chasing tidy answers. We aim for better questions that students can carry into their everyday lives. I have learned to slow down, leave room for silence, and invite stories. Students’ reflections land harder when they connect the abstract with the local. One student wrote about the uncanny comfort of an AI mental-health chatbot that seemed to “listen” better than people did, yet offered canned empathy detached from context. Their reflections were analytic yet deeply personal, blurring the boundary between data subject and sociologist.
When generative AI entered the classroom, I felt the same ambivalence many instructors feel. These same technologies could enhance teaching, yet they also threaten to flatten it. Rather than banning the tool, I positioned it as a text to be critiqued. In one activity, a chatbot summarized Durkheim on social facts. We read with pencils, circling what felt too smooth or too certain. The goal was not to catch errors but to model inquiry. Students saw how systems echo dominant stories with confidence. A prompt about successful families returned a narrow script, and we learned that critical thinking comes from our practice, not from the tool.
Equity sits at the center of this work. Some students rely on outdated laptops, unstable internet, or shared family devices. When we discuss algorithmic bias, they recognize themselves not as abstract examples but as those whose data are often missing from training sets. I try to design with that in mind. Using free tools, I invite collaborative note-taking and shared inquiry, so no student is left behind. I keep a parallel reading list with working-class, rural, and non-Western voices. Students see themselves in the materials and feel licensed to question how knowledge infrastructures are built.
Some of my most memorable teaching moments come from students’ refusals, the decision not to use an AI tool, not to accept an algorithmic classification, or not to write in the machine’s preferred style. Their refusals remind me that critical pedagogy is not about producing compliant digital citizens but about cultivating discernment. One student refused to let ChatGPT summarize her fieldnotes, explaining that doing so felt like “outsourcing reflection.” Another chose to hand-draw her network map rather than use software, arguing that the tactile act of drawing revealed relationships she would have missed otherwise. These gestures, small but significant, embody the ethics of “slow sociology” in a fast computational world. Teaching in this way has changed how I assess work. I now value evidence of process as much as the final product. Annotated screenshots, reflective journals, and short audio notes all make the thinking visible. Students now think with and against machines, not just what answers they produce.
Writing this, I see my public is not only my students but also the educators facing the same dilemmas. Each term, we rehearse sociology’s perennial question, how structure shapes life, now refracted through digital interfaces. If public sociology aims beyond the academy, teaching AI makes that aim urgent. To teach while the machine watches back calls for humility, craft, and vigilance, reminding me that pedagogy is a sociotechnical practice that, when reflexive, can cultivate critical publics who imagine more just futures.

Tamanna M. Shah is the Eric A. Wagner Professor of Sociology at Ohio University and a Curriculum Writing Fellow at Harvard University. Her research focuses on comparative political sociology, gender, peacebuilding, and armed conflict. She serves as the Book Reviews Editor for Sociological Research Online and is a member of the Editorial Board of ASA TRAILS. She is the author of Children and Youth as “Sites of Resistance” in Armed Conflict, Volumes I and II, published as part of the ASA Section on Children and Youth series.

