An unsettling result of using ChatGPT too much

I am sort of startled at how often I use ChatGPT as a teacher. If anyone is bored, I wrote out some details but I will put them at the bottom because it may be more specific than anyone cares about.

The upshot of all of them, though, is that in every case they work because I have a precise vision of what I am looking for, and I have that precise vision because I did it the “hard way” for over 20 years. If I’d had ChatGPT as a younger teacher, I do wonder if I would have relied on it to build my vision in a way that would have made my own development stagnate. I didn’t know enough to argue with it.

I especially worry about teachers who use it to give feedback. Reading essays and giving feedback is appallingly awful. But it’s also how you learn to teach writing, because over the course of reading and commenting on thousands of essays, you see patterns, you learn how people that aren’t you think about writing. You honestly don’t even know what you are trying to get them to do until you’ve read 100s of essays that don’t do it. So how does that work if you skip that step? How do you ever learn what you even want? And if you don’t learn what you want, how do you learn to teach it?

I also worry that ChatGPT is good at giving feedback on a subset of things that could be given feedback on. So when teachers let it give feedback and tweak it–something I’ve heard frequently–some of the important stuff isn’t there to tweak. But since so much IS there, it’s not obvious. There’s plenty of feedback, after all. So I worry a person could have big flaws never get addressed. This is even more problematic if multiple teachers in a row use the same approach. Typically, it’s a good thing to have one teacher who is hell on concision and another that really focuses on effective organization and then someone who never let you get away with imprecise word choice or whatever. Everything gets “picked up on” by someone. But if AI is really the one determining what matters and it doesn’t change, that’s not as effective. And kids will learn to write what the AI likes.

The TL:DR is that ChatGPT is that people try to use it for the intellectually challenging stuff and they should be using it for the time-consuming but stupid stuff.

Ok, how I use ChatGPT in the classroom:

Vocabulary lists and quizzes. I can feed it a list of words and ask for brief definitions and sentences that contain context clues, and it just spits them out. I can ask for a story that uses all the words. I read over them, make minimal tweaks, and they are great. I can feed it a text and ask it to flag challenging vocab and generate a list of definitions at whatever grade level I would like. I can feed it a list of words and ask for a morphological analysis. All this was stuff I could do with a dictionary, but in 10% of the time and no worries about copyright.

Writing “textbook” copy. I teach AP Government, and I hate all of the text books. They go into too much detail about irrelevant things, or rely on prior knowledge the students don’t have, or fail to make key distinctions (or make irrelevant distinctions). I know exactly what I want them to know about, say, congressional oversight, but it takes me a long time to write it out. Instead, I ( and many teachers) rely on bullets on slides and hope the kids listen as we explain. With AI, I can quickly generate an explanation that is exactly suited to the amount of time I have for the topic, the prior knowledge of my students, and contains all the relevant information. It does always involve a certain amount of editing. That means I can assign 350 words of reading before class instead of 3500. This makes it much more likely the reading will happen, and that we will have good discussions when they come back. I can also put that 350 words out on Google Classroom with out worrying about copyright fraud.

Generating model answers. In all my classes, we do a lot of 3-5 sentence answers to things. (explain this, give an example of that) and I like to give students model responses to read after they finish their own. The nice thing about using AI for this is that it doesn’t sound like me, it sounds like a textbook, like a model. I don’t want them to try to sound like me, or to think the goal is to sound like me. I want to give them a more neutral model. The other advantage is that sometimes Chat thinks of additional approaches I didn’t. So the model answer they see after they submit their own is more comprehensive. They review the model and then I can ask if they have questions about that, and we get much better discussions. But again, I often have to edit or correct the models.

Coming up with language. At times, it’s really good at helping me find a term for something or wrap my head around an idea that I understand, but am having trouble breaking down for kids. This is when it gets really “chatty” and I don’t directly use anything it produces. It just gives me new approaches for my own thinking.

TPS-Forms. I have a good principal and don’t have to turn in dumb lesson plans or intervention plans or anything, but if I did, I would 100% use ChatGPT for that.
.