How Sci-Fi Taught Me to Embrace AI in My Classroom


This story was published by a Voices of Change fellow. Learn more about the fellowship here.

Growing up as a sci-fi geek, the promise of humanity’s future among the stars was bolstered by artificial intelligence. In “Star Trek,” the ship’s omnipresent computer was a font of knowledge, advice, and could even make a cup of Earl Grey. However, in today’s society, the specter of AI is often portrayed as a villain by the media and general society, particularly when it comes to AI in the classroom.

In my history class, one of my favorite projects I give to students is to have them learn about something deeply, create a lesson and teach back what they learned to their class. It’s a fun way to start the lessons on how to research well and how to communicate learning with others. This year, when rolling out the project, I gave a small lesson on how to find good sources. I gave the usual spiel about Wikipedia, how to use online libraries and databases, and even how, sometimes, YouTube can provide good learning.

What truly shocked some students was when I said, “Use ChatGPT or other AI services to find sources.” One student very loudly said, “What? We can use AI on this?” Now, obviously, my intent was not for them to use AI to do the project for them, but simply as a way to point them in the right direction of sources of information. Thinking back on this interaction gave me pause: Why did my student act so surprised when I said AI is useful? She simply couldn’t believe I had even said those letters, wondering out loud, “Is this a trick?”

I believe a lot of her reaction lies in how we, the adults, educators, parents and media have presented AI. Students, just like us, have been seeing all the articles and news stories about AI in the classroom. Every story sounds a bit the same. AI is the new “big bad” of education, and students are using it to cheat! There is little wiggle room. AI is the Death Star, and its aim is on our students’ abilities to think for themselves.

If the popular sentiment around AI and education is to be believed, there are few to no redeeming qualities to this emerging technology. While these sentiments may hold true for some, I also believe we are responsible for the way we frame the benefits and utility of AI. If we only present to students that AI is a tool for cheating, then students will only ever see AI as a tool for cheating. So, how can we reframe AI in the classroom for our students?
Still, and somewhat contentiously, I have hope. But if AI is to become something more than just a tool for cheating, it is up to us as educators to educate ourselves and our students on its other uses.

“I do not fear computers. I fear the lack of them” — Isaac Asimov

The fear of AI being used as only a tool for cheating and “dumbing down” students in writing and humanities classes, like social studies, reminded me of growing up in the age of calculators and later, the internet. When I was a student in the early 1990s, calculators were demonized as simply a way to skip the hard part of math, and would lead to students never learning math basics. We have seen how the calculator age has changed mathematics education with shifts toward new math principles like understanding how the calculation works, why it works the way it does, and applications of math, rather than simply “finding the answer.”

I would argue that this new way of thinking in math has been a net positive for our students. They are now much more understanding of the underlying theories behind all sorts of mathematical problems and empowered to use their critical thinking to understand how best to solve the problem. Now, ask an older person to do a simple math problem. They may be able to find the answer, but they have no idea how or why it works.

The fact that students don’t actually have to calculate every number by hand (or in their heads) doesn’t stop them from being fantastic mathematicians and strong critical thinkers. Could a similar phenomenon happen with AI?

Earlier, I talked about the ship’s computer in the Star Trek series as an example of the hope for AI. In this example, the AI computer is a vastly intelligent machine that can make any calculation in seconds, provide background information on topics and species and provide statistics and probabilities for the ship’s crew. However, the ship is still manned by teams of professionals. This is because the AI is viewed as a tool for information, but not for decision-making. That is up to the captain (“Engage!”) and the crew to take the computer’s information, double-check it and then make what they think is the best choice. We can try to use this as a model for how we talk about AI to our students.

I have told my students that AI is a new technology that could be a super powerful tool for them, but it is ultimately a really smart child. AI is easily influenced by bad information, doesn’t always think about whether what it says is right or wrong, and ultimately, is still just learning. “Would you let a small child write your report for you?” I once asked a student.
What AI can do is point our students in the direction of information. If we show them how to dig into AI, look at where it got its information from, verify it and make our own decisions, then we can use our own powerful computers (brains) to make decisions on what is good information and bad.

“Progress doesn’t come from early risers — progress is made by lazy men looking for easier ways to do things” — Robert Heinlein

Even if you agree with AI being a useful information system, that doesn’t solve the problem of simple laziness. Some students will admit that they know the material, but are simply too busy or too lazy to go the extra mile and actually do the writing or creation of projects to show it. It is simply easier and more efficient to plug what they know into an AI program and let it put it together in whatever form the teacher asked for.

We can help guide students around this behavior by being clear about what our intentions are for the assignment. Say I want my students to write a short, persuasive essay about role-playing as someone trying to get people to buy war bonds during World War II. Outside of simply completing the work, I have specific standards and concepts I am looking to see if my student understands. AI could be used to find primary sources to quote, summaries of the effectiveness of the war bond programs and even provide an outline for how effective persuasive essays can be formatted.

If a student uses AI for all those things, and then does the writing themselves using their own voice and words, have they met the criteria for solid and original work? How much work can we sacrifice to efficiency before it becomes cheating or unoriginal work? This is up to each teacher and maybe even by assignment. There are plenty of places where I think giving up some authority over how much “work” the student did versus how much understanding they have of the concept is perfectly reasonable.

As a teacher, I will freely admit to using a multitude of quality AI programs to help speed up tedious tasks. Things like text leveling, checks for understanding on videos, and even writing emails can all be made much easier and efficient for teachers using AI. If I can use those tools and still be an effective and quality teacher, we can teach our students to use similar tools to be efficient and quality learners.

“The future is not set. There is no fate but what we make for ourselves” — John Connor

I’m not too naive to understand that, no matter how we present it, some students will always be tempted by “the dark side” of AI. What I also believe is that the future of AI in education is not decided. It will be decided by how we, as educators, embrace or demonize it in our classrooms.

My argument is that setting guidelines and talking to our students honestly about the pitfalls and amazing benefits that AI offers us as researchers and learners will define it for the coming generations.

Can AI be the next calculator? Something that, yes, changes the way we teach and learn, but not necessarily for the worse? If we want it to be, yes.

How it is used, and more importantly, how AI is perceived by our students, can be influenced by educators. We have to first learn how AI can be used as a force for good. If we continue to let the dominant voice be that AI is the Terminator of education and critical thinking, then that will be the fate we have made for ourselves.



Source link

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *