The range of AI tools that are now available for teachers is growing exponentially. Even for someone who has been actively following, experimenting with and leading AI professional learning and development workshops for teachers, the rate at which it is growing is a mind-boggling phenomenon which is hard to keep up with.
There are a few red flags at the moment and some key considerations need to be at the forefront. Similar to using the SAMR model to be more purposeful and innovative with digital tools, all that is new and shiny is not necessarily useful or necessary. There are also ethical considerations and pedagogical considerations to keep in mind. And it might just be a good time to stand back and take some time to absorb, test and refine so that we can use AI with more pedagogical purpose.
Google’s short course on AI language models encourages some good habits. It reminds us to verify the accuracy of any AI generated content and to honour full disclosure of AI generated content. All outputs should be personalised, verified and edited to meet the purpose of the task. It is also important to know the school’s policies on AI use and consider the implications on the end user if the origins of any content are revealed as being AI generated. Obviously (I hope) there are privacy considerations as well as security concerns around what is used as an input and to what end. The ethical considerations of ‘adding to the confetti’ of knowledge that is the pull source of AI generated content needs to be at the front of our minds. Interestingly, in order to submit articles for a leading educational website I learned that all authors must disclose if AI was used in any part of the writing process from ideation through to drafting and editing. Perhaps this will become the new industry standard?
An additional consideration is the environmental impact which is less talked about. The large machines that we are creating such demand for do not operate without land, water, energy or human costs. And how convenient is it for us to gloss over the potential environmental cost? So if we are using AI - the best questions to ask are, why?
Use AI thoughtfully
We need to consider if we are using AI for good. Do we have the necessary knowledge to judge whether the output is correct, useful or revelatory? Can we judge whether or not a particular output might have negative unintended consequences? Are there biases that are being accidentally reinforced? Are we leaning into an average that will quickly become the status quo? Some recent work with AI generated summaries for Aotearoa NZ Histories curriculum content highlighted some alarming biases.
As future-focused teachers and leaders, we are tasked with designing learning experiences that are engaging, universally designed and creative. We still want our learners to be critical thinkers, problem-solvers and collaborators, don’t we? So if we use AI to design worksheets and ‘cloze’ activities or lesson plans that don’t have additional pedagogical considerations, how can we ensure that we are utilising AI to make our practice better? Easier, maybe. Faster, sure. But better? Making better learning experiences requires a bit more critical and creative thinking when it comes to prompt engineering as well as awareness of what good pedagogy looks like in use as a prompt.
PARTS Prompt Engineering
When inputting a prompt to generative AI we can use Google's PARTS acronym : Persona, Aim, Recipients, Theme and Structure.
P: As a future-focused secondary teacher of Social Studies I want to
A: design an activity that encourages critical problem solving about the ethics and resource use of AI as a research tool for
R: year 10 Social Studies Students - some with dyslexia, some with autism
T: so that they can practice critical literacy skills with experiential learning in groups of three
S: with a Bingo Board choices board of simple single-sentence creative prompts with points allocated to indicate levels of difficulty
(Just a quick example).
When we actively apply a pedagogical lens (constructivism/experiential learning/UDL/differentiation/learner agency/gamification) to the prompt engineering, it can be so much more useful.
As teachers feeling lost in the rapid expansion of AI teaching assistants, it is so important not to fall for the ‘easy worksheet’ outputs that are so swiftly generated. There are so many other considerations that are being overlooked.
Just like we needed to use a SAMR model to evaluate the usefulness of digital tools, using AI in schools does not equate to educational value. We need to carefully consider how AI applications can enhance learning, promote critical thinking, and foster creativity among out students while also keeping ethical considerations, transparency and security in mind.
We teachers and leaders of the 21st century have to extend our knowledge of responsible use of AI to ask questions about ethics, pedagogy, usefulness and need.
Just because we can, doesn’t mean that we should. If we are employing AI to create worksheets (as many teachers are currently excited to do) we are taking a step backwards towards rote learning and regurgitative lessons that will not support our students to be future-ready.
Why not use AI to teach about AI?
Learn about prompt engineering and how to integrate purposeful pedagogy into prompts
Automate the administrative tasks to have more hands on time in the classroom.
Teach AI Literacy as a part of Digital Fluency.
Deliberately design human-centred and student-centred tasks that can’t be completed by AI.
Use scenario planning to future-proof teaching and learning
Allow time to play and explore to test limitations and possibilities of new tools
Integrating AI in education demands us to be conscientious creators who are not afraid of critical evaluation and creative application. We need to leverage AI thoughtfully and purposefully. Using it as it is or how it is packaged for us (worksheet generators!) is not the way forward. If we are critical thinkers and discerning users who demand better and insist on applying purposeful pedagogy, we can empower our students to thrive in this rapidly evolving digital landscape. We can retain integrity and efficacy as educators and we can make learning better.
And because you are wondering, which AI tools are on my current top 5 play list (these ones were in my most recent AI for educators workshop):
Perplexity. ai - a great research tool to use with students. It provides citations, footnotes and links as well as follow up questions to promote deeper research and inquiry practices.
Brisk - a fantastic Chrome plugin. I like the ‘inspect writing’ tool to offer a replay of cut and paste and real time editing of student work to scan for plagiarism or to play back my own writer's process so they can see the lifecycle of how I might write a poem or short story.
Diffit - a useful differentiation tool that is straightforward to add scaffolds for your learners.
Magic Padlet - a lesson/unit creation tool that is surprisingly quick for an overview you can edit. This is found within padlet and you can have three live padlets with the free version.
Almanack - this is a bit like MagicSchools but it has a few more gamified options including a jeopardy three level question designer that appeals to my love of gamification. I fed it a glossary and it created a leveled jeopardy game for me within seconds. It still needed editing but it was a fun way to explore ‘levels’ of jargon associated with a unit. My next step is to get students to design their own three level jeopardy game using class terminology - as an output tool it was ‘ok’ - but as a prompt for students to do better it was a winner.
Thank you for reading! If you want to book me for a session to explore prompt engineering, pedagogy, ethics and play with AI, I’m only an email or phone call away!
And as a last wondering - Beyond Chat GPT - which AI tools are on your (current and likely to change any time) tick list?
And more reading from the OECD. Useful food for thought.