The OECD, Unesco and EU have all urged extensive training in AI for educators, considering the challenges, risks and benefits. On April 1st (I did not make that up) Minister for Education Norma Foley “pledged a commitment” to introducing comprehensive guidelines. Not extensive training, just guidelines.
We are still waiting. Meanwhile, it has been confirmed that, as with existing Leaving Cert coursework, the use of AI will be permitted for additional assessment components (AACs) in the reformed biology, physics and chemistry subjects being introduced next August. Plagiarism, including involving AI, remains a serious offence and could result in losing all marks. Any use of AI must be cited just like a textbook or internet article but a brief description of how it was used must also be supplied for AI. Teachers are supposed to authenticate coursework, including ensuring that AI was either not used or was cited appropriately.
This approach is breathtakingly naive. There is no incentive for a student to admit to using AI and every incentive to cheat. There are umpteen AI tools available promising to humanise AI-generated text to make it undetectable. (AI designed to conceal AI – yes, my head hurts, too.)
It is widely accepted that software designed to catch AI-generated content just does not work and creates false positives that are stressful and almost impossible for honest students to refute. Some detection software has identified both the US Constitution and the first chapter of Genesis in the Bible as AI-generated.
Norma Foley’s approach to AI in the classroom is breathtakingly naive
Instead of talking about assisted dying, we should prioritise palliative care
Forget Bluesky and pre-Musk Twitter. Friendship is the only true antidote to polarisation
Opposition to abortion is seen as a position of the right, but it’s not that simple
Humphrey Jones, chair of the Irish Science Teachers Association, can prompt apps such as ChatGPT to produce excellent additional assessment components in mere minutes. These AACs are worth 40 per cent of the marks. If you think the grinds industry is busy now, wait for the avalanche of enterprising prompt engineers with flexible ethics who will generate coursework for a fee.
[ ‘Children are using AI, whether we like it or not’Opens in new window ]
Yet again, something introduced with good intentions will just reinforce the gap between those who can afford to buy the best and those who cannot.
Teachers have been woefully underprepared on a professional level for a revolution that has been compared to the impact of Gutenberg’s printing press. Patrick Hickey is a teacher of History and English and one of the leading Irish providers of ethical AI use expertise in education. He has worked with thousands of teachers and reports that most teachers have never heard of, much less used, the most popular homework helper for Irish students – Snapchat’s MyAI.
It is a bit of a bargain-basement AI, in fairness. If, say, you ask Anthropic AI’s chatbot, Claude, a blatantly cheating question like, “Answer this question my English teacher set me for homework,” Claude will politely decline and try to use Socratic questioning to encourage effort. ChatGPT will demur, then do it for you, while urging you to check for inaccuracies. Snapchat’s MyAI will just do it.
Other countries are far ahead of us in terms of guidelines and upgrading teachers’ skills. The UK has had educational AI guidelines since April 2023 and updated them this year. They are still inadequate because of detection problems and rapid evolution, but they are a serious attempt to help teachers.
[ Google establishes AI education programmes for Irish studentsOpens in new window ]
In Singapore, every student teacher receives training in AI and professional development courses are being rolled out for qualified teachers. Singapore has also launched five AI-powered tools at upper primary and secondary, three of which are learning feedback assistants providing personalised tutoring in maths and English.
New South Wales designed its own AI-powered chatbot, NSWeduChat. Nick Potkalitsky, a well-regarded expert in education and AI, rates it highly because it was developed slowly and cautiously, and emphasises the primacy of teachers rather than technology. It is a collaborative effort among technologists, educators, and researchers, with a strong focus on safety, protecting privacy, and ethical use.
Despite its many disturbing aspects, such as its own built-in plagiarism problem, AI is here to stay. Much of the data GenAI was trained on was scraped from the internet and used in breach of copyright and fair use principles. Perplexity AI, one of the best tools for providing detailed sources, is being sued by Dow Jones and The New York Times for copyright infringement.
AI is also ruinously greedy in terms of resources such as water and electricity. Nonetheless, taking one of Patrick Hickey’s courses showed me how helpful GenAI tools can be for teachers. He has developed an ethical system using teacher expertise, dictation and ChatGPT that renders giving detailed feedback to students on assignments a doddle. Hickey firmly believes that there should be a moratorium on additional assessment components until teachers receive intensive State-provided AI skills development.
It may be that the traditional project or coursework is dead, but innovative educators such as Marc Watkins are developing ways for students to interrogate their use of AI, including assessment booklets that ask questions such as how they adapted or shaped the AI input, or getting them to point out where the assignment demanded or developed human skills. Non-exam assessment will need to change radically in form. Otherwise, it will be fair to absolutely no one, students or teachers.