Students across Australia have started the new school year using pencils, pens and keyboards to learn how to write. In workplaces, machines are also learning to write, so effectively that within a few years they may write better than humans. Sometimes they already do, as apps like Grammarly demonstrate. Certainly, much everyday writing humans now do may soon be done by machines with artificial intelligence (AI).
The predictive text commonly used by phone and email software is a form of AI writing that countless humans use every day. According to an industry research organisation Gartner, AI and related technology will automate production of 30% of all content found on the internet by 2022. Some prose, poetry, reports, newsletters, opinion articles, reviews, slogans and scripts are already being written by artificial intelligence. Literacy increasingly means and includes interacting with and critically evaluating AI.
This means our children should no longer be taught just formulaic writing. Instead, writing education should encompass skills that go beyond the capacities of artificial intelligence.
Back to basics, or further away from them?
After 2019 PISA results (Programme for International Student Assessment) showed Australian students sliding backwards in numeracy and literacy, then Education Minister Dan Tehan called for schools to go back to basics. But computers already have the basics mastered. Three major reports — from the NSW Teachers’ Federation,the NSW Education Standards Authority and the NSW, QLD, Victorian and ACT governments — have criticised school writing for having become formulaic, to serve NAPLAN (the National Assessment Program – Literacy and Numeracy).
In some schools, students write essays with sentences fulfilling specified functions, in specified orders, in specified numbers and arrangements of paragraphs. These can then be marked by computers to demonstrate progress. This template writing is exactly the kind of standardised practice robot writers can do.
Are you scared yet, human?
In 2019, the New Yorker magazine did an experiment to see if IT company OpenAI’s natural language generator GPT-2 could write an entire article in the magazine’s distinctive style. This attempt had limited success, with the generator making many errors. But by 2020, GPT-3, the new version of the machine, trained on even more data, wrote an article for The Guardian newspaper with the headline “A robot wrote this entire article. Are you scared yet, human?” This latest much improved generator has implications for the future of journalism, as the Elon Musk-funded OpenAI invests ever more in research and development.
Robots have voice but no soul
Back at school, teachers experience pressure to teach writing for student success in narrowly defined writing tests. But instead, the prospect of human obsolescence or “technological unemployment” needs to drive urgent curriculum developments based on what humans are learning AI cannot do — especially in relation to creativity and compassion.
AI writing is said to have voice but no soul. Human writers, as the New Yorker’s John Seabrook says, give “colour, personality and emotion to writing by bending the rules”. Students, therefore, need to learn the rules and be encouraged to break them. Creativity and co-creativity (with machines) should be fostered. Machines are trained on a finite amount of data, to predict and replicate, not to innovate in meaningful and deliberate ways.
How to write with purpose
AI cannot yet plan and does not have a purpose. Students need to hone skills in purposeful writing that achieves their communication goals. Unfortunately, the NAPLAN regime has hampered teaching writing as a process that involves planning and editing. This is because it favours time-limited exam-style writing for no audience. Students need to practise writing in which they are invested, that they care about and that they hope will effect change in the world as well as in their genuine, known readers. This is what machines cannot do.
AI is not yet as complex as the human brain. Humans detect humour and satire. They know words can have multiple and subtle meanings. Humans are capable of perception and insight; they can make advanced evaluative judgements about good and bad writing. There are calls for humans to become expert in sophisticated forms of writing and in editing writing created by robots as vital future skills.
Robots have no morality
Nor does AI have a moral compass. It does not care. OpenAI’s managers originally refused to release GPT-3, ostensibly because they were concerned about the generator being used to create fake material, such as reviews of products or election-related commentary. AI writing bots have no conscience and may need to be eliminated by humans, as with Microsoft’s racist Twitter prototype, Tay. Critical, compassionate and nuanced assessment of what AI produces, management and monitoring of content, and decision-making and empathy with readers are all part of the “writing” roles of a democratic future.
Skills for the future
As early as 2011, the Institute for the Future identified social intelligence (“the ability to connect to others in a deep and direct way”), novel and adaptive thinking, cross-cultural competency, transdisciplinarity, virtual collaboration and a design mindset as essential skills for the future workforce. In 2017, a report by The Foundation for Young Australians found complex problem-solving skills, judgement, creativity and social intelligence would be vital for students’ futures.
This is in stark contrast to parroting irrelevant grammar terms such as “subordinate clauses” and “nominalisations”, being able to spell “quixotic” and “acaulescent” (words my daughter learnt by rote in primary school recently) or writing to a formula. Teaching and assessment of writing need to catch up to the real world.
By Lucinda McKnight, Senior Lecturer in Pedagogy and Curriculum, Deakin University
This article is republished from The Conversation under a Creative Commons license. Read the original article.