Here is a comprehensive summary of the news article. ### Summary of News Report | | | |---|---| | **Title** | It’s true that my fellow students are embracing AI – but this is what the critics aren’t seeing | | **Source** | The Guardian (Opinion Piece) | | **Author** | Elsie McDowell (Student and 2023 Hugo Young award winner) | | **Publication Date** | June 29, 2025 | --- ### Overview In this opinion piece, student Elsie McDowell argues that the widespread adoption of Artificial Intelligence (AI) tools like ChatGPT among university students is not primarily driven by laziness or a desire to cheat. Instead, she posits it is a rational response to a deeply flawed post-COVID education system characterized by profound uncertainty, inconsistent assessment methods, and mounting financial pressures. The author contends that to understand the rise of AI, critics must look at the systemic failures that have left students feeling unsupported and "on the back foot." ### Key Findings and Arguments The author builds her case by connecting the educational disruptions of the pandemic to the current academic and economic landscape for students. **1. The Post-COVID Educational Context:** * The author's generation experienced unprecedented disruption to their secondary education, with the cancellation of national exams (GCSEs and A-levels) in 2020 and 2021. * This was followed by a "punitive crackdown on grade inflation" when in-person exams returned in 2023, leaving many students with lower-than-expected grades. * Consequently, a large cohort of students entered higher education without the typical experience of sitting formal, high-stakes, handwritten exams. **2. Inconsistent and Unstable University Assessments:** * In response to the pandemic, universities shifted to online, open-book assessments. This trend has largely continued. * **Key Statistic:** Five years after the pandemic began, **70% of universities** still utilize some form of online assessment. * This has led to a highly variable and inconsistent system. The author notes her own exams switched from half-online in her first year to all-handwritten in her second, with confirmation of the format arriving late in the academic year. * This inconsistency creates an environment of uncertainty, making AI tools more appealing to students navigating a constantly changing system. **3. The Role of AI as a Tool:** * While acknowledging concerns about cheating, McDowell states that students often view AI as a "broadly acceptable tool in the learning process" for tasks like research assistance and structuring essays. * The release of ChatGPT in 2022 occurred in a "university system in transition," making it a convenient solution for students dealing with academic uncertainty. ### Contributing Socio-Economic Factors The author argues that the problem extends beyond the classroom and is exacerbated by significant financial pressures. * **Student Employment:** A record number of students are working to support themselves. * **Key Statistic:** **68% of students have part-time jobs**, which is the highest rate in a decade. * This leaves students with "less time than ever to actually be students," making time-saving tools like AI more attractive. * **Student Debt:** The financial burden on students is increasing. * **Key Detail:** The author's cohort is the first to face a **40-year student loan repayment period**, a significant increase from the previous 30-year term. ### Conclusion and Recommendations The author concludes that the rise in AI use is a symptom of systemic issues within higher education, not a moral failing of the student body. The combination of academic instability and financial precarity has created a "perfect storm" for AI adoption. **Recommendations:** * **Consistency:** Universities must decide on a stable and consistent examination format and adhere to it. * **Clarity on AI Use:** If universities continue with coursework or open-book exams, they must provide clear and explicit guidelines on what constitutes "proportionate" and acceptable usage of AI. ### Notable Risks and Concerns While defending students' use of AI, the author also personally acknowledges valid concerns, including: * The potential for abuse and overuse of Large Language Models (LLMs) in education. * The significant environmental cost (water and energy consumption) of powering AI data centers.
It’s true that my fellow students are embracing AI – but this is what the critics aren’t seeing | Elsie McDowell
Read original at The Guardian →Reading about the role of artificial intelligence in higher education, the landscape looks bleak. Students are cheating en masse in our assessments or open-book, online exams using AI tools, all the while making ourselves stupider. The next generation of graduates, apparently, are going to complete their degrees without ever having so much as approached a critical thought.
Given that my course is examined entirely through closed-book exams, and I worry about the vast amounts of water and energy needed to power AI datacentres, I generally avoid using ChatGPT. But in my experience, students see it as a broadly acceptable tool in the learning process. Although debates about AI tend to focus on “cheating”, it is increasingly being used to assist with research, or to help structure essays.
There are valid concerns about the abuse and overuse of large language models (LLMs) in education. But if you want to understand why so many students are turning to AI, you need to understand what brought us to this point – and the educational context against which this is playing out.In March 2020, I was about to turn 15.
When the news broke that schools would be closing as part of the Covid lockdown, I remember cheers erupting in the corridors. As I celebrated what we all thought was just two weeks off school, I could not have envisioned the disruption that would mar the next three years of my education.That year, GCSEs and A-levels were cancelled and replaced with teacher-assessed grades, which notoriously privileged those at already well-performing private schools.
After further school closures, and a prolonged period of dithering, the then-education secretary, Gavin Williamson, cancelled them again in 2021. My A-level cohort in 2023 was the first to return to “normal” examinations – in England, at least – which resulted in a punitive crackdown on grade inflation that left many with far lower grades than expected.
At the same time, universities across the country were also grappling with how to assess students who were no longer physically on campus. The solution: open-book, online assessments for papers that were not already examined by coursework. When the students of the lockdown years graduated, the university system did not immediately return to its pre-Covid arrangements.
Five years on, 70% of universities still use some form of online assessment.This is not because, as some will have you believe, university has become too easy. These changes are a response to the fact that the large majority of current home students did not have the typical experience of national exams.
Given the extensive periods of time we spent away from school during our GCSE and A-level years, there were inevitably parts of the curriculum that we were never able to cover. But beyond missed content, the government’s repeated backtracking and U-turning on the format of our exams from 2020 onwards bred uncertainty that continued to shape how we were assessed – even as we progressed on to higher education.
In my first year of university, half of my exams were online. This year, they all returned to handwritten, closed-book assessments. In both cases, I did not get confirmation about the format of my exams until well into the academic year. And, in one instance, third-year students sitting the exact same paper as me were examined online and in a longer timeframe, to recognise that they had not sat a handwritten exam at any point during their degree.
And so when ChatGPT was released in 2022, it landed in a university system in transition, characterised by yet more uncertainty. University exams had already become inconsistent and widely variable, between universities and within faculties themselves – only serving to increase the allure of AI for students who felt on the back foot, and make it harder to detect and monitor its use.
Even if it were not for our botched exams, being a student is more expensive than ever: 68% of students have part-time jobs, the highest rate in a decade. The student loan system, too, leaves those from the poorest backgrounds with the largest amounts of debt. I am already part of the first year to have to pay back our loans over 40, rather than 30, years.
And that is before tuition fees rise again.Students have less time than ever to actually be students. AI is a time-saving tool; if students don’t have the time or resources to fully engage with their studies, it is because something has gone badly wrong with the university system itself.The use of AI is mushrooming because it’s convenient and fast, yes, but also because of the uncertainty that prevails around post-Covid exams, as well as the increasing financial precarity of students.
Universities need to pick an exam format and stick to it. If this involves coursework or open-book exams, there needs to be clarity about what “proportionate” usage of AI looks like. For better or for worse, AI is here to stay. Not because students are lazy, but because what it means to be a student is changing just as rapidly as technology.
Elsie McDowell is a student. She was the 2023 winner of the Hugo Young award, 16-18 age categoryDo you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.




