Student perceptions of AI 2025
A report on how students view artificial intelligence, how they are using it, and their concerns and hopes, based on conversations with students held between June 2024 and May 2025.
Introduction
Over the last two years we have spoken to groups of students to get a broad understanding of the way they view artificial intelligence (AI), and how they are using it, and what their concerns and hopes are. We’ve published two reports, summarising our findings, in 2023 and 2024. This report updates our findings, based on conversations with students held between June 2024 and May 2025.
This year we took a slightly different approach to our insight gathering, by combining our own student discussion groups with surveys carried out by a number of universities and colleges. We held student discussion groups with a diverse group of 173 students from further education and higher education – both undergraduate and postgraduate, where we were able to discuss issues in detail. Additionally, we analysed data from seven surveys, which collectively gathered 1,274 student responses across the same range of educational levels to inform this report.
As AI becomes more embedded in daily life, students and learners are having to navigate both the opportunities and its challenges. Many students are using AI tools to support their studies. However, some have found that heavy reliance on AI for academic tasks can lead to a perceived (or actual) gradual decline in the quality of their work. Alongside practical experiences, there is growing anxiety among students about the rapid pace of AI developments and their ability to keep up.
A major concern is around misinformation, with many students and learners admitting they do not fully understand how deepfakes are created or how to spot them, adding to their fears about trust and accuracy in a digital world. They are also worried about data privacy, particularly about the possibility that AI could be used to predict and influence their behaviours over time.
However, the most significant concern for most students and learners is the impact AI will have on their future employability. They are deeply aware that automation and changing job markets will demand new skills, yet for some, especially students with disabilities or limited resources, the pressure to continually upskill feels overwhelming.
This report explores these concerns in detail, highlighting both the opportunities students and learners see in AI and the support they believe is needed from their institutions.
Key messages
Students are embracing AI, but they want clear, fair, and practical support to use it responsibly. Our 2025 student perceptions research highlights the following key messages:
- AI is part of everyday life: students and learners are using tools like ChatGPT, Microsoft Copilot, Google Gemini, and Snapchat My AI daily for study, life organisation, job preparation, and personal support
- Academic use is evolving: AI is helping students with writing, research, notetaking, revision, and presentation skills, coaching them through their studies.
- Clear guidance is urgently needed: students are calling for consistent, course-specific policies on AI use, combined with practical, ethical, and course specific education, support or guidance
- Employability is the biggest concern: students worry that AI will impact entry-level roles and devalue the skills they are developing. Many feel unprepared for the rapid changes in the job market
- Equity matters: access to AI tools is uneven. Students want institutions to ensure fair access and support, recognising differences in background, ability, and resources
- Ethical concerns are growing: students worry about misinformation, deepfakes, bias in AI outputs, and the risks to intellectual property and personal data security
- Human connection still matters: students see AI as a tool, not a replacement. They value personalised teacher feedback, discussion, and collaboration alongside AI, and want education that blends human and AI strengths
- Skills loss is a real fear: overreliance on AI could reduce critical thinking, creativity, and communication skills and impact future workplace success
- Students want to be partners, not passengers: they don’t just want to be taught about AI, they want to help shape how AI is integrated into education and how it prepares them for the future
Emerging issues
Many of the key issues are similar to those found in the previous two years. A small number of new issues are starting to emerge or grow:
- Some students noted that over-reliance on AI was starting to have a negative effect on the quality of their work, causing them to re-evaluate its use
- Many students are starting to report a feeling of anxiety about the speed of AI developments
- Students are starting to be concerned that their data would be used to predict their behaviours on a larger scale in future
- As deepfakes become ever more realistic, student concern about their impact on the world is growing
- Some students are starting to report that they are using AI applications for relationship and mental health advice and support
An overview of student use of AI
Students and learners are using AI in increasingly sophisticated ways. They are using these applications to support and coach them through their study and their daily lives.
Academic use
- Writing: AI applications including Grammarly, ChatGPT, Copilot and Editor are regularly being used to improve the clarity and professionalism of written communications, such as emails
- Non-English-speaking students use tools such as DeepL and Google Translate for translating texts and helping with understanding
- Students/learners continue to use AI applications such as ChatGPT, Microsoft Copilot, Claude, Google Gemini and Snapchat My AI for help with understanding what is required and to suggest approaches. They also use them for writing essays, creating outlines, and overcoming mental blocks, and to assist in structuring arguments, generating statements, proofreading and providing content ideas
- Some students use ChatGPT’s voice input feature to speak their thoughts and ideas aloud, then have it structure them into their desired format
- Many students mentioned using AI to interpret the marking rubric to inform their planning. They are also using AI to get feedback on their work prior to submitting. Many repeat the cycle a few times to improve their submission and ensure they have included all required elements
- Accessibility: many disabled students, including those with ADHD, dyslexia, and other forms of neurodivergence, find tools like ChatGPT, Microsoft Copilot, Goblin.Tools, Grammarly, and Quillbot helpful for explaining concepts and confirming understanding. Microsoft Copilot was also mentioned as helping neurodiverse students build flexibility and fluency by allowing them to practise and rehearse answers to potential questions. This helped them express their thoughts more clearly and approach Q&As with greater confidence. Google Translate was also reported as helpful for Dyslexia , its text-to-speech allows students to see and hear words together to improve understanding
- Collaborative working: some students mentioned the use of AI applications during group/teamwork. In some cases, it was part of the assignment to use AI as a team member, and this was seen as positive. It was mainly used as a critical friend, giving them feedback and identifying gaps and additional activities. This would enable the team to assess and decide how useful they were
- Note-taking: students routinely use AI applications to enhance notetaking during lectures or from lecture recordings or texts. ChatGPT, Microsoft Copilot and Notebook LM are being used to identify the key points and create concise and organised notes. Many students said this helped them gain a better understanding of the topic
- Research assistance: Scholarcy and Elicit are used to summarise research papers and academic articles, making it easier for students to digest large volumes of information by providing more manageable summaries and often cutting out that overwhelming stage of early research. Zotero is also being used to extract data from research papers efficiently
- PhD students are using AI applications to act as a digital colleague to brainstorm ideas with. They also use them to draft, edit, and refine large bodies of work such as doctoral theses, making the difficult task of managing extensive research projects more manageable
- Revision: students continue to use Copilot and ChatGPT to create flashcards, practice papers – based on old papers, and practice quizzes that both ask questions and provide feedback to support revision. They also report using AI to fill knowledge gaps and improve their critical thinking skills
- Speaking: many students said Presenter Coach helped them to rehearse presentations, giving them feedback on filler words, tone and clarity. They combined this with ChatGPT or CoPilot to make their content coherent and engaging. Using these tools boosted confidence levels
Subject specific
- Content creation: students in creative areas, such as digital arts, marketing, and design, are using AI applications to generate design ideas, edit images, and create visual content. Applications like Midjourney, Adobe and Photoleap support students to experiment with different design concepts quickly and aid their creativity
- Coding: computing students continue to use GitHub Copilot and Google Colab to debug code, generate programming scripts, and optimise algorithms. They are using these applications to complete coding assignments based on their input, significantly speeding up the process
- Data analysis: students are using AI to perform initial data analysis to understand patterns, detect outliers, and prepare data for deeper examination. Students are using SPSS or R for statistical analysis and modelling, to summarise data trends, generate visual data representations and build predictive models
- Games design: students are using Character AI to create character bios for their games design. They input basic details about their character such as name, background, personality traits and appearance and the AI will then generate detailed bios including motivations and backstory, making their characters more rounded and believable. They can also check and refine their characters, ensuring they are consistent with their role in the game
- Maths: ChatGPT and Microsoft Copilot continue to be used to improve understanding of how to approach and solve exercises, helping students to understand the more complex aspects
- Daily planning: students are using AI applications such as Motion Schedule and Notion to manage their time effectively, prioritise tasks, and organise study and personal activities. Including suggesting the best times for studying and breaks and continually adapting to their needs
- General queries: many students remarked that it is often easier and quicker to ask AI than to interact with teaching staff
Levels of use
Some students are not using AI at all. This was due to a few factors including a fear that all AI use is classed as cheating, a lack of clarity in the guidance, a perception that it’s not yet good enough to rely on, a lack of awareness of AI in general and a lack of the knowledge and skills to use it effectively for study. They expressed anxiety about doing the wrong thing and felt that without clearer reassurance or basic training, it was safer to avoid AI entirely.
Relying on AI for academic tasks has shown diminishing returns for some students. Those that had initially turned to AI tools as their primary resource for completing assignments, reported a gradual decline in the quality of their work and subsequently lower grades. As a result, these students are now re-evaluating their approach to integrating AI and establishing new study habits.
Employment
Students are using AI tools such as ChatGPT, Microsoft Copilot and Gemini to help them with job application letters and restructuring their CVs. They are also using these applications to help them with interview practice, such as getting AI to generate suggested role specific questions they can practice and getting AI feedback on their answers.
Some students worry that those with fewer skills might get ahead, simply by using AI effectively in their applications. Students fear it will become harder for them to stand out and secure interviews, especially where recruiters can’t distinguish easily between authentic and AI content.
Personal support
Some students report that they are using AI applications for relationship and mental health advice and support. In this report we are focusing on sharing what students are reporting, rather than commenting on the use. However, this jumps out as a particular area that warrants further investigation.
Overall
Students feel that AI can make them more efficient with their studies and that colleges and universities should move away from restriction and focus on how to adapt and embrace AI. Students/learners consider they should be involved in shaping AI within their organisations and are willing to take part in discussions.
Student concerns
Students and learners have concerns about the use of AI too and want their institutions to support them. In addition to the specific concerns below, many students reported feeling of anxiety about the speed of AI developments and how they could keep up to date.
Academic integrity: students/learners are increasingly concerned about how AI could impact academic integrity. They worry that AI tools might provide an unfair advantage, especially to those who can afford premium versions, potentially facilitating more sophisticated cheating. This situation could make it harder to distinguish between student-generated work and AI-generated content. They are particularly cautious about how any use of AI might be viewed. They worry it could affect perceptions of originality and intellectual effort.
Despite, according to Jisc’s latest survey, 86% of universities and 49% of colleges having student guidance in place, students and learners are still reporting that they are unclear on what use of AI is/is not allowed.
They need to understand what counts as appropriate use versus cheating, for their specific context, and where the line lies between support and overuse. Students desire clear and concise policies and guidance that ensure the fair and ethical use of AI, along with sufficient support and training to use these tools effectively and responsibly.
Misinformation and deepfakes
Students are increasingly concerned about the challenges posed by AI generated misinformation and deepfakes. Many students admit to not understanding how deepfakes are created or how to identify them, heightening fears about their potential misuse in areas like current affairs. They worry that AI could be used to distort facts to fit specific political agendas, with snippets of media being manipulated to misrepresent the truth. They are also concerned about the potential for AI to spread false information.
Some students voiced concerns about losing a clear sense of reality in digital spaces, especially on social media where AI-generated content is often undetectable. They recognised that developing stronger information literacy skills is essential to help them distinguish between real and manipulated content, particularly as deepfakes become more sophisticated.
Equity
Most learners/students are using free versions of AI applications, however there is a growing proportion of students paying for access to premium versions. This is exacerbating concerns that the costs associated with advanced AI tools leads to digital inequity. Students with financial constraints can only access basic versions, whilst those who can afford premium services gain a competitive edge. This creates a divide, affecting their academic performance and future opportunities.
Students were also concerned that uneven and inconsistent AI use across different classes or modules, is unfairly disadvantaging some students. They would like a clear and consistent approach to AI.
Privacy and data usage: students and learners expressed unease about how their personal data is used and stored by AI applications, with students generally sceptical of the transparency and security practices of AI tool providers.
Intellectual property rights (IPR) and data security
Creative students and researchers were especially worried about two things: first, their work might be stored by the large language models and used in ways they didn’t intend and secondly, these applications might use their contributions to respond to other users. Other students were concerned that their data would be used to predict their behaviours on a larger scale in future.
Bias
PhD students raised concerns about the potential for AI applications to embed biases into their responses, which could skew research and reinforce existing prejudices and stereotypes.
Loss of innovation
Some students expressed concern that AI trained on historical content could reinforce conventional thinking and limit creativity. They worry that increased use of AI, by students and staff, could discourage more original, disruptive, or unconventional approaches to learning and problem-solving, particularly in fields that value innovation and critical thinking. Balancing AI as a support tool with the need to nurture disruptive and creative thinking is crucial.
Skills loss
Some students expressed a concern over losing important skills due to their reliance on AI applications in their studies and daily lives. There is a fear that constant use of AI for tasks could lead to a homogenisation of thought, reducing individuality and a loss of creativity. While many appreciated AI for helping them work faster and more efficiently, they were also concerned it would reduce their ability to think critically and generate original ideas.
There was particular concern about the impact on their communication skills. They worry that if they passively accept AI suggestions without reviewing and engaging with how the words, phrasing and tone can influence meaning, they risk being unable to engage in social and professional interactions.
They worry this reliance could also hinder key interpersonal skills, like active listening, persuasive speaking, and responding to others’ needs, leaving them unprepared for future workplace challenges.
Student needs
Students and learners want structured, equitable, and practical support to help them use AI effectively and responsibly.
They want clear and consistent guidance and policies at both institutional and course level. Many students expressed frustration that different departments or tutors interpret AI use differently. They want workshops or integrated sessions, tailored for their courses, that show them how to use AI tools effectively.
They understand that using AI well is a skill, and they want to learn how to write effective prompts, verify outputs, and avoid misinformation.
Across groups, students and learners highlighted a desire for more project-based, practical learning that reflects real-life scenarios. They called for training that connects directly to everyday tasks and future employment needs.
Neurodivergent, disabled, and international students highlighted the need for support that recognises their unique challenges—whether through inclusive AI tools or alternative approaches.
Many are unsure about the data they’re giving to AI platforms. They want clear advice about what’s safe to share, how their data might be used, and how to protect their intellectual property.
Employability
Students and learners are concerned about the impact of AI on their future employment opportunities. They are thinking seriously about what it means for their future careers. Many are hopeful about the opportunities AI can bring, but they’re also worried about the kinds of jobs they’ll be stepping into post education. This is probably the biggest concern for most students and learners.
They fear that as AI tools become more powerful, entry-level jobs will disappear. They are also extremely concerned that AI’s ability to perform complex analytical tasks will undermine the value of the skills they are developing.
Students also feel the pressure to keep up. Many believe it’s now up to them to learn new AI skills if they want to stay competitive, but they’re not always sure how. Some said their courses haven’t adapted quickly enough, and that college and university departments aren’t keeping pace with how fast AI is changing the job market.
For some, especially those with disabilities or limited resources, the idea of constantly having to “upskill” feels overwhelming.
Despite these concerns, students also see a lot of potential in AI. They hope it can help them work more efficiently, save time on routine tasks, and focus more on the creative and more interesting complex parts of their jobs.
What students really want is support. They don’t just want access to AI tools—they want to understand how to use them properly, and how to do so ethically and confidently in real-world jobs.
Above all, students want to be prepared, not replaced. They know that human skills like creativity, emotional intelligence, and problem-solving will still matter. They want an education that helps them develop those strengths alongside AI.
Conclusion
AI has already become an intrinsic part of everyday life for many students and learners with them using ChatGPT, Snapchat My AI, Microsoft Copilot, and Google Gemini daily as their first choice to access news, for search engines, to prioritise daily tasks, plan activities, enhance their social media, suggest vegan versions of recipes, to interpret messages from friends and/or teachers, for weather forecasts and study support.
Students/learners stated that AI applications are tools, just like Word and Excel. They believe the focus should be on teaching them how to use them responsibly and effectively for their study. For them AI isn’t a future concern, it is here now and here to stay.
Their message to institutions is clear. Students and learners don’t want AI to be treated as something to be feared or avoided. Instead, they want practical support to help them use it well. That means clear policies, consistent guidance, and training in how to use AI ethically, effectively, and responsibly. They want to build the skills that will help them stay current in a fast-changing world, skills like critical thinking, prompt writing, fact-checking, and understanding where and when AI is appropriate.
They are also asking for fairness. Students are aware of inequalities in access to AI tools and want institutions to level the playing field, ensuring no one is left behind due to cost, ability, or lack of exposure. They want the freedom to explore AI’s potential in creative, diverse, and inclusive ways, ethically and responsibly.
A strong theme was the value of humans combined with AI. Students emphasised that while AI can speed up planning and execution, it cannot replace discussion, personalised feedback, or the nuance of tutor engagement. They see the future not as AI versus human skills, but as the two working in balance.
Finally, and perhaps most importantly, students want to be part of the conversation. They don’t just want to be told how AI will be used; they want to help shape those decisions. Many said they would welcome the chance to work with colleges and universities to contribute to policy or codesign modules that include AI.
Recommendations
Clear guidance: develop and communicate clear policies on AI use, defining appropriate use and academic misconduct.
Data privacy and intellectual property: provide students and learners with clear guidance on what data is safe to share and how to protect their intellectual property.
Embed AI literacy into the curriculum: teach students and learners how to use AI tools practically, critically, ethically, and effectively through course-specific training.
Employability support: offer careers guidance, upskilling opportunities, and practical training to prepare them to succeed in an evolving AI enabled jobs market.
Equity: provide institutional access to core AI tools where possible and avoid widening the digital divide with paid application use.
Skills development: support students to develop critical thinking, creativity and communication skills alongside their use of AI tools.
Support human–AI collaboration: ensure that the use of AI complements rather than replaces human interaction.
With thanks to
Student discussion forums
Blackburn College of Further Education
Manchester Metropolitan University
South Devon College of Further Education
University of Bedfordshire
University of Lancaster
Member surveys
AI learner survey by Gloucestershire College (not published)
Student and Staff Perceptions of Generative AI in Chemistry Departments by Dr Stephen Potts, Department of Chemistry, University College London and Dr Gan Shermer and Dr Robin Groleau, Department of Chemistry, University of Bath (not yet published)
Glossary of AI tools
Adobe: Creative design software enhanced with AI tools for image editing, concept creation, and digital arts
Character AI: AI chatbot for creating detailed character bios and narratives for games design and storytelling
ChatGPT AI: chatbot used for writing support, brainstorming, summarising, study support, job applications, and general advice
Claude: advanced chatbot used for generating writing ideas, structuring essays, and providing wellbeing advice
Copilot: AI assistant integrated into Microsoft apps, used for writing, revising, note-taking, planning, and job support
Elicit: research assistant that helps summarise academic papers and find key points to support research work
Gemini: conversational AI tool used for study help, job preparation, wellbeing support, and answering everyday questions
GitHub Copilot: AI coding tool that helps students write, debug, and optimise code quickly and efficiently
Google Colab: cloud platform for collaborative Python programming and data science projects
Google Translate: language translation tool that supports reading, writing, and understanding in multiple languages
Grammarly: grammar and tone checker that helps improve the clarity, correctness, and professionalism of writing
Hemingway Editor: writing editor focused on improving readability, grammar, and style for essays and communications
Midjourney: AI image generator used to create visual concepts and artwork for creative and digital projects
Motion Schedule (motion app): AI-based app for managing daily tasks, prioritising activities, and optimising study and break times
My AI (Snapchat): built-in AI chatbot on Snapchat for casual queries, basic advice, and quick study support
Notebook LM: AI-supported note-taking app that summarises lectures, readings, and key learning points
Notion: productivity tool with built-in AI for organising tasks, projects, coursework, and personal schedules
Photoleap: AI photo editing app used for creating, editing, and experimenting with visual designs
Presenter Coach (MS PowerPoint): practice tool that gives real-time feedback on presentation style, pacing, clarity, and filler words
R: Statistical programming language used for data analysis, modelling, and producing visualisations
Scholarcy: summarisation tool that turns long academic papers into concise summaries for easier study.
SPSS: statistical analysis software for managing data, detecting trends, and building predictive models
Zotero: reference management tool that helps with organising research sources and automating citations
About the author

I co-lead our artificial intelligence activity. Our focus is on supporting our members to responsibly adopt AI. We provide a wide range of thought leadership, practical advice, guidance, and training alongside piloting relevant AI products.