๐ง National Research Week 2026 · Roundtable Discussion · University of Mauritius
From Passive Consumers to Critical Thinkers:
Navigating AI in Higher Education
Navigating AI in Higher Education
These reflections on the use of AI and student learning were written from a recent roundtable discussion held during National Research Week 2026 at the University of Mauritius. As the integration of AI continues to significantly impact critical thinking, its use represents new educational challenges while fundamentally shifting the skills students must develop. The following insights explore the urgent need to redefine pedagogy, ensuring that universities cultivate active, critical thinkers rather than passive learners who rely on AI as a substitute for foundational learning.
The Crisis of Passive Learning: Challenging the AI Status Quo
The integration of AI is significantly impacting student learning and critical thinking, primarily presenting new educational challenges while shifting the types of skills students need to develop.
๐ Negative Impacts on Cognitive Skills
Surveys of professors and educational reports highlight a troubling decline in students' literacy and numeracy skills as a result of using generative AI tools:
- Reduced cognitive effort: Students are demonstrating lower brain activity, as well as diminished deep and critical thinking.
- Skill degradation: Educators report lower levels of creativity, memory retention, problem-solving, and writing ability, alongside shortened attention spans.
- Outsourcing thought: Students are increasingly dependent on generative tools, essentially "outsourcing" their thinking to AI, raising concerns about creating a "generation of fools". One speaker notes a distinct mindset shift where students now expect platforms like ChatGPT to simply do their assignments for them.
⚠️ The Shift Toward "Passive Learning"
With the rise of "agentic AI" that can independently complete tasks, students are at risk of becoming "passive learners". In this environment, students use AI to generate answers but fail to question the system, blindly accepting the output without critically evaluating whether the information is actually right or wrong.
The Need for New Evaluation Skills
Despite these negative trends, AI is changing how critical thinking must be applied rather than eliminating the need for it. Because students are allowed to use AI platforms, their learning must focus on:
- Validating outputs: Students must possess foundational knowledge to independently validate whether an AI tool is giving them correct output or hallucinating.
- Recognizing quality: The key skill for graduates is no longer just producing work from scratch, but having the ability to "recognize what good work is" — critically analyzing AI outputs for quality, bias, and ethical issues.
- Understanding over copying: Educators emphasize that the ultimate goal is to ensure students understand the material and can construct their own base of knowledge, rather than just relying on a "copy and paste" approach to AI answers.
Preventing Passive Learning: Active Strategies for Educators
๐ Demand foundational knowledge
Students must first develop a strong base of knowledge. Without this, they lack context to know whether AI output is accurate or flawed.
Students must first develop a strong base of knowledge. Without this, they lack context to know whether AI output is accurate or flawed.
๐ Teach output validation
Train students to scrutinize AI-generated content for accuracy, bias, and ethical implications — not accept it at face value.
Train students to scrutinize AI-generated content for accuracy, bias, and ethical implications — not accept it at face value.
✂️ Discourage copy-and-paste habits
Set clear expectations that simply copying AI outputs is unacceptable. The aim is genuine understanding.
Set clear expectations that simply copying AI outputs is unacceptable. The aim is genuine understanding.
❓ Encourage active questioning
Push students to constantly question the system. AI is beneficial only if it helps students actively construct their own knowledge base.
Push students to constantly question the system. AI is beneficial only if it helps students actively construct their own knowledge base.
Agentic AI and the Passive Learner: The Looming Challenge
The shift toward "agentic AI" in education involves moving beyond basic generative AI to systems that can autonomously complete tasks by themselves. Instead of using AI as a supportive tool, students can simply log in and let the agentic AI do all of their assignments. This directly fuels the rise of "passive learners" — students stop actively reading, questioning, or trying to deeply understand the material. Ultimately, agentic AI reflects a broader mindset shift where students increasingly expect AI platforms to just do the work for them.
Redefining Employability & Graduate Agency
There is an ongoing debate about whether the primary purpose of higher education is scholarly knowledge production or workforce preparation. However, balancing employability with critical thinking requires redefining what it means to be employable and fundamentally shifting university pedagogy.
๐ผ Redefining Employability Beyond Micro-Skills
Viewing employability narrowly as a "bundle of micro skills" does a disservice to graduates. True employability should be understood as "agency" — the capacity to engage with work and interact with others in a productive, relational way. The competencies most highly valued by modern employers are problem-solving, creative thinking, and critical thinking.
๐ Adapting Curriculum and Assessment
Universities must critically examine their curricula. Degree programs must be pedagogically sound, intentionally integrate work-related skills, and reliably assess critical thinking attributes. Students should be encouraged to use AI to augment their knowledge for complex, high-level cognitive tasks, rather than using it as a substitute for learning fundamental concepts.
๐ง Preserving Graduate Agency
A major concern is that graduates will be "seduced by the rationality and efficiency" of AI, passively consuming its outputs and handing over intellectual agency to external systems. To prevent this, universities must train students to be critical of AI-generated knowledge — identifying systemic biases, navigating ethical challenges, and generating robust knowledge relevant to local contexts.
๐ค Co-Evolving with Industry
Achieving balance requires viewing higher education and industry as "co-evolving systems". Universities should nurture ongoing dialogues with employers through curriculum design, teaching, and work-integrated learning. This continuous interaction avoids merely producing "sheep" for the workforce and instead graduates open-minded individuals capable of critical, independent thought.
Should students use AI to augment complex tasks?
Yes — the sources advocate for students using AI to augment knowledge for higher-order cognitive tasks, but strongly warn against using it as a substitute for foundational learning.
Drawing on Bloom's taxonomy, a participant suggests that students should avoid using AI for lower-level cognitive tasks. It is essential that students learn fundamental concepts rather than using AI as a shortcut. However, when engaging in high-level functions — critical thinking, problem-solving, creative thinking — students should actively use AI to enhance and augment their capabilities.
Familiarity with AI for task optimization is becoming a core employability skill. The skills that command the highest salaries are precisely these higher-order cognitive abilities that AI can help augment.
Yet a significant risk remains: young graduates can be easily "seduced by rationality and efficiency", passively accepting AI-generated knowledge without scrutiny. Unlike experienced researchers who can identify biases or hallucinations, students risk handing over their intellectual agency to the machine if they do not maintain a critical lens. AI should be used for augmentation, but students must also be trained to constantly question and evaluate its outputs.
Yet a significant risk remains: young graduates can be easily "seduced by rationality and efficiency", passively accepting AI-generated knowledge without scrutiny. Unlike experienced researchers who can identify biases or hallucinations, students risk handing over their intellectual agency to the machine if they do not maintain a critical lens. AI should be used for augmentation, but students must also be trained to constantly question and evaluate its outputs.
Conclusion
Ultimately, the path forward for higher education lies in balancing technological integration with the preservation of intellectual agency. Rather than allowing AI to serve as a substitute for learning, universities must pivot to a pedagogy that prioritizes foundational knowledge and higher-order cognitive skills like problem-solving and critical analysis. By treating AI as a tool for augmentation rather than a replacement for effort, educators can train students to validate outputs, scrutinize biases, and maintain an active, questioning mindset. Through these intentional strategies, institutions can successfully bridge the gap between academic rigor and workforce readiness, ensuring that graduates remain independent, critical thinkers in an increasingly automated world.

No comments:
Post a Comment