Unis uses artificial intelligence to help students sit honestly on the exam. -SchoolNews

Universities are increasingly using computer programs to supervise college students sitting on exams. Is this the future of testing?

Due to the pandemic, educational institutions around the world are rapidly adopting exams software Examplify, ExamSoft, Proctor U and more.

Supervision techniques allow candidates to be monitored off-campus. Instead of seeing in a traditional office, you can see at home. Some programs only allow a person to monitor students remotely.

More sophisticated and automated supervision software hijacks student computers to block and monitor suspicious activity. These programs often use artificial intelligence (AI) to scrutinize test conduct.

Our recent Research paper I explored the ethics of automatic supervision. The potential of software has turned out to be attractive, but it comes with considerable risk.

Some educational institutions claim that supervisory skills are needed to prevent fraud. Some other institutions and students are concerned about hidden dangers.

Indeed, the students have launched protest, petition When Proceedings.. They accuse online directors of being discriminatory and intrusive, Brother.. Some supervisors responded in an attempt to suppress the protest. bring the action Their critics.

Student complaints about testing and supervising AI falsely flagged her as cheating and attracted millions of views on TikTok.

What does the software do?

The automated supervision program provides tools for examiners to prevent fraud. The program can capture system information, block web access, and analyze keyboard strokes. You can also direct your computer’s camera and microphone to record the test taker and its surroundings.

Some programs AI “Flag” suspicious behavior. The facial recognition algorithm checks to make sure the student is still seated and no one else is in the room. The program also identifies whispers, atypical typing, unusual movements, and other behaviors that may suggest cheating.

When the program “flags” an incident, the examiner can view the stored video and audio and ask the student a question for further investigation.

Why use supervision software?

Automated supervision software aims to reduce fraud in remote exams. This is necessary during a pandemic. A fair exam protects the value of your qualifications and shows that academic integrity is important. These are important parts of the certification requirements for disciplines such as medicine and law.

Cheating is unfair to honest students. Leaving it unchecked increases the incentive for these students to cheat.

Companies selling supervisory software claim that tools prevent fraud and improve the fairness of everyone’s exams, but our job is questioning that.

So what’s the problem?


After evaluating the software, we found that simple technical tricks could circumvent much of the anti-fraud protection. This finding suggests that the tool may offer limited benefits.

Requiring students to install software that gives them such powerful control over their computers is a security risk. In some cases, the software may remain secretly even after the student uninstalls the software.


Some students may lack access to the proper device and the high speed internet connection required by the software. This leads to technical problems that cause stress and disadvantage. With one incident, 41% of students experienced technical problems.


Online supervision raises privacy issues. Video capture means that the examiner can look into the student’s house and scrutinize his face unnoticed. Such intimate surveillance recorded for potential repetitive viewing distinguishes it from traditional face-to-face proctors.

Fairness and prejudice

Software supervision raises serious concerns about fairness.The software facial recognition algorithms we evaluated are not always so correct..

In the next paper by one of us, we found that the algorithms used by the major US-based manufacturers could not identify dark skin tones. face As accurate as a light-skinned face. The resulting hidden discrimination can increase social stigma.Others reported resemble concern With supervision software and common facial recognition technology.

The supervision software uses facial recognition technology in which the issue of ethnic prejudice is well documented. Shutterstock

The supervision algorithm can also falsely flag a candidate’s atypical eye and head movements. This can lead to unjustified suspicions about students who are not neurotypical or have a peculiar exam sitting style. Even without automatic supervision, exams are already a stressful event that affects our behavior.

Investigation of unfounded allegations

Institutions often have the option of using automatic features or rejecting automatic features. Supervisors may argue that the “flags” generated by AI are not evidence of academic fraud, but merely a reason to investigate possible fraud in educational institutions. discretion..

However, based on fake machine suspicions, simply investigating and asking students can be unfair and traumatic in and of itself.

Surveillance culture

Finally, automated test monitoring may set a broader precedent.General concerns about Surveillance And automation Decision making Growing up. Care should be taken when introducing potentially harmful technologies, especially if these are imposed without our true consent.

Where are you from here?

It is important to find a way to manage the exam fairly remotely. It is not always possible to replace the exam with another assessment.

Nevertheless, institutions using automated supervision software must be accountable. This means keeping students transparent about how technology works and what can happen to student data.

The examiner can also offer meaningful alternatives, such as face-to-face exam options. Providing alternatives is the basis of informed consent.

While supervisory tools appear to offer a panacea, financial institutions need to carefully consider the risks inherent in technology.conversation

Author: Simon Coghlan, Senior Research Fellow of Digital Ethics, Center for AI and Digital Ethics, Faculty of Computing and Information Systems, University of Melbourne; Genie Marie Patterson, Professor of Law, University of Melbourne; Shaanan Corney, Cyber ​​security instructor, University of Melbourne, When Tim Miller, Associate Professor of Computer Science (Artificial Intelligence), University of Melbourne

This article will be republished from conversation Under a Creative Commons Original work..

Unis uses artificial intelligence to help students sit honestly on the exam. -SchoolNews

Source link Unis uses artificial intelligence to help students sit honestly on the exam. -SchoolNews

Back to top button