Score Divergences between Personalized and Equal Exams in Online Assessment

Proceedings of the 8th International Conference on Advanced Research in Education, Teaching and Learning

Year: 2024

DOI:

[PDF]

Score Divergences between Personalized and Equal Exams in Online Assessment

Mario J. C. Ayala, and Fredy Andres Olarte Dussan

 

 

ABSTRACT:

Online assessments provide an easy way to evaluate due to the development of technological tools that help professors and students. Nevertheless, the facilities for cheating answers on the Internet and the quick sharing between unsupervised students imply a challenge for maintaining academic integrity. This challenge has promoted the development of technological tools such as large banks, timely restrictions, random order, and random values in questions. We developed a large data bank of questions for signals and systems I subject at the National University of Colombia to create personalized exams with random values in online amassments using NUMBAS. We compared the exam scores between personalized and equal exams in three semesters with a sample of 163 exams, applying an ANOVA test to evaluate differences between scores and a Mann-Whitney test to check if the scores in the equal exams were higher than the scores in the personalized ones. The results evidenced better scores and less variability in equal exams, suggesting a possible illegal answer-sharing between classmates. The ANOVA test proved the difference among the scores, and the Man-Whitney test proved that the scores in equal exams were higher than the personalized ones, with a p-value over 98%. These results indicate that online assessments allow better scores by adopting cheating practices with fewer control techniques, demonstrating the effectiveness of personalized exams. This study revealed that random changes in questions through personalized exams help to obtain more accurate scores in online assessments.

keywords: Academic integrity, personalized exams, online assessments, cheating practices