Onze stratégies pour optimiser la gestion des évaluations à enjeux élevés à distance
DOI :
https://doi.org/10.36834/cmej.73734Résumé
La pandémie de la COVID-19 a eu comme conséquence de rendre nécessaire le recours aux technologies de l’information et de la communication aux évaluations des ordre professionnels, institutions de certification ainsi que de nombreux programmes universitaires. Le terrain glissant exploré dans cet article est celui de l’utilisation des TIC pour permettre la réalisation d’évaluations à enjeux élevés à distance par rapport à leur élaboration, l’administration et le monitorage tout en assurant la validité de l’interprétation des scores.
Statistiques
Références
American Educational Research Association., American Psychological Association., National Council on Measurement in Education., Joint Committee on Standards for Educational and Psychological Testing (U.S.). Standards for educational and psychological testing. Washington, DC: American Educational Research Association; 2014.
Camara W. Never let a crisis go to waste: large‐scale assessment and the response to COVID‐19. Educ Meas Issues Pract. 2020;39(3):10–18. https://doi.org/10.1111/emip.12358c
Chilisa B. Towards equity in assessment: crafting gender-fair assessment. Assess Educ Princ Policy Pract. 2000;7(1):61–81. https://doi.org/10.1080/713613318
Cumming JJ. Legal and educational perspectives of equity in assessment. Assess Educ Princ Policy Pract. 2008;15(2):123–135. https://doi.org/10.1080/09695940802164168
Wiley A, Buckendahl CW. Your guess is as good as ours. Educ Meas Issues Pract. 2020;39(3):49–52. https://doi.org/10.1111/emip.12366
Langenfeld T. Internet‐based proctored assessment: security and fairness issues. Educ Meas Issues Pract. 2020;39(3):24–27. https://doi.org/10.1111/emip.12359
Evans J, Knezevich L. Impacts of COVID‐19 on the law school admission test. Educ Meas Issues Pract. 2020;39(3):22–23. https://doi.org/10.1111/emip.12367
Joncas SX, St-Onge C, Bourque S, Farand P. Re-using questions in classroom-based assessment: an exploratory study at the undergraduate medical education level. Perspect Med Educ. 2018;7(6):373–378. https://doi.org/10.1007/s40037-018-0482-1
Pugh D, De Champlain A, Gierl M, Lai H, Touchie C. Can automated item generation be used to develop high quality MCQs that assess application of knowledge? Res Pract Technol Enhanc Learn. 2020;15(1):1–13. https://doi.org/10.1186/s41039-020-00134-8
Joint Information Systems Committee (JISC). Effective assessment in the digital age [Internet]. UK: HEFCE; 2010 Available from: https://www.webarchive.org.uk/wayback/archive/20140613220103/http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassass_eada.pdf [Accessed Aug 19, 2021].
Sweeney T, West D, Groessler A, et al. Where’s the transformation? Unlocking the potential of technology-enhanced assessment. Teach Learn Inq. 2017;5(1):1–16.
Hewson C. Can online course‐based assessment methods be fair and equitable? Relationships between students’ preferences and performance within online and offline assessments. J Comput Assist Learn. 2012;28(5):488–498. https://doi.org/10.1111/j.1365-2729.2011.00473.x
Hewson C, Charlton J, Brosnan M. Comparing online and offline administration of multiple choice question assessments to psychology undergraduates: do assessment modality or computer attitudes influence performance? Psychol Learn Teach. 2007;6(1):37–46. https://doi.org/10.2304%2Fplat.2007.6.1.37
Hewson C, Charlton JP. An investigation of the validity of course-based online assessment methods: the role of computer-related attitudes and assessment mode preferences. J Comput Assist Learn. 2019;35(1):51–60. https://doi.org/10.1111/jcal.12310
St-Onge C, Ouellet K, Lakhal S, Dubé T, Marceau M. COVID-19 as the tipping point for integrating e-assessment in higher education practices. Br J Educ Technol. 2022;53(2):349-366. https://doi.org/10.1111/bjet.13169
Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual objective structured clinical examination in dental education. A response to COVID-19. Eur J Dent Educ. 2021;25(3):488–494. https://doi.org/10.1111/eje.12624
Lewandowski R, Stratton A, Gupta TS, Cooper M. Twelve tips for OSCE-style Tele-assessment. MedEdPublish. 2020;9. https://doi.org/10.15694/mep.2020.000168.1
Ryan A, Carson A, Reid K, Smallwood D, Judd T. Fully online OSCEs: a large cohort case study. MedEdPublish. 2020;9. https://doi.org/10.15694/mep.2020.000214.1
Karim MN, Kaminsky SE, Behrend TS. Cheating, reactions, and performance in remotely proctored testing: an exploratory experimental study. J Bus Psychol. 2014;29(4):555–572. https://doi.org/10.1007/s10869-014-9343-z
Lilley M, Meere J, Barker T. Remote live invigilation: a pilot study. J Interact Media Educ. 2016;1(6):1–5. http://dx.doi.org/10.5334/jime.408
Stowell JR, Bennett D. Effects of online testing on student exam performance and test anxiety. J Educ Comput Res. 2010;42(2):161–171. https://doi.org/10.2190%2FEC.42.2.b
Weiner JA, Hurtz GM. A comparative study of online remote proctored versus onsite proctored high-stakes exams. J Appl Test Technol. 2017;18(1):13–20. Available from: http://jattjournal.net/index.php/atp/article/view/11306 1
Lovett BJ. Extended time testing accommodations for students with disabilities: answers to five fundamental questions. Rev Educ Res. 2010;80(4):611–38. https://doi.org/10.3102%2F0034654310364063
Lovett BJ, Leja AM. ADHD symptoms and benefit from extended time testing accommodations. J Atten Disord. 2015;19(2):167–72. https://doi.org/10.1177/1087054713510560
Katsiyannis A, Zhang D, Ryan JB, Jones J. High-stakes testing and students with disabilities: Challenges and promises. J Disabil Policy Stud. 2007;18(3):160–7. https://doi.org/10.1177%2F10442073070180030401
Lin P-Y, Lin Y-C. Examining accommodation effects for equity by overcoming a methodological challenge of sparse data. Res Dev Disabil. 2016;51–52:10–22. https://doi.org/10.1016/j.ridd.2015.12.012
Bautista JMD, Manalastas REC. Using video recording in evaluating students’ clinical skills. Med Sci Educ. 2017;27(4):645–650. https://doi.org/10.1007/s40670-017-0446-9
Sturpe DA, Huynh D, Haines ST. Scoring Objective structured clinical examinations using video monitors or video recordings. Am J Pharm Educ. 2010;74(3):1–5. https://doi.org/10.5688/aj740344
Kiehl C, Simmenroth-Nayda A, Goerlich Y, et al. Standardized and quality-assured video-recorded examination in undergraduate education: informed consent prior to surgery. J Surg Res. 2014;191(1):64–73. https://doi.org/10.1016/j.jss.2014.01.048
Vivekananda-Schmidt P, Lewis M, Coady D, et al. Exploring the use of videotaped objective structured clinical examination in the assessment of joint examination skills of medical students. Arthritis Care Res. 2007;57(5):869–876. https://doi.org/10.1002/art.22763
Kumar RV. Videotaped OSPE: is this a right procedure to assess health science students’ performance?--A pilot study. Int J Inf Educ Technol. 2016;6(3):211–214. https://doi.org/10.7763/IJIET.2016.V6.686
Driscoll PJ, Paisley AM, Paterson-Brown S. Video assessment of basic surgical trainees’ operative skills. Am J Surg. 2008;196(2):265–272. https://doi.org/10.1016/j.amjsurg.2007.09.044
Nickel F, Hendrie JD, Stock C, et al. Direct observation versus endoscopic video recording-based rating with the objective structured assessment of technical skills for training of laparoscopic cholecystectomy. Eur Surg Res. 2016;57(1–2):1–9. https://doi.org/10.1159/000444449
Meijer RR, Sijtsma K. Methodology review: evaluating person fit. Appl Psychol Meas. 2001;25(2):107–135. https://doi.org/10.1177%2F01466210122031957
Wood TJ, St-Onge C, Boulais A-P, Blackmore DE, Maguire TO. Identifying the unauthorized use of examination material. Eval Health Prof. 2010;33(1):96–108. https://doi.org/10.1177%2F0163278709356192
Iramaneerat C, Yudkowsky R, Myford CM, Downing SM. Quality control of an OSCE using generalizability theory and many-faceted Rasch measurement. Adv Health Sci Educ Theory Pract. 2008;13:479–493. https://doi.org/10.1007/s10459-007-9060-8
McManus I, Thompson M, Mollon J. Assessment of examiner leniency and stringency ('hawk-dove effect’) in the MRCP(UK) clinical examination (PACES) using multi-facet Rasch modelling. BMC Med Educ. 2006;6:42. https://doi.org/10.1186/1472-6920-6-42
Aubin A-S, St-Onge C, Renaud J-S. Detecting rater bias using a person-fit statistic: a Monte Carlo simulation study. Perspect Med Educ. 2018;7(2):83–92. https://doi.org/10.1007/s40037-017-0391-8
Téléchargements
Publié-e
Comment citer
Numéro
Rubrique
Licence
(c) Tous droits réservés Christina St-Onge 2022
Cette œuvre est sous licence Creative Commons Attribution - Pas d'Utilisation Commerciale - Pas de Modification 4.0 International.
La soumission d’un manuscrit original à la revue constitue une indication qu’il s’agit d’un travail original, qu’il n’a jamais été publié et qu’il n’est pas envisagé pour publication dans une autre revue. S’il est accepté, il sera publié en ligne et ne pourra l’être ailleurs sous la même forme, à des fins commerciales, dans quelque langue que ce soit, sans l’accord de l’éditeur.
La publication d’une recherche scientifique a pour but la diffusion de connaissances et, sous un régime sans but lucratif, ne profite financièrement ni à l’éditeur ni à l’auteur.
Les auteurs qui publient dans la Revue canadienne d’éducation médicale acceptent de publier leurs articles sous la licence Creative Commons Paternité - Pas d’utilisation commerciale, Pas de modification 4.0 Canada. Cette licence permet à quiconque de télécharger et de partager l’article à des fins non commerciales, à condition d’en attribuer le crédit aux auteurs. Pour plus de détails sur les droits que les auteurs accordent aux utilisateurs de leur travail, veuillez consulter le résumé de la licence et la licence complète.