CBTF Best Practices

The following steps will help to ensure your students don't experience technical issues and know what to expect when they come to the CBTF for their first exam. Some steps may not apply if using an LMS other than PrairieLearn, but many of the principles will still be relevant.

We encourage all faculty members to setup a time to tour our labs to get a better understanding of the student experiences.  Contact cbtf@illinois.edu to schedule a tour.

Technical Preparations

Necessary Steps Purpose
Add a date range to your course instance Without a date range in your course instance, students will not be able to access their PrairieLearn assessments
Link exam in PrairieTest
  • Copy and past directly from PrairieTest
  • Don't add any additional access rules (e.g. no exam access dates or lists of UIDs)
Linking the exam in PrairieTest allows students to access their assessment when they come to the CBTF

Assessment Preparation

Necessary Steps Purpose

Seek Support

Ask questions and learn how to better customize your assessment from PrairieLearn developers

Student Preparation

Necessary Steps Purpose
Tell students to sign register for the exam on PrairieTest
  • Registration for each exam opens on Thursday, two weeks before the exam
  • Students can change and delete reservations as much as they want (unless they have already missed their reservation)

Our labs are used heavily.  Registering early provides more opportunities to find an exam time that works with students' schedules

Assign students at least one assessment in PraireLearn

Serves as a check for course instance configuration

Exposes students to the platform they will be using to take their assessment

Provide practice with every question type or exam configuration that they will see on the day of the exam
  • downloading files
  • formula sheets
  • manual grading
  •  workspaces

Exposing students to the various question formats before taking their assessment in the CBTF allows students to focus on the content and not on learning to navigate the LMS. 

Students should not see any new question formats in the CBTF that they have not already practiced before their exam.

Inform Students about the CBTF

Educating students about the CBTF before their exam will allow students to know what to expec when they arrive for their exam, reducing anxiety and allowing students to focus on their exam.

 

"To be effective, an implementation of second-chance testing should be more than just “another roll of the dice” to see if a student can get a better score on a different assessment. Instead, it must encourage most students to engage in the study habits and test-taking behaviors that benefit them the most" (Herman, Varghese, Zilles, 2019).
 
Benefits of Second Chance Exams
  • approximates mastery learning teaching principles
  • can encourage more engagement with  course content
  • students generally score higher on the second exam
Summary
  • A full grade replacement should not be considered.  
  • Use a "Weighted Average with Max" policy as a good compromise between benefits and drawbacks.
  • Include a requirement (additional practice assignments, attendance at office hours, etc.) to make students eligible to retake an exam
Second Chance Exam Models
Policy  Description Benefits Drawbacks
Full replacement Second exam  grade completely replaces the first exam Allows second opportunity to take exams. Many students either don't prepare for the first exam or don't take the first exam.
Weighted Average First and second exams are averaged together (Exam 1 = 40% and Exam 2 = 60%) Students are encouraged to study for both exams Depending on weights, students who already earned high grades may try testing again
Weighted Average with Max Similar to weighted average, but associated with worst and best scores (e.g. 10% of the worst exam grade + 90% of the best exam grade)
  • Students who earned a C or below are likely to take the exam again
  • Can contribute to more studying and learning during a course
Students who earned higher grades are less likely to take the second exam
Weighted Average w/
Insurance
Same as weighted average accepted final score can't be less than the first exam Less stress associated with talking second exam Students may be less encouraged to study and just take another chance
Weighted Average w/
Extra homework
Same as weighted average except students have to complete an additional assignment to be eligible for the second exam. Helps to assure more engagement with the content between the first and second exam Similar drawbacks as described above depending on weights and insurance
 
Related Research

Herman, G. L., Cai, Z., Bretl, T., Zilles, C., & West, M. (2020). Comparison of Grade Replacement and Weighted Averages for Second-Chance Exams. Proceedings of the 2020 ACM Conference on International Computing Education Research, 56–66. https://doi.org/10.1145/3372782.3406260

Herman, G., Varghese, K., & Zilles, C. (2019). Second-chance Testing Course Policies and Student Behavior. 2019 IEEE Frontiers in Education Conference (FIE), 1–7. https://doi.org/10.1109/FIE43999.2019.9028490

Schmitz, C., Herman, G., & Bretl, T. (2020). The Effects of Second-Chance Testing on Learning Outcomes in a First-Year STEM Course in Engineering. 2020 ASEE Virtual Annual Conference Content Access Proceedings, 35313. https://doi.org/10.18260/1-2--35313

Emeka, C., Smith, D., Zilles, C., West, M., Herman, G., & Bretl, T. (2023). Determining the Best Policies for Second-Chance Tests for STEM Students. 2023 ASEE  Annual Conference Content Access Proceedings, 39387. https://doi.org/10.18260/1-2--43019

Emeka, C., Zilles, C., West, M., Herman, G., & Bretl, T. (2023). Second-Chance Testing as a Means of Reducing Students’ Test Anxiety and Improving Outcomes. 2023 ASEE  Annual Conference Content Access Proceedings, 39385. https://doi.org/10.18260/1-2--44207

“Faculty and students are both overwhelmingly positive about shorter, more frequent exams [22]. Students prefer them because each exam is less stressful, because it is a smaller fraction of their overall grade.  Faculty like them because they prevent student procrastination” (Zilles et al., 2019).

Benefits of More Frequent Exams 

  • More frequent testing can improve student learning and retention of content
  • Can lower student stress and encourage better study habits
  • Reduces student’s opportunities to procrastinate
  • Encourages more active study habits
Principles for Success
  • Feedback and incentives to revisit exam content are necessary
  • Concepts should be assessed across multiple exams
  • Impact of frequent testing decreases as the number of exams increases
Examples of Impact on Students Learning 
Lower Percentage of Failing Grades Following the Implementation of More Frequent Exams
 

(Morphew et al., 2020)  

(Nip et al., 2018) 

Related Research

Adkins, J., & Linville, D. (2017). Testing frequency in an introductory computer programming course. Information Systems Education Journal, 15(3), 22–28.

Morphew, J. W., Silva, M., Herman, G., & West, M. (2020). Frequent mastery testing with second‐chance exams leads to enhanced student learning in undergraduate engineering. Applied Cognitive Psychology, 34(1), 168–181. https://doi.org/10.1002/acp.3605

Nip, T., Gunter, E. L., Herman, G. L., Morphew, J. W., & West, M. (2018). Using a Computer-based Testing Facility to Improve Student Learning in a Programming Languages and Compilers Course. Proceedings of the 49th ACM Technical Symposium on Computer Science Education, 568–573. https://doi.org/10.1145/3159450.3159500

Zilles, C., West, M., Herman, G., & Bretl, T. (2019). Every University Should Have a Computer-Based Testing Facility: Proceedings of the 11th International Conference on Computer Supported Education, 414–420. https://doi.org/10.5220/0007753304140420

 

“When faculty are invited to use asynchronous computerized exams in their courses, their almost universal first concern is the potential for collaborative cheating resulting from the asynchronous nature of the exam…. However, when we plot student’s exam scores versus the day on which they took the exam, we see that on average the scores are actually decreasing over the exam period” (Chen et al., 2017).

Principles for Reducing Collaborative Cheating
  • Question pools with at least 4 questions is sufficient to mitigate collaborative cheating
  • No single question format (multiple choice, checkbox, numeric, etc) does not significantly increase or decrease the advantage from cheating 
  • Exam scores decrease later in an exam window with sufficient question pool sizes and parameterizing questions
Student Testing / Cheating Behavior
  • Students, when given options, overwhelmingly choose exam times later on the last day of a testing window 
  • Initially unproctored exams don’t show elevated levels of cheating, but cheating behaviors increase with continued use of unproctored exams during a semester 
Data from Collaborative Cheating Research
Decreasing Exam Scores over Testing Window Decreasing Advantage with Increased Question Pool Size
(Chen et al., 2017)   (Chen et al., 2018)
Related Research

Chen, B., Azad, S., Fowler, M., West, M., & Zilles, C. (2020). Learning to Cheat: Quantifying Changes in Score Advantage of Unproctored Assessments Over Time. Proceedings of the Seventh ACM Conference on Learning @ Scale, 197–206. https://doi.org/10.1145/3386527.3405925

Chen, B., West, M., & Zilles, C. (2017). Do Performance Trends Suggest Wide-spread Collaborative Cheating on Asynchronous Exams? Proceedings of the Fourth (2017) ACM Conference on Learning @ Scale, 111–120. https://doi.org/10.1145/3051457.3051465

Chen, B., West, M., & Zilles, C. (2018). How much randomization is needed to deter collaborative cheating on asynchronous exams? Proceedings of the Fifth Annual ACM Conference on Learning at Scale, 1–10. https://doi.org/10.1145/3231644.3231664

Chen, B., West, M., & Zilles, C. (2019). Analyzing the decline of student scores over time in self‐scheduled asynchronous exams. Journal of Engineering Education, 108(4), 574–594. https://doi.org/10.1002/jee.20292

Silva, M., West, M., & Zilles, C. (2020). Measuring the Score Advantage on Asynchronous Exams in an Undergraduate CS Course. Proceedings of the 51st ACM Technical Symposium on Computer Science Education, 873–879. https://doi.org/10.1145/3328778.3366859

The CBTF only utilizes dedicated computer labs for administering assessments. Dedicating spaces for computer-based assessment allows labs to be developed that will maximize utilization, contribute to exam security, and also better accommodate students. The following principles should guide the development of new testing labs.


Sight lines

Labs should be designed to prioritize sight lines for proctors. Straight rows allow proctors to more efficiently monitor students, allowing proctors to identify students who need support and potential academic integrity violations.

Traffic Flow

To maximize lab efficiency, the lab should be designed to allow students to check in, store materials (backpacks, jackets, etc.), and move to their assigned seats. Open storage racks inside the lab provide the quickest and simplest method for students to store what they bring to the lab. 

Lab Size

Larger labs provide more testing capacity and are more cost-effective. All testing sessions should be staffed by two proctors to provide the necessary support and consistency in lab staffing. Two proctors can effectively facilitate check-in and monitor a testing lab of approximately 100 students.

Lab Testing Hours / Wee Proctoring Cost / Week Cost / Testing Hours
33 Seat Lab 2772 $2520 $.90 / Testing Hour
100 Seat Lab 8400 $2520 $.30 / Testing Hour

Calculations based on the following assumptions.

  • 12 testing sessions/day and operating 7 days/week
  • $15 / hour proctor wage

Plan for Accessibility and Testing Accommodations

Principles of universal design should be applied when developing a lab layout. We recommend consulting with accessibility specialists to proactively create an environment that will be accessible and comfortable for all students. We plan for approximately 12 square feet/student when designing spaces.

Students who require documented testing accommodations should be accommodated to allow them to take their exams in the same setting as their classmates. New computer-based testing labs should be designed with testing accommodations in mind. The most common testing accommodations are distraction-reduced environments and extended time. However, planning for accommodations such as text-to-speech, height-adjustable seating, access to food and beverages, and additional bathroom breaks should also be considered.

(Distraction-reduced testing accommodation in the CBTF.)

General Security Principles

Students should not be able to bring in outside data or take information from the exam out of the facility. Everything the student needs should be delivered via the exam.

Anything they make during the exam should be saved with the exam or discarded.

Physical Lab Security

The following steps should be implemented to create a secure testing environment.

  • The lab should be used exclusively for testing and should not be open unless monitored by a proctor.
  • Security cameras are encouraged to help document academic integrity violation allegations.
  • Proctors should use ID card scanners to verify the identity of students.
  • Recycling containers should be placed near the exits so students can dispose of their scratch paper before leaving.

Computer Configuration

The following steps should be taken when configuring the computers in the testing facility.

  • Students should have individual logins.
  • Use of networked file shares, like inheriting a shared home directory, should either be specific to the testing center (not shared with non-testing use systems) or local to the computer only and not persist student storage across systems.
  • Students should not have privileged access to the systems (normal users). Students should not be able to access the systems remotely.
  • USB or other removable media should be disabled. Printing should be disabled.
  • As much as possible, students should not be able to write to shared areas of the disk. Any user data that writes to the disk should be purged on logout.
  • Computers should only have web browsers installed. All exam content should be web-based and delivered through approved Learning Management Systems.

Network Configuration

The following security measures should be taken when configuring the networks used in the testing lab.

  • Egress filtering on the network should prevent connecting to anything other than an allowed list of hosts such as but not limited to: (1) exam servers, (2) authentication servers for the exam servers, (3) licensing servers for the exam servers, (4) networking servers like DNS or DHCP servers, (5) systems needed to install or maintain the lab computers, and (6) systems needed to support assistive technologies on a subset or all of the computers
  • Ideally, most of these allowed servers would not have logins for the student to do anything with them (i.e. the student can access the DNS server for domain lookups, but not to ssh in and save/retrieve information) but still let the computer function.
  • The student should have login access to the exam server(s).
  • The network ranges issued to testing centers should be CIDR described in their own range and not mixed with other systems that don’t share the same security policy. A standalone subnet/VLAN is recommended.

Exam Server Configuration

The following security measures should be taken when configuring the networks used in the testing lab.

  • Exam servers should interface with the testing scheduling software to confirm who can access what and what time limits are being enforced.
  • The exam server should only show the allowed exam (not notes or discussion boards or any other access that (1) gives the students more learning resources than are expected during the exam and (2) gives the student the ability to store data, such as in a message draft or internal-to-LMS email that could facilitate information being about the current exam being taken outside of the lab.)

 Additional documentation available at: https://us.prairietest.com/pt/docs/center/security-center

Related Research

Sosnowski, J., Baker, J., Arnold, O., Silva, M., Mussulman, D., Zilles, C., & West, M., (2024). Reflections on 10 years of operating a computer-based testing facility: Lessons learned, best practices.  Paper presented at 2024 ASEE Annual Conference & Exposition, Portland, Oregon. https://peer.asee.org/47930 

Downey, K., Miller, K., Silva, M., Zilles, C., (2024). One Solution to Addressing Assessment Logistical Problems: An Experience Setting Up and Operating an In-person Testing Center. Paper presented at 55th ACM Technical Symposium on Computer Science Education, Portland, Oregon. https://doi.org/10.1145/3626252.3630902 

Zilles, C., West, M., Herman, G., & Bretl, T. (2019). Every University Should Have a Computer-Based Testing Facility: Proceedings of the 11th International Conference on Computer Supported Education, 414–420. https://doi.org/10.5220/0007753304140420

DeMara, R., Tian, T., Salehi, S., Khoshavi, N., Pyle, S., (2019). Scalable Delivery and Remediation of Engineering Assessments using Computer-Based Assessment. IEEE Integrated STEM Education Conference (ISEC), Princeton, NJ, pp. 204-210, doi: 10.1109/ISECon.2019.8882114. 

The CBTF Idea Exchange is an opportunity for CBTF users, both faculty and and students, to share practices and philosophies that have contributed to improving teaching and learning in relation to CBTF assessments.  Each exchange is made up of 5 - 7 mini presentations.


Fall 2024 CBTF Idea Exchange

  • Quiz Zero and Using Passwords (0:00 - 8:15)
  • Exam  Frequency and Mastery Learning (8:15 - 16:25)
  • Exam Construction and Mastery Learning (16:26 - 24:35)
  • Lab and Design Learning with Autograding (24:46 - 33:45)
  • Question Randomization (33:45 - 38:30)
  • Q&A (38:30 - 49:55)