COVID-19 and the abrupt shift to remote learning: Impact on grades and perceived learning for undergraduate biology students

Institutions across the world transitioned abruptly to remote learning in 2020 due to the COVID-19 pandemic. This rapid transition to remote learning has generally been predicted to negatively affect students, particularly those marginalized due to their race, socioeconomic class, or gender identity. In this study, we examined the impact of this transition in the Spring 2020 semester on the grades of students enrolled in the in-person biology program at a large university in Southwestern United States as compared to the grades earned by students in the fully online biology program at the same institution. We also surveyed in-person instructors to understand changes in assessment practices as a result of the transition to remote learning during the pandemic. Finally, we surveyed students in the in-person program to learn about their perceptions of the impacts of this transition. We found that both online and in-person students received a similar small increase in grades in Spring 2020 compared to Spring 2018 and 2019. We also found no evidence of disproportionately negative impacts on grades received by students marginalized due to their race, socioeconomic class, or gender in either modality. Focusing on in-person courses, we documented that instructors made changes to their courses when they transitioned to remote learning, which may have offset some of the potential negative impacts on course grades. However, despite receiving higher grades, in-person students reported negative impacts on their learning, interactions with peers and instructors, feeling part of the campus community, and career preparation. Women reported a more negative impact on their learning and career preparation compared to men. This work provides insights into students’ perceptions of how they were disadvantaged as a result of the transition to remote instruction and illuminates potential actions that instructors can take to create more inclusive education moving forward.

173 well documented in the sciences [55]. Therefore, it is important to examine the impact of the 174 transition to remote learning on STEM students with social identities historically 175 underrepresented in the sciences, for which ASU's biology program provides a suitable context. Positionality of the authors 200 We acknowledge that our own identities influence the research questions that we ask and how 201 we may interpret the data. Our author team includes individuals who identify as men, women,  215 To study the impact on student course grades that resulted from the shift to remote learning 216 during the COVID-19-impacted Spring 2020 semester, we obtained course grades from the 217 university registrar for Spring 2020 and compared these grades to two spring semesters prior to 218 the pandemic: Spring 2019 and Spring 2018. The population of interest is undergraduate 219 biology majors enrolled in either the in-person biology degree program or the fully online biology 220 degree program. Therefore, we obtained course grades for 42 STEM courses that are 221 commonly taken by students in these biology majors, including general biology courses, 222 biochemistry, chemistry, physics, mathematics, and statistics. See Table S1 for the full list of 223 courses. 224 225 Our grades analysis included a total of 25,100 student-course enrollments, with 8,323 from the 226 Spring 2020 pandemic semester and the remainder from Spring 2018 or 2019. Of these, 19,181 227 course enrollments were in-person courses and the remaining 5,919 were online degree 228 program courses. Course grades were analyzed on a 0-4.33 scale (A+ = 4.33, A = 4.0, A-= 229 3.66,..., E = 0). Grades other than A-E were excluded from analysis; this was a total of 2,404 230 student-course enrollments, or 9.6% of the total dataset. In Spring 2018 and 2019, these 231 excluded grades are almost exclusively W or "withdraw" grades. In response to the unique 232 circumstances of the pandemic, some instructors assigned the "Y" grade which indicates 233 "Satisfactory" work at a level of C or higher. In Spring 2020, about a third of the non-letter 234 grades were Y grades. The combined proportion of non-letter grades held steady in Spring 2020 235 compared to 2018 and 2019 in online courses and increased slightly in Spring 2020 for in-236 person courses. The withdrawal percentage declined, and the Y percentage rose both online 237 and in-person. We cannot say definitively how many of the students who received a Y grade 259 To determine the direction and significance of the effect of the shift to remote learning on 260 student grades, we performed a linear mixed-effects regression on the numerical course 261 grades. The fixed effects in the model included a dummy variable for the Spring 2020 ("COVID-262 19") semester, whether the student was enrolled in the in-person or online degree program, an 263 interaction between these two variables, and the GPAO term. We included random effect terms 264 for course section and student. These terms provided modest improvement to the models with a 265 combined intraclass correlation coefficient equal to 0.256.

266
267 To determine the direction and significance of the effect of the shift to remote learning on grades 268 received by students with identities historically underrepresented in STEM, we added interaction 269 terms between the dummy variable for the Spring 2020 ("COVID-19") semester and each of the 270 demographic terms to the model described above. We again controlled for GPAO and included 271 random effect terms for course section and student in this model (see Table S2 (Table S3). Contrary to our prediction, the model shows positive, but 289 mostly non-significant, interaction effects for all groups compared to their historically 290 overrepresented counterparts. The two statistically significant interactions showed women to 291 have a Spring 2020 effect 0.05 greater than men and Pell-eligible students to have an effect 292 0.08 greater than non-Pell-eligible students.  Building on the open-ended responses from the first instructor survey, we created a second 312 survey that asked in more detail about instructional changes in response to the pandemic. To 313 assess cognitive validity, we conducted two think-aloud interviews with biology faculty members 314 who taught in person during Spring 2020 and had to transition to remote learning [59]. These 315 think-aloud interviews indicated that the instructors understood the questions. We then 316 distributed this revised survey to all biology instructors who taught in-person courses in Spring 317 2020 (n=132). In the event that they taught multiple courses, the survey asked them to respond 318 based on their largest course size. The survey first asked instructors to identify any changes 319 they made in their course. This question used a multiple-selection format with a) 24 options 320 provided, b) an option to say that no changes were made, and c) an option to describe other 321 changes not listed. The survey also asked instructors to report the extent to which they tried to 322 reduce cheating in their course, the extent to which they made their course more flexible, and 323 the extent to which they made their course easier. Each of these questions was answered using 324 a six-point Likert scale from strong agreement to strong disagreement with no neutral option and 325 they were asked to explain each answer (a copy of the survey questions analyzed is provided in 326 the Supplemental Materials). While instructors also experienced many of the same personal 327 challenges resulting from the pandemic that students did, our focus was on the student 328 experience and therefore we only asked instructors about instructional changes. 329 330 A total of 43 out of the 132 biology instructors who were contacted completed the second survey 331 (33% response rate) based on their experiences teaching an in-person biology course that 332 shifted to fully remote instruction in the Spring 2020 semester. Of these, 18 had taught an in-333 person course that transitioned during Spring 2020 with at least 100 students. Our analysis will 334 focus on these large courses because these instructors are subject to greater practical 335 constraints when considering how to shift instruction to remote learning and because the larger 336 sizes mean that a greater number of students in total are impacted by these decisions.

337
338 To understand the extent to which changes in assessment practices made by instructors might 339 explain differences in student grades in Spring 2020 compared to previous semesters, we 340 examined data from 10 instructors who responded to our survey who had taught the same 341 course in Spring 2020 and either Spring 2019 or 2018. We performed course-level linear 342 regressions on the relative grade difference using the following variables as predictors: total 343 number of changes made, use of lockdown browsers for exams, whether they made efforts to 344 reduce cheating, and whether they worked to make the course easier.  (Table 3). Focusing on the large courses, about 60% of 353 instructors agreed that they took steps to reduce cheating. Nearly all large course instructors 354 (94%) agreed that they made changes to be more flexible to help students who were 355 experiencing challenges and most (78%) agreed that they made it easier for students to do well.   Table S4).

Response Option Frequency N = 43
Gave individual students extensions on deadlines for out-of-class assignments that I wouldn't have normally provided 37% Extended the deadline or allotted more time than I usually provide to complete out-of-class assignments 33% Increased the amount of time students were allotted to complete a quiz or exam 33% Gave students more opportunities to miss class and not lose participation/attendance points but still gave participation/attendance points for class 26% Reduced or eliminated penalties for out-of-class assignments that were submitted late 26% Changed assessments such as exams or quizzes from closedbook to open-book 26% In addition to delivering my content online I made a significant change to my course that is not reflected above 30% 426 Students that selected "yes" to either of these questions were grouped together as BLNP for our 427 analyses. We grouped students in this manner because all these groups are historically 428 underrepresented in the sciences and our sample sizes for the student survey were not large 429 enough to allow us to disaggregate race/ethnicity data.

431 Student survey distribution
432 In Fall 2020, we used a convenience sampling approach to recruit eight biology instructors who 433 agreed to distribute our survey to students in their classes. The survey was sent to a total of 434 1,540 students in these eight courses and students were offered a small amount of extra credit 435 for completing the survey. A total of 798 students completed the survey, resulting in a response 436 rate of 51.8%. However, only 601 of these students were enrolled in the in-person biology 437 degree program in Spring 2020. Of these students, 70 reported that they did not take any 438 biology courses in Spring 2020, so they were not included in any course-specific analyses. After 439 removing these students and removing 21 students with missing data, we were left with 440 responses from 510 students who had taken in-person biology courses that had transitioned to 441 remote learning in Spring 2020 that we used for our analyses (  453 We calculated the total percentage of students that reported negative impacts on their learning, 483 About 56% of students reported that they think the transition to remote learning negatively 484 impacted their grade, even though our grade analysis did not indicate that this was likely. 511  In fact, about 63% of students said that the amount of time they spent interacting with 531 other students in class and outside of class greatly decreased in Spring 2020, which was the 532 strongest response option (Table S11). However, student responses were fairly split on the 533 amount of time spent studying for a course, with about 45% of students reporting an increase in 534 the amount of time they spent studying and 41% reporting a decrease (Fig 1).

535
536 We collected student and instructor data on instructional practices for eight courses. Among 537 these, only four of the eight instructors agreed that they took steps to reduce cheating in their 538 course and the percentage of students in a given course taught by one of these four instructors 539 who agreed that their instructor took some steps to reduce cheating ranged from 90 to 100%. 540 However, even for the courses where instructors disagreed that they took steps to reduce 541 cheating, 83 to 86% of students agreed that their instructor took some steps to reduce cheating 542 ( Figure S1). By contrast, all eight of these instructors agreed that they tried to make their course 543 more flexible. However, there was more variation in student response to whether their instructor 544 tried to make the course more flexible with the percentage of students who agreed with this 545 statement ranging from 61 to 91% across the eight courses. All but one instructor agreed that 546 they tried to make the course easier, but the student agreement with this question was again 547 mixed ranging from 47 to 81% across the courses. Overall, these data show that students 548 tended to slightly overestimate instructor efforts to reduce cheating and slightly underestimate 549 instructor efforts to make the course easier and more flexible. 550 551 We did not find significant demographic differences in the student Likert responses to most of 552 the survey items. In Fig 1, we describe the few demographic differences we found through our 553 ordinal mixed models (see full ordinal regression results in the Supplemental Materials).
554 Although most students reported that the time spent with instructors decreased or greatly 555 decreased during the pandemic, the proportion of BLNP students that chose these options was 556 lower than non-BLNP students. Pell-eligible students were more likely to report that time spent 557 with other students in class greatly decreased compared to students that were not Pell-eligible.
558 Lastly, women were significantly more likely than men to report negative impacts on their 559 learning in a course and on career preparation. 586 study comparing the effects of active and passive (i.e., lectures) instruction on student learning 587 found that students who received active instruction scored higher on the learning assessment 588 but perceived that they learned less than their peers who received passive instruction [66].
589 Thus, even though it has been shown that students, on average, learn more from active learning 590 [67,68], students' perception of learning might not match their actual learning. A meta-analysis 591 showed that student perceptions of their learning are more strongly related to affective 592 outcomes, such as motivation and satisfaction, and have a much weaker relationship to learning 593 outcomes, such as scores [69]. However, one reason for this may be that grades are often not 594 an accurate measure of student learning [70]. Given this background and our results that 595 instructors were more flexible with grading after the transition to remote learning in Spring 2020, 596 we think it is likely that the increase in grades does not actually reflect an increase in student 597 understanding of the course material. In contrast, students earned higher grades while self-598 reporting that they learned less, which we find concerning for the extent to which their 599 completion of these college courses is preparing them for their future careers. 600 601 The slight increase in average student grades in Spring 2020 compared to previous semesters 602 is consistent with other studies that have examined student grades in Spring 2020 at other 603 institutions [45][46][47]. Interestingly, this increase in student grades was observed both in courses 604 that experienced the emergency transition to remote learning and courses in the online degree 605 program that did not experience a transition in modality. Although we did not survey the online