What is the PLIC?
The Physics Lab Inventory of Critical thinking (PLIC) is a closed-response survey designed to assess how students critically evaluate experimental methods, data, and models. We define critical thinking as the ways in which you make decisions about what to do and what to trust. In a scientific context, this decision-making is based in evidence that includes data, models, analyses, prior literature, etc.
Want to try out the PLIC? Please click here to access the "expert" version of the survey!
Interested in using the PLIC? Please click here to access the course information survey!
Need to edit your start/end dates for the survey? Visit this form.
Design and Validation of the PLIC
The original conception of the PLIC came from analysis of student decision-making and reasoning in lab notes[1]: how well they interpreted data, whether they attempted to improve their experiments, and whether they identified and explained disagreements between data and models. To facilitate broad assessment (rather than painstaking analysis of lab notes), we create a context-rich, closed-response assessment. The PLIC context focuses on two groups completing a mass on a spring experiment to test the relationship between the period of oscillation and mass on the spring. The PLIC poses questions asking students to: a) interpret and evaluate the sample data, b) evaluate the methods, and c) suggest what the group should do next. From a free-response version, we collected common student responses and aggregated them into a closed response version. The questions include a mix of single-option Likert-scale questions (e.g. How well did the group's method evaluate the model?) and choose-many explanation questions (e.g. Which of the following best support your reasoning?). Students are limited to no more than three options for each 'choose-many' question.
Responses were collected from over 50 expert physicists at multiple institutions. These responses were used to improve the assessment and to generate the scoring scheme. Students' scores, therefore, are related to how closely they respond to what experts say. Details of the scoring scheme will be included on this site soon.
Administering the PLIC
We have set up an automated system for signing up and administering the PLIC, based on work[2] by Lewandowski Group at the University of Colorado-Boulder. Any instructor interested in using the PLIC should follow the steps below.
1. To get started, instructors should fill out the Course Information Survey (CIS) available here. The survey includes questions about the course, instructor contact information, when the instructor would like the pre- and post-surveys to close.
2. The instructor is sent an email containing a link to the pre-instruction survey, which should be shared with students by the instructor whenever and however they would like.
3. Four days before the pre-survey is set to close, the instructor is sent an email reminder letting them know how many students have completed the pre-survey. If no one has filled it out at that point, the survey deadline is extended by 3 days.
4. Two days before the post-survey is set to open, the instructor is sent an email informing them that the post-survey link is on its way.
5. Two weeks before the post-survey is set to close, the instructor is sent an email containing a link to the post-instruction survey, which should be again shared with students by the instructor.
6. Four days before the post-survey is set to close, the instructor is sent an email reminder letting them know how many students have completed the post-survey. If no one has filled it out at that point, the survey deadline is extended by 3 days.
7. After the post-survey closes, the instructor is sent a report email including the names of students who completed the survey and a summary of their class’s performance.
Bonus: Instructors are able to change the date that they would like a survey (either pre or post) to close by completing a separate course date change form available here.
Publications
1Holmes, N. G., Wieman, C. E., & Bonn, D. A. (2015). Teaching critical thinking. PNAS, 112(36), 11199–11204. https://doi.org/10.1073/pnas.1505329112
2Wilcox, B. R., Zwickl, B. M., Hobbs, R. D., Aiken, J. M., Welch, N. M., & Lewandowski, H. J. (2016). Alternative model for administration and analysis of research-based assessments. Physical Review Physics Education Research, 12(1), 010139. https://doi.org/10.1103/PhysRevPhysEducRes.12.010139
Walsh, C., Quinn, K. N., Wieman, C., & Holmes, N. G. (2019). Quantifying critical thinking: Development and validation of the physics lab inventory of critical thinking. Physical Review Physics Education Research, 15(1), 010135. https://doi.org/10.1103/PhysRevPhysEducRes.15.010135
Quinn, K. N., Wieman, C. E., & Holmes, N. G. (2018). Interview Validation of the Physics Lab Inventory of Critical thinking (PLIC). In 2017 Physics Education Research Conference Proceedings (pp. 324–327). American Association of Physics Teachers. https://doi.org/10.1119/perc.2017.pr.076
Holmes, N. G., & Wieman, C. E. (2016). Preliminary development and validation of a diagnostic of critical thinking for introductory physics labs. In 2016 Physics Education Research Conference Proceedings (pp. 156–159). American Association of Physics Teachers. https://doi.org/10.1119/perc.2016.pr.034
Holmes, N. G., & Wieman, C. E. (2015). Assessing modeling in the lab: Uncertainty and measurement. In M. Eblen-Zayas, E. Behringer, & J. Kozminski (Eds.), 2015 Conference on Laboratory Instruction Beyond the First Year of College (pp. 44–47). College Park, MD. https://doi.org/10.1119/bfy.2015.pr.011
Holmes, N. G., Olsen, J., Thomas, J. L., & Wieman, C. E. (2017). Value added or misattributed? A multi-institution study on the educational benefit of labs for reinforcing physics content. Physical Review Physics Education Research, 13(1), 010129. https://doi.org/10.1103/PhysRevPhysEducRes.13.010129