PBL – Quantitative Article Critique


Nguyen et al’s 2016 paper, Measuring Student Response to Instructional Practices (StRIP) in Traditional and Active Classrooms, measured and compared student response to problem-based learning in an active classroom and student response to teacher-centered instruction in a traditional classroom.  They note a meta-analysis of 225 studies that found that active learning strategies increased learning and decreased failure rates across all disciplines (Freeman et al, 2014).  However, many engineering instructors were hesitant to adopt active learning for fear of student resistance, and Nguyen et al wanted to confirm if this was an accurate perspective and why that seemed to differ from other research results.  Specifically, they wanted to conduct a study that documented student resistance to the instructional strategy.

The study participants, predominantly male, came from three undergraduate engineering courses at a large public institution in the Southwest.  One hundred and fifty-one students enrolled in an engineering course, and the students had no knowledge of their section’s instructional approach during enrollment.  Sixty-seven students were enrolled in a mechanical engineering course, which served as the traditional classroom.  A second mechanical engineering course of fifty-three students utilized problem-based learning and served as the first active learning class.  The third course, consisting of thirty-one electrical engineers, utilized collaborative problem-based learning and was the second active learning class.  Researchers chose to focus on multiple classes from a single institution in order to minimize variation effects resulting from a change in setting.  The experimental design allowed researchers to focus on two aspects of response: the impact of classroom strategies on student response, as well as the ability for the survey instrument, detailed below, to reflect the differences seen in each classroom.

The undergraduate engineering students who participated in the study were given a research-designed survey: the Student Response to Instructional Practices (StRIP) Survey.  The questions on the survey differentiated between the two active learning courses to ensure accuracy of responses and provide data for analysis between the group and individual approaches to problem-based learning.  The StRIP Survey allowed for the collection of empirical data focused specifically on student response to problem-based learning.  Questions relating to the traditional instructional practices were also included in order to allow for further analysis of their effect on student response.  This StRIP Survey went through six development phases, including item generation, validity testing, and piloting of the protocol.  Despite having multiple sections, students completed the survey in approximately fifteen minutes.  The survey focused on student response to in-class activities, student perception of the instructor, student satisfaction with the course, and student prior experience with problem-based learning.

Since much of the survey contained Likert-data, researchers used a non-parametric test to analyze the responses.  To determine if there were any statistically significant differences in student responses from the three courses, the researchers conducted a Kruskal-Wallace Test.  Post-hoc Nemenyi testing with a Tukey distribution was utilized to compare statistics for each individual survey item and determine statistical differences between the three courses.  Results suggest that students responded positively to problem-based learning and did not exhibit signs of resistance.  However, it is also worth noting that students responded positively to all three instructional environments and that the survey, while able to differentiate between individual and group problem-based learning classrooms, was not able to strongly differentiate between the traditional course and the other two courses.

Critical Analysis

There were limitations to the research because of the small sample size and its specific focus on engineering courses.  For example, engineering is a discipline that often benefits from hands-on and collaborative work, which could have created inherent motivation for students to engage in active learning.

The traditional course did not register on the post-hoc tests to have more significant instructor involvement than the other courses, suggesting that instructor involvement was still a core component across all classes.  However, follow-up discussion with the traditional course instructor suggested that he was more engaging than standard lecturers, perhaps skewing the results.  Further research would need to be conducted to accurately compare a traditional course with an active learning classroom based upon educator experience.

Further, the present StRIP Survey does not distinguish between different modalities of student response to learning.  The researchers have considered delineating between four motivation-related characteristics: participation, value, emotion, and evaluation.  This would allow for a more holistic understanding of the effects on student engagement in relation to instructional practice and strategies.

This study would have benefitted from regression-modeling, which would have allowed researchers to examine factors influencing student response to survey questions.  For example, student preparation for learning might have easily impacted their readiness to engage in problem-based activities.  In addition, the problem-based classrooms had fewer students than the traditional classroom, with the collaborative problem-based class being less than half the size of the traditional classroom.  While it would not be surprising for a smaller class to reflect strong student engagement, Nguyen et al did not find major differences in student engagement across the three classrooms.  Even if problem-based learning were found to offer significant benefits over traditional classrooms, the size of these benefits would need to be measured against any increase in cost per student.

Despite the need for further study, this research successfully implies that student resistance to problem-based learning is not a significant obstacle, even in engineering, a field in which some instructors expressed concern that this would be an issue.  Paired with the research that this type of learning decreases student failure rates, educators should highly consider the incorporation of active learning techniques within their courses and should not worry that student resistance will present an obstacle.


Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences,111(23), 8410-8415.

Nguyen, K., Borrego, M., Finelli, C., Shekhar, P., Demonbrun, R., Henderson, C., Waters, C.(2016). Measuring Student Response to Instructional Practices (StRIP) in Traditional and Active Classrooms. 2016 ASEE Annual Conference & Exposition Proceedings. doi:10.18260/p.25696



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s