Cal Poly Pomona A/B Tests Active Learning vs. Lecture Course Formats

When a sizeable percentage of students are failing a critical, upper-level course, what can the university do?

Many solutions could reduce academic rigor, ignore compliance issues, or eliminate core content. But that doesn’t have to be the case.

At Cal State Polytechnic University, Pomona, a primarily undergraduate institution with a heavy emphasis on teaching and hands-on education, educators took a different tack.

Dr. Paul Nissenson, Associate Professor at Cal State Polytechnic University, Pomona, joined the Enrollment Growth University podcast to talk about testing a move away from a traditional lecture format in an upper-level engineering course that had become a roadblock for many students.

A/B Testing an Upper-Level Engineering Course Design

Engineering students in their third year at Cal Poly Pomona have to take fluid mechanics. Historically, this course has proven to be a huge bottleneck for the department. Typically, around 450 students enroll each year, and about a third leave with a D or an F.

“This is a common problem,” Paul told us, “not just in our department, but in mechanical engineering departments across the country.”

Traditionally, this was a lecture course. The instructor stood at the whiteboard going over new concepts, derivations, and example problems when there was time. Homework consisted of reading assignments and end-of-chapter problems, and there’d be some combination of quizzes and midterm and final exam.

“It would take at least a few days to get them back to students,” Paul said, “sometimes even up to a week. And by then they’ve already kind of moved on to the next topic. It also really wasn’t enough time for any kind of meaningful student teacher interaction in the classroom.”

With so much content to cover, students were not engaging with the Textbooks, either. In fact, some were not even buying the textbook, and the faculty had no way of tracking of how learners were engaging with the readings. Most troubling, students had access to any solution manual they wanted, as they do in most engineering and math courses.

“So, when we assigned problems from the textbook,” Paul said, “many of them were just going right to the solution manual. They can find a pirated version online and just blindly copy.”

The majority of students passed the course, but clearly, change was needed. Paul and his team set about finding the right solution.

Testing Different Active Learning Activities

“This definitely was a team effort,” Paul said.

Besides Paul, the school had four other mechanical engineering instructors who helped create video content for the course, and the team partnered with the psychology and sociology departments to do the assessment.

“Over a couple of years, we experimented with a bunch of different tools and pedagogical strategies and getting data at every single step,” Paul told us, “and we started moving from this traditional lecture style of teaching to more of a flipped classroom model.”

The class met twice a week fo 75 minutes per session, and students would watch around 30 minutes of video content before coming to the first section in order to be prepared when they came to class.

“During the first meeting, we would give them a short concept quiz to make sure they actually watched the videos,” Paul said. “That was really important we found out. Otherwise, maybe half weren’t going to even watch the videos.”

Paul spent the rest of the class time just answering questions, going over as many example problems as possible, and doing demonstrations in the classroom — all things he didn’t have time for in a traditional setting.

In the second meeting, students would take a longer, calculation-based quiz, more typical of an engineering or STEM course exam. Then, the they would spend about half the time engaged in an activity they called the team battle.

“In the team battle,” Paul explained, “I grouped the students kind of at random into teams of four. And these students would then be given various problems to complete.”

The teams would compete against each other to complete the problems as fast as possible. The winners received some combination of a small amount of extra credit and candy.

Results of the Different Course Designs

Did the ⅓ of students getting Ds and Fs improve?

“We ended up going from about a third of the students getting a D or F to about around 10%,” Paul said. “But it wasn’t just about the Ds and Fs. We also had surveys and focus groups which were organized by the psychology and sociology partners.”

Tthe students in the flip sections were having a much better, more positive experience, than the students in the traditional lecture sections, especially when it came to the team battles. The students really enjoyed that in-class activity.

Maintaining Academic Rigor

“We spent two years on this project and we got 10 sections worth of data,” Paul said. He and his team also experimented with different tools and data. Ultimately, he gave similarly difficult final exams and discovered that students in the flipped classrooms tended to perform better on the assessment.

“If we use the final exam as kind of a measure of how much students get from the entire course, they actually seem to be learning more in the flip sections,” Paul explained.

The students are getting quizzed more frequently, too. Instead of getting something like four quizzes over a 10-week quarter, they’re now getting a quiz every single time they come to class, so they have to stay on top of things.

Next-Steps Advice for Institutions Looking to Test Course Design Formats

If you want to improve your pass rate, start by defining your needs. Then figure out if your potential strategy can address those needs.

“I had those four items that I wanted to address,” Paul told us, “the feedback, more frequent feedback; increasing student teacher interaction in the classroom; having students read the textbook more; and trying to reduce that copying from the solution manual.”

The flipped classroom strategy he employed did exactly what he needed in each area. He could even track how often students read the textbook using the McGraw Hill Connect platform. This platform even had a built-in tool to randomize problems so each student got a unique problem set, eliminating the temptation to copy solutions from the manual.

“I’d also say that if you’re ever going to employ any new technology, make sure that you test drive it yourself first,” Paul said. “I have seen too many times where instructors just … they have a problem but they just throw a technology at it. They don’t understand all the limitations of that technology, and it can be a frustrating experience for the students.”

One more recommendation: make sure that your department chair knows what’s going on and has your back in case something goes wrong.

Paul has made available videos developed for the course redesign along with two papers describing that redesign: Year 1 – Nissenson et al. (2017) andYear 2 – Wachs et al. (2018).

This post is based on a podcast interview with Dr. Paul Nissensson from Cal State Polytechnic University, Pomona. To hear this episode, and many more like it, you can subscribe to Enrollment Growth University.

If you don’t use iTunes, you can listen to every episode here.