SEARCH

SEARCH BY CITATION

Keywords:

  • mathematics instruction;
  • model development sequences;
  • summer bridge programs

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

Background

Since students' success in their first-semester college mathematics course is a key factor in their success in engineering, many summer bridge programs for underrepresented students focus on their preparation in mathematics. However, research on the design and efficacy of such programs is limited. We examine the design and effectiveness of a modeling-based approach to mathematics for entering freshmen engineering students.

Purpose

The study addresses two questions: Does a modeling-based mathematics course in a bridge program positively affect students' performance in their first-semester college mathematics course? To what extent does a sequence of modeling tasks support the development of students' concepts of average rates of change?

Design/Method

This quasi-experimental study compared two cohorts of bridge program students over six years to examine the effectiveness of a modeling-based mathematics course on first-semester mathematics course grades. Pre- and post–tests measured changes in students' concepts of average rates of change.

Results

The modeling-based mathematics course closed the previous letter grade gap between bridge program participants and non-participants in the first mathematics course. We also found significant course grade gains for students who took the modeling-based mathematics course compared with a previous cohort who took a traditional summer mathematics course.

Conclusions

These results suggest that the modeling-based mathematics course, with its focus on the development of engineering students' abilities to model changing phenomena, was effective in improving students' concepts of average rate of change and their first-semester mathematics course grade.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

Research over the past two decades on the experiences of undergraduate women and underrepresented minorities in engineering programs has pointed to two important factors for retaining students: intellectual engagement with the discipline and social support leading to connections with peers, faculty, and engineering professionals (Brainard & Carlin, 1998; May & Chubin, 2003; Micomonaco & Stricklen, 2010; Vogt, 2008). These findings have led institutions to design summer bridge programs for women and underrepresented minorities as they transition from high school to college. These bridge programs are intended to support those students who might otherwise be underprepared for the rigors of first-year engineering courses and who would benefit from acclimation to the social and academic environment of the university. Many summer bridge programs have focused on improving students' first mathematics course placement (e.g., Reisel, Jablonski, Hosseini, & Munson, 2012), given the importance of first-year mathematics courses for success in engineering (Ohland, Yuhasz, & Sill, 2004). Yet research on the design and efficacy of summer programs is limited (Citty & Lindner, 2012; Papadopoulos & Reisel, 2008; Strayhorn, 2011).

In this article, we describe our research findings on the design and effectiveness of a modeling-based mathematics course that was offered as part of a summer bridge program for high school students transitioning to college-level engineering. Unlike the goal of other summer mathematics course offerings, our primary goal in designing the modeling-based mathematics course was not to remediate the weaknesses of students' mathematical preparation in order to improve their mathematics placement in their first semester (e.g., Alkhasawneh & Hobson, 2010); Kowalchuk, Green, Ricks, & Nicklow, 2010; Yue, 2011). Rather, we sought to prepare students for success in their first mathematics course, whether that placement was pre-calculus or calculus. Therefore, we designed a modeling-based mathematics course to engage students in learning about average rates of change, a critical mathematical concept for the study of calculus, through a sequence of modeling tasks in a collaborative group setting. Our study investigated the potential effectiveness of this approach for engineering students by examining the following research questions: Does a modeling-based mathematics course in a summer bridge program positively affect students' performance in their first-semester mathematics course? To what extent does the sequence of modeling tasks support the development of students' concepts of average rates of change?

Background and Motivation

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

Summer Bridge Programs

Research on the effectiveness of summer bridge programs has focused on the impact of the program on student retention in college engineering programs by examining factors such as general academic skills, peer interactions, students' sense of self-efficacy, and social skills (Callahan, 2008; Murphy, Gaughan, Hume, & Moore, 2010; Strayhorn, 2011). While these interacting factors have been shown to contribute to retention, students' success in their first-year mathematics courses has also been shown to be a critical factor in their success in engineering (Budny, LeBold, & Bjedov, 1998; Leuwerke, Robbins, Sawyer, & Hovland, 2004; Ohland, Yuhasz, & Sill, 2004; Tolley, Blat, McDaniel, Blackmon, & Royster, 2012; Veenstra, Dey, & Herrin, 2008). Two common measures of efficacy for bridge programs are improvements in students' first mathematics course placement and students' subsequent success in their first calculus course. Yet as Papadopoulos and Reisel (2008) have pointed out in their meta-analysis of bridge programs, little research examines the mathematics performance data of students who participate in such programs, and the data about students' subsequent success in their Calculus I course are inconclusive.

In a recent study, Reisel, Jablonski, Hosseini, and Munson (2012) found that an on-campus program was more effective than an online program in improving students' mathematics placement results. A study by Gleason et al. (2010) found that while their on-campus bridge program did improve students' mathematics placement, the students' subsequent performance in their first mathematics course was worse than that of the students who did not participate, except in calculus. A subsequent study by Reisel, Jablonski, Rineck, Munson, and Hosseini (2012), while hindered by a small sample size, found that bridge program students tended to do worse in some first-semester courses than did their peers. These findings suggest that we should be cautious when focusing on improving students' mathematics placements over the short time frame of a summer program. Such programs may improve students' placement examination scores without addressing the foundational understandings and abilities needed for success in first-semester mathematics courses. Prior research has also pointed to the difficulties in establishing control groups for studying the effectiveness of bridge programs (Papadopoulos & Reisel, 2008).

Motivation

We examined data from three years of an engineering summer bridge program where students typically took a six-week version of a traditional college algebra or pre-calculus course, leading to a first-semester mathematics course placement in either pre-calculus or Calculus I. As in the previously cited research, we found that the bridge program participants, after taking these courses, performed about a letter grade lower in their first mathematics course than did students who had not participated in the program. Hence, we redesigned the bridge program mathematics course (as described below) to close this letter grade gap, while the other components of the program remained substantially the same. The redesigned mathematics course was intended to improve students' success in their first mathematics course, not their first-semester mathematics course placement. The motivation for this study was to examine the effectiveness of the redesigned bridge program mathematics course in improving students' success in their first mathematics course. The data from the earlier three years provided a control group for comparing students' performance. This study addresses the need for research on the potential benefits of summer bridge programs by providing evidence on their effectiveness and by explicating the specific design of the mathematics course that may have contributed to that effectiveness.

Theoretical Background

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

Modeling-Based Approaches to Instructional Design

Modeling approaches to the teaching and learning of science, mathematics, and engineering encompass a wide range of theoretical and pragmatic perspectives (Kaiser & Sriraman, 2006). Modeling approaches grounded in the “contextual modelling” perspective draw on the design of activities that motivate students to develop the mathematics needed to make sense of meaningful situations (Kaiser & Sriraman, 2006). Much of this work draws on model eliciting activities (MEAs) developed by Lesh and colleagues (Lesh & Doerr, 2003) and recently applied to engineering education (Bursic, Shuman, & Besterfield-Sacre, 2011; Hamilton, Besterfield-Sacre, Olds, & Siewiorek, 2010; Verleger & Diefes-Dux, 2008; Zawojewski, Diefes-Dux, & Bowman, 2008). Model eliciting activities confront the student with the need to develop a model that can be used to describe, explain, or predict the behavior of a realistic situation. Such MEAs encourage teams of students to engage in an iterative process where they express, test, and refine their ways of thinking about realistic situations. Solutions to MEAs go beyond what is required of ordinary textbook problems in that the solutions generally involve creating a process or procedure that can be shared with others and re-used in similar situations. MEAs are designed to elicit a generalizable model that reveals the underlying mathematical structure of the problem situation so that the model can then be applied in a range of contexts.

To date, much of the research on MEAs in engineering education has focused on the effectiveness of a single modeling task in improving student outcomes in mechanics, dynamics, and thermal science, and in meeting ABET professional standards (Bursic, Shuman, & Besterfield-Sacre, 2011; Diefes-Dux, Hjalmarson, Zawojewski, & Bowman, 2006; Kean, Miller, Self, Moore, Olds, & Hamilton, 2008; Ridgely & Self, 2011; Self & Widmann, 2010). However, a single MEA in isolation is seldom sufficient for a student to develop a generalized model that can be used and re-used in a range of contexts (Doerr & English, 2003; Lesh, Cramer, Doerr, Post, & Zawojewski, 2003; Lesh, Doerr, Carmona, & Hjalmarson, 2003). To achieve this goal, students need to engage in a sequence of model development activities (Hjalmarson, Diefes-Dux, Bowman, & Zawojewski, 2006; Hjalmarson, Diefes-Dux, & Moore, 2008). Model development sequences begin with an MEA and are followed by model exploration activities and model application activities. Model exploration activities focus on the underlying structure of the model and on the strengths of various representations and ways of using them productively. Model application activities engage students in applying their models to new contexts; this often results in students adapting their model, extending representations, deepening their understanding, and refining their language for describing and explaining phenomena. Each component of a model development sequence engages students in multiple cycles of descriptions, interpretations, conjectures, and explanations that are refined while the students interact with one another.

In this study, we designed a model development sequence (described in detail in the next two sections) that focused on the development of student models of the average rate of change. An understanding of this model (or conceptual system) is foundational for student learning in pre-calculus and calculus (Carlson, Jacobs, Coe, Larsen, & Hsu, 2002; Oehrtman, Carlson, & Thompson, 2008) and for student abilities to analyze changing physical phenomena in introductory engineering courses in dynamics, fluid mechanics, heat and energy transfer, and electric circuits. Furthermore, developing student abilities to create and use mathematical models is an important goal of engineering education (Zawojewski, Diefes-Dux, & Bowman, 2008).

Design of a Modeling-Based Mathematics Course

The overall aim of the course developed for the bridge program was to prepare students for subsequent success in pre-calculus and calculus. Therefore, we designed the course around four interrelated goals. The first goal was to develop student abilities to interpret and quantify change. The second goal was to develop student problem-solving skills. We wanted to improve student abilities to interpret problem situations, to persist in problem solving, and to do so more independently than in high school. Our third goal was to develop students' communication skills and their abilities to work in collaborative groups. We wanted students to gain experiences and improve in their abilities to read and write about mathematical problems and their solutions, while collaborating with their peers. The fourth goal was to develop and enhance students' algebra skills, necessary for success in this and in their next mathematics course.

To accomplish these goals, we designed the course around a sequence of modeling activities that would engage students in solving problems, working in small groups, and communicating their thinking; we drew on what Smith, Sheppard, Johnson, and Johnson (2005) refer to as the pedagogies of engagement. The course was organized to promote deep student understanding of a central mathematical idea, the average rate of change, through the analysis and interpretation of the behavior of linear and nonlinear phenomena. We designed and implemented a sequence of model development tasks to engage students in creating and interpreting models of physical phenomena that change.

Model Development Sequence

Model development sequences are structurally related tasks; a sequence begins with an MEA and is followed by model exploration and model application activities. These tasks are not step-by-step procedures (as too often found in laboratory projects) but rather are open-ended tasks that encourage students to express their own ideas about a realistic situation and then explore and apply those ideas in other contexts. Tasks are accompanied by instructor-led discussions, student presentations, and summaries to focus attention on the structural similarities among the tasks and on the use of representations across the tasks. The modeling tasks included working with motion detectors to analyze linear and quadratic motion and the associated rates of change, working with computer simulations to interpret velocity and position graphs, using light sensors to model the intensity of light with respect to the distance from the light source and to analyze the rate at which the intensity changes at varying distances from the light source, building a simple resistor-capacitor circuit to charge a capacitor, and creating a mathematical model to analyze the change in voltage across the capacitor as it discharges. The sequence began with an MEA that examined constant and nonconstant velocity for motion along a straight path. Using a motion detector and their own bodily motion, students created graphs of comparative situations of faster and slower constant speed, change of speed, and change of direction. The students also investigated position graphs (such as a vertical line or a curve with a cusp) where the motion was not physically possible.

After the model eliciting activity, the students engaged in several model exploration activities, which were designed to help students think about the underlying structure of the system (or model) and the representations of that structure. Research in physics education (Hestenes, 1992, 2010) has similarly emphasized the need for instruction to focus on the structure of a system (that is, the set of relationships among the objects in a system) and its representations. Representational fluency is recognized as important in supporting the development of students' understanding of scientific content and higher-order engineering principles (Moore, Miller, Lesh, Stohlmann, & Kim, 2013; Streveler, Litzinger, Miller, & Steif, 2008).

The model exploration activities used a computer simulation environment called SimCalc Mathworlds (Kaput & Roschelle, 1996). This environment reversed the representational space of the model eliciting activity where bodily motion produced a position graph. In this environment, velocity graphs produced the motion of an animated, frog-like character, as shown in Figure 1. The students explored this representational space and its structure by creating velocity graphs based on written descriptions of motion; these velocity graphs generated the motion of the character in the simulation environment.

image

Figure 1. Creating and using a velocity graph to determine a position graph.

Download figure to PowerPoint

From this cybernetic motion, the students created position graphs, and thus developed their understanding of how the position graph could be constructed by calculating the area between the velocity graph and the x-axis. The students completed multiple tasks in which they compared the relative motion of two characters, found an equivalent constant velocity for a character moving at a changing velocity, and examined how changes in initial position affected the velocity graph. In exploring the linked relationship between the velocity and position graphs, students reasoned both about the position of characters solely from information about their velocity and about their velocity solely from information about their position. The model exploration activity thus provided an opportunity for students to develop their abilities to interpret and create written descriptions of motion along a straight path with its associated average rates of change and to develop their understandings of the structure and representations of this motion.

After the model eliciting activity and the model exploration activity, the students engaged in two model application activities, where students applied their models to new problem situations. These activities were designed to lead students to a generalized understanding of average rate of change and to develop their abilities to communicate useful descriptions and explanations about changing phenomena. In the first model application activity, students worked in small groups to investigate the relationship between the intensity of light and the distance from a light source. Students collected light intensity data using a point source of light, a light intensity probe, and their graphing calculators. Using these data, students analyzed the average rates of change of the intensity at varying distances from the light source and described the change in the average rates of change as the distance from the light source increased. Since one of our goals was to develop students' abilities to communicate their reasoning, students worked in pairs and wrote reports where they represented their work with equations, tables, or graphs, summarized their findings, and explained their results.

The second model application activity was an investigation of the rate at which a fully charged capacitor in a simple resistor-capacitor circuit discharged with respect to time. Working in collaborative groups, the students built the circuits, charged the capacitor, and used their graphing calculator and a voltage probe to measure the voltage drop across the capacitor as it discharged. Using a set of resistors and capacitors, students developed a model to answer these three questions: How does increasing the resistance affect the rate at which a capacitor discharges? Compare the rates at which the capacitor is discharging at the beginning, middle, and end of the total time interval. How do the average rates of change of the function change as time increases? How does increasing the capacitance affect the rate at which a capacitor discharges?

Students communicated their results in class discussions and in a written report. These two activities focused the students' attention simultaneously on the quantity that was measured and on how that quantity was changing with respect to some other quantity (i.e., distance or time). A coordinated understanding of these two measurements is at the crux of representing and reasoning about changing phenomena (Oehrtman et al., 2008).

Since a primary goal of the course was to prepare students for subsequent success in their first-year mathematics courses, we explicitly focused on three topics found by instructors to be common sources of student difficulties in our pre-calculus and calculus courses: rational expressions and complex fractions, exponential expressions and equations, and logarithmic expressions and equations. These topics were directly related to the mathematical content in the model development sequence, which provided students with the opportunity to use their algebra skills in a meaningful context. We provided additional skills practice through the use of an online homework system.

Implementation

The summer bridge program was a six-week residential program, at a mid-sized university, that provided the entering students with an opportunity to become familiar with the academic, social, and cultural life at the university. Over the last six years, the enrollment in the program has been 28% women and 61% underrepresented minorities. We implemented the modeling-based mathematics course during the last three years of the program (2010 to 2012). There were 85 students, with 30% (26) women and 61% (52) underrepresented minorities. There was a wide range of student backgrounds and experience in mathematics. Over half of the students (49) had studied calculus in high school, and over half of these 49 students (28) had taken in an Advanced Placement calculus course.

Participants worked in small groups to complete the model eliciting activities and model application activities. The model exploration activities were done individually at a computer; however, the participants were encouraged to discuss their work with each other. Throughout the model development sequence, students presented the results of the work produced during the modeling tasks in whole-class discussions. Class discussion following the model exploration tasks focused on the structural features of the model and on the relationships among different representational systems. Students worked in pairs to complete final reports for each of the model application tasks. All participants completed all of the tasks in the model development sequence described above.

Research Design and Methodology

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

This quasi-experimental study was designed to measure the effectiveness of a bridge program mathematics course that used a modeling approach to integrate mathematics with engineering applications. We addressed two research questions: Did the modeling-based mathematics course positively affect students' performance in their first semester mathematics course? To what extent did the sequence of modeling tasks support the development of students' concepts of average rate of change?

To address the first research question, we made two related comparisons. First, we compared the first-semester mathematics course grade of the students in the modeling-based course (2010–2012 cohort) with the course grades of the students who had taken a traditional course (the 2007–2009 cohort). Second, we compared the first-semester mathematics grades for the participants in the modeling-based course (the 2010–2012 cohort) with the mathematics grades of those students who did not participate in the bridge program. In other words, Did the modeling-based mathematics course close the letter-grade gap between the participants in the bridge program and the non-participants? We found statistically significant differences in SAT mathematics scores (a widely used college admissions test in the United States) between the participants in the 2007–2009 and 2010–2012 cohorts and between the participants and non-participants in both cohorts. Since SAT mathematics scores correlate with academic performance in college mathematics courses (Barnett, Sonnert, & Sadler, 2012; Kobrin, Kim, & Sackett, 2012; Mattern, Patterson, & Kobrin, 2012; Mesa, Jaquette, & Finelli, 2009), we compared the mathematics course grades between groups using analyses of covariance with SAT mathematics score as a covariate.

To address the second research question, we developed an instrument (the Rate of Change Concept Inventory) to measure the improvement in students' understanding of average rate of change. This instrument (described more fully below) was administered as a pre- and post-test to all participants. A quantitative analysis of paired t-tests was followed by a qualitative analysis of those items on which there was greater than 30% improvement in order to gain insight into the relationship between these items and the model development sequence that was the basis for the course design.

Participants

Upon completion of the bridge program, most students were placed in a pre-calculus or Calculus I course taken with other engineering students. Those students who earned a grade of 4 or 5 on the Advanced Placement calculus examination were placed into a second calculus course (Calculus II). In the fall of 2008, we introduced a new pre-calculus course that combined pre-calculus with college algebra for those students who were not prepared to take the standard pre-calculus course for engineers. The number of bridge program students who placed into pre-calculus with algebra or Calculus II from both cohorts was too small for meaningful statistical analysis. Hence, our comparisons focus on the bridge program participants and non-participants who placed into pre-calculus and Calculus I. The overall demographics for these students are summarized in Table 1.

Table 1. Participants and Non-participants in the Bridge Program
 2007–2009 cohort2010–2012 cohort
 Non-participantsParticipantsNon-participantsParticipants
Female1802416022
Male5045561243
Asian831013811
African American52334721
Hispanic52173613
Native American5040
Multiple ethnicities196504
White384945413
Unknown894433

Since bridge programs are intended for students who might be underprepared for the academic rigor of first-year engineering courses, the bridge program participants in this study, not surprisingly, had mean SAT mathematics scores that were significantly lower than the scores of the non-participants for both the 2007–2009 and the 2010–2012 cohorts. The comparisons of mean SAT mathematics scores for the students in pre-calculus and Calculus I are summarized in Table 2.

Table 2. Mean SAT Mathematics Scores within Course by Bridge Program Participation
 Non-participantsParticipants  
 nM (SD)nM (SD)tdf
  1. a

    p <.001.

2007–2009 cohort
Pre-calculus190585.2 (53.0)37500.8 (53.9)−8.84a225
Calculus I455639.6 (53.0)37562.7 (70.5)−8.26a490
2010–2012 cohort
Pre-calculus122583.0 (48.3)13534.6 (33.8)−3.51a133
Calculus I571645.6 (56.9)48588.8 (47.8)−6.73a617

The SAT mathematics scores for the 2007–2009 cohort of engineering students who did not participate in the summer bridge program were not significantly different from the SAT mathematics scores of the 2010–2012 cohort of non-participating students.

First Mathematics Course Grades

For the three years prior to the implementation of the modeling-based course (2007–2009), regardless of whether the bridge program students were placed in pre-calculus or Calculus I, their achievement in their first mathematics course lagged about a letter grade below that of the non-participants in the bridge program, as shown in Table 3. Course grades are reported on a four-point scale, where an A = 4 and an F = 0. These grades are qualified by + or − and correspondingly incremented or decremented by 0.3334 or 0.3333; there are no A+, D+, D−, or F+ grades. The statistically significant (p < 0.001) letter grade gap between the participants and the non-participants was the motivation for the redesign of the bridge program mathematics course. A chi-square test was performed to determine whether the grade distributions significantly differed between the participants and non-participants in each course. We found significant differences for the pre-calculus course, χ2 (4, 236) = 27.63, p < 0.001, and the Calculus I course, χ2 (4, 527) = 55.91, p < 0.001.

Table 3. First Mathematics Course Grades for the 2007–2009 Cohort by Bridge Program Participation
 Non-participantsParticipants  
 nM (SD)nM (SD)tdf
  1. Note. We performed analyses treating this variable as ordinal and as interval, and the results were substantially identical.

  2. a

    p < .001.

Pre-calculus1982.64 (1.11)381.75 (1.27)−4.42a234
Calculus I4862.60 (1.12)411.28 (1.20)−7.21a525

Since the mean SAT mathematics score of the participants was significantly lower than that of the non-participants (see Table 2), we conducted an analysis of covariance, using SAT mathematics score as a covariate. This analysis showed that the participants' fall course grades in pre-calculus and Calculus I were significantly lower than those of the non-participants, after accounting for the baseline SAT mathematics score. The mean course grades, adjusted for the covariate, are shown in Table 4.

Table 4. Course Grades Adjusted for SAT Mathematics Score for the 2007–2009 Cohort
 Non-participantsParticipants 
 nAdjusted mean (SE)nAdjusted mean (SE)d
Pre-calculus1902.60 (0.08)371.97 (0.21)−0.52
Calculus4552.57 (0.05)371.47 (0.19)−0.97

Cohen's effect size values, d = −0.52 and d = −0.97, for pre-calculus and Calculus I, respectively, suggest a moderate-to-high practical significance of these results. This significance is further underscored by the importance of success in first-year mathematics courses for engineers. We report our analyses of the data for the modeling-based mathematics course (for the 2010–2012 cohort) in the Results section of the paper.

Rate of Change Concept Inventory

To measure student understanding of average rate of change, we developed a Rate of Change Concept Inventory over the three years of implementation of the modeling-based mathematics course. The initial version of the inventory consisted of 17 items. Fourteen of these items were drawn from the research literature on student conceptions of rate of change (see Beichner, 1994, and Carlson, Oehrtman, and Engelke, 2010, for details of the validation studies). Since procedural knowledge and conceptual knowledge are interrelated and co-develop (National Research Council, 2001; Rittle-Johnson, Siegler, & Alibali, 2001; Star & Seifert, 2006), we included three items to test the student mastery of the algebra involved in expressing and computing average rates of change. In the second year, modifications were made to some items, and three additional items were added to the inventory. In the third year, we again revised the inventory, eliminating some items that did not discriminate well, modifying options on multiple-choice questions, and adding some items from Beichner (1994) to directly address negative average rates of change. The final version of the inventory consisted of 20 items in four categories of representations: algebraic expressions (seven items), graphical interpretation (eight items), symbolic interpretation (three items), and purely contextual (two items). Nine items were in common over the three years of using the inventory: three items each in the categories of algebraic expressions, graphical interpretation, and symbolic interpretation.

To address the second research question, the overall pre- and post-test scores on the Rate of Change Concept Inventory were analyzed separately each year, using t-tests. The four subscores were analyzed using the nonparametric related-samples Wilcoxon signed rank test, since not all of these subscores were normally distributed. To understand the effect of the design of the modeling-based course on students' concept of average rate of change, we report a detailed analysis of student results on the pre- and post-tests in the third year of the bridge program, focusing on the items from the inventory where the gains were greater than 30%.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

The results of this study showed a statistically significant gain in first-semester mathematics course grades for students who participated in the modeling-based mathematics course compared with a previous cohort of students who had taken a traditional summer course in college algebra or pre-calculus. The modeling-based course was effective in closing the previous letter grade gap in the first mathematics course between participants and non-participants in the bridge program, when controlling for the effect of SAT mathematics score. In each year of the modeling-based course, we found statistically significant gains in student understanding of average rate of change, as measured by the Rate of Change Concept Inventory. By examining the inventory items on which the students showed the greatest improvement, we gain some insight into the ways in which the design and implementation of the modeling-based course may have influenced students' first-semester mathematics performance. We report these results in the following sections.

First Mathematics Course Grade

  Participants across cohorts comparison The bridge program students who participated in the modeling-based course (the 2010–2012 cohort) performed significantly better in their first-semester mathematics course compared with the 2007–2009 cohort of students who had taken traditional college algebra and pre-calculus courses, as shown in Table 5.

Table 5. First-Semester Mathematics Grades for Bridge Program Participants
 2007–2009 cohort2010–2012 cohort   
 nM (SD)nM (SD)tdfd
  1. Note. We performed analyses treating this variable as ordinal and as interval, and the results were substantially identical.

  2. a

    Equal variances not assumed.

  3. b

    p < .001.

Pre-calculus381.75 (1.27)143.17 (0.69)5.11ba42.51.39
Calculus I411.28 (1.20)512.20 (1.15)3.74b900.78

However, as shown in Table 6, the 2010–2012 cohort had a statistically significant higher SAT mathematics score than the 2007–2009 cohort. Hence we conducted an analysis of covariance (Table 7), with SAT mathematics score as the covariate, which showed that participation in the modeling-based course was a significant factor, but that SAT mathematics score was not a significant factor. Cohen's effect size values, d = 1.39 and d = 0.78, for pre-calculus and Calculus I, respectively, suggest a high practical significance of these results and that the modeling-based course may be even more effective for pre-calculus students than for calculus students.

Table 6. SAT Mathematics Scores for Bridge Program Participants
 2007–2009 cohort2010–2012 cohort  
 nM (SD)nM (SD)tdf
  1. a

    Equal variances not assumed.

  2. b

    p< .05.

Pre-calculus37500.8 (53.9)13534.6 (33.8)2.62ba33.9
Calculus I37562.7 (70.5)48588.8 (47.8)2.03b83
Table 7. Effect of Bridge Program Participation by Cohort with SAT Mathematics Score as a Covariate
FactordfMSFPartial eta squared
  1. a

    r2 = 0.22. r2 = 0.18.

  2. b

    p < .001.

Pre-calculusa
Cohort1, 5016.9712.17b0.21
SAT math1, 500.030.0270.00
Calculus
Cohort1, 8519.0014.71b0.15
SAT math1, 850.260.200.00

Participants versus non-participants comparison We examined the effect that the modeling-based course had on the letter grade gap between participants and non-participants in the bridge program. The fall course grades for these two groups for the 2010–12 cohort are shown in Table 8. However, as noted earlier in Table 2, the SAT mathematics scores for the participants versus the non-participants are significantly different for this cohort. Hence, we performed an analysis of covariance for each course to account for the SAT mathematics scores.

Table 8. First-Semester Mathematics Grades by Bridge Program Participation for the 2010–2012 Cohort
 Non-participantsParticipants  
 nM (SD)nM (SD)tdf
  1. Note. We performed analyses treating this variable as ordinal and as interval, and the results were substantially identical.

  2. a

    p = < .05.

Pre-calculus1312.90 (1.11)143.17 (0.69)0.86143
Calculus I6412.55 (1.13)512.20 (1.15)−2.13a690

For the pre-calculus course, the analysis showed that the effect of SAT mathematics scores was not significant. Our results showed that the modeling-based course was effective in closing the previous letter grade gap (shown in Table 3 for the 2007–2009 cohort), even though there was still a significant difference in SAT mathematics scores between the participants and the non-participants in this cohort. Fisher's exact test (used due to small sample size) showed that there was no longer a significant difference in the grade distributions between the participants and the non-participants (p = 0.923).

For the Calculus I course, the analysis of covariance (Table 9) showed that there was no significant difference in the fall Calculus I grade by participation in the modeling-base course, but rather the difference in course grades was explained by the differences in SAT mathematics scores between the two groups (p < 0.001).

Table 9. Effect of Bridge Program Participation on Calculus I Grades with SAT Mathematics Score as a Covariate
FactordfMSFPartial eta squared
  1. a

    p < .001.

Participation1, 6180.000.000.00
SAT math1, 61854.1145.41a0.07

The mean Calculus I course grades, when adjusted for the SAT mathematics score, were nearly identical. The adjusted mean course grade for the non-participants was 2.51 (n = 571, SE = 0.05) and for the participants was 2.51 (n = 48, SE = 0.16). This result means that the modeling-based course enabled the bridge program participants to perform on a par with the non-participants who had comparable SAT mathematics scores; thus the course closed the previous letter grade gap between the participants who in the earlier cohort (2007–2009) had performed below the non-participants (as shown in Table 3). A chi-square test showed that there was no longer a significant difference in the grade distributions between the participants and the non-participants, χ2 (4, 692) = 5.89, p = 0.21.

Understanding of Average Rate of Change

As noted above, we systematically changed the items on the inventory: we added three items from the first to the second year of the course implementation and modified items from the second year to the third year of the implementation. The pre- and post-test results shown in Table 10 indicate that there was a significant improvement (p < 0.001) in the student understanding of the concept of average rate of change for each year of the modeling-based course, with large effect sizes in all three years.

Table 10. Pre- and Post-test Results for the Rate of Change Concept Inventory
 Pre-testPost-test   
YearM (SD)M (SD)tdfd
  1. a

    p < .001.

20108.86 (3.47)12.69 (2.81)10.65a331.21
20118.05 (3.10)12.34 (3.16)7.92a161.37
20126.92 (3.25)12.62 (3.30)12.37a341.74

As shown in Table 11, there was a significant improvement on the nine common items on the Rate of Change Concept Inventory for the 2010–2012 cohort in the three subscore areas of algebraic expressions, graphical interpretation, and symbolic interpretation, with a maximum score of three points in each area. Because these subscores were not normally distributed, we compared the pre- and post-test results using related-samples Wilcoxon Signed Rank Test and found statistically significant improvements on all three subscores when aggregating the data across all three years of the modeling-based course.

Table 11. Pre- and Post-test Results for the Subscores of the Common Items on the Rate of Change Concept Inventory for 2010–2012 Cohort
 Pre-testPost-test 
SubscoreM (SD)M (SD)z
  1. a

    p < .001.

Algebraic1.14 (.99)2.27 (1.08)6.503a
Symbolic0.79 (.82)1.53 (.97)5.261a
Graphical1.33 (.87)1.98 (.80)5.850a

In the third year of the modeling-based course, the overall scores improved from 6.92 (35%) to 12.62 (63%), as shown in Table 10. There were nine items on the concept inventory for which the improvement was greater than 30%. Four of these were algebraic expression items, four were graphical interpretation items, and one was a symbolic interpretation item. We report the analysis of student gains on these items in relationship to the model development sequence.

Algebraic expression items The four algebraic expression items required students to use or create algebraic expressions related to the concept of average rate of change. Two of the algebraic expression items (Q2 and Q12) asked the students to give an expression for inline image when inline image and to calculate the average rate of change between two points on a parabola. These items measured student proficiency in algebraically expressing two basic ideas about the average rate of change. As shown in Table 12, there were substantial gains on both of these items. On the item (Q16) that asked students to simplify a complex fraction of the form inline image, there was a very large gain, as only four (11%) students were able to correctly simplify this fraction on the pre-test and 22 (80%) of the students were able to so on the post-test.

Table 12. Algebraic Expression Items with High Percentage Gain
 Correct on pre-testCorrect on post-testGain
Itemn%n%ΔnΔ%
Q2174930861337
Q12144032911851
Q1641128802469
Q1982319541131

Item Q19 asked students to find an average speed over a round trip between two cities when given speed (45 miles per hour) and time (two hours) information for the first part of the trip and only speed (30 miles per hour) information for the return trip. The significant improvement on this item was largely accounted for by the number of students who shifted from incorrectly seeing the average speed as a simple average of the two speeds (37.5 miles per hour) to correctly reasoning about this as a weighted average that has to account for the amount of time traveled at each speed, resulting in an average speed of 32.5 miles per hour.

The gains on items Q2 and Q16 (simplifying algebraic expressions for a quadratic function and for a complex fraction) likely reflect the emphasis in the course on developing students' algebraic skills. The gain on item Q12 (finding an average rate of change between two points on a parabola) is likely due to the limits of students' prior experiences with the meaning of the term average rate of change in a mathematical context. This was an open-ended question; without knowing the mathematical meaning of the term, many students simply left this question blank on the pre-test. We attribute the gain on item Q19 (interpreting two speeds over different time intervals) in part to the emphasis in the model development sequence on analyzing and interpreting velocity in the context of motion along a straight path.

Symbolic interpretation items The symbolic interpretation items required the students to create appropriate symbolic expressions when given a problem context or to interpret the meaning of the parameters in symbolic expression. There was a substantial gain on one of these three items, which is taken from Carlson et al. (2010). The students had to choose correct interpretations of the parameter m in a linear growth function:

A baseball card increases in value according to the function, inline image, where b gives the value of the card in dollars and t is the time (in years) since the card was purchased. Which of the following describe what inline image conveys about the situation?

There was a substantial gain (37%) on this question from the pre-test (n = 7, 20%) to the post-test (n = 20, 57%). This gain likely reflects the emphasis in the model development sequence on making meaningful interpretations of data and on giving descriptions of the average rate of change in various contexts.

Graphical interpretation items There were substantial gains on four items that measured student proficiency at reasoning about and interpreting rates of change when given graphical information, as shown in Table 13. We take these forms of graphical reasoning to be central to the work and the learning of engineers and scientists.

Table 13. Graphical Interpretation Items with High Percentage Gain
 Correct on pre-testCorrect on post-testGain
Itemn%n%ΔnΔ%
Q4 A92624681542
Q4 B144027771337
Q5123428801646
Q8102925711543
Q1761719541337

Two of these items (Q4 and Q8) addressed the interrelated interpretation of velocity and position graphs. Question Q4 addressed interpreting information about velocity when students were given a position graph. Question Q8, on the other hand, involved interpreting position information when students were given a velocity graph. Students often confuse the interpretation of these graphs; they experience difficulty in inferring velocity information from a position graph and misread a velocity graph as if it were a position graph (Beichner, 1994). This coordination of representational systems (shifting between the velocity or rate graph and its associated position graph) was the main focus of the model exploration tasks described earlier.

Item Q4A asked for the value of the velocity of the person whose position graph is shown in Figure 2 at seven seconds, and item Q4B asked for the person's average speed over the entire time interval. On the post-test, 24 (68%) of the students correctly reasoned about the velocity at a particular point in time when the velocity is constant and the starting point for the motion is not at the origin. There were two common errors on the pre-test. The most common error was to incorrectly assume that the speed would equal position divided by time, or 35 meters divided by seven seconds, which produced an incorrect answer of 5 meters per second. The second most common error on the pre-test was to read the position graph as if it were a velocity graph, which produced an incorrect answer of 35 meters per second. On the post-test, only five students made this first error and none made the second error. The prevalence and persistence of both errors are well established in both the mathematics education and physics education literatures; these results compare favorably with results on an equivalent item by Beichner (1994), where only 21% of students were able to reason correctly about the velocity after college-level instruction in kinematics. On the post-test, 27 (77%) of the students were able to reason correctly on item Q4B about the average of two different constant velocities over two different intervals of time. We note that this item is the graphical equivalent of algebraic item Q19, on which there was also a substantial improvement from the pre- to post-test. The student achievement on the post-test was greater on the graphical form of the item (with 77% correct) than on the algebraic form of the item (with 54% correct).

image

Figure 2. Interpreting velocity information from a position graph.

Download figure to PowerPoint

Item Q8 represents an important reversal of the above problem and one that is a well-known source of difficulty for physics and calculus students (Beichner, 1994; Monk, 1992). The item asks students to identify which of a set of position versus time graphs would best represent the object's motion during the same time interval for the motion shown in the velocity graph in Figure 3. A successful answer to this item requires an understanding of how to reason about position when given a velocity graph.

image

Figure 3. Reasoning about position from a velocity graph.

Download figure to PowerPoint

On the post-test, we found that 25 (71%) students were able to correctly identify an appropriate position graph. On an analogous item reported by Beichner (1994), only 29% of students were able to correctly use areas to reason about velocity from an acceleration graph after instruction in kinematics. The substantial 43% gain on this question suggests that the model exploration tasks within the model development sequence helped the students understand how to use areas to reason graphically about nonconstant and negative velocities (or rates of change), such as shown in Figure 3.

The two other graphical items (Q5 and Q17) for which there were substantial gains addressed student proficiency at choosing a text description of motion when given a position graph (Q5) and in comparing the average rates of change by reasoning about the outputs of three functions over differing input intervals (Q17). In question Q5, the students were asked to choose a description of the motion of an object whose position is shown in Figure 4.

image

Figure 4. Describing an object's motion when given a position graph.

Download figure to PowerPoint

The post-test results showed that 80% of the students were able to select a correct description; results reported by Beichner (1994) on this item showed that only 37% of students were able to describe the motion correctly after kinematics instruction. We attribute the substantial gain in achievement on this item to student experiences with motion detectors in the model eliciting activities and to the emphasis on descriptions of motion in the model exploration activities in the model development sequence.

Limitations

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

A number of limitations should be considered when interpreting the results of this study. First, other variables that would influence students' success in their first-year mathematics course, such as self-efficacy in mathematics, peer interactions and study groups, general academic study skills, and sense of belonging, were not available to us and, hence, not considered in this study. The characteristics associated with these variables may have been distributed differently between the non-participants and the participants in the bridge program and between the cohorts. Second, the relatively small number of participants in the bridge programs leads us to be cautious in interpreting the results of this study. Measurements on such small samples sizes are sensitive to the impact of outliers in the sample. More importantly, the small sample size precluded us from treating gender and underrepresented minority status as variables. Hence, we do not know if the modeling-base mathematics course had a differential impact on various groups of students.

A third study limitation is in the use of mathematics course grades as indicators of student success. Course grades are a discrete variable and can vary substantially among instructors, thus introducing bias with respect to any effects due to the modeling-based mathematics course. In this study, both the pre-calculus and the Calculus I courses were taught in large lectures by a single instructor, thereby reducing instructor bias in any given year but not across the years within each cohort or between cohorts. We note that improvements in mathematics course grade, rather than overall success in freshman courses, was the primary goal of the redesigned mathematics course. Hence, we used mathematics course grade as an outcome variable, given the design of the study and the use of course grade in similar studies (Budny et al., 1998; Gleason et al., 2010; Ohland et al., 2004).

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

In both cohorts, participants in the summer bridge program had mean SAT mathematics scores that were statistically lower than those of non-participants. For both cohorts, the percentage of women and underrepresented minorities in the bridge program was greater than that among the non-participants. Together, these facts suggest that the summer bridge program was successful in recruiting at-risk students or those underprepared for success in engineering. We found, for the 2007–2009 cohort, that the students who participated in the traditional mathematics courses of the bridge program performed worse than non-participants, and that this difference was not accounted for by differences in mean SAT mathematics scores. This finding suggests that the traditional course had the effect of improving students' first course placement but without addressing foundational understandings needed for success in pre-calculus or Calculus I. Thus, this study provides evidence for the cautions raised by Gleason et al. (2010) and others regarding the negative consequences of focusing on improved mathematics placements for students who are underprepared for their engineering studies.

The participants in the 2010–2012 cohort, who took the modeling-based mathematics course, outperformed their peers in the 2007–2009 cohort in both pre-calculus and Calculus I (p < 0.001), with large effect sizes of d = 1.39 and d = 0.78, respectively. This substantial improvement in first mathematics course grades closed the previous gap between participating and non-participating students. Since the bridge program was substantially the same for both cohorts, with the exception of replacing tradition college algebra and pre-calculus courses with a newly designed modeling-based mathematics course, we can view the 2007–2009 cohort as a control group for effectiveness of the intervention of the modeling-based course for the 2010-12 cohort. On the basis of the improvement in first-semester grades in both pre-calculus and Calculus I courses, we conclude that the modeling-based course effectively prepared students for success in their first-semester mathematics course.

Our strategy for preparing students for success in their first-semester mathematics course was to develop their understandings of a foundational concept that brings together mathematics and engineering, namely the study of changing phenomena and the underlying mathematical concept of average rate of change. Each year of the course, we found significant gains on the post-test (p < 0.001) in student understanding of the concept of average rate of change; this gain indicates that the course's targeted learning outcomes were achieved. This achievement was especially strong in student skills in using algebra to express ideas about the average rate of change, such as using function notation and simplifying complex fractions. Students made large gains in expressing and interpreting position, speed, and velocity both graphically and algebraically. We found substantial gains, as compared with known results in the research literature, in student abilities to interpret position information when given a velocity graph and to interpret velocity information when given a position graph. This coordination of understandings is foundational to later understanding of the relationship between a function, its derivative, and its antiderivative in calculus (Carlson et al., 2002; Oehrtman et al., 2008) and in understanding their applications in engineering (e.g., Prince, Vigeant, & Nottis, 2012).

The results of this study suggest that student gains in understanding the concept of average rate of change were a strong foundation for learning in first-semester mathematics courses, as evidenced by the improvements in pre-calculus and Calculus I course grades. An implication of this research is that the design of the mathematics component of bridge programs should move away from an emphasis on remediation to improve student performance on a placement test. Instead, the design of the mathematics component should focus on helping students rethink and learn challenging and foundational concepts in the context of engineering applications so that they will experience success in their first mathematics course.

A further implication of this research is related to our theoretical perspective on model development sequences, which was the basis for the design of the mathematics course. As Hjalmarson et al. (2008) and other researchers (e.g., Bursic et al., 2011; Hamilton et al., 2010) who have drawn on a modeling perspective in engineering education have argued, students' learning is enhanced by engaging with modeling tasks that require them to explain their thinking, justify their reasoning, and listen to other students' arguments. The emphasis on interpreting phenomena and creating and using representations (e.g., Moore et al., 2013; Streveler et al., 2008) is central to the design of a modeling-based instructional sequence. Hence, a further implication of this study is that the three components of the model development sequence (model eliciting activities, model exploration activities, and model application activities) can serve as a set of instructional design principles for bridge program mathematics courses.

This study contributes to understanding both the design and the effectiveness of a modeling-based mathematics course in a summer bridge program and extends the work of other researchers on this important transitional context for underprepared engineering students. Further research remains to be done on the influence of other variables (such as self-efficacy, peer interactions, general study skills, and sense of belonging) and other programmatic efforts (such as learning communities and peer-led study groups) on students' success in their first mathematics course. Further research also needs to examine the effectiveness of the modeling-based approach on retention in engineering, especially among women and underrepresented minorities. Finally, studies that investigate the effectiveness of a modeling-based approach for the design of introductory courses for engineers would seem fruitful.

Conclusions

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies

The results of this study provide evidence that the modeling-based mathematics course in a summer bridge program had a positive impact on engineering students' performance in their first-year mathematics course and a positive impact on their understanding of average rate of change, as measured by the statistically significant gains on the Rate of Change Concept Inventory. The gains on the algebraic expression items reflect the emphasis in the course on using algebra in making sense of changing phenomena in meaningful contexts, such as motion along a straight path, decreasing light intensity, and discharging capacitors. The gains on the graphical interpretation items may be due to the model exploration tasks that focused on the coordination between representational systems. This coordination included shifting between position and velocity graphs, thus distinguishing between the function's graph and the graph of its rate of change, and shifting between numerical and graphical data, thus linking the value of the average rate of change with the graph of the function. Building students' foundational understandings of functions and their rates of change and having them apply these understandings in meaningful contexts, while at the same time developing their basic algebra skills, appears to have contributed to engineering students' subsequent success in their first college mathematics course.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies
  • Alkhasawneh, R., & Hobson, R. (2010). Pre-college mathematics preparation: Does it work? Proceedings of the ASEE Annual Conference and Exposition, Louisville, KY.
  • Barnett, M. D., Sonnert, G., & Sadler, P. M. (2012). More like us: The effect of immigrant generation on college success in mathematics. International Migration Review, 46(4), 891918.
  • Beichner, R. (1994). Testing student interpretation of kinematics graphs. American Journal of Physics, 62(8), 750755.
  • Brainard, S. G., & Carlin, L. (1998). A six-year longitudinal study of undergraduate women in engineering and science. Journal of Engineering Education, 87(4), 369375.
  • Budny, D., LeBold, W., & Bjedov, G. (1998). Assessment of the impact of freshman engineering courses. Journal of Engineering Education, 87(4), 405412.
  • Bursic, K., Shuman, L., & Besterfield-Sacre, M. (2011). Improving student attainment of ABET outcomes using model-eliciting activities (MEAs). Proceedings of the ASEE Annual Conference and Exposition, Vancouver, BC, Canada.
  • Carlson, M., Jacobs, S., Coe, E., Larsen, S., & Hsu, E. (2002). Applying covariational reasoning while modeling dynamic events: A framework and a study. Journal for Research in Mathematics Education, 33(5), 352378.
  • Carlson, M., Oehrtman, M., & Engelke, N. (2010). The precalculus concept assessment: A tool for assessing students' reasoning abilities and understandings. Cognition and Instruction, 28(2), 113145.
  • Citty, J., & Lindner, A. (2012). Dual model summer bridge programs: A new consideration for increasing retention rates. Proceedings of the ASEE Annual Conference and Exposition, San Antonio, TX.
  • Diefes-Dux, H., Bowman, K., Zawojewski, J., & Hjalmarson, M. (2006). Quantifying aluminum crystal size. Part 1: The model eliciting activity. Journal of STEM Education: Innovations and Research, 7(1/2), 5163.
  • Doerr, H. M., & English, L. D. (2003). A modeling perspective on students' mathematical reasoning about data. Journal for Research in Mathematics Education, 34(2), 110136.
  • Gleason, J., Boykin, K., Johnson, P., Bowen, L., Whitaker, K., Micu, C., Raju, D., & Slappey, C. (2010). Integrated engineering math-based summer bridge program for student retention. Advances in Engineering Education, 2(2), 117.
  • Hamilton, E., Besterfield-Sacre, M., Olds, B., & Siewiorek, N. (2010). Model-eliciting activities in engineering: A focus on model building. Proceedings of the ASEE Annual Conference and Exposition, Louisville, KY.
  • Hestenes, D. (1992). Modeling games in the Newtonian world. American Journal of Physics, 60, 732748.
  • Hestenes, D. (2010). Modeling theory for math and science education. In R. Lesh, P. Galbraith, C. Haines, & A. Hurford (Eds.), Modeling students' mathematical modeling competencies (pp. 1341). New York, NY: Springer.
  • Hjalmarson, M., Diefes-Dux, H. A., Bowman, K., & Zawojewski, J. S. (2006). Quantifying aluminum crystal size. Part 2: The model development sequence. Journal of STEM Education: Innovations and Research, 7(1/2), 6473.
  • Hjalmarson, M. A., Diefes-Dux, H. A., & Moore, T. J. (2008). Designing model development sequences for engineering. In J. S. Zawojewski, H. A. Diefes-Dux, & K. J. Bowman (Eds.), Models and modeling in engineering education: Designing experiences for all students (pp. 3754). Rotterdam, the Netherlands: Sense.
  • Kaiser, G., & Sriraman, B. (2006). A global survey of international perspectives on modelling in mathematics education. Zentralblatt für Didaktik der Mathematik, 38(3), 302310.
  • Kaput, J., & Roschelle, J. (1996). SimCalc: MathWorlds [Computer software].
  • Kean, A., Miller, R., Self, B., Moore, T., Olds, B., & Hamilton, E. (2008). Identifying robust student misconceptions in thermal science using model-eliciting activities. Proceedings of the ASEE Annual Conference and Exposition, Pittsburgh, PA.
  • Kobrin, J., Kim, Y., & Sackett, P. (2012). Modeling the predictive validity of SAT mathematics items using item characteristics. Educational and Psychological Measurement, 72(1), 99119.
  • Kowalchuk, R., Green, T., Ricks, R., & Nicklow, J. (2010). Evaluation of a summer bridge program on engineering students' persistence and success. Proceedings of the ASEE Conference and Exposition, Louisville, KY.
  • Lesh, R. A., Cramer, K., Doerr, H. M., Post, T., & Zawojewski, J. (2003). Model development sequences. In R. A. Lesh & H. M. Doerr (Eds.), Beyond constructivism: Models and modeling perspectives on mathematics problem solving, learning and teaching (pp. 3558). Mahwah, NJ: Lawrence Erlbaum.
  • Lesh, R. A., & Doerr, H. M. (Eds.). (2003). Beyond constructivism: Models and modeling perspectives on mathematics problem solving, learning and teaching. Mahwah, NJ: Lawrence Erlbaum.
  • Lesh, R. A., Doerr, H. M., Carmona, G., & Hjalmarson, M. (2003). Beyond constructivism. Mathematical Thinking and Learning, 5(2, 3), 211234.
  • Leuwerke, W., Robbins, S., Sawyer, R., & Hovland, M. (2004). Predicting engineering major status from mathematics achievement and interest congruence. Journal of Career Assessment, 12(2), 135149.
  • McDermott, L. C., Rosenquist, M. L., & van Zee, E. H. (1987). Student difficulties in connecting graphs and physics: Examples from kinematics. American Journal of Physics, 55(6), 503513.
  • Mattern, K., Patterson, B., & Kobrin, J. (2012). The validity of SAT scores in predicting first-year mathematics and English grades (College Board Research Report No. 2012-1). Retrieved from http://research.collegeboard.org/sites/default/files/publications/2012/7/researchreport-2012-1-sat-predicting-1st-year-mathematics-english-grades.pdf
  • May, G. S., & Chubin, D. E. (2003). A retrospective on undergraduate engineering success for underrepresented minority students. Journal of Engineering Education, 92(1), 2739.
  • Mesa, V., Jaquette, O., & Finelli, C. (2009). Measuring the impact of an individual course on students' success. Journal of Engineering Education, 98(4), 349359.
  • Micomonaco, J., & Sticklen, J. (2010). Toward a better understanding of academic and social integration: A qualitative study of factors related to persistence in engineering. Proceedings of the ASEE Annual Conference and Exposition, Louisville, KY.
  • Monk, S. (1992). Students' understanding of a function given by a physical model. In E. Dubinsky & G. Harel (Eds.), The concept of function: Aspects of epistemology and pedagogy (pp. 175194). Washington, DC: Mathematical Association of America.
  • Moore, T., Miller, R., Lesh, R., Stohlmann, M., & Kim, Y. (2013). Modeling in engineering: The role of representational fluency in students' conceptual understanding. Journal of Engineering Education, 102(1), 141178.
  • National Research Council, Mathematics Learning Study Committee. (2001). Adding it up: Helping children learn mathematics. J. Kilpatrick, J. Swafford, & B. Findell (Eds.). Washington, DC: National Academy Press.
  • Oehrtman, M., Carlson, M., & Thompson, P. (2008). Foundational reasoning abilities that promote coherence in students' function understanding. In M. Carlson & C. Rasmussen (Eds.), Making the connection: Research and teaching in undergraduate mathematics education (pp. 2741). Washington, DC: Mathematical Association of America.
  • Ohland, M., Yuhasz, A., & Sill, B. (2004). Identifying and removing a calculus prerequisite as a bottleneck in Clemson's general engineering curriculum. Journal of Engineering Education, 93(3), 253257.
  • Papadopoulos, C., & Reisel, J. (2008). Do students in summer bridge programs successfully improve math placement and persist? A meta-analysis. Proceedings of the ASEE Conference and Exposition, Pittsburgh, PA.
  • Prince, M., Vigeant, M., & Nottis, K. (2012). Development of the heat and energy concept inventory: Preliminary results on the prevalence and persistence of engineering students' misconceptions. Journal of Engineering Education, 101(3), 412438.
  • Reisel, J., Jablonski, M., Hosseini, H., & Munson, E. (2012). Assessment of factors impacting success for incoming college engineering students in a summer bridge program. International Journal of Mathematical Education in Science and Technology, 43(4), 421433.
  • Reisel, J., Jablonski, M., Rineck, L., Munson, E., & Hosseini, H. (2012). Analysis of math course placement improvement and sustainability achieved through a summer bridge program. Proceedings of the ASEE Annual Conference and Exposition, San Antonio, TX.
  • Ridgely, J., & Self, B. (2011). Model-eliciting activities in a mechanical engineering experimental methods course. Proceedings of the ASEE Annual Conference and Exposition, Vancouver, BC, Canada.
  • Rittle-Johnson, B., Siegler, R., & Alibali, M. (2001). Developing conceptual understanding and procedural skill in mathematics: An iterative process. Journal of Educational Psychology, 93(2), 346362.
  • Self, B., & Widmann, J. (2010). Dynamics buzzword bingo: Active/collaborative/inductive learning, model eliciting activities, and conceptual understanding. Proceedings of the ASEE Annual Conference and Exposition, Louisville, KY.
  • Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom-based practices. Journal of Engineering Education, 94(1), 87100.
  • Star, J. R., & Seifert, C. (2006). The development of flexibility in equation solving. Contemporary Educational Psychology, 31, 280300.
  • Strayhorn, T. (2011). Bridging the pipeline: Increasing underrepresented students' preparation for college through a summer bridge program. American Behavioral Scientist, 55(2), 142159.
  • Streveler, R. A., Litzinger, T. A., Miller, R. L., & Steif, P. S. (2008). Learning conceptual knowledge in the engineering sciences: Overview and future research directions. Journal of Engineering Education, 97(3), 279294.
  • Thornton, R. K., & Sokoloff, D. R. (1998). Assessing student learning of Newton's laws: The force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66(4), 338352.
  • Tolley, P., Blat, C., McDaniel, C., Blackmon, D., & Royster, D. (2012). Enhancing the mathematics skills of students enrolled in introductory engineering courses: Eliminating the gap in incoming academic preparation. Journal of STEM Education, 13(3), 7486.
  • Veenstra, C., Dey, E., & Herrin, G. (2008). Is modeling of freshman engineering success different from modeling non-engineering success? Journal of Engineering Education, 97(4), 467479.
  • Verleger, M., & Diefes-Dux, H. (2008). Impact of feedback and revision on student team solutions to model-eliciting activities. Proceedings of the ASEE Annual Conference and Exposition, Pittsburgh, PA.
  • Vogt, C. (2008). Faculty as a critical juncture in student retention and performance in engineering programs. Journal of Engineering Education, 97(1), 2736.
  • Yue, J. (2011). Improving math skills through intensive mentoring and tutoring. Proceedings of the ASEE Annual Conference and Exposition, Vancouver, BC, Canada.
  • Zawojewski, J., Diefes-Dux, H., & Bowman, K. (Eds.). (2008). Models and modeling in engineering education: Designing experiences for all students. Rotterdam, the Netherlands: Sense.

Biographies

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background and Motivation
  5. Theoretical Background
  6. Research Design and Methodology
  7. Results
  8. Limitations
  9. Discussion
  10. Conclusions
  11. References
  12. Biographies
  • Helen M. Doerr is a professor of mathematics and mathematics education at Syracuse University, 215 Carnegie Hall, Syracuse, New York, 13244; hmdoerr@syr.edu.

  • Jonas Bergman Ärlebäck is a senior lecturer and researcher in mathematics and mathematics education at Linköping University, SE-581 83 Linköping, Sweden; jonas.bergman.arleback@liu.se.

  • Andria Costello Staniec is an associate professor of civil and environmental engineering at Syracuse University, 304 Steele Hall, Syracuse, NY, 13244; costello@syr.edu.