SEARCH

SEARCH BY CITATION

Keywords:

  • modeling;
  • engineering design;
  • adaptive expertise

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

Background

Modeling is a pervasive feature of engineering that is rarely taught explicitly to engineering students. The implicit inclusion of modeling often results in conceptions held by students of models based on the everyday use of the term that neglects important predictive types of models.

Purpose

We studied the effectiveness of an explicit modeling module designed to broaden student understandings of various approaches to and applications of modeling.

Design/Method

A two-phase analysis of student conceptions was undertaken. Phase I analyzed the conceptions of an experimental group before and after they were taught an explicit modeling module. Phase II added a comparison group at a second institution.

Results

A significant shift was observed for engineering students who were explicitly taught a modeling module. Student-held conceptions were predominantly descriptive-centric (e.g., physical models) throughout the investigation with an added focus on predictive (e.g., mathematical) modeling after completing the modeling module. These results were consistent for a comparison group.

Conclusions

Explicit learning experiences about models and the modeling process need to be embedded into the engineering curriculum, specifically in the teaching of engineering design. Teaching modeling will improve student use and understanding of modeling as an important and pervasive engineering tool.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

Many courses in an engineering curriculum focus on teaching students engineering fundamentals. Engineering fundamentals can include many disciplinary topics, but one underlying emphasis is developing analytic skills that are rooted in basic mathematical and scientific principles. Typical engineering courses engage students in deriving, using, and applying theories, equations, and models in a variety of problem-solving contexts. Yet traditional instruction is often exclusively focused on setting up formulas and carrying out algorithmic mathematical steps to arrive at a solution (Sheppard, Macatangay, Colby, & Sullivan, 2008). Little to no explicit instruction is focused on the assumptions that are embedded in the model and why some models are better approximations to physical phenomena than others. The potential to develop a sophisticated fluency in applying modeling to a range of complex or open-ended design problems is decreased when student experiences are limited to a subset of procedural-focused modeling skills.

Our research study focused on modeling because it is a core skill for engineering students, a pervasive feature of the engineering curriculum, and a signature feature of discovery and application-based research in the sciences (Hesse 1963; Magnani, Nersessian, & Thagard, 1999; Morgan & Morrison; 1999). Modeling is one activity that students are expected to perform throughout their engineering education, including in fundamental engineering, foundational math and science, and project-based design courses as well as problem-based learning (Nersessian & Patton, 2009; Sun, Newstetter, & Nersessian, 2006). The nuanced and complex activity of modeling and model-based reasoning presents a challenge in engineering education that students often do not fully develop by the end of their formal education.

Our research fills a major gap in the literature by investigating the explicit teaching and use of models and modeling as part of learning engineering design through an exploration of student conceptions of modeling and model use. We therefore ask the question, What are student conceptions of modeling in the context of engineering design?

Literature Review

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

We frame our study using adaptive expertise and model-based reasoning. According to Schwartz, Bransford, and Sears (2005), an adaptive expert is someone who has deep subject matter knowledge and the ability to recognize when this knowledge applies in a novel setting. The concept of adaptive expertise was introduced to extend our understanding of the meaning of expertise. Hatano and Inagaki (1986) contrasted two types of expertise: routine and adaptive. They claim “routine experts are outstanding in speed, accuracy, and automaticity of performance, but lack flexibility and adaptability to new problems” (p. 266). Furthermore, Hatano and Oura (2003) explained that the majority of studies on expertise “have shown that experts, who have had many years of problem-solving experiences in a given domain, can solve familiar types of problems quickly and accurately, but often fail to go beyond procedural efficiency” (p. 28). In contrast, adaptive experts “can be characterized by their flexible, innovative, and creative competencies within the domain” (p. 28).

The concept of adaptive expertise grew out of research on how individuals transfer knowledge to new problems, specifically how related learning activities could lead to solutions. Our study analyzes adaptive expertise as it applies to how one flexibly uses knowledge of modeling as part of engineering design. Modeling know-how is a set of skills that can be classified as computational adaptive expertise (CADEX), which concentrates the assessment of adaptive expertise to knowledge fluency in design and innovation (McKenna, Linsenmeier, & Glucksberg, 2008; McKenna, in press). Models are a language used by engineers to enhance their engineering design process and their computational understanding of a problem (Dym et al., 2005). Engineers use model-based reasoning to construct representations and derive inferences (Cartier, Rudolph, & Stewart, 2001). Many engineering courses teach modeling without explicitly acknowledging the assumptions, approximations, and limitations associated with abstract representations of a physical phenomenon (Gainsburg, 2006). This approach omits important steps in the development of a robust fluency in modeling techniques.

Two additional issues complicate the teaching and learning of modeling. First, the term model can be used as a noun (e.g., a model), adjective (e.g., model teacher), or verb (e.g., to model). Second, Maki and Thompson (2006) draw attention to the variable uses of the term and its related forms. Everyday use typically refers to modeling as either a display version or miniaturization of something. This use corresponds to engineering's use of models as physical representations intended for experimentation, display, and emulation purposes. Most physical models neglect theoretical, logical, and mathematical components that represent behaviors (Starfield, Smith, & Bleloch, 1994).

Engineering educators have the task of helping students to clearly understand the intricacies and power of modeling as a whole. Most instructors utilize a modeling process to guide students in their understanding of appropriate use and application of modeling techniques in engineering. As Lesh and Doerr (2003) describe, modeling is a cyclic activity consisting of real-world description, prediction, manipulation, and verification. Modeling as a process provides students with an understanding of how to create purposeful and meaningful representations.

The teaching and understanding of modeling is often a difficult task. Perkins (1986) notes that models are intrinsically ambiguous and require additional information to be fully understood. Traditional approaches can lead students to rely on one type of model or can even unintentionally cause them to lose sight of or neglect the importance of modeling. Mathematical modeling is particularly difficult because not all students or practicing engineers believe the mathematics they have learned is applicable to sound design or work as an engineer (Cardella, 2007; Winkelman, 2009). Mathematical modeling demonstrates the importance of mathematics in engineering (Beckenbach, 1961/2013). Students need to have good emotional experiences with, see the value of, and expect to succeed with mathematics in order to realize the importance of mathematics and utilize mathematical modeling effectively (Goold & Devitt, 2012).

Much research on embedding modeling interventions into engineering instruction has focused on model-eliciting activities (MEAs) (Besterfield-Sacre et al., 2012; Deifes-Dux, Moore, Zawojewski, Imbrie, & Follman, 2004; Diefes-Dux, Follman, et al., 2004; Diefes-Dux, Zawojewski, & Hjalmarson, 2010; Moore, 2008; Moore & Diefes-Dux, 2004; Moore & Hjalmarson, 2010; Moore, Miller, Lesh, Stohlmann, & Kim, 2013; Yildirim, Shuman, & Besterfield-Sacre, 2006; Zawojewski, Bowman, & Lesh, 2008). The use of MEAs is intended to promote problem solving and student thought processes by encouraging student use of mathematical models during engineering tasks (Litzinger, Lattuca, Hadgraft, & Newstetter, 2011). Model-based reasoning embedded in engineering design experiments and problem-based learning has also been explored as a means to advance student learning of problem solving and reasoning (Newstetter, 2005). Previous research associated with our project has explored the effect of explicit mathematical modeling interventions on student design ideas and solutions through an analysis of what students think should be modeled, how students model, what considerations students make regarding models, how well students can critique a model, and how well students create mathematical models (Cole, Linsenmeier, McKenna, & Glucksberg, 2010; Cole, Linsenmeier, Molina, Glucksberg, & McKenna, 2011; McKenna & Carberry, 2012). Connections between these topics have not yet been explored.

Research Design and Methods

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

This study measured student conceptions of modeling as a way to assess how well they utilize and connect modeling in the engineering design process. Conceptions were assessed instead of behaviors in order to avoid bias that may occur due to available resources that limit modeling in practice. An assessment of conceptions allows students to propose any possibility without limitations.

The study had two phases. Phase I included an in-depth analysis of a cohort of students (experimental group) before and after they were given an explicit modeling intervention. The modeling intervention included activities that addressed the mathematical modeling process described by Gainsburg (2006): (1) identify the real-world phenomenon, (2) simplify or idealize the phenomenon, (3) express the idealized phenomenon mathematically, (4) perform the mathematical manipulations, (5) interpret the mathematical solution in real-world terms, and (6) test the interpretation against reality.

The activities were embedded in the design of a phototherapy device through four iterations. The first iteration placed students in a consulting scenario. Students were asked to tell a design team what they thought should be modeled, what their modeling approach would be, and how they would expect the model to be helpful. The second iteration explicitly focused on mathematical modeling. Students were asked to sketch the system to model, list relevant parameters and variables, and propose a mathematical approach to the problem. The third iteration asked students to find the equations and list the assumptions needed to create a mathematical model of the system. Students subsequently solved for one parameter of the model. The final iteration asked students to explain a mathematical model designed by another design team. Students were provided the experimental data used to verify the model. A detailed description of the activity development and implementation strategy has been previously published (Cole, Linsenmeier, McKenna, & Glucksberg, 2010; Cole, Linsenmeier, Molina, Glucksberg, & McKenna, 2011). Post-module conceptions were recorded approximately one month after the end of the course and prior to the start of the new term.

Phase II added a second cohort of students enrolled in a hands-on project-based course to provide a comparison group. The group is not a control because the students were from a different institution and at a different stage in their studies. As a comparison group, they were not taught an explicit modeling module but experienced a project-based design course that implicitly utilized various forms of modeling. The differences that did exist between the two samples of students were viewed as extraneous factors irrelevant to the learning experiences students were receiving at both institutions. The comparison students provide a means to assess the gains made by the experimental group. Our hypothesis states that the pre-module conceptions of the experimental group would be similar to the conceptions of the comparison group regardless of institution and academic year differences.

Participants

Phase I of our study investigated the effect of an explicit modeling module on senior engineering students. Students were enrolled in a capstone design course at a highly selective mid-sized private university in the Midwest. The course had 76 students who completed a series of activities embedded in the course syllabus. Students were allowed to voluntarily participate in the study; 48 agreed to participate and responded to every component of the study (63% response rate). No demographic data were collected because the tasks were assignments within the course.

Phase II of our study added a comparison group of second-year engineering students at a large public university in the Southwest. The cohort consisted of 60 students enrolled in a project-based design course (97% response rate). These students were chosen because of the implicit modeling activities embedded in the project course.

Data Collection

Students in the experimental group (Phase I) were asked at the beginning and end of the course to respond to two open-ended questions regarding their conceptions of modeling in design:

Question 1 Describe different ways to model a design solution or idea.

Question 2 In what ways can models be useful/helpful in the design process?

The questions were designed to identify general conceptions of modeling and modeling use prior to and after the course's new modeling module.

The comparison group (Phase II) was asked the same questions as the experimental group at the end of their course. In Phase II we focus on comparing responses provided at the end of the course between two different groups to explore similarities and differences in conceptions of modeling. Both sets of data were collected using an online surveying tool.

Data Analysis

An open-coding approach consisting of labeling concepts, defining categories, and developing dimensions was taken to identify codes from the student responses that emerged from both Question 1 (Ways to Model) and Question 2 (Models in Design) (Glaser & Strauss, 1967; Miles & Huberman, 1984). A single rater first read each student's response to determine a set of codes that could easily summarize the overall sample's conceptions. These codes were then compiled into a rubric. The rubric was then used to code each student's response to identify what type of models they mentioned and how they use models in design. A second rater then used the rubric to test its reliability across raters. The second rater repeated a two-step process consisting of coding 10% of the responses using the rubric and consulting the first rater's codes, until agreement was reached. Changes to the rubric were made to establish 100% interrater reliability between the two raters.

Six codes emerged from the data for Question 1 (Table 1). Physical Models included student statements that mentioned tangible artifacts, such as prototypes, mockups, artwork (e.g., drawings, sketches), systematic diagrams, and charts or graphs (example: “Physical models may be used to prototype or mock-up a design idea or solution”). Computer Models included student references to computer-aided design (CAD) drawings that conceptualize theoretical ideas (examples: “It is also common to use computer-aided design software to create more detailed and exact sketches” and “Computer simulations can be conducted that test how the design responds to physical laws of the universe”). Mathematical Models consisted of ideas represented by mathematical equations and calculations (example: “Construct a numerical model and use calculations to predict outcomes”). Theoretical/Conceptual Models represent untested ideas based on what is known about the real world (example: “[A model] can also be represented as a conceptual entity”). Written Descriptions are models in the form of written words (example: “… can be a step-by-step written process connecting the idea to the problem and explaining the qualities that contribute to the solution”). The final code, Design Process, represented instances when a respondent assumed that the entirety of the design process was equivalent to modeling (example: “Start with the problem, and the current/known solution. Branch out to solutions to other problems that can be adapted to the current problem …”). The coding rubric was defined on the basis of the existence of particular statements by respondents and the fit of responses within a category.

Table 1. Student Responses for Question 1 (Ways to Model)
CodePre-module (%)Post-module (%)
Physical9485
Computer5838
Mathematical1998
Theoretical/Conceptual1015
Written192
Design Process130

Twenty-three codes were identified for Question 2 (Models in Design). For organizational purposes we have grouped the codes into five summative categories: (1) Project Management (time management and cost consideration); (2) Testing & Evaluation (feasibility, test performance, understand the solution, understand the problem, and confirm requirements); (3) Description & Display (documentation, communicate design idea(s), visualization, and concrete/physical design); (4) Development (optimization, feedback from stakeholders, simplify the design, improvement, interaction, decision making, iteration, establish alternatives, and implementation); and (5) Prediction (estimate process or performance, simulate, and predict). These categories represent several design process activities and serve as a shorthand way to group the data for discussion purposes.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

Phase I: Pre- and Post-Module Analysis

Our analysis for Phase I includes an examination of our experimental group's responses before (pre-module) and after (post-module) the modeling module. Responses were coded by assigning a value of 1 when a code was present and 0 if a code was not.

Question 1 (Ways to Model) The majority of students consistently focused on descriptive models – physical representations, computer models, and written descriptions, with major emphasis on prototypes, mockups, and artwork – for both their pre- and post-module responses (Table 1). Very few students referred to predictive models (mathematical models and theoretical models) before the modeling module. A significant shift was observed in the students' responses after the modeling module in reference to mathematical models. This finding was likely due in part to the specific instruction regarding mathematical modeling.

A paired-samples t-test was conducted to compare responses before and after the modeling module (Table 2). Significant differences between responses were found for Computer Models, Written Descriptions, and Mathematical Models. Effect size was medium for Computer Models and Written Descriptions (0.09 ≤ r2 < 0.25), while Mathematical Models displayed a large effect size (r2 ≥ 0.25). Each category decreased in the second assessment except Mathematical Models. These results reveal that students reported mathematical models far more often after the modeling module than they did beforehand and that computer models, written descriptions, and the engineering design process were reported significantly less after the modeling module. The remaining categories were consistently cited between assessments.

Table 2. Paired-Samples t-test Analysis for Question 1 (Ways to Model)
Codet (47)Effect size (r2)
  1. a

    p ≤ 0.05

  2. b

    p ≤ 0.01

  3. c

    p ≤ 0.001.

Physical1.4300.04
Computer2.217a0.09
Mathematical−13.364***0.79
Theoretical/conceptual−0.6280.01
Written3.066**0.17
Design Process0.7030.01

Question 2 (Models in Design) Using the five categories, we found that Project Management and Prediction increased after the modeling module, while Testing & Evaluation, Description & Display, and Development all decreased (Table 3). The observed increase for Project Management, t (47) = −4.497, p (2-tailed) ≤ 0.001, r2 = 0.30, and Prediction, t (47) = −2.424, p (2-tailed) ≤ 0.05, r2 = 0.11, were both shown to be significant with medium to large effect sizes using a paired-samples t-test. The remaining three categories, while decreasing, were shown to still be prominent with small effect sizes: Testing & Evaluation, t (47) = 0.684, p (2-tailed) = 0.50, r2 = 0.01; Description & Display, t (47) = 1.151, p (2-tailed) = 0.26, r2 = 0.03; and Development, t (47) = 1.939, p (2-tailed) = 0.06, r2 = 0.07. When broken down into subcategories, the results before the modeling module identified Feasibility, Visualize, Understand the Solution, Improvement, and Test Performance as the most cited uses of models. After the modeling module, student responses cited Feasibility, Visualize, Understand the Solution, and Improvement less in quantity. Test Performance increased, while Cost Consideration, Time Management, and Prediction became significant. None of the remaining categories were cited by more than 17% of students.

Table 3. Student Responses for Question 2 (Models in Design)
CodePre-module (%)Post-module (%)
Project Management1352
cost consideration1350
time management223
Testing & Evaluation7367
feasibility4227
test performance2738
understand solution296
understand problem613
confirm requirements02
Description & Display4838
visualize2915
make design concrete817
communicate design1310
document134
Development5842
improvement278
simplify1515
optimize010
feedback134
decision making138
interaction02
alternatives02
iteration24
implementation20
Prediction1027
predict823
simulate04
estimate22

A paired-samples t-test of the subcategories was conducted to compare responses before and after the modeling module. A significant difference was observed in the number of students that reported Cost Considerations, Improvement, Optimization, Prediction, Time Management, and Understand the Solution as uses for models after the modeling module (Table 4). Each category displayed a medium effect size, except Cost Consideration, which displayed a large effect size. Of the six significant categories, only Improvement and Understand the Solution decreased after the modeling module. The remaining categories were consistently cited between assessments.

Table 4. Paired-Samples t-test Analysis for Question 2 (Models in Design)
Codet (47)Effect size (r2)
  1. a

    p ≤ 0.05

  2. b

    p ≤ 0.01

  3. c

    p ≤ 0.001.

Cost Consideration−4.289***0.28
Improvement2.441a0.11
Optimize−2.338a0.10
Predict−2.452a0.11
Time Management−3.142**0.17
Understand the Solution2.861**0.15

Phase II: Experimental and Comparison Student Analysis

Phase II of our analysis compared responses from our experimental group analyzed in Phase I with a comparison group of students.

Question 1 (Ways to Model) Students from the experimental group were compared with students in the comparison group to further extend the applicability of the changes seen in the experimental group. To ensure the groups were comparable, we conducted a chi-square (χ2) analysis between the experimental group's responses before the modeling module and the comparison group's single set of responses; no significant differences for any of the six categories were found.

As in the Phase I analysis, experimental responses after the modeling module and the responses collected from the comparison students consistently focused on citing physical representations as models (Table 5). The χ2 analysis (Table 6) between the two groups identified the same significant differences between groups for Computer, Mathematical, and Written Models. In addition, a difference in Theoretical/Conceptual Models was also observed. The experimental group cited Theoretical/Conceptual and Mathematical Models more often than did the comparison group, while the comparison group cited computer and written models more often. Computer and Mathematical Models displayed a large effect size (≥0.5), while Written and Theoretical/Conceptual Models displayed a medium effect size (≥0.3). These results match the results from the Phase I analysis and clearly indicate a positive effect of the modeling module on students' responses including predictive-type models.

Table 5. Experimental and Comparison Student Responses for Question 1 (Ways to Model)
CodeExperimental (%)Comparison (%)
Physical8592
Computer3865
Mathematical9832
Theoretical/Conceptual153
Written213
Design Process05
Table 6. Chi-square Analysis for Question 1 (Ways to Model)
Codeχ2Effect size
  1. a

    p ≤ 0.05

  2. b

    p≤ 0.01

  3. c

    p ≤ 0.001.

Physical1.8270.17
Computer8.092b0.61
Mathematical49.249c0.98
Theoretical/conceptual4.418a0.39
Written4.418a0.39
Design Process0.4890.05

Question 2 (Models in Design) The emergent categories from the Phase I analysis were used again for the Phase II analysis. A χ2 analysis of the experimental and comparison groups' responses was used to ensure that the groups were comparable; no significant differences were observed. The comparison analysis revealed the categories of Prediction, Development, and Project Management to be highly cited by the experimental group, while the remaining categories were more highly cited by the comparison group (Table 7). A χ2 analysis (Table 8) of the results identified a significant difference with a large effect size (≥0.5) for the categories of Prediction and Description & Display. The difference in Prediction category was consistent with the Phase I analysis. This difference reinforces our finding that the intervention significantly influenced the experimental group's awareness of predictive models. The significant difference for Description & Display suggests that the comparison group's conception of modeling uses is heavily descriptive compared with the experimental group. The remaining categories were not significantly different; however, Testing & Evaluation was close to having a significant difference (p = 0.058) with a medium effect size (≥0.3).

Table 7. Experimental and Comparison Student Responses for Question 2 (Models in Design)
CodeExperimental (%)Comparison (%)
Project Management2722
Testing & Evaluation4258
Description & Display3863
Development6758
Prediction5224
Table 8. Chi-Square Analysis for Question 2 (Models in Design)
 χ2Effect size
  1. a

    p ≤ 0.01

  2. b

    p ≤ 0.001.

Prediction12.188b0.76
Development0.1300.01
Description & Display9.124a0.66
Testing & Evaluation3.5900.33
Project Management1.1750.11

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

Modeling is inherent in the engineering design process; however, our initial findings suggest that students often do not have nuanced conceptions of the full power and use of models. Our Phase I analysis identified an overwhelmingly descriptive notion of modeling held by the sample of students prior to the module. Students are clearly capable of model-based reasoning when identifying the need for physical models in design. This finding suggests adapted knowledge of modeling gained through coursework and other experiences. We believe that the physical nature of modeling shown in the pre-module assessments is more than just semantics. These results are in large part due to student coursework experiences and the influence of everyday common use of the term modeling. This belief reflects not just what is present in the curriculum but rather, what is absent or tacit.

We identified a misconception held by students when some student responses revealed the belief that modeling is equivalent to the design process. We believe this unanticipated finding is the result of typical implicit teaching of modeling. Modeling and design are both processes, but the steps of mathematical modeling defined by Gainsburg (2006) occur throughout the design process and are not equivalent. While representations of the design process are indeed a model, viewing modeling as equivalent to the design process assumes that a model is an approach to solving a problem rather than a process used to enhance problem solving. The design process may include modeling as a necessary component or task, depending on the nature of the problem; however, the design process includes more than just modeling.

When an explicit modeling module was introduced to the experimental group of students in this study, a shift occurred in their conceptions of modeling. Students' conflation of modeling with the design process was alleviated; no students described modeling as the design process in their post-module responses. We assume that the students realized that modeling is a tool used in design rather than the process itself. Descriptive models remained highly cited in both the before and after responses, and mathematical modeling became far more prevalent in the post-module responses. The significant increase in citing mathematical models was highly influenced by the mathematical nature of the module; however, the results suggest that the mathematically based activities positively affected the student's connection of more abstract predictive models to modeling. This result did not appear to diminish students' recognition and need for descriptive forms of modeling. The frequent inclusion of predictive models in students' post-module responses indicates that students adapted the new knowledge into their understanding of modeling and modeling uses in design.

We did see a significant decrease of Computer Models (particularly CAD models) and Written Descriptions in student responses after the modeling module. We hypothesize that the novelty of the mathematical modeling activity affected the focus of the students' responses and modified their original notion of modeling. It is also possible that we are observing students realizing that mathematical models have a greater impact on their designs than computer-aided design drawings or written descriptions.

We also observed changes in students' conceptions of modeling uses after the modeling module. A significant shift occurred for students who identified models as used for optimization and prediction. Students continued to identify models as used for visualization, testing, and feasibility assessment without a significant decrease after the modeling module. The increased attention given by students to cost consideration and time management is likely associated with the class instruction related to expenses and time saved through mathematical modeling.

Our findings suggest that the student focus on visualization and testing of model use is a product not only of semantics but also course experiences. When modeling is made explicit to students, they obtain a clearer understanding of model-based reasoning in engineering design. Making modeling steps explicit can help students recognize the value of the predictive nature of modeling and change their conception of modeling. Students can become more adaptive experts with their use of models when they know the whole of what modeling can provide for design.

Our Phase II analysis added a step of validation to help expand the generalizability of the results seen in Phase I. Comparison of the experimental with the comparison groups' responses supported the hypothesis that the two groups began with equivalent conceptions of ways to model a design and how models can be used in design. The academic standing of the students had no effect on these conceptions. This finding supports our claim that implicit teaching of modeling leaves students with underdeveloped understandings of modeling throughout their formal engineering education. Comparison of the two groups' responses identified similar trends in the effect the module had on the experimental group. Predictive-type models were far more frequently cited by the experimental than by the comparison group. The comparison group cited description and display for model uses at a higher rate than the experimental group. This finding is not surprising since the experimental group included an explicit mathematical modeling activity that emphasized abstraction and prediction. The course for the comparison group expected students to use model-based reasoning in their design process, but paid no greater attention to mathematical modeling. The results also show the experimental group's focus on project management increased after the modeling module but was not significantly different from the comparison group. This result suggests that the module influenced the experimental students' project management skills.

Conclusions, Implications, and Future Work

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

Modeling is an important part of engineering and the design process. Our findings suggest that students often do not have very nuanced conceptions of the full power and use of models in the context of design. Their life experiences and use of the terms model and modeling lead them to conceive of models as physical in nature. Our analysis of student responses prior to an explicit modeling module indicates that students describe multiple ways to use models in a design solution, and that most students view models as a means to visualize and test their solutions. This observation suggests that students adapt their real-life understanding of modeling to their engineering design problems. Yet modeling is much more than just physical constructions for visualization and testing. When students were taught an explicit mathematical modeling module, their responses included abstract predictive modeling in the context of design. These findings suggest modeling should be explicitly taught to engineering students throughout their formal engineering education.

While our work has provided useful insights, additional studies are needed to further investigate the modeling conceptions of engineering students. Adding cross-disciplinary, multiple-year, and demographic data would expand our results to inform how modeling might be taught throughout the engineering curriculum. The possible inclusion of an explicit modeling module depends greatly on the current curriculum used at an institution. In some instances the change may simply be for analysis-focused courses to be more explicit about how modeling is used in the context of setting up a problem and deriving a solution. It may also be necessary to embed modeling as a core component of design-focused courses. Changes in the way modeling is taught can improve the way that students perceive modeling and increase their modeling expertise.

As we continue this research, we aim to continue to shed light on how to teach modeling so that the implicit process-oriented activities of modeling are given as much attention as the tools or products of modeling, thereby helping students to develop more sophisticated conceptions of an important and core engineering skill.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies

This work was supported by the National Science Foundation Engineering Education Program (EEC) grant 0648316, 1110453, and 1118659. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation. The authors would also like to thank Guillermo Ameer, Jennifer Cole, Matthew Glucksburg, David Kelso, and Robert Linsenmeier for their integral roles in the dissemination of the surveys and modeling intervention.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies
  • Beckenbach, E. F. (Ed.). (2013). Modern mathematics for the engineer: Second Series. Mineola, NY: Dover Publications (Original work published in 1961).
  • Besterfield-Sacre, M. E., Self, B. P., Shuman, L. J., Christ, J. A., Miller, R. L., & Moore, T. J. (2012). Models and modeling in upper division classrooms: Impacting conceptual understanding and the professional skills. Proceedings of the ASEE Annual Conference & Exposition, San Antonio, TX.
  • Cardella, M. (2007). What your engineering students might be learning from their mathematics pre-reqs (beyond integrals and derivatives). Proceedings of the ASEE/IEEE Frontiers in Education Conference. Milwaukee, WI. doi: 10.1109/FIE.2007.4418130
  • Cartier, J., Rudolph, J., & Stewart, J. (2001). The nature and structure of scientific models. National Center for Improving Student Learning and Achievement in Mathematics and Science. University of Wisconsin-Madison. Retrieved from http://ncisla.wceruw.org/publications/reports/Models.pdf
  • Cole, J., Linsenmeier, R., McKenna, A., & Glucksberg, M. (2010). Investigating engineering students' mathematical modeling abilities in capstone design. Proceedings of the ASEE Annual Conference & Exposition, Louisville, KY.
  • Cole, J., Linsenmeier, R., Molina, E., Glucksberg, M., & McKenna, A. (2011). Assessing engineering students' mathematical modeling abilities in capstone design. Proceedings of the ASEE Annual Conference & Exposition, Vancouver, BC, Canada.
  • Diefes-Dux, H., Follman, D., Imbrie, P. K., Zawojewski, J., Capobianco, B., & Hjalmarson, M. (2004). Model eliciting activities: An in-class approach to improving interest and persistence of women in engineering. Proceedings of the ASEE Annual Conference & Exposition, Salt Lake City, UT.
  • Deifes-Dux, H. A., Moore, T., Zawojewski, J., Imbrie, P. K., & Follman, D. (2004). A framework for posing open-ended engineering problems: Model-eliciting activities. Proceedings of the ASEE/IEEE Frontiers in Education Conference, Savannah, GA. doi: 10.1109/FIE.2004.1408556
  • Diefes-Dux, H. A., Zawojewski, J. S., & Hjalmarson, M. A. (2010). Using educational research in the design of evaluation tools for open-ended problems. International Journal of Engineering Education, 26(4), 807819.
  • Dym, C. L., Agogino, A. M., Eris, O., Frey, D. D., & Leifer, L. J. (2005). Engineering design thinking, teaching, and learning. Journal of Engineering Education, 94(1), 103120. doi: 10.1002/j.2168–9830.2005.tb00832.x
  • Gainsburg, J. (2006). The mathematical modeling of structural engineers. Mathematical Thinking and Learning, 8, 336. doi: 10.1207/s15327833mtl0801_2
  • Glaser, B., & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Aldine.
  • Goold, E., & Devitt, F. (2012). The role of mathematics in engineering practice and in the formation of engineers. Proceedings of the Annual Conference of the European Society for Engineering Education (SEFI), Thessaloniki, Greece.
  • Hatano, G., & Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, H. Azuma, & K. Hakuta (Eds.), Child development and education in Japan (pp. 262272). New York, NY: Freeman.
  • Hatano, G., & Oura, Y. (2003). Commentary: Reconceptualizing school learning using insight from expertise research. Educational Researcher, 32(8), 2629.
  • Hesse, M. (1963). Models and analogies in science. London: Sheed and Ward.
  • Lesh, R., & Doerr, H. M. (2003). Foundations of a models and modeling perspective on mathematics teaching, learning, and problem solving. In R. Lesh & H. M. Doerr (Eds.), Beyond constructivism: Models and modeling perspectives on mathematics problem solving, learning, and teaching (pp. 333). Mahwah, NJ: Lawrence Erlbaum.
  • Litzinger, T. A., Lattuca, L. R., Hadgraft, R. G., & Newstetter, W. C. (2011). Engineering education and the development of expertise. Journal of Engineering Education, 100(1), 123150. doi: 10.1002/j.2168–9830.2011.tb00006.x
  • Magnani, L., Nersessian, N. J., & Thagard, P. (Eds.). (1999). Model-based reasoning in scientific discovery. New York, NY: Kluwer/Plenum.
  • Maki, D., & Thompson, M. (2006). Mathematical modeling and computer simulation. Belmont, CA: Thomson Brooks/Cole.
  • McKenna, A. (in press). Adaptive expertise and knowledge fluency in design and innovation. In A. Johri & B. M. Olds (Eds.), Cambridge handbook of engineering education research. Cambridge University Press.
  • McKenna, A., Linsenmeier, R., & Glucksberg, M. (2008). Characterizing computational adaptive expertise. Proceedings of the ASEE Annual Conference & Exposition, Pittsburgh, PA.
  • McKenna, A. F., & Carberry, A. R. (2012). Characterizing the role of modeling in innovation. International Journal of Engineering Education, 28, 263269.
  • Miles, M., & Huberman, M. (1984). Qualitative data analysis: A source book for new methods. Thousand Oaks, CA: Sage.
  • Moore, T. J. (2008). Model-eliciting activities: A case-based approach for getting students interested in material science and engineering. Journal of Materials Education, 30(5–6), 295310.
  • Moore, T., & Diefes-Dux, H. (2004). Developing model-eliciting activities for undergraduate students based on advanced engineering content. Proceedings of the ASEE/IEEE Frontiers in Education Conference, Savannah, GA. doi: 10.1109/FIE.2004.1408557
  • Moore, T. J., & Hjalmarson, M. A. (2010). Developing measures of roughness: Problemsolving as a method to document student thinking in engineering. International Journal of Engineering Education, 26(4), 820830.
  • Moore, T. J., Miller, R. L., Lesh, R. A., Stohlmann, M. S., & Kim, Y. R. (2013). Modeling in engineering: The role of representational fluency in students' conceptual understanding. Journal of Engineering Education, 102(1), 141178. doi: 10.1002/jee.20004
  • Morgan, M. S., & Morrison, M. (Eds.). (1999). Models as mediators Cambridge, UK: Cambridge University Press.
  • Nersessian, N. J., & Patton, C. (2009). Model-based reasoning in interdisciplinary engineering. In A. Meijers (Ed.), Handbook of the philosophy of technology and engineering sciences (pp. 687718). Amsterdam, the Netherlands: Elsevier.
  • Newstetter, W. C. (2005). Designing cognitive apprenticeships for biomedical engineering. Journal of Engineering Education, 94(2), 207213. doi: 10.1002/j.2168–9830.2005.tb00841.x
  • Perkins, D. N. (1986). Knowledge as design. Hillsdale, NJ: Lawrence Erlbaum.
  • Schwartz, D. L., Bransford, J. D., & Sears, D. (2005). Efficiency and innovation in transfer. In J. Mestre (Ed.), Transfer of learning: Research and perspectives (pp. 151). Greenwich, CT: Information Age Publishing.
  • Sheppard, S. D., Macatangay, K., Colby, A., & Sullivan, W. M. (2008). Educating engineers: Designing for the future of the field. San Francisco, CA: Jossey-Bass.
  • Starfield, A. M., Smith, K. A., & Bleloch, A. L. (1994). How to model it: Problem solving for the computer age. Edina, MN: Burgess Publishing.
  • Sun, Y., Newstetter, W., & Nersessian, N. J. (2006). Promoting model-based reasoning in problem-based learning. Trabajo presentado en la reunión anual de la Cognitive Science Society, Vancouver, BC, Canada.
  • Winkelman, P. (2009). Perceptions of mathematics in engineering. European Journal of Engineering Education, 34(4), 305316. doi: 10.1080/03043790902987378
  • Yildirim, T. P., Shuman, L., & Besterfield-Sacre, M. (2010). Model-eliciting activities: Assessing engineering student problem solving and skill integration processes. International Journal of Engineering Education, 26, 831845.
  • Zawojewski, J. S., Hjalmarson, J. S., Bowman, K., & Lesh, R. (2008). A modeling perspective on learning and teaching in engineering education. In J. Zawojewski, H. Diefes-Dux, & K. Bowman (Eds.), Models and modeling in engineering education: Designing experiences for all students. Rotterdam, the Netherlands: Sense.

Biographies

  1. Top of page
  2. Abstract
  3. Introduction
  4. Literature Review
  5. Research Design and Methods
  6. Results
  7. Discussion
  8. Conclusions, Implications, and Future Work
  9. Acknowledgments
  10. References
  11. Biographies
  • Adam R. Carberry is an assistant professor in the Department of Engineering & Computing Systems in the College of Technology and Innovation, Arizona State University, 7171 East Sonoran Arroyo Mall, Peralta Hall, Suite 330, Mesa, Arizona, 85212; adam.carberry@asu.edu.

  • Ann F. McKenna is a professor and chair in the Department of Engineering & Computing Systems in the College of Technology and Innovation, Arizona State University, 7171 East Sonoran Arroyo Mall, Peralta Hall, Suite 330, Mesa, Arizona, 85212; ann.mckenna@asu.edu.