SEARCH

SEARCH BY CITATION

Cited in:

CrossRef

This article has been cited by:

  1. 1
    Alec M. Bodzin, Qiong Fu, Denise Bressler, Farah L. Vallera, Examining the Enactment of Web GIS on Students' Geospatial Thinking and Reasoning and Tectonics Understandings, Computers in the Schools, 2015, 32, 1, 63

    CrossRef

  2. 2
    Richard J Hift, Should essays and other “open-ended”-type questions retain a place in written summative assessment in clinical medicine?, BMC Medical Education, 2014, 14, 1

    CrossRef

  3. You have free access to this content3
    James E. Carlson, What Differential Weighting of Subsets of Items Does and Does Not Accomplish: Geometric Explanation, ETS Research Report Series, 2014, 2014, 2
  4. 4
    Gabriel A. Reich, Imperfect models, imperfect conclusions: An exploratory study of multiple-choice tests and historical knowledge, The Journal of Social Studies Research, 2013, 37, 1, 3

    CrossRef

  5. 5
    Lisa A. Keller, Ronald K. Hambleton, The Long-Term Sustainability of IRT Scaling Methods in Mixed-Format Tests, Journal of Educational Measurement, 2013, 50, 4
  6. 6
    Ross Guest, Towards Learning Standards in Economics in Australia, Economic Papers: A journal of applied economics and policy, 2013, 32, 1
  7. 7
    David M. Williamson, Xiaoming Xi, F. Jay Breyer, A Framework for Evaluation and Use of Automated Scoring, Educational Measurement: Issues and Practice, 2012, 31, 1
  8. 8
    Stephen Hickson, W. Robert Reed, Nicholas Sander, Estimating the Effect on Grades of Using Multiple-Choice Versus Constructive-Response Questions: Data From the Classroom, Educational Assessment, 2012, 17, 4, 200

    CrossRef

  9. 9
    Wen-Chung Wang, Kuan-Yu Jin, Xue-Lan Qiu, Lei Wang, Item Response Models for Examinee-Selected Items, Journal of Educational Measurement, 2012, 49, 4
  10. 10
    Jörn R. Sparfeldt, Rumena Kimmel, Lena Löwenkamp, Antje Steingräber, Detlef H. Rost, Not Read, but Nevertheless Solved? Three Experiments on PIRLS Multiple Choice Reading Comprehension Test Items, Educational Assessment, 2012, 17, 4, 214

    CrossRef

  11. 11
    Dimos Triantis, Errikos Ventouras, Higher Education Institutions and Learning Management Systems, 2012,

    CrossRef

  12. 12
    Dimos Triantis, Errikos Ventouras, Virtual Learning Environments, 2012,

    CrossRef

  13. 13
    Christopher M. Keller, John F. Kros, An Innovative Excel Application to Improve Exam Reliability in Marketing Courses, Marketing Education Review, 2011, 21, 1, 21

    CrossRef

  14. 14
    Ou Lydia Liu, Hee-Sun Lee, Marcia C. Linn, An Investigation of Explanation Multiple-Choice Items in Science Assessment, Educational Assessment, 2011, 16, 3, 164

    CrossRef

  15. 15
    Errikos Ventouras, Dimos Triantis, Panagiotis Tsiakas, Charalampos Stergiopoulos, Comparison of oral examination and electronic examination using paired multiple-choice questions, Computers & Education, 2011, 56, 3, 616

    CrossRef

  16. 16
    Errikos Ventouras, Dimos Triantis, Panagiotis Tsiakas, Charalampos Stergiopoulos, Comparison of examination methods based on multiple-choice questions and constructed-response questions using personal computers, Computers & Education, 2010, 54, 2, 455

    CrossRef

  17. 17
    A. Hwang, J.B. Arbaugh, Seeking feedback in blended learning: competitive versus cooperative student attitudes and their links to learning outcome, Journal of Computer Assisted Learning, 2009, 25, 3
  18. 18
    Gabriel A. Reich, Testing Historical Knowledge: Standards, Multiple-Choice Questions and Student Reasoning, Theory & Research in Social Education, 2009, 37, 3, 325

    CrossRef

  19. 19
    Ross H. Nehm, Irvin Sam Schonfeld, Measuring knowledge of natural selection: A comparison of the CINS, an open-response instrument, and an oral interview, Journal of Research in Science Teaching, 2008, 45, 10
  20. 20
    Lynn Bible, Mark G. Simkin, William L. Kuechler, Using Multiple-choice Tests to Evaluate Students' Understanding of Accounting, Accounting Education, 2008, 17, sup1, S55

    CrossRef

  21. 21
    Michael Scott, Tim Stelzer, Gary Gladding, Evaluating multiple-choice exams in large introductory physics courses, Physical Review Special Topics - Physics Education Research, 2006, 2, 2

    CrossRef

  22. 22
    Adrian J. Bailey, What Kind of Assessment for What Kind of Geography? Advanced Placement Human Geography, The Professional Geographer, 2006, 58, 1
  23. 23
    Nancy L. Allen, Paul W. Holland, Dorothy T. Thayer, Measuring the Benefits of Examinee-Selected Questions, Journal of Educational Measurement, 2005, 42, 1
  24. 24
    Mark G. Simkin, William L. Kuechler, Multiple-Choice Tests and Student Understanding: What Is the Connection?, Decision Sciences Journal of Innovative Education, 2005, 3, 1
  25. 25
    Ching-Fung Si, Randall E. Schumacker, Ability Estimation Under Different Item Parameterization and Scoring Models, International Journal of Testing, 2004, 4, 2, 137

    CrossRef

  26. 26
    Michael C. Rodriguez, Construct Equivalence of Multiple-Choice and Constructed-Response Items: A Random Effects Synthesis of Correlations, Journal of Educational Measurement, 2003, 40, 2
  27. 27
    Michael O'Leary, Stability of Country Rankings Across Item Formats in the Third International Mathematics and Science Study, Educational Measurement: Issues and Practice, 2002, 21, 4
  28. 28
    Anita Wester, Widar Henriksson, The interaction between item format and gender differences in mathematics performance based on TIMSS data, Studies in Educational Evaluation, 2000, 26, 1, 79

    CrossRef

  29. 29
    J.R Mulkey, H.F O'Neil, The effects of test item format on self-efficacy and worry during a high-stakes computer-based certification examination, Computers in Human Behavior, 1999, 15, 3-4, 495

    CrossRef

  30. 30
    WILLIAM E. BECKER, CAROL JOHNSTON, The Relationship between Multiple Choice and Essay Response Questions in Assessing Economics Understanding, Economic Record, 1999, 75, 4
  31. 31
    Michael Kane, Terence Crooks, Allan Cohen, Validating Measures of Performance, Educational Measurement: Issues and Practice, 1999, 18, 2
  32. 32
    Christine E. DeMars, Gender Differences in Mathematics and Science on a High School Proficiency Exam: The Role of Response Format, Applied Measurement in Education, 1998, 11, 3, 279

    CrossRef

  33. You have free access to this content33
    Stephen G. Sireci, Howard Wainer, Henry Braun, PSYCHOMETRICS, ETS Research Report Series, 1998, 1998, 1
  34. 34
    Peter Kennedy, William B. Walstad, Combining Multiple-Choice and Constructed-Response Test Scores: An Economist's View, Applied Measurement in Education, 1997, 10, 4, 359

    CrossRef

  35. 35
    Howard Wainer, Robert Lukhele, Managing the Influence of DIF From Big Items: The 1988 Advanced Placement History Test as an Example, Applied Measurement in Education, 1997, 10, 3, 201

    CrossRef

  36. You have free access to this content36
    Howard Wainer, Robert Lukhele, MANAGING THE INFLUENCE OF DIF FROM BIG ITEMS: THE 1988 ADVANCED PLACEMENT HISTORY TEST AS AN EXAMPLE, ETS Research Report Series, 1996, 1996, 2
  37. 37
    Anita Wester, The Importance of the Item Format with Respect to Gender Differences in Test Performance: a study of open-format items in the DTM test, Scandinavian Journal of Educational Research, 1995, 39, 4, 335

    CrossRef

  38. 38
    Howard Wainer, Xiang-Bo Wang, David Thissen, How Well Can We Compare Scores on Test Forms That Are Constructed by Examinees Choice?, Journal of Educational Measurement, 1994, 31, 3
  39. 39
    Stephen G. Sireci, Howard Wainer, Henry Braun, Psychometrics, Overview, Encyclopedia of Biostatistics,
  40. 40
    Stephen G. Sireci, Howard Wainer, Henry Braun, Psychometrics, Overview, Wiley StatsRef: Statistics Reference Online,
  41. 41
    David D. Qian, Mingwei Pan, Response Formats, The Companion to Language Assessment,