SEARCH

SEARCH BY CITATION

Interactive classroom Internet sites such as Blackboard are now widely used in college courses. These Internet sites include options for a number of possible interactive technologies. Particularly, they are useful for disseminating information and communicating quickly and easily with individual students and the class as a whole. One feature that is generally underutilized on interactive classroom sites is the online survey option. This feature was developed for instructors to survey the class, yet it is generally used by instructors for course evaluation purposes.

In this article, I will describe how an online survey can be used to teach undergraduate Psychology of Women students how to conduct social science survey research related to the topics covered within their textbook. Specifically, I use online surveys in a large (N = 150) undergraduate Psychology of Women course to (a) survey male and female students about topics we will cover in class in order to provide lecture materials on how students' opinions differ from class readings, (b) demonstrate how to analyze results of simple quantitative items by gender, (c) demonstrate how to come up with themes for the answers of open-ended questions by gender, and (d) teach students how to write up survey information in a research paper. I should point out here that even instructors teaching much smaller classes (e.g., 15–20 students) can use this method, as long as there are enough students in class to make survey items meaningful (data could also be collected across classes when there are multiple sections of the same course).

This assignment has some additional benefits. Students in Psychology of Women courses often consider published research studies outdated, even if those studies were conducted just a few years ago. In contrast, the online surveys were conducted just weeks or months ago. Additionally, students have a tendency to believe that research described in their textbook is not true on their campus or of their peer group. Class surveys are thus particularly convincing because the sample consists entirely of their classmates. Finally, it is a rare opportunity to see how different people interpret the same data and to explain how these different realities are part of feminist pedagogy.

COMPLETING ONLINE SURVEYS

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

My course syllabus for Psychology of Women indicates that students have 3 weeks at the beginning of the semester to complete six online surveys (supplemental materials are available online) on the interactive Web site Blackboard for a small amount of course credit. This time period spans the entire drop/add period so that even students on the waiting list (who cannot be enrolled on Blackboard) have a chance to complete the surveys when they eventually are admitted into the course. Because Blackboard does not have the capacity to analyze items by gender with a single questionnaire, each survey has a male and female version (with identical items), given that the main purpose is to analyze gender differences. If a student does not identify as a man or as a woman or does not wish to complete the surveys, the syllabus indicates that they should e-mail me and I will then give them a short assignment for equal course credit.

The first survey is labeled Practice Survey–Women or Practice Survey–Men, and I use the results of this survey to demonstrate how to analyze surveys. The other five surveys are labeled Gender and Language, Gender and Relationships, Gender and the Body, Gender and Sports, and Gender and Students, each one with versions for men and women. Surveys are very short, and I focus on items that are (a) easy to understand and interpret, (b) not personally intrusive or potentially embarrassing to students, and (c) related to topics covered in the course textbook. Each survey has a few yes/no or multiple-choice items and a few open-ended items. For example, the Practice Survey focuses on items related to my lecture on transgender issues. Items are: (1) Have you ever tried on clothes of the other gender in a store? (yes/no/not sure), (2) If yes, which changing room did you use? (women/men/other), (3) Has someone ever mistaken you for someone of the other gender? (yes/no/other), (4) If so, what were the circumstances? (open ended), (5) Have you ever tried to do something reserved for members of the other gender (e.g., join a club or sports team)? (yes/no/not sure), (6) If yes, what was it? (open ended), (7) Have you ever met someone whose gender was unclear to you? (yes/no/not sure), (8) What was that like for you? (open ended), (9) Did you ever play a different gender in a play? (yes/no/not sure), (10) Describe what role you played (open ended), (11) Did you ever dress up as a different gender at Halloween? (yes/no/not sure), and (12) Describe your Halloween character (open ended). These items convey the range of ways that students have in fact transgressed gender while at the same time portraying this in a fairly innocuous manner.

DEMONSTRATING ITEM ANALYSES

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

As soon as the add/drop period for the course is over and the deadline for completing online surveys has passed, the online survey option on Blackboard is closed to students. I then use the results of the Practice Survey in my research methods lecture to demonstrate how to analyze, interpret, and write up research.

For the qualitative items, I remind students how to calculate percentages for male and female sample items (some instructors may wish to teach students how to calculate chi squares at this point, but I do not do this in an introductory course). I show how percentages are useful ways of comparing men and women in a class in which the overwhelming number of students are female. I also explain how the relatively small number of men affects reliability, so that even one male student changing a response from yes to no, for example, markedly changes the percentage of male students in general. Although students do not use statistical tests, I explain how gender differences may be large or small and how those differences are open to interpretation. For example, if 50% of women have dressed up as the other gender for Halloween versus 40% of men, do we see this as a meaningful gender effect, even if statistically significant?

To demonstrate how to interpret the qualitative items, I list sample responses to the open-ended questions by male and female students, respectively, and ask students to develop themes. I emphasize that some themes hold true only for women or only for men, whereas others go across gender. Each year, some students identify very general themes and others find multiple subthemes. For example, when I show sample responses for the item about how students felt when they met someone whose gender was unclear to them, some students will find two themes: “positive reactions” versus “negative reactions.” Other students will mention subthemes such as “awkward/embarrassing,”“angry/suspicious,”“uncertain how to address them/refer to them,”“no big deal,” and “pleased/proud.”

I then ask students about their interpretations of the quantitative and qualitative results—what do these imply about gender differences in how male and female students are allowed to dress up as a member of the other gender for plays, for Halloween, etc.? Each year the results vary, but in general the Practice Survey is a good exercise to show differences in gender flexibility (usually women have more leeway in dressing up as the other gender at Halloween, playing the role of the other gender in a play, and trying on clothes of the other gender in a store). It is also a good exercise to show variability within gender (usually some women and men felt quite comfortable not knowing someone's gender whereas others felt embarrassed or awkward). Finally, it makes the transgender lecture less exotic because the majority of students have experienced some gender transgression, according to their answers on the Practice Survey.

STUDENTS SELECTING A SURVEY TO ANALYZE

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

The syllabus indicates that students should e-mail me their top three survey choices, so that I can assign them one survey to analyze. One advantage of students having completed all six surveys (the Practice Survey as well as the other five surveys) is that it gives them a very good sense of which surveys they would be most interested in analyzing.

With five possible surveys to choose from and 150 students, I like to divide surveys equally (i.e., 30 students analyzing each of five surveys). Each year the most popular surveys are Gender and Relationships and Gender and the Body. As soon as a survey has 30 students assigned to analyze it, I assign subsequent students their second choice, and so on. Technically there is no need to assign students to surveys in equal numbers, but there are two advantages to dividing survey topics equally. First, it yields more examples of good papers across survey topics to use in class demonstration and, second, it makes grading less tedious.

MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

Blackboard automatically calculates survey percentages and means for online surveys, which is an advantage for everyone except for instructors wanting to teach students how to calculate survey items. Thus, my teaching assistant converts percentages back to compiled numbers. She creates a file for each survey that includes the following information: (a) the number of female students who completed each survey item, (b) the number of responses by female students to each part of each item (e.g., for the Practice Survey above, it would be the number who answered yes, no, and not sure), and (c) every response by each female student to every open-ended item (Blackboard provides these in the form of a list, with no names or other identifying information). Then she creates a file with the same information about male students who completed each survey. Students are e-mailed these survey results 2 weeks before their survey paper is due, and the course syllabus includes detailed instructions on how to analyze quantitative and qualitative survey items and also how to write up the survey paper.

INCORPORATING SURVEY RESULTS IN CLASS LECTURES

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

I stagger deadlines so that survey papers are due about 3 weeks before the date when I lecture on that survey topic in class. The only way to do that fairly is to also stagger the date when students first receive survey results, so that students whose papers are due later do not have more time to write their paper. There is no need to do things this way, but in my case it means I do not need to grade 150 papers at once; instead, there are five times during the semester when I am grading 30 papers. This also means that if a student has an emergency and is unable to work on the survey paper, I can assign that student a survey due later in the semester (given that I go over survey results in class, it is not possible for students to hand in survey papers more than a week late without knowing the results). Finally, this means that, when I go over survey paper results in class, students analyzing that survey have only recently worked on their paper and are familiar with the results.

As soon as I have graded survey papers, I select two or three of the best ones to incorporate into my class lecture on that topic. I always begin with a slide that thanks every student researcher by name who worked on that particular survey. Then I present results of each quantitative item by gender and list the name of the student whose survey paper I used for this task. For the open-ended (qualitative) items, different students come up with different themes, and so I usually show a couple of good ways in which students did that part of the survey. Then I ask the class to interpret the findings in general—what do the results imply about gender differences and similarities about that topic? Finally, during my lecture, I compare class findings to data from the readings or other literature. Sometimes the class findings are nearly identical to readings and the general literature; sometimes they are completely different (sparking discussion of possible explanations for this difference). Either way, it is an interesting method to show how class opinions and experiences compare with national standardized surveys or other data.

DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

This assignment was intended to teach students how to analyze and interpret quantitative and qualitative data, so that the course readings about research in the Psychology of Women would not just be a virtual task. In fact, this assignment has taught me a lot as a researcher. Rarely, if ever, do researchers have the luxury of seeing how 30 people go about analyzing and writing up the same research results. Of course, students vary widely on such abilities as writing style or knowledge of relevant research literature. However, far more importantly, there is an extensive range of other issues that affect the quality of the survey papers. The following identifies some common factors.

Ways to Present Quantitative Results

Some students develop tables, others graphs, and yet others pie charts to present the survey findings. Some students put all data into one table; others use a separate table for the results of each item. Students often use color-coded pie charts, with different colors for each item or for each gender. Students have even presented pie charts so that the size of the chart corresponds to the percentage of respondents, with larger pie charts for more frequently answered items. Students have used icons on graphs—for example, stick figures who were swimming, running, playing team sports, etc.—for the item on the Gender and Sports Survey that asks respondents about sports they currently do. This exercise has also shown me what is not effective. For example, graphs and pie charts are visually appealing, but it is not always easy to see exact percentages when data are presented that way.

Ways to Present Qualitative Results

Some students develop a table that lists themes for each open-ended item by gender, and then give examples. Other students list all themes and then use color-coded ink to show which themes were more commonly mentioned by women versus men. Students have highlighted sample quotations in color to demonstrate where in the quote each theme was represented.

Ways to Go Beyond the Data

Each semester I am impressed with students who manage to find additional results beyond what I had expected when I created the surveys. For example, in the Gender and Language Survey, students often point out gender differences in the actual wording of respondents' open-ended answers (e.g., female respondents use more expressive language or refer to people in ways that are polite rather than profane), and they relate this insight to the research literature. In the Gender and Sports Survey, a student recently noticed that women engaged in more individual sports and men in more team sports (the survey item just listed a variety of different sports). Other students grouped these categories into winter versus summer sports, indoor versus outdoor sports, etc.

USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

Undergraduate students have read many research articles, but many have limited or no experience transforming raw data into a research paper. Thus, students need to learn how to transform data into a story that has a logical progression, is easy for the informed reader to understand, and is also interesting.

I use examples from undergraduate Psychology of Women survey papers to demonstrate good and bad ways to write up each section of a research paper. This exercise is particularly easy in classrooms that have opaque projectors, so I can just place that portion of the student paper on the projector and move through many papers in a short period of time. I can do this to demonstrate persuasive introductions, good use of relevant research literature, creative ways for presenting data in tabular or graph form, logical ways of describing results in the text, creative use of quotations, and comprehensive conclusions. It also would be easy to photocopy sample sections into pdf files that can be cut and pasted into PowerPoint slides.

I also discuss some weaknesses. For example, some students are overreliant on long quotations; others do not use quotations much at all. Sometimes the results are hard to follow; for example, students may not clearly describe the measures. Some papers contain too much background literature or the literature is not quite relevant to the topic. Some students present the results very well but do not interpret them. Others stretch the data by making a great deal about very small effects. I ask students for their opinions on the best papers, and I point out that there is usually some variability on their evaluations. I describe how the peer-review process also results in heterogeneous ratings of the quality of a manuscript (and thus why it is useful to have multiple reviewers).

CONCLUSION

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

In sum, although online surveys are included as part of interactive classroom technologies, they are rarely used to teach research methods. Using online surveys for undergraduate courses is an easy way to gather information without having to resort to paper-and-pencil questionnaires. Paper-and-pencil questionnaires would take longer to collate and evaluate and would necessitate photocopying multiple copies of each survey given that several students would be analyzing each survey. In addition, some students' handwriting is hard to read and some types of pencil are hard to photocopy. I have found that students in Psychology of Women classes often discount published research findings as not being true for their friends or their campus; therefore the data reflect the reality of their own peer group. The online surveys can be used to teach quantitative and qualitative data analysis, interpretation, and written presentation. This method is also a useful pedagogical method for demonstrating good and bad ways to write up research results. It demonstrates the subjectivity involved in conducting and interpreting research, a topic in feminist science. Finally, the ability to read multiple papers about the same data set has taught me a great deal about how to make research findings understandable in creative ways.

Esther Rothblum, Ph.D., is a professor of women's studies at San Diego State University and fellow of seven divisions of the American Psychological Association, including Division 2 (Teaching of Psychology) and Division 35 (Society of the Psychology of Women).

Supporting Information

  1. Top of page
  2. COMPLETING ONLINE SURVEYS
  3. DEMONSTRATING ITEM ANALYSES
  4. STUDENTS SELECTING A SURVEY TO ANALYZE
  5. MAKING SURVEY RESULTS ACCESSIBLE TO STUDENTS
  6. INCORPORATING SURVEY RESULTS IN CLASS LECTURES
  7. DEMONSTRATING DIFFERENT WAYS TO ANALYZE THE SAME DATA
  8. USING ONLINE SURVEYS TO DISCUSS FEMINIST RESEARCH
  9. CONCLUSION
  10. Supporting Information

The following supporting information is available for this article online: Sample surveys: (a) Transgender Survey, (b) Gender and Language Survey, (c) Gender and the Body Survey, (d) Gender and Relationships Survey, (e) Women and Men as Students Survey, and (f) Gender and Sports Survey.

Please note: Wiley-Blackwell are not responsible for the content or functionality of any supporting Information supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article. Additional Supporting Information may be found in the online version of this article.

FilenameFormatSizeDescription
PWQU_1548_sm_Rothblum.doc52KSupporting info item

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.