Kevin Wise is an assistant professor of Strategic Communication and Co-Director of the PRIME Lab at the Missouri School of Journalism, University of Missouri-Columbia. His research interests include the cognitive and emotional processing of interactive media.
Address: Missouri School of Journalism, 176B Gannett Hall, Columbia, MO 65211-1200 USA
Missouri School of Journalism University of Missouri-Columbia
Kjerstin Thorson is a doctoral student in the School of Journalism & Mass Communication at the University of Wisconsin-Madison. Her research explores the impact of new media on political evaluations and participation in politics.
Address: 5050 Vilas Hall, 821 University Avenue, Madison, Wisconsin 53706 USA
We conducted two experiments to explore how moderation, response rate, and message interactivity affected people’s intent to participate in a web-based online community. In our first experiment, 62 participants observed either a moderated or an unmoderated online community and answered questions about their intent to participate in the community. The participants who viewed the moderated community reported significantly higher intent to participate than participants who viewed the unmoderated community. In our second experiment, 59 participants observed a different online community in which we manipulated both the rate (in time) of posted comments and the interactivity of each comment. We derived our manipulation of interactivity from Rafaeli’s (1988) definition of interactivity as message contingency. Participants reported significantly greater intent to participate in an online community featuring interactive messages, but only when response rate was slow. These results indicate that both structural features of interfaces and content features of interactions affect people’s intent to participate in online communities.
Once the realm of the technically savvy, the Internet has become a practical communication channel for people of all abilities. While many people use the Internet to communicate with individuals through tools such as email and instant messaging, others use it to communicate with larger groups by posting their ideas to blogs, discussion groups, and other public websites. In fact, The Pew Internet and American Life Project recently reported that 44% of Internet users have contributed content to a website (Lenhart, Fallows, & Horrigan, 2004). Such websites are now generically referred to as online communities. While these communities vary significantly in terms of content or topic area, they share the goal of generating participation from those who visit.
Can certain features of an online community increase a person’s stated intent to participate in it? In the experiments that follow, we explored the effects of two structural features and one content feature on reported intent to participate in an online community. Structural features are attributes of the community that are not necessarily reflected in the content of online discussion. The structural features we explored here include the presence of moderation and the length of time that passes between successive posts. Content features, on the other hand, are attributes of the actual words and phrases that comprise the discussion. The content feature we examined was the level of interactivity of succeeding messages. We begin by reviewing how previous research has addressed these features, then describe and report the results of two experiments in which we manipulated these features and analyzed how these manipulations affected peoples’ intent to participate.
Online communities and participation
There is no single, accepted definition for online community. In this article, we use Porter’s (2004) definition of an online community as “an aggregation of individuals or business partners who interact around a shared interest, where the interaction is at least partially supported and/or mediated by technology and guided by some protocols or norms.” Given this definition, what motivates people to participate in virtual communities? One important factor emerging from previous research is the relevance of content to individual interests. For example, Ridings and Gefen (2004) asked participants in online communities a simple question: “Why did you join?” They found that people’s reasons for joining depended, in part, on the purpose or content of the community. Across all types of communities, information exchange was the most commonly cited reason for participation. People want access to information that interests them.
Similarly, Kaye (2005) examined the uses and motivations among users of weblogs, more commonly referred to as blogs. Her findings supported those of Ridings and Gefen (2004), in that people reported visiting blogs that focus on specific topics or endorse a particular point of view. These blogs provide a way for like-minded people to congregate and communicate. Kaye’s survey revealed that blogs are not only used for informational purposes, but also because they provide opportunity for interaction.
The importance of social interaction to individuals who participate in online communities explains why sociability may be a key element in determining the success or failure of an online community. A community that is visibly sociable should have more appeal than one that is not. How is the sociability of a community manifested on a computer monitor? We propose three features of online communities that may affect a person’s intent to participate: moderation, response rate, and message interactivity. We address each of these attributes in the following section.
Researchers have shown that with little more than text and images, online communities develop sophisticated norms that can guide thousands of members. Highly specific norms standardize attributes such as message length and response times (McLaughlin, Osborne, & Smith, 1995), message headers (Baym, 1995), and even the proper way to welcome new members (Honeycutt, 2005). The presence of these norms, however, is often insufficient to govern an online community. Studies into practices such as flaming and the dissolution of online communities have shown that deindividuation can lead to increased anti-social behavior (Lea, O’shea, Fung, & Spears, 1992), such that correcting violations of norms can account for up to 15% of messages in some online communities (McLaughlin, et al., 1995). In some cases, harmful messages have even overwhelmed and shut down online discussions (The Los Angeles Times, 2005).
In order to minimize harmful messages, many online communities rely on a variety of formal structures for moderating messages, ranging from a professional editor to teams of volunteer moderators (Poor, 2005; Preece, 2000). Moderators describe normative functions such as keeping a conversation on topic and preventing harmful attacks as their primary roles (Berge & Collins, 2000). These may not be necessary in communities where other normative influences are sufficient (Maloney-Krichmar & Preece, 2005). As online communities grow, however, moderation systems become necessary (Lampe & Resnick, 2004). By reducing the presence of harmful or repetitive messages, moderators may create an environment that encourages participation (Nonnecke & Preece, 2001). Based on this line of reasoning, we propose the following hypothesis:
H1: A moderated online community will elicit greater intent to participate than an unmoderated online community.
Studies have shown that the speed with which new messages are posted can have varied effects on how participants perceive each other in computer-mediated communication. Researchers have identified two kinds of messages in computer-mediated communication (CMC): synchronous messages that are exchanged in real time, and asynchronous messages exchanged over time (Baym, 1995). Compared to synchronous communications, asynchronous messages have been shown to increase perceptions of a speaker’s credibility and involvement (Nowak, Watt, & Walther, 2005), but also to decrease perceptions of usefulness and persuasion (Ng & Detenber, 2005). Asynchronous messages may be more appropriate for online communities because they allow more time for reflection (Preece, 2001) and do not require all members of the community to gather at the same time (Blanchard, 2004). While some studies failed to show any differences between effects of message frequency in asynchronous communication (Baym, 1995; Ng & Detenber, 2005; Nowak, 2005), other research demonstrates that increased frequency of posting in asynchronous communications can lead to more favorable impressions of communication partners (Liu, Ginther, & Zellhart, 2001; Walther & Bunz, 2005).
Reduced Social Cues (RSC) research argues that participants in CMC use contextual cues of the environment to make up for the absence of social cues normally available in face-to-face communication. According to the RSC approach, participants use the frequency of messages as one such cue (Baym, 1995, 1998). Researchers extending the RSC approach have suggested that timing of messages can serve as a proxy for a sense of social presence (Blanchard, 2004), as an indication of attentiveness (Walther & Bunz, 2005) or respect (Bargh & McKenna, 2004), and as a clue to the sociability of a community (Maloney-Krichmar & Preece, 2005). As such, the frequency of messages may serve as a signal for how engaged participants are with the community. We predict that frequent messages will indicate a community with active participants, which will make visitors more likely to participate:
H2: An online community with a fast response rate will elicit greater intent to participate than an online community with a slow response rate.
The concept of interactivity has been widely debated among communication scholars (McMillan, 2002). Efforts to explore the effects of interactivity vary widely in approach, with scholars defining interactivity variously as a structural element of the medium (Coyle & Thorson, 2001), as a perception variable in the mind of the user (McMillan, Hwang, & Lee, 2003) and as a multidimensional construct (Ha & James, 1998). The present study approaches interactivity from an interpersonal perspective, or what McMillan (2002) classifies as human-to-human interactivity.
Unlike the presence or absence of structural features that are “interactive,” interpersonal interactivity is evident (or absent) in the “pattern of interaction” demonstrated by members of an online community (Porter, 2004). Research in this tradition defines interactivity as message contingency, or the extent to which messages relate to one another (Rafaeli, 1988). Rafaeli and Sudweeks (1997) write that fully interactive communication requires that “later messages in any sequence take into account not just messages that preceded them, but also the manner in which previous messages were reactive” (n.p.).
While Rafaeli and Sudweeks (1997) speculate on the possible effects of interactivity, proposing that increased interactivity leads to engagement with the online group and that interactive messages are more involving, experimental investigations of this type of interactivity on website users are rare. Sundar, Kalyanaraman, and Brown (2003) drew on Rafaeli’s (1988) concept of interactivity to investigate the effects of interactivity on impression formation of a political candidate’s website. Although this study defined interactivity as message contingency, the construct was operationalized based on structural features of the website. That is, rather than examining the effects of mediated interpersonal interactions, Sundar, et al. (2003) looked at how websites were hyperlinked and the depth of the website structure. They found that increased interactivity on a political website led to more positive impressions of a political candidate and higher levels of agreement with the candidate’s policy positions. Although the authors approached interactivity from a somewhat different perspective, their findings suggest that interactivity (as message contingency) does positively effect perceptions of a website. This leads to our next hypothesis:
H3: An online community with more interactive messages will elicit greater intent to participate than an online community with fewer interactive messages.
Finally, we are interested in how a structural feature of an online community (response rate) interacts with a content feature (message interactivity):
RQ1: How do response rate and message interactivity interact to affect a person’s intent to participate in an online community?
We tested these hypotheses in two different experiments. In the first experiment, we manipulated the presence of moderation. In the second experiment, we manipulated the time lag between the posting of new messages (response rate) as well as the interactivity of those messages. We looked at the effects of these manipulations on people’s reported intent to participate in the community.
We recruited 62 students from an introductory journalism class at a major Midwestern U.S. university. Participants received course credit for taking part in the experiment.
We manipulated moderation as part of a posttest-only experimental design. Moderation had two levels, moderated and unmoderated. In the moderated condition, the website contained several elements that suggested the presence of moderation. First, the website contained a sidebar instructing visitors that posts and comments would be reviewed by a team of editors and rated by other volunteers:
In order to keep this site a great place for political conversation, YouthPoliticsAmerica is administrated by a team of editors who periodically review posts and comments to weed out anything really inappropriate. We can’t catch everything, however. Please alert an editor if you think a post or comment should be removed from the site.
We also have a team of volunteers who rate each and every post and comment to make it easier for you to find the really good stuff.
Second, every contribution to the website was rated on a scale from one to five, and featured a button allowing visitors to the website to alert editors of inappropriate content. Third, one post on the page was labeled as the editor’s pick. Finally, any commentary posted by experimental participants appeared on the site with a notice saying it was “not yet rated” by the editors.
In the unmoderated condition, the website’s content was identical but contained none of the additional elements suggesting the presence of a moderator. Sample screenshots of our stimuli appear in Appendix B.
Participants viewed one page of discussion from a website that we created called “YouthPoliticsAmerica.” The website is described as “a place for young people across the country to share their views about politics and build political perspective.” The website contained posts and comments concerning such issues as education, healthcare, and general political news.
The experiment took place in a campus computer laboratory. Participants entered the laboratory, sat at a computer terminal, and provided informed consent to take part in the experiment. Participants were randomly given a numbered questionnaire packet and told that they would be looking at a website and answering some questions regarding how they felt about it. The number on each packet determined which level of moderation would be seen. When participants indicated that they understood the procedure, they entered this number into the computer and began the experiment.
Participants viewed an online discussion containing a text box in which they could post their own comments if they wished. If a participant posted a comment, the computer refreshed the discussion board to reflect the additional comment. Once each participant had finished reading the online discussion, he or she filled out the corresponding questionnaire. Upon completing the questionnaire, each subject was thanked, debriefed, and dismissed. The experiment lasted approximately 40 minutes.
Dependent measure intent to participate
We measured intent to participate using a 10-item instrument adapted from Ng and Detenber (2005). Each item presented a statement to which subjects indicated their agreement using a seven-point Likert scale anchored by the responses “strongly disagree” (1) and “strongly agree” (7). We reduced the responses from this questionnaire with principal components analysis, using .60 as a cut-off point for factor loadings. This procedure yielded one factor, which contained eight items and explained 63% of the variance in the entire instrument. Cronbach’s alpha for this eight-item factor was .94. We created an index for intent to participate by summing each participant’s responses across these eight items. The instrument we used is provided in Appendix A, with the two eliminated items noted by asterisks.
To test the effects of moderation, we ran a one-way ANOVA on each of our dependent variables, using p = .05 as a cut-off point for determining statistical significance. Data from six participants were discarded from analyses due to technical difficulties or excessive missing responses, leaving 56 cases for analysis.
To demonstrate that participants had detected our experimental manipulation, we had them express their agreement with two statements using a five-point Likert scale. The first statement read, “Website editors review posts and comments on this website.” Participants who viewed the moderated community (M = 4.25, SD = .97) expressed significantly greater agreement with this statement than participants who viewed the unmoderated community (M = 3.07, SD = 1.02). The second statement read, “Posts and comments on this website are rated.” Again, participants who viewed the moderated community (M = 4.07, SD = 1.27) expressed significantly greater agreement with this statement than participants who viewed the unmoderated community (M = 2.11, SD = .88). These results indicate that participants detected the presence or absence of moderation.
Results and discussion
Hypothesis 1 predicted that a moderated online community would elicit greater intent to participate than an unmoderated online community. The main effect of moderation on intent to participate was significant [F(1, 55) = 4.47, p < .04, η2 = .13]. Participants who viewed the moderated community reported greater intent to participate (M = 35.43, SD = 10.89) than participants who viewed the unmoderated community (M = 28.57, SD = 13.27). These data support Hypothesis 1 and demonstrate that the presence of moderation has a positive effect on people’s intent to participate in an online community.
While moderation can be costly, these results demonstrate that people take notice and do report greater intent to participate in an online community that shows evidence of moderation. This finding makes sense in light of many examples of well-liked online communities that went “feral” due to a lack of moderation. For example, shortly after launching a feature allowing readers to collaboratively edit an online editorial, The Los Angeles Times (2005) had to remove the content because it was overwhelmed with inappropriate contributions. Short of complete collapse, members of an unmoderated community could find themselves spending much of their time simply regulating irrelevant or purposefully counter-productive comments posted by others (McLaughlin, et al., 1995). Some indication that either professionals or peers are dedicated to moderating the messages on a community might encourage participation by suggesting that the community will not fall apart.
In our second experiment, we manipulated a structural feature, response rate of posts, along with a content feature, message interactivity.
We recruited 59 undergraduates from a political science class at a major Midwestern U.S. university. Participants earned extra credit for taking part in the experiment.
We used a 2 (message interactivity) × 2 (response rate) within-subjects factorial design. Message interactivity referred to whether the individual messages within a particular page fulfilled Rafaeli’s (1988) criteria of interactivity. This variable had two levels, interactive and non-interactive. In the interactive condition, later messages related to previous messages as well as the way that previous messages related to those preceding them. Messages in the interactive condition also contained first-person pronouns and direct address of posters by name, which we included as additional cues of the relationship to previous messages. Messages in the non-interactive condition were similar in content, but neither referred to previous messages nor contained first-person pronouns or direct address.
Response rate referred to the length of time between subsequent messages and had two levels, fast and slow. We manipulated response rate by altering the time stamp that accompanies comments in many online communities, identifying when a particular comment was posted to the discussion. In the fast condition, the most recent message contained a time stamp indicating that it was no more than four hours old. The message preceding it contained a time stamp indicating that it had been posted no sooner than five hours before the most recent message. The time interval in the slow condition was two days. In other words, there appeared to be a two-day lag between messages in the slow response rate condition, compared to the five-hour lag between messages in the fast response rate condition. The time stamp automatically updated each time a new participant began, so the relative time lags were consistent across all participants.
We created our own discussion board where students could post comments about their professors or classes. We borrowed the concept of this board from an existing website called RateMyProfessors.com, which is popular among college students. We created our own content to reflect our interactivity manipulation. Appendix C contains screenshots of our experimental stimuli.
Measures intent to participate
We used the same dependent measure, intent to participate, and conducted the same data reduction and scaling techniques used in the first experiment. Principal components analysis with Varimax rotation yielded one factor. This factor contained eight (of 10) items and explained 63% of the variance in the entire instrument. We summed each participant’s scores across these 8 items and eliminated the remaining two items from further analysis. Cronbach’s alpha for this factor was .93.
We ran a 2 (message interactivity) × 2 (response rate) repeated-measures ANOVA on intent to participate.
Hypothesis 2 predicted that an online community with more recently added messages would elicit greater intent to participate than an online community with less recently added messages. The main effect of response rate on intent to participate was not significant [F(1,58) = .06, p = ns]. Hypothesis 3 predicted that an online community with interactive messages would elicit greater intent to participate than an online community with non-interactive messages. The main effect of message interactivity on intent to participate was not significant [F(1,58) = 2.50, p = ns]. Hypotheses 2 and 3 are thus not supported. Taken individually, neither response rate nor message interactivity affected people’s intent to participate in our online community.
We proposed a research question to explore how response rate and message interactivity combined to affect peoples’ intent to participate in an online. The interaction of response rate and message interactivity on intent to participate was significant, [F(1, 58) = 8.29, p < .01, partial η2 = .13] and is shown in Figure 1. Post-hoc Bonferroni comparisons showed that, in the slow response rate condition, participants reported greater intent to participate when messages were interactive (M = 35.37, SD = 1.26), and less intent to participate when messages were not interactive (M = 31.46, SD = 1.47). No other comparisons resulted in significant differences.
We tested three different features of online communities to see how they affected peoples’ intent to participate in the community. Our first experiment showed that a moderated online community elicited greater intent to participate than an unmoderated community. Since online communities generally form around a particular topic, it makes sense that people would favor a mechanism for flagging or eliminating behavior that distracts from the topic. The relative anonymity afforded by online communities can encourage people to behave differently in computer-mediated communication than they would in face-to-face communication. Such behavior can include flaming (repeatedly insulting other individuals) and trolling (entering a discussion with the sole intent of antagonizing the entire community). Many online communities that have succumbed to these behaviors may have survived had there been a moderating force to keep matters from getting out of hand and potentially alienating participants in the community. These results indicate that people look favorably upon moderation when thinking about participating in an online community.
Our manipulation included different kinds of moderation commonly used in existing online communities. Moderation can take many forms: a single person, a professional team of editors, a volunteer base, or the community members themselves either informally or through a formal rating structure. We intentionally created a strong manipulation that encompassed most known types of moderation, but this study did not address the relative effectiveness of specific systems. Sundar’s (2001) finding that peer rating systems led to the most positive ratings of liking a news story suggests that peer reviews or some other collective form of moderation might serve online communities best. The comparison of collective and individual moderation, as well as various manifestations of each, is a compelling topic for future research. Collective moderation has a certain democratic appeal but requires more elaborate programming to implement. Individual moderation is more authoritarian but may also be more efficient. Either way, the results of our first experiment suggest that online community administrators should consider implementing some sort of moderation.
In the second experiment, a community featuring interactive messages that relate back to earlier messages in a discussion also elicited greater intent to participate than a community featuring non-interactive messages, but only when more time elapsed between the posting of each message. This finding is especially interesting given that neither message interactivity nor response rate alone significantly affected intent to participate. One possible explanation for this result is that when messages follow one another in rapid-fire succession, people feel like they need to hurry up and get through the discussion so that they can post a comment before it loses relevance, or before someone else beats them to the punch and posts a similar comment.
Drawing from dual processing theories such as the Elaboration Likelihood Model (Petty & Cacioppo, 1996), we could view the detection of response rate as peripheral processing and the detection of message interactivity as central processing. We know that people who have limited motivation to process content are more likely to base evaluations on peripheral cues. The fast response rate, as a prominent peripheral cue, may be drawing more attention than the content of the messages. This would explain why there was no difference between interactive and non-interactive posts in the fast response rate condition. As a starting point for testing some of the features that have come up in previous scholarship, we were only interested in their effects on intent to participate and did not address any cognitive outcomes such as attention or memory. Research on the cognitive processing of media (e.g., Lang, 2000) has demonstrated that attention is elicited by signal stimuli. Because the time stamps in the fast response rate condition were more recent (or closer to the moment that a participant was reading the post), one could speculate that these would be stronger signal stimuli than the time stamps in the slow response rate condition. Future research could address whether response rate affects cognitive processing, as suggested by this explanation.
The interaction of message interactivity and response rate also suggests future research into the uses and gratifications of online communities. It is possible that the interpretation of cues in an online community depends on an individual’s motives for visiting the site. The literature has shown that reasons for participation in an online community relate to both individual needs and the type of community under consideration. For someone interested in searching for information about a particular professor, for example, a community with high levels of interpersonal interaction may not be as useful as a community with less frequent but more informative messages. Indeed, “the delay between receiving a message via asynchronous textual conferencing and sending a reply can provide valuable time for reflection” (Preece, 2000, p. 160).
These findings inform our scholarly understanding of online communities in several ways. First, they provide some empirical support to Rafaeli’s (1988) assertion that message interactivity must be considered when studying both new technologies and the interactions they permit. While Rafaeli’s definition of interactivity is prominent in the existing literature, few have derived operational definitions of interactivity based on his work. The results of our second experiment indicate that interactivity as message contingency is an important attribute that does affect people’s intent to participate in an online community, even if the pattern of those effects in the context of other features is still not entirely clear.
Second, most scholarship concerning online communities has explored community from the inside by using content analyses or surveys of participants to improve understanding of why and how people interact online. The studies described here provide new insight into how visitors—those on the outside looking in—evaluate the quality of online communities. Cues that have been shown to have positive effects on those engaged in computer-mediated communication may be “read” differently by those observing a discussion. For those involved in a discussion, the speed of response provides important information about the attentiveness and interest of the discussion partners. This effect may be nearly opposite when considered from the vantage point of an individual lurking outside the conversation. Messages that appear to be posted less frequently may provide lurkers with a sense of opportunity to add something to a discussion. That opportunity might not present itself in a context of speedily exchanged, highly personal messages among a group of people who seem to have well-established relationships. While the table full of people laughing and talking may seem the better group to join, it can often be the harder one to approach. It is important to note that participants in our studies were only exposed to the discussion for a short period of time. The appeal of a community to a long-time lurker may be very different as he or she comes to “know” the group.
While this research makes both theoretical and practical contributions to existing scholarship on online communities, there are some limitations worth consideration. First, our dependent variable in both studies was attitudinal rather than behavioral. The stimuli used in both experiments did offer participants the opportunity to post their own responses, but we implemented this feature primarily to make the stimuli look more authentic. Because our participants had no previous experience with the online communities they saw in these experiments, too few posted comments to make any systematic analysis worthwhile. Participants also might have avoided commenting because of the experimental setting or a desire to complete the experiment quickly. Future research should augment the present findings with behavioral measures of participation taken from members of an existing online community. In our initial study of participation, we chose the more tightly controlled laboratory setting, with an attitudinal measure that has been used in previous research (Ng & Detenber, 2005). We acknowledge the difficulty in producing significant behavioral effects with the types of manipulations used in these studies, but caution against over reliance on attitudinal measures when the conceptual interest rests on behavioral, cognitive, or social measures.
Because we used a convenience sample of undergraduate student participants, our findings may be limited in their generalization. However, this sample—comprised of the demographic currently most active in online communities—provides a valuable place to begin to understand how sociability influences intentions to participate. Furthermore, the online community used in our first experiment dealt with politics, a topic about which many college students are apathetic. The results of this experiment may actually underestimate the effect of moderation because of this topic. Topic is one of several variables that could be manipulated and explored in future research. Individual variables such as gender, Internet usage, and political affiliation could also moderate the effect of the features we explored here. We purposely used a rather homogenous (and politically apathetic) sample to avoid complicating the basic relationships of interest.
This research provides experimental evidence that both structural and content features of online communities affect individuals’ intent to participate in the community. In so doing so, it also highlights the importance of message-based interactivity, as described by Rafaeli (1988). We stated at the outset that the online community has become a popular venue in which people communicate with a larger group. This sort of communication typically involves reading statements that others have posted and posting statements for others to read. Because online communities have been cited as having both prosocial and antisocial effects on society, it is important that we understand the features that may elicit participation. Such an understanding may help those who administer and maintain online communities to maximize the good and minimize the bad elements of participation.
Appendix A. Questionnaire instrument
Appendix B. Unmoderated (top) and moderated (bottom) online discussion
Appendix C. Non-interactive (top) and interactive (bottom) online discussion
About the Authors
Kevin Wise is an assistant professor of Strategic Communication and Co-Director of the PRIME Lab at the Missouri School of Journalism, University of Missouri-Columbia. His research interests include the cognitive and emotional processing of interactive media.Address: Missouri School of Journalism, 176B Gannett Hall, Columbia, MO 65211-1200 USA
Brian Hamman is the Online Managing Editor for the Columbia Missourian and an Instructor at the Missouri School of Journalism.Address: Missouri School of Journalism, 221 South Eighth Street, P.O. Box 917, Columbia, MO 65205 USA
Kjerstin Thorson is a doctoral student in the School of Journalism & Mass Communication at the University of Wisconsin-Madison. Her research explores the impact of new media on political evaluations and participation in politics.Address: 5050 Vilas Hall, 821 University Avenue, Madison, Wisconsin 53706 USA