Review communities are important resources for sharing information among consumers, who write about their experiences with an increasingly large variety of products, services, and activities. Reviews enable consumers to learn about item attributes that would otherwise be difficult to ascertain before consumption (Schinder & Bickart, 2005). In addition, consumers often view communities as being relatively unbiased sources of information, as compared to an item's manufacturer or a service provider (Dellarocas, 2003). The importance of such sites to businesses is also well documented, as this form of word-of-mouth communication has been shown to influence sales (Chevalier & Mayzlin, 2006; Senecal & Nantel, 2004) and to create trust in a product or seller (Ba & Pavlou, 2002; McWilliams, 2000).
One only need briefly examine popular sites to observe the volume of reviews contributed, and the challenges in effectively organizing them. Online spaces in which many users interact are difficult to sustain, as users become overwhelmed by the information available (Butler, 2001). Review communities typically display contributions in a list across multiple pages, so that users must scroll through them. Since tasks involving the interpretation of unstructured text are particularly susceptible to information overload (Hiltz and Turoff, 1985), online communities must provide a means for users to filter information. Therefore, many sites employ a simple form of social navigation, in which participants are asked to provide feedback on others' contributions (Goldberg et al., 1992). This feedback is then used in determining the display order and thus, how prominent a review will be.
In a large set of reviews, which ones are likely to be read? Although participants develop strategies for mitigating information overload, which include techniques for filtering out messages not likely to be of interest (Jones et al., 2004), little is known about how they select reviews to read. For example, is it possible that review forums are “echo chambers” where participants seek out contributions that express views similar to their own (Garrett, 2009)? Do they study a reviewer's profile, in an attempt to learn who she is, judging her credibility or similarity in terms of product taste? Or, do they consider the “wisdom of the crowds” offered via social navigation a reliable means to guide them to the most informative reviews (Kostakos, 2009)?
Despite the many unknowns surrounding the information-seeking behavior of participants, it is reasonable to assume that the most prominently displayed reviews are the ones most often read. In a community that displays reviews as a list of ranked items, reviews appearing close to the top of the list (i.e., on the first page) are easily seen, while those that are further from the top (i.e., on latter pages) are much less likely to receive attention. In fact, when presented with a ranked list of documents (e.g., in response to a query submitted to a search engine), users seldom look beyond the first page of results (Spink et al., 2006; Joachims et al., 2007; Pan et al., 2007).
Researchers have considered what makes a good review, in terms of providing information viewed as helpful (e.g., “utility prediction” (Zhang & Varadarajan, 2006) and “evaluating helpfulness” (Kim et al., 2006)) or in predicting sales (e.g., (Ghose & Ipeirotis, 2010)). Most work has considered how aspects of review content (e.g., the valence of the reviewer's opinion, the amount of information expressed) are correlated to feedback scores. To contrast, we study how communication tactics used in reviews correlate to their prominence in the community. Another important deviation from previous work, which most often has focused on Amazon.com, is that we study three communities: Yelp, Amazon, and the Internet Movie Database (IMDb). These communities share several features in common: they allow participants to post textual reviews, employ the list display format and use peer feedback to determine display order. However, different commodities are reviewed at these sites, and they have various structural features that participants might use to distinguish themselves.
Review Organization and Structural Features at Amazon, Yelp, and IMDb
Given that information seekers rarely look beyond the first page of a list of items, how contributions are organized in a review community ultimately determines what is seen. Here, we provide an overview of how reviews are organized at the three communities studied. Figure 1 shows a review of a camera at Amazon.com. As shown, users are asked to help others by indicating whether or not the review is helpful. Above the review's title, its rating is expressed in the form “X of Y people” found it helpful. Users may sort by the helpfulness of reviews or by date, with helpfulness being the default. From the reviewer's profile, one gleans information about her level of activity and how useful her contributions are. Participants also use profiles to share information such as a self-description, interests, and photos.
For this review, we can note that 95% (i.e., 52 of 55) of the people who voted found it helpful. Its sentiment toward the product is positive, with the reviewer assigning a perfect rating of five out of five stars. In terms of structural properties, we observe that the reviewer, R. Overall, uses his real name and provides a self-description. In addition, his helpfulness over all contributions is 91%. Another thing that can be noticed is the manner in which the reviewer tries to convince others that the Lumix is a good camera. Overall provides his credentials as a camera reviewer up front; he is a “former pro photog.” Then, he reports several positive aspects of the camera (e.g.,it is “lightweight” and has a “commonly available battery”).
Yelp's review organization also involves user feedback. Figure 2 shows a restaurant review and its reviewer's profile. Members can indicate that a review is useful, funny, or cool. The default “Yelp sort” considers the number of feedback votes, as well as review recency1 . In addition, users may sort by date, item rating, or “elite” status of reviewers2 . From a reviewer's profile, one finds information about her level of participation and recognition from others, and also information concerning who she is. For example, reviewers may post photos and self-descriptions. Links to their friends' pages are shown, as well as the total number of friends.
In Figure 2, we can see from her profile that Melissa M. is an elite reviewer, with 191 friends. She's also written quite a few reviews. Over her 92 reviews, she has received 307 useful votes, 242 funny votes, and 277 cool votes, for a total of 826 votes. It should be noted that since Yelp users can only express positive and not negative votes, we cannot compute Melissa M.'s positive feedback as a percentage, as we could with Amazon's helpfulness metric. In Melissa M.'s review, we can also note some interesting textual properties. At one point, she uses all capital letters (“who I'm SURE hated me”) as well as an emoticon representing a smiley face. Finally, in contrast to R. Overall's camera review, Melissa M.'s review is formatted with paragraph breaks.
Finally, Figure 3 shows a review of the movie Shawshank Redemption from IMDb as well as its reviewer's profile. IMDb participants are asked whether or not reviews are “useful.” Several filters are available including “best reviews” (default), chronological, and most prolific authors. The default uses community feedback and other undisclosed factors. As can be seen, IMDb profiles are basic, as compared to those at Amazon or Yelp. While a user can determine how prolific a reviewer is by viewing all of her reviews, no summary statistics are provided as to how useful they are. Finally, participants cannot post photos, but can share a biography and contact information, and can exchange messages on the IMDb message boards.
The example in Figure 3 is of a moderately useful review; 76% of people who voted found it useful. The reviewer, Si Cole, is positive about the movie, giving it 8 out of 10 stars. In contrast to R. Overall, Si Cole does not offer any information about his credentials or himself, but rather, provides a brief summary of the film, and explains why he liked it. However, in his profile, we learn that Si Cole has an educational background in film and art history, and can learn more from his web site or by contacting him via e-mail.