- Top of page
- Conclusions and Future Work
The email communication system is threatened by unsolicited commercial email aka spam. Spam filters help reduce the amount of spam users have to cope with but often, users have difficulties understanding how these filters actually assess messages to be spam or ham. We conducted a number of user experiments using a simulated email interface providing detailed information about spam assessments. Feedback suggests that the additional information provided by the email interface was welcome.
Unsolicited commercial email or bulk email (short: UCE or UBE, respectively) is threatening email as a way of communicating (e.g., Whitworth and Whitworth 2004). Technical solutions to the problem typically analyze messages and filter those messages considered spam (see for example the CEAS proceedings).
Spam filtering remains a challenge though. Advanced filtering approaches can be considered fairly reliable but there is also anecdotal evidence that overly ambitious spam filters cause problems for genuine email (Lueg et al. 2007). Problems appear to be related to the lack of objective criteria that spam filters could employ for determining “(un)solicitedness” of emails. Neither “unsolicited” nor “unwanted” are objective, measurable aspects of emails (Lueg 2005).
Email users are concerned about what is happening to their email. Based on interviews conducted in the U.S., Fallows (2003) reported that 30% of email users surveyed were concerned their email filters might filter genuine incoming email and 23% of users were concerned email they send to others may be filtered.
In previous research we looked at web-based spam filter interfaces, including Yahoo's and Hotmail's, and found that most provide very little information about the message assessment process they implement. Arguably, disclosure of too many assessment details would undermine the effectiveness of filters as spammers might utilize the information to improve their own spam strategies. On the other hand, the widely used SpamAssassin filtering software does provide detailed information and is still considered one of the best spam filtering systems available.
We conducted a number of user experiments with a simulated email interface providing richer spam assessment information in order to find out if users would appreciate more detailed information and also if this information would help them better understand the assessment process.
The richer email interface (nicknamed ‘CabbageMail’) presented users with a number of realistic email scenarios by providing a mix of genuine spam messages and custom-made emails serving specific purposes. Included were an “introductory” scenario featuring properly classified, genuine messages as well as a number of ‘genuine’ spam messages. Other scenarios included emails incorrectly classified as either spam or ham, i.e. false positives and false negatives, respectively.
A more advanced scenario explored the problem of understanding why certain incoming messages were incorrectly classified as spam. Another advanced scenario asked users to explore why the interface warned them about emails they were about to send as these were likely to be classified as spam by the recipient's spam filter.
Key spam-related aspects of the interface included:
A spam icon indicating the dominant reason for classifying the email as spam;
A spam bar reflecting the degree of spamminess thus providing more detailed information regarding the spam status;
Upon request the specific spam score the email attracted;
Subtle highlighting of emails assessed to be spam
When opening a specific message the interface would provide more detailed Spam Information (see pictures) covering the various aspects of the message that were seen as contributing to the message's “spamminess” factor.
Subjects for evaluating the ‘CabbageMail’ interface were recruited among Computing students, via email. A total of 15 participants evaluated the interface for about 20 minutes each (14 male, 1 female; age range 20-47, most early-mid 20ies; most were core Computing students but some were doing combined degrees including Information Systems).
The evaluation included 35 multiple-choice questions and 2 open feedback questions, some to be answered prior to the experiment and others right after the experiment. 19 of the questions were related to the user experience, 16 questions addressed program usage issues.
Most subjects rated their email experience (1=not at all; 5 = very) as high (Mean 4.40; Std Dev 0.83); spam experience was also rated as high (Mean 4.27; Std Dev 1.03). However the experience with spam filtering as well as spam filtering process knowledge (1=none; 5=a lot) was rated at mere 2.80 (Std Dev 1.08) and 2.79 (Std Dev 1.05), respectively.
Regarding filter control and awareness it is interesting to note that 7 subjects (46.5%) said they were aware of the availability of filter controls they could use; 6 subjects (40%) were somewhat aware and 2 (13.3%) were unaware.
This level of knowledge shifted considerably regarding the awareness of the criteria used to filter messages: only 4 subjects (26.6%) said they were aware of the filter criteria (reliability not tested); 8 subjects (53.3%) were somewhat aware and 3 (20%) were unaware.
The usefulness of the 20 minute session in regards to gaining a deeper understanding of spam filtering issues (1=not at all; 5 =very) was rated slightly better than average (Mean 3.27; Std Dev 1.10) which is not surprising considering the complexity of the topic and the brief period of time.
The improvement of the understanding of what causes false positives and false negatives was medium (Mean 3.33; Std Dev 1.05 and Mean 3.13; Std Dev 1.19, respectively) and could be improved.
However, usefulness of the spam explanation and the advanced spam explanation were both rated highly (Mean 4.47; Std Dev 0.64 and Mean 4.33; Std Dev 0.98, respectively). The usefulness of the outgoing mail scanner (warning users that outgoing emails might be classified as spam by the recipient's spam filter) was also rated highly (Mean 3.93; Std Dev 1.07).
Conclusions and Future Work
- Top of page
- Conclusions and Future Work
Typically, the interfaces of email clients provide little information about the spam assessment process they implement. User-centered design taken seriously, however, would suggest IT should be used to empower users in the sense that users should at least have the option of exploring how their emails are assessed and what happens to them if they are considered spam. The feedback we gained from the experiments described in this paper suggests that making the information available would help users better understand the assessment process. Recruiting subjects among Computing students introduced a certain bias but on the other hand, few of the students actually considered themselves ‘spam experts’.
It is an open question as to how users could be educated about the assessment process without requiring them to possess sound technical knowledge. We suspect concepts underlying interactive queries help explain spam assessment processes. Dynamic queries are animated user-controlled displays that show information in response to movements of sliders, buttons, maps, or other widgets (eg Ahlberg et al 1992). We have already started to investigate these concepts in the spam assessment domain.