Consumer Financial Literacy and the Impact of Online Banking on the Financial Behavior of Lower-Income Bank Customers



This article analyzes a demonstration program mounted by a major bank to understand whether access to information and communications technologies, combined with financial literacy training and training on how to use the Internet, can help low- and moderate-income individuals in inner-city neighborhoods be more effective financial actors. While quantitative analysis turns up few significant program effects, qualitative work implies that implementation issues likely compromised the effectiveness of the program. There was evidence of a potential link between information and communications technologies and financial literacy. Overall, urban low- and moderate-income individuals are interested in becoming technologically and financially literate and an intensive intervention may enable these goals.

Electronic banking technologies have proliferated in recent years, and the availability of a wide range of products has led to increasing adoption among consumers. These technologies include direct deposit, computer banking, stored value cards, and debit cards. Banks and other financial institutions have worked hard to develop and deploy these technologies because of their potential to increase efficiency, cut costs, and attract new customers. Consumers are attracted to these technologies because of convenience, increasing ease of use, and, in some instances, cost savings (Anguelov et al. 2004). Electronic banking, in particular, has grown at impressive rates. Between 1995 and 2003, e-banking increased eightfold (Hogarth and Anguelov 2004). Between late 2002 and early 2005, use of online banking increased 47 percent. There is some evidence that computer banking is associated with better household financial management (Hogarth and Anguelov 2004). However, financial literacy, the digital divide, and other issues that separate disadvantaged groups from the financial mainstream make it difficult for low- and moderate-income (LMI) individuals to reap the potential benefits associated with computer banking.

This article analyzes a demonstration program (the Program) mounted by a major bank (the Bank) to understand whether access to information and communications technologies (ICT), combined with financial literacy training and training on how to use the Internet, can help LMI individuals in inner-city neighborhoods be more effective financial actors. When we began the study, our primary research question was to examine whether technological literacy could serve as a gateway to financial literacy. Secondarily, we wanted to investigate whether the Program could serve as a model to incentivize banks to engage more fully in the provision of financial literacy training. During the process of the research, we also realized that this case study could provide some key lessons for how to look at literacy from beginning to end when planning an intervention to create improvements for a critical population. We hypothesized that coupling a comprehensive intervention addressing all components of the digital divide with a financial literacy component targeted at LMI individuals would increase financial literacy and move participants across the digital divide.

Using quantitative and qualitative data, we examine the Program in the context of changes in the financial services environment and with respect to the financial literacy issue. We studied the ways in which ICT has changed banking processes and what evidence exists about access to and use of electronic banking. In brief, our quantitative analysis turns up few significant program effects. However, our qualitative work implies that implementation issues likely compromised the effectiveness of the Program. We also find evidence of a potential link between ICT and financial literacy. We argue that urban LMI individuals are interested in becoming technologically and financially literate and that an intensive intervention may enable these goals. Creative interventions are warranted for at least three reasons: (1) banks’ history of underserving LMI communities, (2) predatory lenders’ and fringe financial services providers’ disproportionate targeting of these groups, and (3) difficulties associated with motivating adults to pursue financial literacy. At the same time, banks are unlikely to spearhead and fund these interventions unless they believe there is a return for doing so. The relatively recent popularity of the double bottom line concept, in which businesses seek to affect both their fiscal performance and their social impact, makes it possible to expand the definition of “return” for a program like this one. This research has implications for the role corporate actors can play in augmenting consumers’ financial literacy and, potentially, affecting their financial behavior. The work also has implications for how best to deliver financial education. This case study illustrates the importance of defining success at the outset for ensuring that key elements of program design will be incorporated. Our research therefore responds both to the weak incentives for banks to provide financial education and to the difficulty of providing such education to LMI adults.

Definition of key variables, theoretical framework, and literature review

Definition of key variables

Before introducing the literature and theoretical framework that grounds this research, it is important to define key variables. ICT is an umbrella term that includes any communication device or application; with respect to this work, we are most interested in computers and the Internet. Access to ICT refers not only to literal access—that is, having access to a computer and the Internet at home or at another site—but also sufficient training to use the technology and content that is relevant to LMI individuals. Financial literacy refers to a person’s ability to understand and make use of financial concepts. Anguelov et al. (2004, p. 1) maintain that electronic banking “encompasses a broad range of established and emerging technologies” and includes both “front end” such as ATM cards and banking and “back end” technologies such as electronic check conversion. For the purposes of this article, we use the terms “online banking” and “electronic banking” interchangeably to refer to consumers accessing and using existing bank accounts online. Typical activities include paying bills and transferring money between accounts.

Theoretical framework

For many LMI individuals, access to mainstream financial institutions, such as banks, credit unions, and Community Development Financial Institutions, is tenuous because of poor credit histories, insufficient and inconsistent cash flows, and lack of financial literacy. Race- and gender-based discrimination likely also plays a role (Bates 2000; Immergluck 2002). Estimates of the number of unbanked Americans range up to 22 million individuals.1 Many millions more are “underbanked”—they have a bank account but still use fringe financial institutions such as payday lenders and check cashing outlets (Aizcorbe et al. 2003; Stuhldreher and Tescher 2005). Carr and Schuetz (2001) theorize that lower-income families’ nonuse of traditional financial services occurs for complex reasons including: unfamiliarity with banking and savings services, not writing enough checks to justify an account, and distrust of mainstream financial services providers. Other researchers question whether LMI individuals choose not to use mainstream financial institutions because the products and services offered fail to meet their needs (Bond and Townsend 1996; Morduch and Armendariz de Aghion 2005). Low-income families tend to have relatively high debt payment to income ratios are relatively more likely to make late bill payments and, as a result, pay more for credit (Aizcorbe et al. 2003 cited in Hogarth and Anguelov 2004). Other research has looked at how different groups prefer to obtain financial information; one study of low-income individuals showed a preference for learning from friends (Hogarth and Swanson 1993), which may limit these individuals’ ability to make good financial decisions.

Banks have a history of underserving low-income communities.2 Historically, banks have neither located in low-income neighborhoods nor have they catered to LMI individuals with products and services geared to these groups (Fondation, Rufano, and Walker 1999). Partly as a result of this, fringe lending has grown astronomically in recent years, and much of this activity is concentrated in LMI communities. Carr and Schuetz (2001) estimate annual fees collected at check cashing services are $1.5 billion annually, $1.6–$2.2 billion for payday lenders and $2.35 billion for rent-to-own stores. These authors theorize that the U.S. financial system is bifurcated, with mainstream financial services concentrated in vibrant communities and fringe financial services—for example, pawnshops, check cashers, payday lenders—concentrated in distressed communities. People who lack financial literacy most often reside in distressed communities and are less likely to be able to distinguish between financial products and to understand the implications of the transactions into which they are entering.

A primary reason why banks do not serve LMI communities concerns perceptions of demand and the size of the market. New methodologies for measuring market size have begun to question the reliability of typical market data and show that these markets are much larger than was previously presumed (Alderslade 2005). Overall, mainstream financial institutions’ service of low-income communities continues to lag; however, some financial institutions and other purveyors of financial products have begun to recognize the potential for capturing these relatively untapped markets with new products. Financial institutions are investigating whether the lower cost of service that technology enables makes it worth the institutions’ effort to expend the resources necessary to capture these markets. Banks are now exploring the potential of information technology (IT) banking tools to serve low-income customers and to attract the unbanked.

Personal finance is becoming increasingly more complicated because of innovation and deregulation of the financial sector, and the consequences of insufficient financial literacy are growing more severe, as evidenced by the relatively high default rates associated with subprime mortgages that are concentrated among the low income. Previous research has shown that low-income persons have the least amount of financial literacy and there are fewer public and private programs available to them to obtain greater literacy (Braunstein and Welch 2002; Jacob, Hudson, and Bush 2000). Moreover, a greater number of low-educated persons require financial literacy because most of them are working and managing money as a result of welfare reform and the expansion of the earned income tax credit (Anderson and Gryzlak 2002; Cancian 2001; Loprest 2001).

As noted, the reduced availability of financial literacy training for low-income persons represents a barrier to achieving financial literacy. Low incomes also limit the demand for such services among this group even when available. Therefore, it may be necessary to subsidize financial literacy programs for low-income persons to bring their level of literacy to an appropriate level. Such programs may have large returns because previous research has shown that financial literacy is an important determinant of economic well-being (Bernheim 1998; Jacob, Hudson, and Bush 2000). Here, we study one such program.

Our hypothesis that the Program could create both positive financial literacy and digital divide outcomes is based, in part, on the Technology Acceptance Model. The Technology Acceptance Model posits that perceived usefulness and perceived ease of use determine an individual’s intention to use a system (Davis 1989). The Program used the free computers and Internet to get participants to the table. Participants did not necessarily perceive financial literacy to be particularly useful, nor did they believe that computers and the Internet were particularly easy to use. However, they did believe that they needed to be technologically proficient to obtain important information and to get a good job. Once they began to obtain training, we hypothesized that they would learn the connection between financial and technological literacy. The training itself would change their perceptions about ease of use of technology and about the usefulness of financial literacy.

Literature review

Three bodies of research are germane to this study: the electronic banking literature, the financial literacy literature, and the digital divide literature. For each, we are most interested in the pieces that apply especially to LMI individuals.

Electronic banking

In 1994, only 150,000 people banked from their home computers; by 1999, that number had grown to 3.2 million paying bills online (Orr and Ali 1999). As of late 2004, 53 million people or 44% of Internet users and one-quarter of all adults were using online banking (Fox 2005). Of all the major Internet activities tracked by the Pew Internet and American Life Project since its inaugural survey in March 2000, online banking has grown the fastest. In response to increased demand, banks are creating and expanding their online and e-banking presence (Furst, Lang, and Nolle 2001). Banks are strongly encouraging customers to conduct transactions online because electronic banking lowers costs for these institutions. The average transaction for banks over the Internet is one cent, compared to $.27 via ATM, $.54 by telephone, and $1.07 at a full-service branch (Cuevas 1998).

Electronic banking technologies can be classified as either “passive” or “active” (Kolodinsky, Hogarth, and Hilgert 2004). Passive technologies, such as direct deposit, do not require any behavioral changes on the part of the consumer; these innovations are therefore more easily spread to the mainstream. Active technologies, on the other hand, require new behaviors and are therefore more challenging to propagate. Electronic banking requires “perhaps the most consumer involvement, as it requires the consumer to maintain and regularly interact with additional technology (a computer and an Internet connection)” (Kolodinsky, Hogarth, and Hilgert 2004, 243). Consumers who use e-banking use it on an ongoing basis and need to acquire a certain comfort level with the technology to keep using it. LMI individuals are less likely to have this comfort level than are individuals who are better off.

The proliferation of new electronic banking technologies may, if used wisely, help to bridge the gap between those who operate in the financial sector concentrated in vibrant communities and that which is concentrated in distressed communities. The advantages of using computer banking include the ability to see one’s account balances and transfer funds among accounts (Goldfield 1998) as well as “bill-paying that is easier and lower-cost, financial services that are available ‘24/7’, less time spent on financial management tasks, and lower risks associated with carrying cash” (Hogarth and Anguelov 2004, p. 1). Recent research also suggests that “increased use of online banking and bill paying can actually decrease the occurrence of identity theft by taking personal information outside the mailbox and eliminating a paper trail” (Stafford 2004, p. 201). These advantages clearly apply to LMI groups. Studies show that individuals’ willingness to use e-banking technologies is tied to “socioeconomic and demographic characteristics (such as income and age), [and] perceptions of specific technologies (such as perceived ease of use)” (Anguelov et al. 2004, p. 1). Wealthier households, those who have college degrees and those who live in the suburbs are the most likely groups to use online banking (Fox 2005; Kolodinsky, Hogarth, and Hilgert 2004). With apologies for perhaps stating the obvious, it is also necessary for people to have access to computers and the Internet and sufficient experience to be comfortable manipulating their money online. The issue of the digital divide, which is composed of access, training, and content (Servon 2002) must be addressed in concert with financial literacy. A high correlation exists between people with low levels of financial literacy and low levels of technological literacy; both groups are disproportionately represented in the LMI population (U.S. Department of Commerce, National Telecommunications and Information Administration 2004). Kolodinsky, Hogarth, and Hilgert (2004: 256) recommend that “if electronic access to financial management tools is the wave of the future, marketers need to find ways to make all persons comfortable with these tools.” Thus far, financial institutions have tended not to reach out to LMI populations in this way.

Financial literacy

Financial literacy programs have proliferated in the past several years, partly in response to increasing complexity in the financial services environment. Other factors leading to the growth in programs include low levels of financial literacy, low savings rates, growing bankruptcy rates and debt levels, and increased responsibility among individuals for making decisions that will affect their economic futures (Parrish and Servon 2006).

The lion’s share of financial literacy programs have been mounted by public and nonprofit actors. Banks are virtually the only for-profit actors that have entered this arena, but they do less than the other two sectors.3 Clearly, increased financial literacy benefits banks by: (1) moving people from alternative to mainstream financial institutions, (2) getting people to save and invest more, and (3) educating people about products that meet their needs and then purchasing these products. The advent of online banking provides banks with another reason to educate LMI customers—to move them online and thereby serve them less expensively.

Improving the financial literacy of adults is particularly challenging because they do not attend school, as do youth, and they tend not to have the time or interest in generic financial literacy classes (Parrish and Servon 2006). Researchers and practitioners advocate identifying “teachable moments,” such as home purchasing or filing for bankruptcy, at which financial information is particularly relevant (National Endowment for Financial Education 2004). Some research has found positive results when financial education is offered in the workplace in conjunction with the employee’s making decisions about participation in a retirement savings plan (Bayer, Bernheim, and Scholz 1996; Bernheim and Garrett 1996; Loibl and Hira 2004). However, as Hilgert and Hogarth state “one of the greatest challenges for policymakers, consumer educators, and practitioners in providing financial education is motivating individuals to pursue it” (2003, p. 320). Policy recommendations focus on school or the workplace as primary intervention points (Koide, Murrell, and Seidman 2007), but these recommendations have limited generalizability to LMI adults, many of whom do not work or whose jobs do not entitle them to retirement savings. For the sample of individuals we studied for this article, for example, approximately half had 12 or fewer years of education, 20 percent, did not work at all in the past year, and only 46 percent worked full time and full year. The average earnings of the sample were approximately $20,000 per year and 15 percent of the sample received public assistance. Workplace-oriented strategies may not be the most appropriate way to reach this group with financial education. At the same time, given their place along the socioeconomic spectrum, they are both most needy of financial education and most vulnerable to predatory providers of financial services.

Although there is not much research on financial education that targets LMI people, that which does exist suggests that pairing financial literacy training with an opportunity to save is beneficial. The American Dream Demonstration provided low-income participants with the opportunity to save in an Individual Development Account for a home, business start-up, or an education. Participants who saved and completed a financial literacy program had their savings matched. An evaluation of the program found that rates of saving increased for every individual hour of financial education received up to twelve hours (Schreiner, Sherraden, and Beverly 2002). An evaluation of the Financial Links for Low-Income People program in Illinois offers further evidence that low-income individuals benefit from financial education and the opportunity to open a savings account. Financial Links for Low-Income People provided financial education to low-income participants, including Temporary Assistance for Needy Families recipients, a portion of whom had the opportunity to open an Individual Development Account. Participants reported that they were budgeting better, saving more, opening bank accounts, and participating in employer-sponsored retirement plans as a result of the program (Anderson, Scott, and Zhan 2004).

Research about the impact of financial education, while mixed, points to a positive relationship between financial education and financial behaviors and other financial outcomes (Hilgert and Hogarth 2003; Lyons et al. 2006). At the same time, there is “a general lack of understanding and knowledge among financial professionals and educators about how to measure program impact” (Lyons et al. 2006, p. 208). Although research on the format, quality, and content of financial education also ranges, financial education experts recommend that the education be active rather than passive. People tend to learn better when they believe that the material is relevant to their lives and when they are able to practice what they learn (Parrish and Servon 2006). The manner in which material is delivered is also important. Some research suggests that traditional approaches to financial education may do a poor job of connecting with individuals (Ciccotello and Elger 2004) and low-income individuals in particular (Shirer and Tobe n.d.) Other research highlights a new approach called “andragogy,” a learner-centric approach that is more flexible and less lecture oriented than traditional classes and that may work better for adults than traditional pedagogical approaches (National Endowment for Financial Education 2004). Shirer and Tobe (2004) found that traditional budgeting classes did not do a good job of retaining participants, and therefore piloted a model curriculum incorporating “stages of change theory” which they found to be effective for motivating people with few financial resources to pursue a healthy financial lifestyle.

The digital divide

Low-income people and people of color tend to have less access (in terms of quantity and quality) to IT than do whites and higher-income people. Overall, access to IT is increasing at a rapid rate (National Telecommunications and Information Administration 2004). Although some groups of people, namely African Americans, Latinos, and the disabled remain persistently and disproportionately on the wrong side of the digital divide, the gaps between those who have access to IT and those who do not are rapidly closing. Gaps between rural and nonrural households and between seniors and younger people have begun to narrow. Some divides, such as that between women and men, have disappeared altogether.

Increases in online banking correlate with the spread of faster broadband connections. Sixty-three percent of people with broadband connections at home have tried online banking, whereas only 32 percent of people with dial-up connections have tried online banking (Fox 2005). More complicated financial applications tend to work much more smoothly with high-speed Internet connections. Although the proportion of U.S. households with broadband Internet connections more than doubled between 2001 and 2003, broadband remains too expensive for many potential users. Of the 80 percent of households that did not have broadband connections in 2003, nearly 40 percent said their reason was that it is too expensive (New America Foundation, U.S. Department of Commerce, National Telecommunications and Information Administration 2004). Broadband access is much more the norm for wealthier and better-educated Americans (Horrigan 2004).

Taken together, these three bodies of literature paint a vivid picture of the context out of which the Program was generated. LMI groups stand to gain a great deal from financial and technological literacy and from electronic banking but are consistently excluded from the benefits these advances offer. The rationale for the kind of program launched by the Bank was clear.

The program

In its materials, the Bank describes the Program as “a comprehensive community economic development initiative to stimulate wealth creation through digital inclusion and greater access to online financial services in low and moderate-income (LMI) communities.” The Program is innovative because it jointly addresses the digital divide and financial literacy issues. Broadly, the goals of the Program are to increase the financial literacy of participants, enhance the ability of participants to obtain assets, and help bridge the digital divide. The stakeholders of the program—which include the Bank itself, partner community-based organizations (CBOs), and program participants—are somewhat diverse, however, which has created a set of overlapping, but not entirely consistent goals. This was a demonstration program; the Bank deployed it in five LMI communities in three Northeastern cities—Boston (three sites), Newark, NJ (one site), and New York City (one site).

Clearly, the Bank had other goals in addition to those stated above. These included: (1) getting people to move from tellers and ATMs to online banking, which is much less expensive for the banks; (2) customer retention; and (3) converting customers with savings or checking accounts to become users of other investment products and loans.

Unlike many programs designed to bring technology to LMI groups, which focus primarily on access to computers and the Internet, the Program intended to address all three components of the digital divide: access, training, and content. The Bank addressed the access component by providing all Program participants with free computers, printers, and one year of Internet access. The Bank used the term “comfort” to refer to training and technical assistance both with respect to computer and Internet skills and to financial literacy. Recognizing the importance of the forum for the delivery of material, as discussed above, the Bank partnered with local CBOs to provide financial literacy training and training in both computer and Internet skills. The Bank believed that CBOs would best know their communities and that participants might be more comfortable attending classes at a CBO than at a bank. From a practical perspective, the chosen CBOs were already set up to deliver classes using computers (unlike bank branches). Chosen CBOs were already operating as community technology centers, providing computer literacy training to their constituents. Computer classes were mandatory for beginners and optional for those who had had some experience with computers. The Bank gave potential participants written questionnaires to complete to gauge participants’ level of experience with technology; responses to these questionnaires determined participants’ technology level. The Program attempted to address the content aspect of the digital divide problem by creating Web sites for the communities the program targets.

The Bank contracted with a third party to create a financial literacy curriculum tailored to this population. The interactive curriculum was delivered by the CBOs in computer labs and consisted of six 2-hour sessions. The financial literacy component of the Program was designed to teach participants basic financial concepts such as how to balance their checkbooks and how to use credit, as well as how to conduct their banking electronically.

All outreach was done through Bank branches; Bank personnel at branches in the targeted communities were to tell customers about the program and recruit those who met the eligibility requirements. To be eligible to participate in the Program, applicants had to meet the following criteria:

  • •   Bank customer for at least six months
  • •   be LMI (self-reported)
  • •   live in the target area served by the branch
  • •   not currently have a computer at home.4

We chose to study this Program because it represented an uncommon partnership between a Bank and CBOs to effect positive change in LMI communities. It also provided a unique opportunity to study a financial literacy intervention from beginning to end, enabling us to gain important insight into key planning and implementation issues such as the importance of defining success and connecting this definition to program design. We discussed the difficulties involved in providing financial literacy education to adults in an earlier section; we believed that the incentive environment created by the Program might encourage LMI individuals to obtain financial literacy education because of the free computer and training. In addition, if the Program moved significant numbers of people from teller and ATM banking to online banking, it might incentivize banks to do more on the financial literacy front. In effect, the Program represented a potential way for the bank to “do well while doing good,” achieving a double bottom line. For these reasons, we believed that the Program warranted study.


Given that the Program was a pilot, it was important to document both processes and outcomes to understand what changes could be built into a replicable model and to understand whether and how implementation may have affected results. This research combines aspects of both formative and summative evaluation (Fox, Bartholomae, and Lee 2005) and relies on data from quantitative and qualitative methods. Quantitative data from baseline and follow-up telephone surveys create the bones of the story to be told from participants’ background and experiences in this program, while qualitative data from interviews and focus groups help to put flesh on the bones. The “process” piece, in particular, which is critical to understand in a pilot, could only be addressed using qualitative methods. Understanding implementation issues would help us to evaluate whether outcomes were the result of the intervention and to what extent implementation issues played a role in the ultimate effects of the Program. We used both quantitative and qualitative methods to study outcomes.

The first two sites to host the Program—site 1 and site 2—were initiated before the evaluation began and were therefore ineligible for the quantitative component of this study as it was not possible to collect baseline data. For site 3 and site 4, we employed a random sample control group methodology. Half of all applicants at each site were randomly assigned to either the participant or the control group. Controls were told that they would receive their computers nine months after becoming eligible. We recognize that the sample population may not be representative of the LMI population overall because participants and controls had bank accounts and because the eligible population had uneven access to information about the program. Despite this issue, we maintain that the sample is of interest from a policy perspective. It represents a key segment of the LMI population that could benefit from financial literacy. Those who never bank are in a different category and may require different and more remedial interventions.

The baseline survey was administered by telephone by the Center for Survey Research and Analysis at the University of Connecticut. In addition to basic demographic data, we asked participants two kinds of questions. One set related to their financial literacy (i.e., knowledge of basic financial concepts). The other set asked for personal information about their assets, banking habits, use of technology, and savings patterns. There is considerable variation in how financial literacy programs are evaluated and a general lack of consensus about how to measure program outcomes, with many simply counting the number of participants who attend sessions (Lyons et al. 2006). Following consensus from a 2004 U.S. Government Accountability Office forum that “evaluation should focus on behavior change rather than just changes in knowledge and skills” (Lyons et al. 2006, p. 211), the financial literacy questions we employed attempted to gauge both participants’ knowledge (e.g., “Mutual funds pay a guaranteed rate of return” [True or False]) and their behavior (e.g., “Do you save money for emergency expenses?”, “Do you have written spending goals for this year?”). We conducted a follow-up survey via telephone one year after baseline to measure whether and how both knowledge and behavior had changed.5 For the knowledge questions, we looked at whether they answered questions correctly that they had answered incorrectly at baseline. For the behavior questions, we looked for changes such as paying off credit cards in full and saving regularly.

In addition to the baseline and follow-up surveys, we conducted a total of twenty-eight telephone interviews with key staff at the Bank, relevant staff at Program partner CBOs at all five sites, and other partners. These interviews focused on each actor’s perspective on and history with the Program. Given the stature of the interviewees and the kind of information we sought, we conducted the interviews as “guided conversations,” using the interviewees’ responses to direct the flow of the interview (Rubin and Rubin 2004). We analyzed the text of our interview notes to discern trends and to ensure that quotes used to illustrate points typified interviewees’ comments. At each site, we held one focus group for participants who had used the Bank’s electronic banking Web site and one for those who had not, for a total of ten focus groups across the five sites. Focus group questions complemented the telephone surveys. In addition to asking participants about their financial and technological knowledge and use, we also asked them about their experiences with the Program, suggestions for changes, and perceptions about whether and how their behaviors had changed as a result of participating in the program. At each group, we held a drawing for $100 which we used as an incentive for people to participate. We also attended a meeting called by the Bank’s foundation, which brought together all the CBO partners. Finally, we conducted an extensive literature review in the following areas: community development banking, banking and technology, financial literacy, and digital divide.6


Findings from survey research

Baseline survey findings

Table 1 documents numbers of participants and controls at site 3 and site 4 along with response rates for the baseline survey. The relatively low numbers of participants compromise our ability to do analysis beyond differences between participants and controls.7

Table 1. 
Interviews Attempted and Completed
 Site 3Site 4
Total called182984036
Interviews completed134622522
% of total completed73.663.362.561.1

We do not believe that the low numbers reflect lack of interest in the program; more likely, they result from early marketing decisions. The Program budget did not allow for extensive marketing. In addition, given that this was the first program of its kind, Bank staff had no way to gauge the potential response. Believing that giving away free computers could be a huge draw, bank staff understandably wanted to avoid a situation in which the Bank was flooded with applicants who would ultimately be turned away. The Bank therefore conducted outreach only through its branches. However, branch staff were not fully briefed on the program and had no real incentive to tell customers about it. Most of the participants we spoke with heard about the program from friends or family members. In addition, the window of time during which people could apply for the program was relatively small—six months. Finally, the fact that the Bank restricted the program to current customers also shrunk the potential participant pool. The Bank opened the applicant pool to those without a Bank account in site 5 and quickly had more applicants than it could take.8

Sample descriptive statistics

Table 2 presents sample means of baseline survey responses. The table is divided by experimental group status (treatment versus control), by neighborhood (site 3 and site 4), and by experimental status within neighborhood. We begin with a description of the complete sample. Survey participants are mostly female (80%), unmarried (80%), and African American (70%). Single parents make up approximately half the sample and approximately half the sample has 12 or fewer years of education. The labor market attachment and earnings of the sample are consistent with the research objective of focusing on low-income populations. A nontrivial portion of the sample, 20 percent, did not work at all in the past year and only 46 percent worked full time and full year. The average earnings of the sample were approximately $20,000 per year and 15 percent of the sample received public assistance. In general, the sample population is unsophisticated when it comes to banking and finances. About one-quarter of the sample has a checking and savings account, but few (12%) own stocks or bonds. Slightly more than half (52%) the sample has a credit card. Participants’ financial knowledge is also limited; for example, only 32 percent of the sample understood that mutual funds are risky investments. Finally, approximately 40 percent of the sample used the Internet sometimes or often, but only 17 percent bank online.

Table 2. 
Descriptive Statistics of Baseline Characteristics by Experimental Group Status and Experimental Group Status within Facility
VariableTotal (N = 243)aTotal SampleSite 3Site 4
Treat (N = 159)Control (n = 84)Treat (N = 134)Control (N = 62)Treat (N = 25)Control (N = 22)
  • Note: Asterisks indicate that difference in means is statistically significant at specified level. Difference in means is within columns as labeled in first row of table.

  • a

    Sample sizes listed in column headings represent number of interviews and not number of valid responses. The number of valid responses differs for each item.

  • b

    Calculated for those with valid earnings.

  • c

    Calculated for those with credit cards.

  • * .05 < p < .10, **p < .05.

Non-Hispanic white0.
Non-Hispanic black0.700.680.740.680.750.680.73
Number of children in household1.371.311.491.241.281.682.05
Single parent0.520.510.520.520.450.480.73*
Less than twelve years of education0.
Twelve years of education0.380.330.48**0.330.50**0.320.41
More than twelve years of education0.540.580.45*0.580.43*0.060.05
Did not work past year0.
Worked full time–full year0.460.460.450.480.400.400.59
Annual earningsb19,96620,03719,83820,66619,37817,87420,811
Missing earnings information0.350.360.330.410.390.080.18
Received public assistance0.
Has stocks, bonds, mutual funds0.
Has credit card0.520.500.560.510.580.440.50
Owns a mortgage0.
Banks online0.170.210.10**0.230.11*0.120.05
Banks by phone0.150.180.09**
Uses debit card0.160.190.10*
Saves money each month0.650.660.630.650.540.680.86
Always uses monthly budget0.390.370.420.390.410.280.45
Always pays bills on time0.590.600.580.610.550.520.68
Always plans and sets goals for financial future0.
Pays credit card balance each monthc0.
Knows mutual funds have risk0.320.300.350.290.350.360.32
Knows how to minimize credit card interest0.870.880.860.890.840.800.91
Uses Internet often0.210.240.14*0.260.10**0.160.24
Uses Internet sometimes0.220.280.10**0.280.12**0.280.05**

From a research perspective, one of the most important issues is whether the random assignment of study participants resulted in treatment and control groups that are similar. To investigate this, we show sample means by experimental group status in columns 2 (treatment group) and 3 (control group). Surprisingly, there are some statistically significant differences between the groups. Members of the treatment group are more likely to be female, are less educated, are more likely to bank by phone or online, and are more likely to use the Internet than members of the control group. These significant differences are most likely due to the relatively small sample sizes of the treatment (N = 159) and control (N = 84) groups; they do not necessarily indicate a nonrandom assignment.

There are also some significant differences by neighborhood. Survey participants from site 4 tend to have larger families and are more likely to be single parents than survey participants from site 3. Participants in site 4 also have fewer financial accounts, such as checking and savings, and are less likely to bank online or by phone than are participants from site 3.

Within neighborhood, there are some differences by experimental group status. In site 3, members of the treatment group are more likely to be female, are less educated, are more likely to bank by phone or online, and are more likely to use the Internet than members of the control group. This finding is similar to that for the full sample of which participants from site 3 make up 81 percent. In site 4, members of the treatment group are less likely to be single parents and more likely to use the Internet than members of the control group. Other differences between the treatment and the control groups in site 4 are sometimes large, but because of the small sample sizes, these differences are not statistically significant. For example, 40 percent of the treatment group worked full time and full year, whereas 60 percent of the control group worked this amount.

In sum, the sample represents a relatively low-income group that is financially and technologically unsophisticated. These are exactly the types of persons who may benefit from the Program. The provision of computers, computer instruction, and financial literacy training may improve the financial planning skills of this group and their computer skills, which would lower the cost of access to banking and financial advice and information.


The respondents (treatments and controls) to the baseline survey were recontacted approximately one year subsequent to the first interview. The goal of the follow-up survey was to obtain information that could be used to assess whether the Program had an impact on computer use and financial planning skills. We also collected information about participants’ satisfaction with the training classes that were offered. Before assessing program impacts, however, it is of interest to investigate the extent and nature of attrition from the sample. Ideally, there would be no attrition, but if there is, it would be preferable that the attrition not be related to experimental status—that is, whether you were in the treatment or in the control group.

Table 3 presents the results of an ordinary least squares regression analysis of attrition. The dependent variable is equal to one if the respondent was not successfully contacted at the time of the follow-up interview. The regression model includes all the variables listed in Table 1 as control variables plus a set of dummy variables that measure the number of times the person was called to obtain the baseline interview. For example, one person may have responded to the survey on the first call, while it may have taken several calls before another person responded.

Table 3. 
Ordinary Least Squares Regression Estimates of Attrition Probability
  1. Note: Standard errors are given in parentheses.

  2. * .05 < p < .10, **p < .05.

Site 3 −0.06 (0.09)
Control group −0.19 (0.07)**
Female −0.18 (0.09)**
Hispanic0.28 (0.25)
Non-Hispanic black −0.01 (0.23)
Other race −0.13 (0.25)
Married0.00 (0.12)
Number of children in household0.06 (0.03)**
Single parent −0.08 (0.11)
Less than twelve years of education −0.08 (0.14)
Twelve years of education −0.10 (0.08)
Did not work past year0.02 (0.11)
Worked full time–full year0.00 (0.08)
Annual earnings −0.00 (0.00)
Missing earnings −0.02 (0.08)
Received public assistance0.07 (0.10)
Has checking account0.02 (0.11)
Has savings account −0.03 (0.10)
Has stocks, bonds, mutual funds −0.01 (0.12)
Owns a mortgage −0.07 (0.11)
Has credit card0.02 (0.07)
Banks online −0.03 (0.11)
Banks by phone −0.12 (0.08)
Uses debit card0.03 (0.09)
Saves money each month −0.08 (0.07)
Plans and sets financial goals for future −0.03 (0.08)
Always uses monthly budget −0.07 (0.07)
Always pays bills on time0.08 (0.07)
Pays credit card balance each month −0.16 (0.13)
Missing pays credit card balance information0.02 (0.10)
Knows mutual funds have risk0.04 (0.08)
Knows how to minimize credit card interest0.01 (0.10)
Uses Internet often −0.09 (0.10)
Uses Internet sometimes0.03 (0.09)
Three to five calls in wave 10.05 (0.09)
Six to eight calls in wave 10.35 (0.11)**
Nine or more calls in wave 10.25 (0.11)**
Intercept0.70 (0.29)**
No. of observations215

Estimates shown in Table 3 indicate that attrition is significantly related to experimental group status; persons in the treatment group—those who received a computer—were significantly less likely to participate in the follow-up survey than those in the control group who were waiting to receive a computer. The probability of participating in the follow-up survey was 19 percentage points lower among the treatment group than the control group. This result is understandable: persons in the treatment group had already received a computer and the incentive to continue in the study was greatly reduced. Few of the other observed characteristics were related to attrition. Females were less likely to leave the sample than were males and those with relatively few children were less likely to leave the sample than those with two or three children. However, the greatest correlate of attrition was the number of calls it took to obtain the baseline survey. The probability of participating in the follow-up survey was approximately 30 percentage points lower for those who required six or more calls to obtain the baseline interview than for those who needed only one or two calls.

Sample attrition can result in biased estimates of program effects. This is particularly true when the attrition is nonrandom as in the current case in which treatment group members were more likely to leave the sample. Other estimates in Table 2, however, suggest that this nonrandom attrition will not be a severe problem. Fortunately, sample attrition was unrelated to most observed characteristics. This implies that those who did not participate in the follow-up are no different than those that did, at least on the basis of many observed characteristics.9 It appears that those treatment group (and control group) members that left the sample were similar to those that stayed. The fact that those who were difficult to contact the first time were less likely to participate in the follow-up is understandable, as is the fact that those in the treatment group were less likely to participate. What is surprising, but advantageous, is that these “leavers” were observationally very similar to “stayers.” Therefore, the bias due to attrition is likely to be small.

Program effects

We obtained estimates of the program effect using a pre- and post-test with comparison group research design, which is sometimes referred to as a difference-in-differences (DD) approach.

The underlying assumption of this research design is that pre- and postintervention changes in outcomes would be the same for treatment and control group members if there had been no intervention. This assumption is likely to be valid given the experimental design and the evidence presented in Table 2 as to the similarity of the treatment and control groups. Nevertheless, the absence of perfect randomization and the fact that attrition was not random may result in some bias.

In Table 4, we present simple and (covariate) adjusted DD estimates.10 The first three columns pertain to the treatment group. The column 3 labeled “difference” shows an estimate of (B-A) from above. Similarly, column 6 shows an estimate of (D-C). And the last two columns show estimates of the DD. One shows the unadjusted (simple) DD and the second shows estimates adjusted for other covariates. Focusing on the results in column 3, we see that there are statistically significant increases in a few important areas. These are the proportion of treatment group members: who own stocks and bonds, who bank online, who use a debit card, and who use the Internet often. Other changes tend to be relatively modest and not statistically significant. At first glance, there appear to be few changes in outcomes specifically targeted by the program. However, temporal changes unrelated to the intervention may obscure true program effects. Therefore, it is necessary to examine the pre- and postintervention changes in the control group to isolate the program effects from other potentially confounding factors.

Table 4. 
Differences and DD Estimates of Program Effects Site 3 and Site 4 Combined (Unpaired Data Set)
VariableTreatment GroupControl GroupDD Treatment—Control
Wave IWave IIDifference: Wave II − Wave IWave IWave IIDifference: Wave II − Wave INo Covariate AdjustmentCovariate Adjustment
(1)(2)(3) = (2) − (1)(4)(5)(6) = (5) − (4)(7) = (3) − (6)(8) = (3) − (6)
  • a

    Calculated for those with credit cards.

  • * .05 < p < .10, **p < .05.

Has stocks, bonds, mutual funds0.110.200.09*0.120.10 −
Has credit card0.500.560.060.560.640.08 −0.020.02
Owns a mortgage0.** −0.11 −0.10
Banks online0.210.340.13***
Banks by phone0.470.41 −0.060.400.400.00 −0.06 −0.05
Uses debit card0.610.750.14**0.490.640.15* −0.01 −0.03
Saves money each month0.660.60 −0.060.630.57 −
Always uses monthly budget0.370.390.020.420.37 −
Always plans–sets financial goals for future0.270.22 − −0.14 −0.12
Always pays bills on time0.600.650.050.580.57 −
Pays credit card balance each montha0. −
Knows mutual funds have risk0.300.310.010.350.30 −
Knows how to minimize credit card interest0.880.86 −0.020.860.880.02 −0.04 −0.04
Uses Internet often0.240.410.17**0.140.320.18** −0.010.02
Uses Internet sometimes0.280.27 − −0.08 −0.07

The pre- and postintervention differences for the control group are presented in column 6. In this case, there are statistically significant increases in the proportion of the control group: who own a mortgage, who use a debit card, and who use the Internet often. In addition, the proportion of the control group that has a credit card increased by 8 percentage points.

In general, the pre- and postintervention changes in outcomes for the control group are quite similar to those for the treatment group. This implies that the program had few effects. This point can be further illustrated by examining the DD estimates in the last two columns. The first point to note is that there are with one exception no statistically significant differences. Online banking increased significantly more (between 9 and 15 percentage points) for the treatment than control group, but this effect is almost assured by the nature of the intervention. Thus, from a purely statistical point of view, the Program had no effects. But even using a less stringent criterion of whether there were large (practically important) effects reveals few potential program impacts. On the other hand, use of online banking increased by at least 9 percentage points for the treatment group, an increase that is meaningful from the Bank’s perspective. This represents a 42 percent increase over the baseline level, but still leaves the number of persons using online banking at 30 percent (21 baseline plus 9 program impact). For other outcomes, there were few “large” effects. In fact, the largest effect was perverse—members of the treatment group were less likely to plan and set future financial goals at follow-up than they were at baseline.

In Table 4, we do not allow program effects to differ by the intensity of the intervention. We simply investigate whether those in the treatment group differed systematically from those in the control group. But within the treatment group, the intensity of the intervention differed; some members of the treatment group attended beginning classes in financial literacy, while others attended intermediate classes and still others attended no classes. We can use this information to assess whether there was a dose-response relationship between the intervention and the outcomes. Specifically, we examine whether the program had different effects for three types of persons among the treatment group: those who took no financial literacy classes, those who took only beginner classes, and those who took intermediate level classes. Unfortunately, since participation in classes was voluntary, this stratification of the treatment group possibly introduces some selection bias. For example, perhaps those more motivated to learn will enroll in classes, but motivation to learn is likely related to other characteristics that influence how much these persons save or plan for their financial future. Thus, the results of this analysis need to be interpreted with caution.

We do not present the results of the dose-response analysis, but summarize the findings. In short, we gained little insight from the analysis. In most cases, there were no differences in outcomes between persons in different categories. In fact, there were some perverse effects, as those with fewer classes sometimes had greater improvements in financial literacy. This result is most likely related to the selection issue noted above.

The quantitative analysis tells us that the Program did not have significant impacts on participants. However, given that we were studying a demonstration program, the qualitative component of the work is particularly important. The intention for this component of the work was to gain a better understanding of why the Program results were what they were. Were they, for example, a function of poor program design? Of implementation? Do these results indicate that the idea behind the Program was faulty?

Findings from focus groups and interviews

The findings below are separated into two sections—one related to the program and another that concerns participants. Findings from the qualitative component of the research inform the findings from the survey and shed light on potential reasons for the dearth of effects.

Program-related findings

Interviews and focus groups revealed significant issues with the implementation of the Program. These issues concerned marketing, resources, logistical problems, and cost. We believe that these issues compromised the ability of the Program to achieve its potential. A fine-grained understanding of these issues and their likely connection to Program outcomes illuminate the importance of planning and linking design features to definitions of success.

Initial marketing decisions led to much smaller numbers of participants than the Bank anticipated. All marketing of the Program was done through the branches and this marketing was uneven. According to one Bank staffer, “the program wasn’t marketed actively enough at the branches—they weren’t doing a real sales pitch.” Said another, “We were unable to get out into the community as much as we would have liked.” Word of mouth works well but takes time. By the time word got out about the program, the six-month sign up windows were closing.

Representatives of all the CBOs interviewed expressed a desire to have been more involved in outreach from the beginning of the program. They believe that they have a better understanding of the community they serve than does the Bank. Said one CBO representative, “We can hustle here—we have organizers.” One interviewee offered the following explanation: “There has been an overall concern about the possibility of negative PR that has resulted in the project being more low-key than I would have liked it to be …. I understand the concern, but the program was promoted very passively.” Although the Bank recognized this—to an extent—by incorporating CBOs into the program design, the implementation failed to fully leverage the CBOs’ assets.

When asked how they had heard about the Program, many participants told us that it had been poorly publicized and that they heard about it from a friend. One mentioned that she had asked the bank representative at her local branch about it, but that the rep had no knowledge of the Program. Only her persistence enabled her to participate.

It also became apparent to us that the Bank did not devote sufficient resources to the Program to achieve success. Bank staff and community actors generally agreed that the Program budget was not large enough to realize the full potential of the Program. Said one bank staff member, “We have only two people to implement this—there is so much more we believe we could be doing. [Our primary staff person] tries to make house calls if we can’t solve problems over the phone. I just know the impact of that is so great. If we could replicate that across even 50% of participants our impact would be that much greater.” A CBO executive director offered the following, “There needs to be at least one dedicated person to focus on managing this and really be up on what’s going on. There are only a couple of people [at the Bank] working on this, and they are working on other things. [The Program] needs a champion, someone to go to community meetings and keep in better touch with the stakeholders.”

Logistical problems, which arise during any demonstration program, also compromised the potential success of the program. The Bank initiated the Program before all the necessary pieces were in place. For example, site 1 participants received their computers months before the curriculum was completed, which delayed their receipt of training. At another site, a CBO executive director complained that only two participants had received their computers when training began. Therefore, these participants could not go home and practice on their computers between classes. According to this executive director, it is “essential that they have their computers when they are going through the training.” Said another, “Logistics were very poor.”“There were people who had already started and completed the program when they received letters from the Bank saying they were now eligible to participate. And they still had no computers.” In some classes, there were problems with the electronic banking Web site, making it impossible for the instructors to demonstrate it to participants at class time. In addition, some CBOs complained that the Bank repeatedly postponed events and changed dates. These changes included training sessions for the CBOs and dates by which computers would be delivered to participants. Clearly, these are problems that any startup program could encounter. The key is to learn from these early problems and to incorporate them into ongoing work; early evidence is that the Bank did this at site 5. At the same time, it is reasonable to hypothesize that these issues affected the impact of the Program on participants.

Although the Bank provided free equipment and Internet access, the Program did not take into account ancillary expenses related to the Program that participants might incur. The LMI individuals that the Program targeted are extremely sensitive to cost. One participant said that she has not replaced her printer cartridge because they are too expensive ($30–$40). Another told that she could not find the cartridges in her neighborhood and that it was hard for her to get a ride to a store that carried them. This story begins to reveal the complexity of the lives of low-income urbanites and how comprehensive a program must be to be effective. We confronted the transportation issue ourselves on one of the days we held our focus groups. The weather was very cold with freezing rain and snow was predicted. As we made calls the morning of the focus groups to remind people to attend, several told that they were unable to make it out because of the weather and their lack of transportation options.

Other participants have come to the end of their year of free Internet service and do not know if they can afford to pay for it themselves. Some expressed concern about the fees they would have to pay to continue banking online (as part of the program, they received free access to the Bank’s online banking site for only one year). Although online banking is inexpensive for banks, it is relatively expensive for consumers who may not otherwise use high-speed Internet services because of the fees associated with this use. Sophisticated online banking software is cumbersome with a modem. One participant noted that there were other computer classes available at the CBO but that they cost money and she could not afford them—she believed that she would have benefited from more free training.

Participant-related findings

For those participants who achieved a reasonable comfort level with the online banking, we found interesting potential connections between online banking and financial literacy. For those participants who had less experience with ICT, it took a significant amount of training and support to get them comfortable.

One clear difference between participants who used the Bank’s electronic banking Web site and those who did not was that the former tended to have some prior experience with computers and/or a friend or family member who could help them navigate the Internet and answer questions when they got stuck. Inexperience and fear of being taken advantage of were two factors that came across clearly in the groups of participants who did not use electronic banking. Said one participant: “I’m afraid to do it [bank online]. You hear about all of these scams, and people getting your number. I would have to make sure I had it down pat in order to put my info in there.” Another was fearful of banking online “because you’re dealing with bills and dealing with money and I don’t want to mess it up.” With respect to inexperience, one participant said that the first several times she tried to connect to the Internet, the computer “made a funny noise, so I just shut it down because I thought I was breaking it.” Upon telling her instructor about her “problem,” she learned that the noise was her modem connecting to the Internet. Stories such as this one illustrate the basic level at which IT neophytes must be addressed. Even those who began to use electronic banking expressed some trepidation about doing business online. Said one participant, “If I want to do something and I find out it’s not a secure line, I’m out of there.”

When asked why they were interested in applying for the Program, participants spoke mostly about feeling the need to learn about technology. Typical responses included one person who said, “It’s something that you need to know to keep up.” Another relayed that she had “been told that it’s something you need to know to get jobs” and that she had been asked whether she was computer literate. Said another, “I felt like, ‘Wow, I can really learn this.’ I don’t have to feel funny now.” Another echoed this sentiment, saying, “Now I know what it’s like to have a computer in my home—it’s like not being left out.”

One general finding that held across all groups was that most participants desired more training—longer classes and more classes. Several participants spoke of having forgotten what they learned in class and wishing they could return for a refresher course. Said one participant, “If you’ve never done computers, it’s like someone hitting you over the head with a block. The information goes over my head, like there’s so much coming at you and bombarding you that you start to tune it out.” Participants were perhaps less comfortable with technology than Bank staff anticipated. Said one staff member, “Many of the customers were needy; they had a lot of questions and concerns that needed to be addressed.” Another recognized that “ten hours of free training was not quite enough, especially for those with no experience. We stretched it from ten to fifteen because we didn’t hit the numbers we wanted.”

The Bank had some problems with the contractor used to create the curriculum, and some staff were unhappy with the result. Said one staffer, “there were not enough visuals, it wasn’t portable enough, and it was not good for people who have trouble reading.”

One CBO staffer said that Program participants get interested in the computer training quickly, even if they are new users. “Once they get to the second lesson, they want to keep going—that sort of locks them in.”

Although we did not see significant behavioral changes in participants, such as saving more money, the comments participants made about the financial literacy component of the Program indicate that there may be some important synergies between financial literacy training and e-banking. Those who are using the Bank’s electronic banking Web site find that electronic banking helps them to pay their bills on time. Typical comments included that of one participant who said that it “keeps me organized.” Another noted that it “made it easier for me to look at my spending by seeing my statements online.” Others echoed this sentiment that the visual aspect of electronic banking was key. For example, “I feel like I have more control of my money because I can see it.” Said one participant, “Now I know where my money is, and how much I have to play with. I don’t carry cash anymore.” Yet, another liked that she “could easily tell when a bill was paid.” A few participants are very happy with recent Web site enhancements, which they find easier to use. There is some anecdotal evidence that learning the budgeting software is enabling participants to think differently about money and to begin to save. Said one participant, “I do have money. I save more now because I can see it.” One of the CBO trainers said that in the classes on financial literacy, she saw, “Lightbulbs going off. They were seeing how they could make adjustments and begin to save. These tools are helping them to make better choices, and nobody else is giving them these tools. They are learning that saving even $25 a month will make a difference.” Others said that being able to transfer money from one account to another online enabled them to avoid costly fees because it helped them avoid bouncing checks. Going through the program has generally made participants more comfortable with technology and particularly with e-banking. Said one CBO staff member, “everyone was so afraid of it, and now they realize there’s nothing to it.”

At the same time, two participants had had negative experiences that made them even more reticent about trusting the technology.11 For example, one participant relayed that she had tried to set up her account to pay bills while she was on vacation; she returned to find that several checks had bounced. Although she was able to negotiate with the Bank to get the fees reduced, she is now “scared to go back and try it again.”

One factor that most certainly affects the financial literacy outcomes of this intervention concerns the socioeconomic status of the participants. LMI individuals, by definition, have less ability to save, invest, and engage in other positive financial behaviors because their income and assets are minimal. As Lyons et al. maintain, “no matter how much financial education they receive, financially insecure participants will likely find it more difficult than financially secure participants to meet certain program goals (i.e., increasing savings, paying bills in full)” (2006, p. 232).

Reactions to the curriculum were mixed and indicate that the curriculum developed may not have worked particularly well with the target population. One participant felt that “the budgeting information was helpful but I don’t use it. The spreadsheets were more in depth and more complicated than I need.” When asked whether and how the financial literacy information helped them to keep track of their expenses, one participant said, “The computer won’t help you if you don’t have the discipline yourself.” In general, CBO staffers felt that the materials provided by the Bank were good. One CBO executive director felt that the curriculum was geared toward more advanced students—at that CBO, they incorporated their own curriculum into what the Bank provided. This executive director believed that a more basic curriculum was needed and that the Bank should do a more precise screening to separate participants with no computer experience from those with very limited experience.

Discussion and policy implications

Our quantitative analysis shows that the Program generated few effects. On the positive side, there were statistically significant increases for participants in the following areas: owning stocks and bonds, using a debit card, owning a credit card, and using the Internet. Findings from the qualitative component of the study showed substantial implementation issues. It seems reasonable to hypothesize that Program effects may well have been larger if the Program had been better implemented.

All demonstration programs need the space to try out new ideas and incorporate the learning from initial tests into later models. Although the Program set out to address access, content, and training issues, it fell somewhat short on the content and training fronts. Participants clearly wanted more training and the Web site created to address content did not meet participants’ needs. We believe that the idea behind the Program is viable, but that appropriate implementation is key.

Indeed, any new initiative needs to “begin with the end in mind.” Despite its good intentions, the Bank encountered problems because it instituted the Program without thinking through at the beginning how to define and ultimately measure success. This finding leads to our first policy implication, which is that financial literacy interventions must be thought through from beginning to end to maximize the potential for success. In the case of this Program, the Bank clearly could have better used the CBOs it partnered with to foresee and prevent issues that participants ultimately faced, such as a need for more training, or the expense of maintaining their computers and paying for the Internet after the one-year intervention came to a close. The theoretical framework of the technology acceptance model also informs implications around program design. In the case of the Program, the intervention changed participants’ perceptions of the ease of technological literacy and of the usefulness of financial literacy. Given the difficulties associated with delivering financial literacy education to adults, future financial literacy interventions could usefully employ this framework.

Another important lesson from this project is that creating these interventions does not come cheaply. Making a program of this kind work requires significant investment and maintenance. For corporate actors to undertake this investment, there must be a compelling reason such as a boost to the bottom line. Coupling ICT training with financial literacy training may help banks expand into currently underserved markets in a manner that is cost effective. In this demonstration program, technology operated as a powerful hook to get LMI individuals to the table to learn about financial literacy. And the prospect of moving customers from ATM and teller services to e-banking acted as an incentive for the Bank to provide not only the financial literacy training but also the technology. However, if doing so requires providing the level and amount of equipment and training that this research implies it might, it is possible that the cost–benefit analysis of whether or not to provide such programs would not come out positive for banks. Motivating banks to participate more in financial literacy initiatives such as this one will require policy makers and advocates to make a better case for how banks benefit from these initiatives and why “double bottom line” thinking is important.

Our work also supports that of others who theorize that financial literacy education is most powerful when it connects to a person’s life in concrete ways. Financial literacy advocates have recommended teaching financial literacy skills on the job or at key moments, but LMI individuals are less likely than the average person to be employed or to be purchasing a major asset. A third policy intervention involves the need to think creatively about interventions that resonate with peoples’ lives. The Program got to participants through their bank accounts, using technology as a lure. A similar intervention that incentivized the unbanked to obtain a bank account might be appropriate for an even larger population.

Finally, although this research demonstrates a potentially powerful connection between technological literacy and financial literacy, it also illustrates the barriers LMI individuals face in achieving technological literacy. Given the difficulties in designing financial literacy interventions for adults (Parrish and Servon 2006), and the finding that manipulating their own money online may be a compelling gateway into financial literacy, another policy implication arising from this work is that efforts to close the digital divide must be strengthened. Although the current administration has nearly eliminated support for digital divide initiatives, compelling research suggests that the digital divide persists and that LMI individuals are most likely to be on the wrong side of the divide. Broadband is increasingly necessary to support online banking, and people who either do not have any Internet access or who have dial-up access tend to be low income (Horrigan 2007). Leveraging the connection between technological and financial literacy will require targeted initiatives to address this issue.


Financial literacy and technological literacy are important resources that low-income people need to exit poverty. Servon (2002) uses the term “second-order resources” to describe the tools people need to exit poverty rather than simply surviving from day to day. There is some evidence from our work that technological training and e-banking supports financial literacy—the ability to see and work with their own money that the e-banking enables made the financial literacy training more compelling to participants. However, without appropriate financial literacy and ICT training, e-banking will remain a sphere reserved only for financially literate, well-educated, high-income customers. Public policy research has not produced any empirical data on how e-banking and Internet services have shaped LMI households’ economic fortunes. However, this work suggests that IT can expand access to greater financial freedom through increased financial literacy and less-expensive banking alternatives for LMI household users and poor inner-city communities.


Differences and DD Estimates of Program Effects Site 3 and Site 4 Combined (Paired Data Set)
VariableTreatment GroupControl GroupDD Treatment—Control
Wave IWave IIDifference: Wave II − Wave IWave IWave IIDifference: Wave II − Wave INo Covariate AdjustmentCovariate Adjustment
(1)(2)(3) = (2) − (1)(4)(5)(6) = (5) − (4)(7) = (3) − (6)(8) = (3) − (6)
  • a

    Calculated for those with credit cards.

Has stocks, bonds, mutual funds0.
Has credit card0.470.560.090.610.640.030.060.10
Owns a mortgage0.* −0.07 −0.07
Banks online0.250.340.
Banks by phone0.510.41 −0.100.450.40 −0.05 −0.05 −0.04
Uses debit card0.650.750.100.490.640.15* −0.05 −0.04
Saves money each month0.680.60 −0.080.640.57 −0.07 −0.010.02
Always uses monthly budget0.360.390.030.470.37 −
Always plans–sets financial goals for future0.290.22 − −0.15 −0.12
Always pays bills on time0.590.650.060.540.570.030.030.08
Pays credit card balance each montha0. −
Knows mutual funds have risk0.330.31 −0.020.310.30 −0.01 −0.01 −0.01
Knows how to minimize cc interest0.890.86 −0.030.850.880.03 −0.06 −0.08
Uses Internet often0.290.410.120.140.320.18** −0.060.02
Uses Internet sometimes0. −0.06 −0.07


  • 1

    Estimates of the unbanked have been developed by the General Accounting Office using data from the 1998 and 1999 Survey of Income and Program Participation.

  • 2

    The Community Reinvestment Act was passed in 1977 to address the widespread practice of redlining in low-income areas and has somewhat mitigated this condition. Community Reinvestment Act requires banks to provide services to all the areas in which they operate. However, although new guidelines added in 1992 gave Community Reinvestment Act sharper teeth, the problem remains.

  • 3

    Some leading-edge corporate actors have begun to provide financial education programs as research has begun to demonstrate a link between workers’ financial stability and productivity (Garman 1998; Kim and Garman 1998; Quinn 2000).

  • 4

    The Bank changed these requirements somewhat for site 5 because the bank was opening a new branch there. Therefore, the Bank allowed people to apply for the program and open an account at the same time; these applicants had to wait six months before receiving their computers. The Bank hoped to recruit two hundred participants for the Program in site 5 and did so very quickly.

  • 5

    It is possible that waiting longer than one year to conduct the follow-up survey would have enabled us to see greater Program effects. We decided against waiting longer for several reasons. First, it would have added significantly to the cost of the study. Second, given that LMI individuals tend to be mobile and difficult to keep track of, we would have lost many participants and controls. And finally, we would also have had to defer giving the control group computers, Internet access, and training; the Bank believed doing so would upset its customers and was unwilling to do so.

  • 6

    The Bank would not allow the use of random control group survey methodology in site 5. This fact, combined with the difference in the structure of this site keeps us from combining data from the site 5 with data from the other two sites.

  • 7

    The Bank originally hoped to enroll a total of three thousand participants in the Program. The actual number of participants enrolled was much lower.

  • 8

    They decided to do this because it was a brand-new branch with no existing customers.

  • 9

    This implies that attrition is also unrelated to unobserved characteristics.

  • 10

    In Appendix 1, we use an unpaired sample. We use all valid observations form the baseline and follow-up survey to calculate means and differences. Appendix 1 provides a similar analysis using paired data—information from respondents that participated in both the baseline and the follow-up surveys.

  • 11

    Only two focus group participants expressed this kind of reticence. We did not ask questions about negative technological experiences on the survey.