The full text of this article hosted at iucr.org is unavailable due to technical difficulties.

ORIGINAL ARTICLE
Free Access

Conceptualizing cultural literacy through student learning outcomes assessment

Kate Paesani

Center for Advanced Research on Language Acquisition, University of Minnesota

Search for more papers by this author
First published: 19 May 2018

Catherine M. Barrette (PhD, University of Arizona) is Director of Assessment and Associate Professor of Spanish, Wayne State University, Detroit, MI.

Kate Paesani (PhD, Indiana University) is Director of the Center for Advanced Research on Language Acquisition and Affiliate Associate Professor, University of Minnesota‐Twin Cities, Minneapolis.

Abstract

Student learning outcomes (SLO) assessment is often perceived as a burdensome, top‐down process driven by institutional requirements. Recent research has thus called for discipline‐specific models of SLO assessment to more deeply engage postsecondary faculty in determining what constitutes student learning within their programs. In response, this article presents a recursive process of developing and applying a model of cultural literacy to SLO assessment documents from one foreign language (FL) department comprising multiple BA programs. The study has two interrelated parts: (1) operationalizing the concept of cultural literacy, and (2) analyzing assessment documents using qualitative content analysis. Findings show that cultural literacy is characterized by eight unique factors evident across FL program assessment documents to varying degrees. This measurable definition of cultural literacy can facilitate the alignment of goals and outcomes with disciplinary principles and has implications for curriculum development and instructional practices in collegiate FL programs.

1 INTRODUCTION

Given the current emphasis on accountability in higher education, program evaluation can be perceived as a top‐down process driven by institutional requirements.1 Indeed, assessment of student learning outcomes (SLOs), a common mechanism for postsecondary program evaluation in the United States, is often mandated by accrediting agencies and focused on general education goals such as critical thinking, problem solving, or oral communication (e.g., Davis, 2015; Hutchings, 2011; Swarat et al., 2017). Foreign language (FL) faculty may thus consider SLO assessment to be opaque and burdensome, and there are few models for adapting this top‐down approach to reflect the disciplinary principles that shape academic programs (e.g., Banta, 2007; Heiland & Rosenthal, 2011; Kuh, Janowski, Ikenberry, & Kinzie, 2014). Yet when assessment is firmly anchored within a particular discipline, it can lead to faculty engagement, deeper thinking about what constitutes student learning, and principled program improvement. In fact, “without such considerations, one might say that assessment is ‘departmental’ but not necessarily ‘disciplinary’” because it lacks “significant deliberation about what it means to know the field deeply, why that matters, and how to ensure that all students in the program achieve its signature outcomes at high levels” (Hutchings, 2011, p. 37). Thus, a focus on SLO assessment in postsecondary FL programs in an age of accountability provides

a decisive opportunity to state who we are, why we exist, what our value is to learners, institutions, and society, and, quite frankly, why we should not be shut down and why we should play a serious role in any comprehensive approach to a truly liberal education. (Norris, 2006, p. 577)

To gain insight into the interaction between disciplinary principles and SLO assessment, this article describes the recursive process of developing and applying a model of cultural literacy to program assessment documents (mission statements and learning outcomes) from one FL department comprising multiple BA programs. The study had two interrelated parts: (1) operationalizing the concept of cultural literacy, and (2) analyzing mission statements and SLOs using qualitative content analysis. The purpose was to build a measurable definition of cultural literacy that could be applied to multiple collegiate FL contexts to inform SLO assessment, curriculum development, and instructional practices. Moreover, the study aimed to provide a model for using discipline‐specific program assessment to facilitate the alignment of goals and outcomes with disciplinary principles.

2 BACKGROUND

2.1 SLO assessment in FL contexts

Two components of SLO assessment that are relevant for the present study are the mission statement and learning outcomes.2 The mission statement “communicates a broad vision of the fundamental purposes and values of a program, providing an important view of what matters most to faculty” (Allen, 2004, p. 29) and includes goals, stakeholders, and offerings. Learning outcomes operationalize the mission statement and articulate expectations for student learning within a program; they “identify what students should be able to demonstrate, represent, or produce because of what and how they have learned at the institution or in a program” (Maki, 2010, p. 89). In cohesive curriculum design, these elements of SLO assessment inform one another and are informed by the larger mission and objectives of an institution. Yet often programs and departments “are largely left to their own devices to determine what outcomes need to be assessed” (Norris, 2016, p. 174), making coherent SLO assessment challenging. Given the growing need for discipline‐specific models of and inquiry into SLO assessment, making disciplinarity a central part of assessment allows “faculty to operate where they are the most comfortable and to bring the field's distinctive questions, methods, and ways of thinking to the task of improving their students’ learning” (Kuh et al., 2015, p. 103).

One of the most important disciplinary principles shaping postsecondary FL programs is cultural competence, or cultural literacy, as it provides a means of uniting diverse departmental subdisciplines such as literary and cultural studies, film, linguistics, or history (e.g., Byrnes, 2012; Kern, 2000; Paesani, Allen, & Dupuy, 2016). However, empirical research on SLO assessment of cultural literacy has been limited (e.g., Norris & Davis, 2015; Norris & Mills, 2016); this may be due in part to the difficulty of defining and operationalizing this concept. For instance, Warford (2006) claimed that as a profession we have made great progress in determining measurable learning outcomes for language proficiency but significantly less for cultural competence. He proposed a definition of cultural literacy comprising cognitive processes, sociolinguistic skills, and attitudes, and he described his department's efforts to measure this definition through SLO assessment. Likewise, Schulz (2007) argued that there is no agreed‐upon definition of culture in FL contexts. In response, she proposed five culture learning outcomes that connect knowledge and understanding to cultural products, practices, and perspectives; relate culture and language; and prioritize cultural comparisons. In a third study, Ryshina‐Pankova (2015) targeted another shortcoming in relation to cultural literacy and learning outcomes: Cultural knowledge is often separated from linguistic knowledge, with the latter usually playing a more important role in SLO assessment. In response, her program's statement of educational goals for FL literacy served as a starting point for determining SLOs that integrated cultural knowledge, language, and humanistic learning. She and her colleagues thus drew on discipline‐specific principles (e.g., literacy, genre) and context‐specific needs (e.g., programmatic benchmarks, writing assessment) to craft outcomes related to cultural content, perspectives, and global citizenship. Finally, Michelson (2018) investigated students’ understandings and representations of culture and culture learning in a French global simulation course with a literacies orientation. Although it did not specifically target SLOs, Michelson's analysis of final course portfolio reflections revealed that students regarded culture as dynamic, variable, and relational but that they also prioritized language over culture learning and represented culture in stereotypical ways. These results suggest that FL departments should craft SLOs that encourage meaningful analysis and comparison of cultural products, practices, and perspectives, as well as deeper connections between linguistic and cultural learning.

Taken together, this small body of research on SLO assessment of culture in collegiate FL programs underscores the need to establish discipline‐specific, operational definitions of culture and cultural knowledge that go beyond mere language acquisition and to consistently and effectively apply these definitions to assessment practices. Although the new NCSSFL‐ACTFL Intercultural Communication Proficiency Benchmarks and Performance Indicators (NCSSFL‐ACTFL, 2017) address aspects of this research gap, empirical research investigating the operationalization and application of cultural literacy objectives to undergraduate FL programs does not exist.

2.2 Theoretical orientation: Cultural literacy

Over the past two decades, the concept of cultural literacy has gained prominence as a viable goal for organizing collegiate FL curricula (e.g., Byrnes, Maxim, & Norris, 2010; Kern, 2000; Swaffar & Arens, 2005; Paesani et al., 2016). The literacy scholarship that informs this work and provides the theoretical underpinnings for the current study reflects ideological, socially oriented models positing literacy as a contextually variable, culturally embedded practice of meaning‐making (e.g., Cope & Kalantzis, 2009; Kalantzis, Cope, Chan, & Dalley‐Trim, 2016; New London Group, 1996). As such, literacy is multifaceted and “is not simply a matter of correct usage. It is also a means of communication and representation of meanings in a broader, richer, and all‐encompassing sense” (Kalantzis et al., 2016, p. 4); it comprises knowledge of language and formal conventions as well as the ability to communicate in diverse settings and through varied and overlapping modalities (written, oral, visual, gestural, etc.). Moreover, using language to communicate is a dynamic process of transformation and creation rather than a process of replication or repetition (Cope & Kalantzis, 2009).

This broad definition of literacy is reflected in Kern's (2000) application of the concept to postsecondary FL study. He posited that

literacy is the use of socially‐, historically‐, and culturally‐situated practices of creating and interpreting meaning through texts. It entails at least tacit awareness of the relationships between textual conventions and their contexts of use and, ideally, the ability to reflect critically on those relationships. Because it is purpose‐sensitive, literacy is dynamic—not static—and variable across and within discourse communities and cultures. It draws on a wide range of cognitive abilities, on knowledge of written and spoken languages, on knowledge of genres, and on cultural knowledge. (p. 16)

In addition to referring to the dynamic, transformative nature of literacy, this definition also highlights its linguistic, cognitive, and sociocultural dimensions. Indeed, literacy involves understanding language forms and conventions and how they are used to convey meaning; the ability to make inferences, think critically, and reflect on one's learning; and awareness of the socially and culturally situated nature of language and communication (Kucer, 2009). Seven principles of literacy emerge from this definition: language use, cultural knowledge, conventions, interpretation, collaboration, problem‐solving, and reflection (Kern, 2000). In addition, Kern (2015) emphasized that FL literacy entails “sensitivity to the special ways that language is used differently in writing and in speech” (p. 11), which is particularly important given the multimodal aspects of communication. Finally, texts, defined broadly as any written, oral, visual, or multimodal materials that form a unified whole and possess organizational and formal features (Halliday & Hasan, 1976; Kalantzis et al., 2016; Kern, 2000), are central to conceptualizing cultural literacy because they represent the signifying practices of a society and thus have a social purpose.

In an attempt to concretize the highly theorized concept of cultural literacy and operationalize a discipline‐specific definition that informs and is informed by SLO assessment, this study addressed the following questions:

  1. To what degree is cultural literacy evident in FL program mission statements and learning outcomes?
  2. To what extent are specific factors contributing to cultural literacy emphasized within and across FL programs?

3 PROCEDURES

3.1 Context and data sources

This study was conducted at a large, urban research university in the midwestern United States. The university introduced campus‐wide program assessment in 2013 and in 2014 hired a director of assessment3 to support the development of an institutional framework for assessment processes. That framework included, among other elements, a standardized assessment plan consisting of (1) a mission statement, (2) learning outcomes, (3) a curriculum map, (4) assessment methods, (5) assessment results, (6) an action plan, (7) a timeline for implementation, and (8) a plan for reporting to stakeholders. Since 2014, programs across the university have submitted their assessment plan on an annual cycle, making revisions to mission statements, outcomes, curriculum maps, and assessment methods in the fall, then collecting, analyzing, and reporting data in the spring. In spring 2015, an assessment plan feedback rubric was introduced to help faculty evaluate and improve their programs’ assessment plans.

On a parallel track, the FL department's faculty were engaged in ongoing discussions about curricular revision that emphasized the growing importance of cultural literacy in the discipline as well as the need to provide stronger alignment to the university's mission, a clearer sense of direction, and greater integration across language areas following the 2007 merger of four language departments into one. Following a departmental review in 2012, and to comply with university‐mandated assessment requirements, the department and individual language programs began to develop their assessment plans (see Figure 1).

image
Initial Assessment Plan Development Timeline

In fall 2013, each FL program drafted a mission statement and two learning outcomes. In spring 2014, an ad hoc undergraduate curriculum committee composed of faculty from multiple language programs collaboratively drafted a department mission statement (Department of Classical and Modern Languages, Literatures, and Cultures, 2017) informed by each language program's mission statement and by the scholarly research on cultural literacy. FL programs also collected assessment data for the first time that semester. The department mission statement was shared with all faculty in fall 2014, and their feedback informed a final version. In the same semester, programs developed two additional learning outcomes, revised their existing outcomes, and refined their individual mission statement. In fall 2015, faculty used the assessment plan feedback rubric to revise their mission statement, learning outcomes, and other items in their assessment plans. Programs had an additional opportunity in fall 2016 to independently revise their mission statement and learning outcomes. By this time, programs were required to have one mission statement and a minimum of four learning outcomes. The five language programs each provided a mission statement and developed a total of 23 SLOs, ranging from three (program 5) to six (programs 3 and 4); the department prepared a mission statement but no department‐wide SLOs.

3.2 Operationalization of cultural literacy and data coding

In preparation for data coding, the concept of cultural literacy was operationalized using theory‐driven content analysis (Krippendorff, 2004). Key terms were first extracted from definitions of cultural literacy in existing scholarship, and then 10 themes or factors of cultural literacy emerging from these key terms were identified. To represent shared foci, the proposed factors were grouped into three categories: knowledge, context of use, and process. Knowledge factors included language forms and their meanings, culture, and genre, and were derived from key terms such as conventions, knowledge, representations, texts, and tools. These factors represented the types of content that students learn through interaction with FL texts (e.g., Halliday & Hasan, 1976; Kern, 2000; Paesani et al., 2016). Context of use factors encompassed key terms such as discourse communities, individual, multimodal, relationships, social, spoken, variable, and written; these terms reflected the multimodal nature of communication and the diverse settings in which communication takes place (e.g., Cope & Kalantzis, 2009; Kalantzis et al., 2016). Finally, the factors of interpretation, collaboration, problem‐solving, reflection, and transformation constituted the process category of cultural literacy. These factors related to key terms such as affective, awareness, cognitive abilities, creation, and imagination and represented the ways in which students engage with textual meaning (e.g., Kern, 2000; Paesani et al., 2016).

To evaluate the adequacy of the factor definitions, the 27 items in the data set (six mission statements, 21 learning outcomes) were initially coded for the presence (1) or absence (0) of each factor. A factor was coded as “1” if a data source included one or more mentions of that factor. Each data source could provide evidence of multiple factors simultaneously. However, several ambiguities became apparent through the initial coding process, resulting in a return to published scholarship for clarification of the key terms and their interrelationships. As a result, problem‐solving was eliminated from the list of factors because of its overlap with the three knowledge factors as well as with the factors of transformation and reflection. Following this elimination, the 27 items were analyzed again to determine the presence of each of the nine remaining factors, resulting in 243 ratings (27 items by nine factors each).

Each researcher then independently coded the data again using the revised factor list and definitions, reaching full agreement on 189 of the 243 ratings, i.e., a 78% interrater agreement. Because it was possible to argue that all items included some degree of collaboration, this factor was excluded from all analyses. The final list of eight factors and their definitions is presented in Table 1.

Table 1. Factors in cultural literacy
Category Factors The program intentionally develops…
Knowledge Forms/meaning Knowledge of target language forms and their meanings
Culture​ Knowledge of culture (beliefs, practices, perspectives, values, and products that characterize a cultural group or system)​
Genre​ Knowledge of genres (a kind of text or practice that structures things in particular patterns; characterized by conventions and organizational features)​
Context of use Multimodal communication​ Use of overlapping, complementary modes of communication (written, oral, visual, audio, gestural, tactile, spatial)​
Diverse settings​ Use of language in diverse settings (geographic, relationships, identity, interlocutors)​
Process Interpretation​ Interpretation (ability to go beyond surface‐level understanding of target language texts)​
Reflection​ Metacognition about the relationship between language, culture, and an individual's role or participation in the world​
Transformation​ Transformation or reformulation of knowledge (form/meaning, culture, genre) by an individual to use that knowledge in his/her own way; inclusive of basic creative language use through highly advanced, novel production​

To resolve disagreements from the second pass of independent coding, the authors discussed all discrepancies in interpretation, in particular regarding what constituted evidence of a deliberate intent to develop students’ understanding of form/meaning connections, genre, and transformation. Underlying many of the disagreements was a methodological question: To what extent should the researchers’ inferences incorporate deep knowledge of the discipline, the faculty, and the programs themselves in addition to the formal mission statements and SLOs? After extensive discussion, and in keeping with Krippendorff's (2004) argument for attending to the communicators in analyzing content, information from sources in addition to written documents was included. For instance, some programs referred to particular levels of the ACTFL Proficiency Guidelines (ACTFL, 2012) without quoting the proficiency level descriptors, necessitating consultation of the published guidelines to determine which codes to apply. Similarly, phrases in the data such as “study of literature” did not explicitly state that the program intentionally sought to develop knowledge of genres; the researchers thus drew on their knowledge of the faculty and programs to determine which codes to apply. Finally, the transformation factor was a source of several discrepancies because the literacy scholarship focuses on high‐level knowledge reformulation and textual creation but does not effectively delimit a minimal level of these skills that would facilitate restrictive coding. For instance, the case of beginning‐level students reformulating their knowledge to create meaning suggests more of a continuum of processing than is typically presented in the research. As such, any item that indicated the creative reformulation of knowledge at the learner's current proficiency level was accepted as evidence for transformation. Based on detailed discussions of each item, all discrepancies were resolved, resulting in the data set used for analysis.

4 RESULTS

4.1 Analysis by factor and category

Each mission statement and learning outcome was coded for the presence of the eight factors of cultural literacy. Frequencies ranged from 8 to 15 mentions for a single factor, with the knowledge category and its associated factors mentioned most frequently (see Table 2). To normalize values across the different number of factors in each category, an average number of mentions was calculated, which indicated greater parity between the process and the context of use factors than a simple total would suggest.

Table 2. Frequency of factor mentions
Knowledge factors Frequency Context of use factors Frequency Process factors Frequency
Forms/meaning 13 Multimodal communication 10 Interpretation 9
Culture 15 Diverse settings 11 Reflection 8
Genre 11 Transformation 13
Total 39 Total 21 Total 30
Average mentions (Total/number of factors) 12.7 Average mentions (Total/number of factors) 10.5 Average mentions (Total/number of factors) 10

4.2 Analysis by program

An analysis across programs revealed differences in factor distributions across and within categories. As the subtotals in Table 3 show, the department as a whole and programs 2 and 5 in particular demonstrated a balance of factors across the three categories while programs 1, 3, and 4 emphasized some categories over others. However, even in programs that similarly emphasized a category, there was variation in distribution across some factors within that category. For example, programs 1 and 2 each had nine mentions of knowledge factors, but program 1 emphasized culture whereas program 2 emphasized form/meaning and genre. Furthermore, while the department and programs 2, 4, and 5 provided evidence of all factors, program 1 did not provide evidence of interpretation and program 3 did not provide evidence of multimodal communication or transformation.

Table 3. Factor and category representation by department and program
Department Program 1 Program 2 Program 3 Program 4 Program 5
Knowledge factors
Form/meaning 1 2 4 1 2 3
Culture 1 4 1 4 3 2
Genre 1 3 4 1 1 1
Knowledge subtotal 3 9 9 6 6 6
Context of use factors
Multimodal communication 1 1 4 0 1 3
Diverse settings 1 1 5 1 1 2
Context of use subtotal 2 2 9 1 2 5
Process factors
Interpretation 1 0 4 2 1 1
Reflection 1 3 1 1 1 1
Transformation 1 1 5 0 4 2
Process subtotal 3 4 10 3 6 4
All categories total 8 15 28 10 14 15

4.3 Analysis by assessment document

The final analysis focused on whether the eight cultural literacy factors were present in the mission statements, in the learning outcomes, or in both data sources in the five programs. Out of the 40 possible cases (eight factors by the five programs that submitted both a mission statement and learning outcomes), three programs did not represent at least one factor in either data source. In contrast, the factors were present in both sources in 18 of the 37 cases (40 possible cases minus three where a factor was not present in either) and were present in only the mission statement or only the SLOs in 19 others. Knowledge factors were most frequently present in both data sources, whereas context of use and process factors were more present in SLOs than in mission statements.

5 DISCUSSION

The operationalization of cultural literacy based on a review of existing scholarship suggests that this concept can be operationalized in terms of eight factors organized into three categories: knowledge, context of use, and process. These categories reflect the dimensions and principles in the definitions of cultural literacy presented earlier (Kern, 2000; Kucer, 2009) and are consistent with the diverse kinds of learning that result from FL study; that is, “the knowledge that will be gained, the skills or abilities acquired, and the dispositions, attitudes, or awareness developed” (Norris, 2006, p. 577). Moreover, these cultural literacy factors and categories capture the interconnectedness of language and culture learning, thus moving the profession closer to a definition that can be adopted across postsecondary FL programs. Nonetheless, as the process of operationalizing cultural literacy revealed, clearly identifying, delineating, and applying the factors involved with this concept was challenging. For instance, problem‐solving, which was eliminated from the final list of factors, was ambiguous because it overlapped significantly with language use, both of which entail form‐meaning connections. A second challenge was how factors were presented in the theoretical research: Many had not been concretized in a way that facilitated the operationalization of cultural literacy, complicating the delineations among factors. This abstractness of factors furthermore suggests that implementing cultural literacy in programmatic mission statements and learning outcomes may also be challenging for FL faculty.

Concerning the degree to which cultural literacy was evidenced in the assessment documents under consideration, the results showed that program mission statements and learning outcomes provided substantial evidence of the degree to which cultural literacy can serve as an overarching departmental goal. Moreover, the use of both types of documents in the analysis provided a more robust picture of SLO assessment, revealing that in some cases factors were included in one document type but not in the other within a single program. In addition to providing a model of discipline‐specific SLO assessment, the study's findings suggest that cultural literacy factors were evident across language program mission and outcomes statements to varying degrees. Indeed, the often overlapping and complementary ways in which these factors appeared in mission statements and learning outcomes shows that assessment documentation can be used as evidence of faculty conceptualizations of cultural literacy and the degree to which those conceptualizations align with theoretical, research‐based definitions.

Regarding the extent to which specific factors contributing to cultural literacy are emphasized within and across FL programs, both consistencies and variations were observed. Analyses revealed substantial similarities between the departmental conceptions of cultural literacy and those of several individual language programs. However, the balance across factors and categories varied, indicating a possible misalignment between the departmental understanding of cultural literacy and individual programs’ conceptualizations. The multilevel analysis exemplified in this study thus offers an approach to analyzing and facilitating the consistent implementation of a framework for cultural literacy as an overarching departmental and programmatic goal.

With respect to specific emphases among the factors, the highest‐frequency factors in this study (culture, forms/meaning, transformation) corresponded to what are often considered the more traditional content‐ and language‐learning goals of FL study. The midrange‐frequency factors (multimodal communication and diverse settings) may also indicate a traditional conceptualization of language study as four independent skills (reading, writing, speaking, listening) without conscious emphasis on their interdependent and overlapping nature, as reflected in the communicative modes (interpersonal, interpretive, presentational), or on the impact of context on language use and communication. An example from the mission statement of one of the programs analyzed for this study demonstrates the rationale for this interpretation: “[o]ur majors should reach an intermediate high level and minors an intermediate low level in all the four language skills.” The department mission statement, in contrast, specifies that “abilities, skills, and knowledge are embedded with broad contexts and exchanges that support cross‐cultural awareness” (Department of Classical and Modern Languages, Literatures, and Cultures, 2017, n.p.). Genre had a similar midrange frequency despite the generous interpretation of evidence based on the researchers’ knowledge of the FL discipline and the department faculty. This result may have been due to a mismatch between faculty members’ definitions of genre and the definition used in this study, or possibly an implicit understanding that genre is a core goal of literary study in FL programs that does not require explicit mention in learning outcomes. The two factors that had the lowest frequencies (reflection, interpretation) aligned with the department's more recently adopted cultural literacy framework.

6 IMPLICATIONS AND LIMITATIONS

This study demonstrated that a factor‐based operationalization of cultural literacy that grows out of a deep analysis of existing scholarship can in fact provide a practical tool for analyzing and comparing FL programs or departments. The study thus provides a starting point for future investigations into the concept of cultural literacy, how cultural literacy is defined and develops, and the relationship between cultural literacy and SLO assessment. Because some factors may only manifest themselves through instructional activities, observational data may provide insights into the ways in which faculty operationalize these concepts in course design, instructional approach, and class activities and assessments. For instance, Kern (2000) identified several learning processes—including problem‐solving and collaboration, two factors that were eliminated from the original list of 10 for this study—as being essential to literacy development, yet since these processes are perhaps best evidenced through instruction, the extent to which faculty address them may not be evident in learning outcomes or mission statements. Future investigations may also seek to confirm FL faculty members’ understandings of cultural literacy and its associated factors and help them adopt, articulate, and implement this overarching goal in their programs.

Establishing systematic definitions of cultural literacy also has important curricular implications: Such definitions enable understandings of how cultural literacy develops over time and which factors should be more or less emphasized at different curricular stages. This developmental dimension of cultural literacy (Kucer, 2009) focuses on how a learner builds understandings of the linguistic, cognitive, and sociocultural dimensions of literacy that were mentioned earlier and “underscores the idea that one becomes literate; literacy is seen as a process rather than a product” (Paesani et al., 2016, p. 12). In further refining the definition of cultural literacy and its related factors, it will also be important to investigate this developmental dimension through an examination of student data collected during the SLO assessment process over time, either for a cohort of students or cross‐sectionally throughout a program of study. Evidence of student learning in relation to cultural literacy can then inform curriculum development, assessment practices, and refinement of student learning outcomes.

Systematically defining cultural literacy can also help FL programs more effectively determine instructional approaches that best meet specific learning objectives and mission statements. With a better understanding of what students are able to do in relation to the various factors of cultural literacy at different points of an FL program, faculty can pinpoint class activities and course assessments that develop the knowledge, contexts of use, and process categories that were defined and operationalized in this study. Indeed, an important implication of this study's findings is the ability to use SLO assessment practices to align program goals with their implementation through classroom practices. FL faculty must therefore regularly revisit the alignment between mission statements and learning outcomes and consider how those outcomes are implemented and assessed in the curriculum. It is through this assessment process that faculty's stated goals are transformed into evidence‐based learning experiences for students.

Finally, the findings provide evidence of how specialized disciplinary knowledge can guide the creation and evaluation of assessment documents in an FL department encompassing a range of subdisciplines (e.g., literature, cultural studies, linguistics, history). As such, this study responds to the need for discipline‐specific models of SLO assessment that draw on established disciplinary frameworks and research‐based understandings—in this case, constructs such as cultural literacy, proficiency, and second language acquisition.

Two limitations of this study should be considered when designing future studies of discipline‐specific SLO assessment. First it is not known to what extent the specific wording used by various programs and the potentially differing understandings of those words among faculty and the researchers impacted the ability to objectively identify and code each program's assessment practices. Thus, while the study describes what appears to be a fruitful approach to SLO development and analysis, the findings themselves are not generalizable to other instructional contexts. In addition, although a collaborative process was used to create the departmental mission statement that was examined in this study, it is not clear whether this document informed the drafting and revision of program‐specific mission statements or learning outcomes. In the future, it will be important to investigate how individual programs create and refine SLO assessment documents and relate these to departmental documents, including an overall mission statement. Collecting data from sources outside of assessment documents (e.g., interviews with faculty, classroom observations) would also provide a more fine‐grained understanding of this process and how cultural literacy is defined, understood, and implemented in collegiate FL programs.

7 CONCLUSION

Continued research into discipline‐specific SLO assessment in FLs and other disciplines will help faculty think deeply about what constitutes student learning in their programs, collaborate with colleagues to create and revise assessment documents, and promote engagement in the assessment process. Indeed, an essential component of increasing “faculty ownership and involvement [in assessment] is to respect the disciplinary differences and perspectives of faculty that frame how they perceive and approach assessment” (Swarat et al., 2017, p. 2). This case study analysis of discipline‐specific SLO assessment in FLs proposes one approach to investigating and meeting that goal and, by extension, to better understanding student learning in postsecondary contexts.

ACKNOWLEDGMENTS

We are grateful to Dan Soneson, Catherine Wehlburg, and two anonymous reviewers for their helpful comments on previous versions of this article and to Russell Simonsen for his careful bibliographic work.

    ENDNOTES

    • 1 Norris (2016) defined program evaluation as the process of gathering information about the features of academic programs to understand, analyze, and improve them.
    • 2 Within SLO assessment scholarship, definitions of key terms such as mission statement, goals, objectives, and outcomes vary widely. The focus here is on the mission statement and learning outcomes as they reflect the assessment structure of the institution investigated in this study.
    • 3 Both authors served as participant‐researchers in this study. Barrette was the institutional director of assessment, and Paesani was the chair of the departmental ad hoc undergraduate curriculum committee.