The results of international assessments such as the Trends in International Mathematics and Science Study (TIMSS) are often reported as rankings of nations. Focusing solely on national rank can result in invalid inferences about the relative quality of educational systems that can, in turn, lead to negative consequences for teachers and students. This study seeks an alternative data analysis method that allows for improved inferences about international performance on the TIMSS. In this study, four classroom teachers categorized a sample of TIMSS items by the cognitive domains of knowing and applying using the definitions provided by the TIMSS 2011 Assessment Frameworks. Items of different cognitive domains were analyzed separately. This disaggregation allowed for more valid inferences to be made about student performance. Results showed almost no significant difference between the performance of U.S. students and the students of five other nations. Additionally, no differences were observed in U.S. students' performance on knowing items and applying items, although students from some sample nations performed significantly better on knowing items. These results suggest that policy makers, educators, and citizens should be cautious when interpreting the results of TIMSS rank tables.