Early in the first year of life infants exhibit equivalent performance distinguishing among people within their own race and within other races. However, with development and experience, their face recognition skills become tuned to groups of people they interact with the most. This developmental tuning is hypothesized to be the origin of adult face processing biases including the other-race bias. In adults the other-race bias has also been associated with impairments in facial emotion processing for other-race faces. The present investigation aimed to show perceptual narrowing for other-race faces during infancy and to determine whether the race of a face influences infants’ ability to match emotional sounds with emotional facial expressions. Behavioral (visual-paired comparison; VPC) and electrophysiological (event-related potentials; ERPs) measures were recorded in 5-month-old and 9-month-old infants. Behaviorally, 5-month-olds distinguished faces within their own race and within another race, whereas 9-month-olds only distinguish faces within their own race. ERPs were recorded while an emotion sound (laughing or crying) was presented prior to viewing an image of a static African American or Caucasian face expressing either a happy or a sad emotion. Consistent with behavioral findings, ERPs revealed race-specific perceptual processing of faces and emotion/sound face congruency at 9 months but not 5 months of age. In addition, from 5 to 9 months, the neural networks activated for sound/face congruency were found to shift from an anterior ERP component (Nc) related to attention to posterior ERP components (N290, P400) related to perception.