A systematic comparison of Cu toxicity thresholds was made between freshly spiked soils and soils in which elevated Cu concentrations have been present for various times. Three uncontaminated soils were spiked and experimentally leached or incubated outdoors for up to 18 months. Additionally, five field-contaminated soils with a 6- to 80-year-old Cu contamination were sampled, and corresponding uncontaminated soils were spiked to identical total concentrations. All soil samples were subjected to three microbial assays (nitrification potential, glucose-induced respiration, and maize residue C-mineralization). Experimental leaching or soil incubation after spiking reduced Cu toxicity (1.3- or 2.3-fold increase of dose, respectively, to inhibit process by 50% [ED50]). No significant effects of soil type, aging time (6, 12, or 18 months), or bioassay on the factor change of ED50 were found. Significant reductions of microbial activity in field-contaminated soils were only identified in 2 of the 15 series (three assays in five soils), whereas freshly spiking the corresponding control soils significantly affected these processes in 12 series. Soil solution Cu concentrations significantly decreased on leaching at corresponding total soil Cu, and smaller decreases were found during additional aging. Soil solution Cu concentrations largely explain changes in Cu toxicity on leaching and aging, although additional variation may be related to changes in the sensitivity of microbial populations. It is concluded that total Cu toxicity thresholds are lower in freshly spiked soils compared to soils in which Cu salts have equilibrated and leaching has removed excess soluble salts. The large variability of soil microbial processes creates a large uncertainty about the magnitude of the factor by which aging mitigates Cu toxicity.