Changing the order of items between alternate test forms to prevent copying and to enhance test security is a common practice in achievement testing. However, these changes in item order may affect item and test characteristics. Several procedures have been proposed for studying these item-order effects. The present study explores the use of descriptive and explanatory models from item response theory for detecting and modeling these effects in a one-step procedure. The framework also allows for consideration of the impact of individual differences in position effect on item difficulty. A simulation was conducted to investigate the impact of a position effect on parameter recovery in a Rasch model. As an illustration, the framework was applied to a listening comprehension test for French as a foreign language and to data from the PISA 2006 assessment.