Measurements such as flow rates from a chemical process violate conservation laws and other process constraints because they are contaminated by random errors and possibly gross errors such as process disturbances, leaks, departures from steady state, and biased instrumentation. Data reconcilation is aimed at estimating the true values of measured variables that are consistent with the constraints, at detecting gross errors, and at solving for unmeasured variables. An approach to constructing sequential principal-component tests for detecting and identifying persistent gross errors during data reconciliation by combining principal-component analysis and sequential analysis is presented. The tests detect gross errors as early as possible with fewer measuremennts. They were sharper in detecting and have a substantially greater power in correctly identifying gross errors than the currently used statistical tests in data reconciliation.
If you can't find a tool you're looking for, please click the link at the top of the page to "Go to old article view". Alternatively, view our Knowledge Base articles for additional help. Your feedback is important to us, so please let us know if you have comments or ideas for improvement.