The paper gives an overview of modern likelihood asymptotics with emphasis on results and applicability. Only parametric inference in well-behaved models is considered and the theory discussed leads to highly accurate asymptotic tests for general smooth hypotheses. The tests are refinements of the usual asymptotic likelihood ratio tests, and for one-dimensional hypotheses the test statistic is known as r*, introduced by Barndorff-Nielsen. Examples illustrate the applicability and accuracy as well as the complexity of the required computations. Modern likelihood asymptotics has developed by merging two lines of research: asymptotic ancillarity is the basis of the statistical development, and saddlepoint approximations or Laplace-type approximations have simultaneously developed as the technical foundation. The main results and techniques of these two lines will be reviewed, and a generalization to multi-dimensional tests is developed. In the final part of the paper further problems and ideas are presented. Among these are linear models with non-normal error, non-parametric linear models obtained by estimation of the residual density in combination with the present results, and the generalization of the results to restricted maximum likelihood and similar structured models.