Asymptotic Theory of Outlier Detection Algorithms for Linear Time Series Regression Models

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

  • Søren Johansen
  • Bent Nielsen, University of Oxford - Nuffield College, Storbritannien

Outlier detection algorithms are intimately connected with robust statistics that down-weight some observations to zero. We define a number of outlier detection algorithms related to the Huber-skip and least trimmed squares estimators, including the one-step Huber-skip estimator and the forward search. Next, we review a recently developed asymptotic theory of these. Finally, we analyse the gauge, the fraction of wrongly detected outliers, for a number of outlier detection algorithms and establish an asymptotic normal and a Poisson theory for the gauge.

OriginalsprogEngelsk
TidsskriftScandinavian Journal of Statistics
Vol/bind43
Nummer2
Sider (fra-til)321-348
Antal sider28
ISSN0303-6898
DOI
StatusUdgivet - 1 jun. 2016

Se relationer på Aarhus Universitet Citationsformater

Download-statistik

Ingen data tilgængelig

ID: 101193775