Iterative isotonic regression
ESAIM: Probability and Statistics, Tome 19 (2015), pp. 1-23.

This article explores some theoretical aspects of a recent nonparametric method for estimating a univariate regression function of bounded variation. The method exploits the Jordan decomposition which states that a function of bounded variation can be decomposed as the sum of a non-decreasing function and a non-increasing function. This suggests combining the backfitting algorithm for estimating additive functions with isotonic regression for estimating monotone functions. The resulting iterative algorithm is called Iterative Isotonic Regression (I.I.R.). The main result in this paper states that the estimator is consistent if the number of iterations k n grows appropriately with the sample size n. The proof requires two auxiliary results that are of interest in and by themselves: firstly, we generalize the well-known consistency property of isotonic regression to the framework of a non-monotone regression function, and secondly, we relate the backfitting algorithm to von Neumann’s algorithm in convex analysis. We also analyse how the algorithm can be stopped in practice using a data-splitting procedure.

Reçu le :
DOI : 10.1051/ps/2014012
Classification : 52A05, 62G08, 62G20
Mots clés : Nonparametric statistics, isotonic regression, additive models, metric projection onto convex cones
Guyader, Arnaud 1 ; Hengartner, Nick 2 ; Jégou, Nicolas 3 ; Matzner-Løber, Eric 3

1 UniversitéRennes 2, INRIA and IRMAR, Campus de Villejean, 35043 Rennes, France
2 Los Alamos National Laboratory, NM 87545, Los Alamos, USA
3 Université Rennes 2, Campus de Villejean, 35043 Rennes, France
@article{PS_2015__19__1_0,
     author = {Guyader, Arnaud and Hengartner, Nick and J\'egou, Nicolas and Matzner-L{\o}ber, Eric},
     title = {Iterative isotonic regression},
     journal = {ESAIM: Probability and Statistics},
     pages = {1--23},
     publisher = {EDP-Sciences},
     volume = {19},
     year = {2015},
     doi = {10.1051/ps/2014012},
     mrnumber = {3374866},
     zbl = {1392.62117},
     language = {en},
     url = {http://www.numdam.org/articles/10.1051/ps/2014012/}
}
TY  - JOUR
AU  - Guyader, Arnaud
AU  - Hengartner, Nick
AU  - Jégou, Nicolas
AU  - Matzner-Løber, Eric
TI  - Iterative isotonic regression
JO  - ESAIM: Probability and Statistics
PY  - 2015
SP  - 1
EP  - 23
VL  - 19
PB  - EDP-Sciences
UR  - http://www.numdam.org/articles/10.1051/ps/2014012/
DO  - 10.1051/ps/2014012
LA  - en
ID  - PS_2015__19__1_0
ER  - 
%0 Journal Article
%A Guyader, Arnaud
%A Hengartner, Nick
%A Jégou, Nicolas
%A Matzner-Løber, Eric
%T Iterative isotonic regression
%J ESAIM: Probability and Statistics
%D 2015
%P 1-23
%V 19
%I EDP-Sciences
%U http://www.numdam.org/articles/10.1051/ps/2014012/
%R 10.1051/ps/2014012
%G en
%F PS_2015__19__1_0
Guyader, Arnaud; Hengartner, Nick; Jégou, Nicolas; Matzner-Løber, Eric. Iterative isotonic regression. ESAIM: Probability and Statistics, Tome 19 (2015), pp. 1-23. doi : 10.1051/ps/2014012. http://www.numdam.org/articles/10.1051/ps/2014012/

D. Anevski and P. Soulier, Monotone spectral density estimation. Ann. Stat. 39 (2011) 418–438. | MR | Zbl

M. Ayer, H.D. Brunk, G.M. Ewing, W.T. Reid and E. Silverman, An empirical distribution function for sampling with incomplete information. Ann. Math. Stat. (1955) 641–647. | MR | Zbl

R.E. Barlow, D.J. Bartholomew, J.M. Bremner and H.D. Brunk, Statistical inference under order restrictions: Theory and application of isotonic regression. John Wiley & Sons (1972). | MR | Zbl

H.H. Bauschke and J.M. Borwein, On the Convergence of von Neumann’s Alternating Projection Algorithm for Two Sets. Set-Valued Anal. 1 (1993) 185–212. | MR | Zbl

H.H. Bauschke and J.M. Borwein, Dykstra’s alternating projection algorithm for two sets. J. Approx. Theory 79 (1994) 418–443. | MR | Zbl

M.J. Best and N. Chakravarti, Active set algorithms for isotonic regression; An unifying framework. Math. Program. 47 (1990) 425–439. | MR | Zbl

H.D. Brunk, Estimation of isotonic regression. Cambridge University Press (1970) 177–195. | MR

H.D. Brunk, Maximum likelihood estimates of monotone parameters. Ann. Math. Stat. (1955) 607–616. | MR | Zbl

V.V. Buldygin and Y.V. Kozachenko, Metric Characterization of Random Variables and Random Processes. American Mathematical Society (1972). | MR | Zbl

A. Buja, T.J. Hastie and R.J. Tibshirani, Linear smoothers and additive models. Ann. Stat. 17 (1989) 453–510. | MR | Zbl

F. Deutsch, The method of alternating orthogonal projections. Approximation Theory, Spline Functions and Applications, edited by S.P. Singh (1991) 105–121. | MR | Zbl

C. Durot, On the Lp-error of monotonicity constrained estimators. Ann. Stat. 35 (2007) 1080–1104. | MR | Zbl

R.L. Dykstra, An isotonic regression algorithm. J. Stat. Plann. Inference 5 (1981) 355–363. | MR | Zbl

J.H. Friedman and W. Stuetzle, Projection pursuit regression. J. Amer. Stat. Assoc. (1981) 817–823. | MR

A. Guyader, N. Jégou, A.B. Németh and S.N. Németh, A Geometrical Approach to Iterative Isotone Regression. Appl. Math. Comput. 227 (2014) 359–369. | MR | Zbl

L. Györfi, M. Kohler, A. Kryzak and H. Walk, A distribution-free theory of nonparametric regression. Springer-Verlag, New York (1990). | MR | Zbl

D.L. Hanson, G. Pledger and F.T. Wright, On consistency in monotonic regression. Ann. Stat. 1 (1973) 401–421. | MR | Zbl

T.J. Hastie and R.J. Tibshirani, Generalized additive models. Chapman & Hall/CRC (1990). | MR | Zbl

W. Härdle and P. Hall, On the backfitting algorithm for additive regression models. Statistica Neerlandica 47 (1993) 43–57. | MR | Zbl

N.W. Hengartner and S. Sperlich, Rate optimal estimation with the integration method in the presence of many covariates. J. Multivar. Anal. 95 (1999) 246–272. | MR | Zbl

J. Horowitz, J. Klemelä and E. Mammen, Optimal estimation in additive regression models. Bernoulli 12 (2006) 271–298. | MR | Zbl

W. Kim, O.B. Linton and N.W. Hengartner, A computationally efficient oracle estimator for additive nonparametric regression with bootstrap confidence intervals. J. Comput. Graph. Stat. 8 (1999) 278–297. | MR

C.I.C. Lee, The min-max algorithm and isotonic regression. Ann. Stat. 11 (1983) 467–477. | MR | Zbl

E. Mammen and K. Yu, Additive isotone regression. In: Asymptotics: Particles, Processes and Inverse Problems, Lect. Notes Monogr. Series 55 (2007) 179–195. | MR | Zbl

E. Mammen, O. Linton and J. Nielsen, The existence and asymptotic properties of a backfitting projection algorithm under weak conditions. Ann. Stat. 27 (1999) 1443–1490. | MR | Zbl

M. Meyer and M. Woodroofe, On the Degrees of Freedom in Shape-Restricted Regression. Ann. Stat. 28 (2000) 1083–1104. | MR | Zbl

J.D. Opsomer, Asymptotic properties of backfitting estimators. J. Multivar. Anal. 73 (2000) 166–179. | MR | Zbl

J.D. Opsomer and D. Ruppert, Fitting a bivariate additive model by local polynomial regression. Ann. Stat. 25 (1997) 186–211. | MR | Zbl

T. Robertson, F.T. Wright and R.L. Dykstra, Order Restricted Statistical Inference. Wiley, New York (1988). | MR | Zbl

S. Van De Geer, and M. Wegkamp, Consistency for the least squares estimator in nonparametric regression. Ann. Stat. 24 (1996) 2513–2523. | MR | Zbl

S. van de Geer, Empirical Process in M-Estimation. Cambridge University Press (2000). | Zbl

Cité par Sources :