This article shows how to smoothly "monotonize" standard kernel estimators of hazard rate, using bootstrap weights. Our method takes a variety of forms, depending on choice of kernel estimator and on the distance function used to define a certain constrained optimization problem. We confine attention to a particularly simple kernel approach and explore a range of distance functions. It is straightforward to reduce "quadratic" inequality constraints to "linear" equality constraints, and so our method may be implemented using little more than conventional Newton-Raphson iteration. Thus, the necessary computational techniques are very familiar to statisticians. We show both numerically and theoretically that monotonicity, in either direction, can generally be imposed on a kernel hazard rate estimator regardless of the monotonicity or otherwise of the true hazard rate. The case of censored data is easily accommodated. Our methods have straightforward extension to the problem of testing for monotonicity of hazard rate, where the distance function plays the role of a test statistic.
|Journal of Computational and Graphical Statistics
|Published - 2001