Isotonic regression model.
Read more in the :ref:`User Guide <isotonic>`.
.. versionadded:: 0.13
Parameters ---------- y_min : float, default=None Lower bound on the lowest predicted value (the minimum value may still be higher). If not set, defaults to -inf.
y_max : float, default=None Upper bound on the highest predicted value (the maximum may still be lower). If not set, defaults to +inf.
increasing : bool or 'auto', default=True Determines whether the predictions should be constrained to increase or decrease with `X`. 'auto' will decide based on the Spearman correlation estimate's sign.
out_of_bounds : str, default='nan' The ``out_of_bounds`` parameter handles how `X` values outside of the training domain are handled. When set to 'nan', predictions will be NaN. When set to 'clip', predictions will be set to the value corresponding to the nearest train interval endpoint. When set to 'raise' a `ValueError` is raised.
Attributes ---------- X_min_ : float Minimum value of input array `X_` for left bound.
X_max_ : float Maximum value of input array `X_` for right bound.
f_ : function The stepwise interpolating function that covers the input domain ``X``.
increasing_ : bool Inferred value for ``increasing``.
Notes ----- Ties are broken using the secondary method from Leeuw, 1977.
References ---------- Isotonic Median Regression: A Linear Programming Approach Nilotpal Chakravarti Mathematics of Operations Research Vol. 14, No. 2 (May, 1989), pp. 303-308
Isotone Optimization in R : Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods Leeuw, Hornik, Mair Journal of Statistical Software 2009
Correctness of Kruskal's algorithms for monotone regression with ties Leeuw, Psychometrica, 1977
Examples -------- >>> from sklearn.datasets import make_regression >>> from sklearn.isotonic import IsotonicRegression >>> X, y = make_regression(n_samples=10, n_features=1, random_state=41) >>> iso_reg = IsotonicRegression().fit(X.flatten(), y) >>> iso_reg.predict(.1, .2
) array(1.8628..., 3.7256...
)