Kernel Density Estimation.
Read more in the :ref:`User Guide <kernel_density>`.
Parameters ---------- bandwidth : float The bandwidth of the kernel.
algorithm : str The tree algorithm to use. Valid options are 'kd_tree'|'ball_tree'|'auto'
. Default is 'auto'.
kernel : str The kernel to use. Valid kernels are 'gaussian'|'tophat'|'epanechnikov'|'exponential'|'linear'|'cosine'
Default is 'gaussian'.
metric : str The distance metric to use. Note that not all metrics are valid with all algorithms. Refer to the documentation of :class:`BallTree` and :class:`KDTree` for a description of available algorithms. Note that the normalization of the density output is correct only for the Euclidean distance metric. Default is 'euclidean'.
atol : float The desired absolute tolerance of the result. A larger tolerance will generally lead to faster execution. Default is 0.
rtol : float The desired relative tolerance of the result. A larger tolerance will generally lead to faster execution. Default is 1E-8.
breadth_first : bool If true (default), use a breadth-first approach to the problem. Otherwise use a depth-first approach.
leaf_size : int Specify the leaf size of the underlying tree. See :class:`BallTree` or :class:`KDTree` for details. Default is 40.
metric_params : dict Additional parameters to be passed to the tree for use with the metric. For more information, see the documentation of :class:`BallTree` or :class:`KDTree`.
See Also -------- sklearn.neighbors.KDTree : K-dimensional tree for fast generalized N-point problems. sklearn.neighbors.BallTree : Ball tree for fast generalized N-point problems.
Examples -------- Compute a gaussian kernel density estimate with a fixed bandwidth. >>> import numpy as np >>> rng = np.random.RandomState(42) >>> X = rng.random_sample((100, 3)) >>> kde = KernelDensity(kernel='gaussian', bandwidth=0.5).fit(X) >>> log_density = kde.score_samples(X:3
) >>> log_density array(-1.52955942, -1.51462041, -1.60244657
)