Naive Bayes classifier for categorical features
The categorical Naive Bayes classifier is suitable for classification with discrete features that are categorically distributed. The categories of each feature are drawn from a categorical distribution.
Read more in the :ref:`User Guide <categorical_naive_bayes>`.
Parameters ---------- alpha : float, default=1.0 Additive (Laplace/Lidstone) smoothing parameter (0 for no smoothing).
fit_prior : bool, default=True Whether to learn class prior probabilities or not. If false, a uniform prior will be used.
class_prior : array-like of shape (n_classes,), default=None Prior probabilities of the classes. If specified the priors are not adjusted according to the data.
Attributes ---------- category_count_ : list of arrays of shape (n_features,) Holds arrays of shape (n_classes, n_categories of respective feature) for each feature. Each array provides the number of samples encountered for each class and category of the specific feature.
class_count_ : ndarray of shape (n_classes,) Number of samples encountered for each class during fitting. This value is weighted by the sample weight when provided.
class_log_prior_ : ndarray of shape (n_classes,) Smoothed empirical log probability for each class.
classes_ : ndarray of shape (n_classes,) Class labels known to the classifier
feature_log_prob_ : list of arrays of shape (n_features,) Holds arrays of shape (n_classes, n_categories of respective feature) for each feature. Each array provides the empirical log probability of categories given the respective feature and class, ``P(x_i|y)``.
n_features_ : int Number of features of each sample.
Examples -------- >>> import numpy as np >>> rng = np.random.RandomState(1) >>> X = rng.randint(5, size=(6, 100)) >>> y = np.array(1, 2, 3, 4, 5, 6
) >>> from sklearn.naive_bayes import CategoricalNB >>> clf = CategoricalNB() >>> clf.fit(X, y) CategoricalNB() >>> print(clf.predict(X2:3
)) 3