AVERAGED ONE DEPENDENCE ESTIMATORS FOR NAIVE BAYESIAN CLASSIFICATION
Abstract
Naive Bayes (NB) is a simple, computationally efficient probabilistic approach to classification learning. It assumes that all attributes are conditionally independent of each other given the class. But in real time applications complete attribute independence is not possible, so to address this limitation of NB we go for ODE and AODE. One Dependence Estimators (ODE) is based on the concept of selecting one attribute as the dependent attribute and it acts as the predictor to identify the class label. To improve the efficiency of ODE further we have AODE. Averaged One-Dependence Estimators (AODE) is a popular and effective approach to Bayesian learning. It relaxes the attribute independence assumption by averaging all models that assume all attributes are conditionally dependent on the class and one common attribute, known as the super-parent. This often improves the classification performance significantly. In the work, a new attribute selection approach is proposed for AODE. It can search in a large model space, while it requires only a single extra pass through the training data, resulting in a computationally efficient two-pass learning algorithm. Its low bias and computational efficiency make it an attractive algorithm for learning from big data.