Bayesian Networks tend to be used not as classifiers but a tool to explore joint probability distributions.
Interestingly related to your topic, HMMs and Naive Bayes are related in that HMMs are kinda like the sequence sensitive version. HMMs and Naive Bayes are generative models. They both model/estimate a joint probability on the data with very strong conditional independence assumptions. Where as Logistic regression and Conditional Random Fields estimate the conditional probability of the output/labels directly.
HMMs : Naive Bayes as linear chain CRFs : Logistic Regression. CRFs are state of the art at sequence and time series prediction. I have not yet gotten my head round them though. The relationship between logistic regression and naive bayes is not commonly known (although the comparison of log reg to a simple Neural network is common). Knowing when Logistic regression outperforms Naive Bayes is useful (simple rule of thumb: logistic regression less sensitive to independence assumption, more data use log reg, less data use naive bayes). I've implemented a multi class sparse regularized logistic: SMLR. Its up there with linear SVMs but simpler but also gives a probability.
Bayesian Networks tend to be used not as classifiers but a tool to explore joint probability distributions.
Interestingly related to your topic, HMMs and Naive Bayes are related in that HMMs are kinda like the sequence sensitive version. HMMs and Naive Bayes are generative models. They both model/estimate a joint probability on the data with very strong conditional independence assumptions. Where as Logistic regression and Conditional Random Fields estimate the conditional probability of the output/labels directly.
HMMs : Naive Bayes as linear chain CRFs : Logistic Regression. CRFs are state of the art at sequence and time series prediction. I have not yet gotten my head round them though. The relationship between logistic regression and naive bayes is not commonly known (although the comparison of log reg to a simple Neural network is common). Knowing when Logistic regression outperforms Naive Bayes is useful (simple rule of thumb: logistic regression less sensitive to independence assumption, more data use log reg, less data use naive bayes). I've implemented a multi class sparse regularized logistic: SMLR. Its up there with linear SVMs but simpler but also gives a probability.