Sequence classification is very important in NLP. One of representative methods for sequence classification is HMM as a generative model and it tried to be combined with MEM as a discriminative model: this is just called MEMM. CRF is finally developed by overcoming the achilles' heel of MEMM. You know. Currently, CRF is the state-of-the-art technique for sequence classification in many NLP applications. So I attempted to summarize the concepts of MEMM and CRF as follows:
http://web.donga.ac.kr/yjko/usefulthings/MEMM&CRF.pdf
This comment has been removed by the author.
ReplyDeleteI have a question based on your summary note, actually page 26.
ReplyDeleteMEMM has a label bias problem due to local probability, but CRF does not due to local potential. What is 'local potential'? What's the meaningful difference btw probability and potential(maybe frequency)?
ps. it seems like a typo error at edge frequency number 10(state2, observation2).
This comment has been removed by the author.
ReplyDelete