This course introduces the key concepts underlying statistical natural language processing. Students will learn a variety of techniques for the computational modeling of natural language, including: n-gram models, smoothing, Hidden Markov models, Bayesian Inference, Expectation Maximization, Viterbi, Inside-Outside Algorithm for Probabilistic Context-Free Grammars, and higher-order language models. Graduate-level requirements include assignments of greater scope than undergraduate assignments. In addition to being more in-depth, graduate assignments are typically longer and additional readings are required.
As we work together to battle the coronavirus, we will continue to offer safe and secure online sessions . Even though our physical office is closed, in accordance with the guidelines recommended by CDC, we are working remotely and continuing to provide student, staff, and faculty assistance. We can be reached Monday-Friday 9am-4pm Mountain Standard Time at 520-621-3565 or by email – please refer to the iSchool Directory. Please allow up to 24 hours response time. Faculty and Adjuncts will respond as their schedules permit.