College of Computer and Information Science, Northeastern University

Towards Building Better Predictive Text Entry Methods for Mobile Devices

Gong Jun
Computer and Information Sciences, Northeastern University
October 19, 2006 1:30 PM to 2:15 PM

     Despite the ever-increasing popularity of mobile devices, text entry on such devices is becoming more difficult due to shrinking device sizes that limit available display space, input modalities, and interaction techniques. In attempting to resolve this issue, researchers have found that predictive text entry methods are very efficient for text entry on devices such as mobile phones that use keypads instead of full keyboards. However, word ambiguity, limited dictionary sizes, and lengthy learning curves prevent this method from being practical in many situations. Innovative solutions to these problems are proposed in this research. The result will be more effective predictive text entry methods for mobile devices with limited numbers of keys or buttons.

Can your computer tell if you are sad?

Harriet Fell
Computer and Information Sciences, Northeastern University
October 19, 2006 2:15 PM to 3 PM

     Imagine your computer responding with sympathy when you are sad, explaining things more simply when you are frustrated, or speaking calmly to you when you are stressed.
     Researchers are working on recognizing emotion through a variety of inputs including pulse, temperature, blood volume pressure, respiration, general somatic activity, and galvanic skin response. In many situations it is not possible to collect this kind of physiological information. Human listeners are clearly sensitive to affect in speech even when the speaker is not visible, e.g. on radio broadcasts and telephone calls. We want to build that same kind of sensitivity into computers and through automatic detection of emotional state from acoustic information alone.
     This talk will present an introduction to speech-processing for the purpose of detecting emotion in speech with a look at some current and planned work.

© 2006