Developing Multimodal Applications for New Platforms

Talks

Developing Multimodal Applications for New Platforms

Add to calendar

Event details

Date:
Coordinated Universal Time
Location:
Washington, D.C., USA
Speakers:
Deborah Dahl

Multimodal interfaces, combining speech, graphics, and sensor input, are becoming increasingly important for interaction with the rapidly expanding variety of nontraditional platforms, including mobile, wearable, robots, and devices in the Internet of Things. User interfaces on these platforms will need to be much more varied than traditional user interfaces. We demonstrate how to develop multimodal clients using standards such as WebRTC, WebAudio, and Web Sockets and the Open Web Platform, including open technologies such as HTML5, JavaScript and CSS. We also discuss integration with cloud resources for technologies such as speech recognition and natural language understanding. Attendees should have access to a browser that supports the Open Web Platform standards, for example, the current versions of Chrome, Firefox, or Opera. Basic knowledge of HTML5 and JavaScript would be very helpful.