Physics Based Multi-Touch Movement Interface Creation for 3D Modeling and Simulation
Navy SBIR FY2012.1


Sol No.: Navy SBIR FY2012.1
Topic No.: N121-061
Topic Title: Physics Based Multi-Touch Movement Interface Creation for 3D Modeling and Simulation
Proposal No.: N121-061-1285
Firm: Jardon & Howard Technologies Incorporated
2710 Discovery Drive
Suite 100
Orlando, Florida 32826
Contact: Alberto Vasquez
Phone: (407) 435-7769
Web Site: www.jht.com
Abstract: Current multi-touch pressure sensitive Commercial Off The Shelf (COTS) hardware and software (e.g., Apple iPhones, Android Phones, etc.) allow users to seamlessly interact with software applications using gestures, but underutilize their potential by only employing a limited set of Human Computer Interactions (HCI) (e.g., panning, zooming, etc.) Multi-touch pressure sensitive hand movements, hereafter called gestures, combined with real-time physics can result in more realistic and intuitive software applications. JHT's goal is to research, define, and develop reusable real-time physics, navigation, and meta gestures API software and standards for use in training, simulations, architectural design, and entertainment software applications on COTS multi-touch hardware. JHT's efforts will identify an initial set of user-defined gestures with metrics by creating a gesture data collection apparatus, gathering data using the apparatus, and analyzing it. The phase I exploratory study will identify user-defined gestures by presenting the effects of gestures to participants to elicit causes meant to invoke them. The exploratory development and testing of the physics and navigation gestures Application Programming Interface (API) will require a physics visualization application to be created by developing a render engine and integrating a physics engine with it.
Benefits: JHT anticipates the exploratory study will clearly define parameters and metrics to develop a reusable real-time gestures API. The most substantial benefits of the effort are determining the feasibility of our approach to provide intuitive and reusable HCI for users, reduce training gaps and costs for expensive operator and maintenance equipment training, and expand accessibility to these technologies for training, simulation, architectural design, and entertainment software applications. We also anticipate benefits beyond those realized by current gesture methods and expect this program will provide a path forward to create the next frontier of innovation in the HCI field. Simulation of equipment for training in various industries is becoming a necessity to provide a scalable solution reaching a larger audience at cost effective rates. This market spans numerous industries requiring operator and maintenance training solutions including defense, energy, education, industrial production, communications, and transport. Developing and commercializing proposed gestures API for games would provide new/expanded experiences for users that anticipate evolution beyond mouse, keyboard, and the limited set of available gestures. Games could be developed in-house using the gestures API and sold as products or game companies could purchase licenses for the gestures API to enhance existing and new titles.

Return