Animation and Analysis of Shipboard Aircraft Recovery Using Ship’s Geo-Referenced Data
Navy SBIR 2011.2 - Topic N112-111 NAVAIR - Ms. Donna Moore - [email protected] Opens: May 26, 2011 - Closes: June 29, 2011 N112-111 TITLE: Animation and Analysis of Shipboard Aircraft Recovery Using Ship’s Geo-Referenced Data TECHNOLOGY AREAS: Air Platform, Information Systems, Ground/Sea Vehicles ACQUISITION PROGRAM: PMA-209, Air Combat Electronics OBJECTIVE: Develop an algorithm that compensates for and merges separately generated aircraft and ship motion data into a single stream/file that, when animated, accurately depicts the aircraft landing on the ship until after the arresting gear engages. DESCRIPTION: With Global Positioning System (GPS) data becoming available on various military vehicle platforms, the role of animated planning/debriefing is on the rise. The ability to represent the geospatial relationship of moving platforms relative to terrain and to each other provides value in areas such as training, mission planning, and post-mission debriefing. With the availability of GPS, realistic animation of aircraft flight using real-time recorded data has become routine. Currently, land-based operations are depicted accurately from takeoff to landing with incredible fidelity. But, an as-yet untried extension to this concept, important due to the maritime nature of U.S. Navy operations, is to provide the same fidelity for operations in which landing occurs aboard a moving platform. Specifically desired is the depiction of an aircraft landing aboard a moving aircraft carrier until after the arresting gear engages. Doing so requires the integration of the ship’s position and movement with those of the aircraft. In concept, the challenge presented is more complicated than merely merging the ship’s position with that of the aircraft at touchdown. The reason is that the ship’s position does not necessarily represent the point of landing but rather the position of the GPS antenna on the ship’s superstructure. Correctly compensating for this difference, which may seem insignificant when the two vehicles are at large distances from each other, is critical to the fidelity of the landing scene. Furthermore, the means of making this precise determination must be user friendly and enable the operator to correct for noise and errors in the data. Because the level of compensation required may not initially be known, the user must also have the capability of performing run time corrections. If the correct adjustments are not made, the animation will not accurately reflect the actual point at which the aircraft lands. As a consequence, the animation will be of little use for post-landing debriefings. Achieving those objectives requires the development of algorithms that will perform the appropriate steps to compensate for the differences in the instrumentation data currently being provided and the actual motion and position of the ship and aircraft. Therefore, accurate representation of vehicle geometry, position, and axial motion is a key element. Any resultant software must accept time-stamped parametric data (format: <time><value>) of position and axial attitudes of both aircraft and ship, compensate for positional offsets and data errors, and allow for user input of corrections. The proposed method must produce a solution that merges the motion paths of both vehicles to accurately depict the embarked landing of the aircraft. An analysis of the input data, the observed errors, and recommendations for improvement in data specifications are also required. The actual ship/aircraft information will be provided by the government as de-identified data in an open format such as CSV or XML. The output data must be demonstrated with animation software showing all of the complex facets of the entire landing scene. The vendor is responsible for providing all models, scenery, and hardware, but may also determine the animation hardware and software to be used. The host computer system must be compatible with PC/Microsoft Windows systems. The output data format must be open and nonproprietary and be exportable in <time><value> format using CSV, XML, or other open data conventions. Deliverables include a demonstration of a visualization session from data produced by the algorithm and of the ability of the algorithm to create an open exported data file on the host system. Other requirements include an analysis of input data quality, along with recommendations for improvement, and the accompanying source code of the algorithm. PHASE I: Determine the feasibility of accurately representing high-fidelity ship/aircraft animation by using to-be-developed data-compensation algorithms and currently available PC animation technology. Define a plan for the development of the project. PHASE II: Develop, demonstrate, and validate a prototype data-compensation algorithm and ship/aircraft animation process. PHASE III: Build and demonstrate ship/aircraft animation data-compensation algorithm and software, analyze input data, and deliver a source code. Transition technology to applicable platforms. PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: The system developed in this SBIR can be used for real-time monitoring and management of large shipping vessels in harbors throughout the world. In addition, the resultant technology can be applied to accurately represent sea hazards in shipping lanes. Other possible uses include depicting helicopters landing on commercial ships and oil rigs. REFERENCES: 2. Computer animation. (2010). In Wikipedia. Retrieved November 17, 2010, from 3. Saunders, Stephen. 2009. Jane's fighting ships, 2009-2010. (2009/2010) Coulsdon: Jane's Information Group. KEYWORDS: Global Positioning System (GPS); Animation; Algorithm; Embarked Landing; Military Flight Operations Quality Assurance (MFOQA); Debriefing
|