Enhanced Visualization for Situational Understanding
Navy SBIR 2019.1 - Topic N191-017
NAVSEA - Mr. Dean Putnam - [email protected]
Opens: January 8, 2019 - Closes: February 6, 2019 (8:00 PM ET)

N191-017

TITLE: Enhanced Visualization for Situational Understanding

 

TECHNOLOGY AREA(S): Battlespace, Electronics, Sensors

ACQUISITION PROGRAM: PEO IWS 1.0, FNC - Operator Planning Tool

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 3.5 of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop an automated three-dimensional (3-D) enemy Courses of Action (COAs) application that utilizes five dimensional (5-D) representations, variable in both time and space, for complex missions that provide situational visualization to achieve greater understanding in real-time.

DESCRIPTION: The Surface Navy currently has no automated, collaborative tools for the analysis of potential Course of Action (COA) Tactical Decision Aids (TDA). Surface ships now use a set of disaggregated software decision support aids that do not support collaboration, have only two-dimensional (2-D) visualization, and must be aggregated by the operator to be useful. Current decision support tools are fixed in time, and must replicate calculations and be combined by operators to assess performance over time. Visual graphical animations allow trends to be spotted and evaluated more quickly than tabular or static 2-D presentation formats.� Static 2-D representations have proven to be effective in coordinating unit support and assigning roles, tasks and actions within the maritime, and air and land mission domains; however, they are limited in their ability to visually represent multi-unit or multi-domain temporal coordination. Although many three-dimensional (3-D) situational visualization representations that provide understanding exist, consensus for and widespread adoption of 3-D situational visualization have not been achieved. This often stems from the lack of machine readable data, inaccuracies in the modeling, and sub-optimal estimates and visualization of the friendly and enemy COAs. This condition is particularly serious with respect to known and known but unaccounted-for threats within the Integrated Air and Missile Defense (IAMD) domain. To address this condition, enhanced situational visualization to achieve greater understanding is needed. A solution is needed that will provide a 3D Graphic User Interface (GUI) with a capability to include moving forward and backward in time and space as added dimensions (known as 5-D) to the current COA analysis tool under development. This will enhance capabilities of the developing Battle Management Aid for Surface Navy planning. Leveraging the foundation of an established COA generation tool with a 3-D graphic virtualization will deliver a COA representation with the added dimension of time variability.

One of the key challenges of big data is taking the enormous amounts of information and turning it into something useful that can be consumed and ingested by the human brain. Although there are no defined limitations to hardware and software for use aboard ships, Navy resource sponsors are seeking to reduce lifecycle costs to support Fleet capability by developing hardware agnostic software and by employing software standards that facilitate updates without significant cost. Neuroscientists at the Florida Atlantic University (FAU) say they have developed a new type of visualization � a five-dimensional colorimetric model that they say will help them visualize data across space and time. Some 5-D visualizations are being used in medical and construction industries. The method, called a five dimensional (5-D) colorimetric technique, graphs spatiotemporal data (data that includes both space and time), that has not previously been achieved. Previous to the 5-D colorimetric model spatiotemporal problems were analyzed either from a spatial perspective (for instance, a map of gas prices in July 2013), or from a time-based approach (evolution of gas prices in one county over time), but not simultaneously from both perspectives.

The Navy seeks an automated COA capability that improves situational understanding through the use of visualization. New approaches are needed with enhanced visualization methods and present dynamic real-time, temporally accurate visualizations of friendly and threat capabilities within a region of interest.� Leading edge technologies in medicine and construction design are beginning to utilize 5-D representations that they say will help them visualize data across space and time. The technology is not widespread in commercial application, because not many industries need a time and space dimension to plan. Using 3-D situational visualization on optimal, multi-domain animated COA estimates will provide automated predictive COAs and improve data analysis and situational understanding.� Adding the dimension of time, the ability to slide forward and backward across space and time with 3-D representations, is the added capability of a 5-D system. This will provide the improved capability for temporal coordination of tactical performance envelopes that is needed. This new capability will utilize methods that apply structured data, generate estimates, and graphically display dynamic temporal opportunities and vulnerabilities that are relevant within the tactical context. 5-D representations that spatially and temporally adapt to indicate dynamic parameters (1st dimension), distributed sensor and weapon coverage areas (2-4th dimensions), and more importantly, convey an estimate of the reaction or decision time (5th dimension), are highly desirable for understanding and addressing multi-domain tactical mission planning.� Just as a tactical heads up display (HUD) overlays critical information on a 3-D view (often of the tactical area or target) to provide increased situational awareness and a reduction of decision and reaction times, COA-based visualization has proven to be effective in conveying temporal orientation and increasing �tactical cognitive� performance. As the dynamic information state factors (e.g., analysis insights, atmospheric propagation and detection ranges) change, the new capability will clarify their impact to the unit commander. 3-D visualization with 5-D animation will enhance operator comprehension of options and contribute to mission planning optimization.

The Phase II effort will likely require secure access, and NAVSEA will process the DD254 to support the contractor for personnel and facility certification for secure access. The Phase I effort will not require access to classified information. If need be, data of the same level of complexity as secured data will be provided to support Phase I work.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. Owned and Operated with no Foreign Influence as defined by DOD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been be implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this contract as set forth by DSS and NAVSEA in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advance phases of this contract.

PHASE I: Develop an initial concept design for an automated 3-D COA solution that presents 5-D animation of COA. Prove the feasibility of the concept, through modeling and simulation, to meet the capabilities described in the Description. Develop a Phase II plan. The Phase I Option, if exercised, will include the initial design specifications and capabilities description to build a prototype solution in Phase II.

PHASE II: Develop a prototype automated COA solution that presents 5-D animation of COA for ease of comprehension that meets the parameters described in the Description. Evaluate the prototype to ensure it improves situational visualization and situational understanding within an IAMD context.

It is probable that the work under this effort will be classified under Phase II (see Description section for details).

PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the 3-D COA tool with 5-D animation. It has two likely destinations, supporting the dual paths the Navy is exploring for Battle Management Aid placement. The new COA tool will be used as a web application service in the Navy�s Maritime Tactical Command and Control (MTC2) network and will need to be compliant with the software interface requirements for web applications as mentioned in the Description. Support the Navy in transitioning the technology to Navy use within the Aegis Weapon System (AWS in Advanced Capability Build (ACB) 20 or higher) as part of an Integrated AWS planner. Refine the prototype for integration into the current AWS operational planning tools. Test and refine the prototype design for the appropriate interfaces with other Navy systems and to comply with information security requirements.

The developed technology should be broadly applicable to live testing of manned and unmanned systems and simulations in which users need COA planning and updates to the plan as time progresses. Dual use applications are numerous, almost any analyst seeking to combine spatial and temporal data in a single display could use this technology.

REFERENCES:

1. Duffie Jr., Warren. �Virtual Victories: Marines Sharpen Skills with New Virtual-Reality Games.� Office of Naval Research, 17 May 2017. http://www.navy.mil/submit/display.asp?story_id=100513

2. Stilman, B. �Discovering the Discovery of the No-Search Approach.� Int. J. of Machine Learning and Cybernetics, 2012, Springer., p. 27. (Printed in 2014, Vol. 5, No. 2, pp. 165-191.) https://link.springer.com/article/10.1007/s13042-012-0127-3

3. Stilman, B., Yakhnis, V., and Umanskiy, O. �Chapter 3.3. Strategies in Large Scale Problems.� Adversarial Reasoning: Computational Approaches to Reading the Opponent's Mind, Ed. by A. Kott (DARPA) and W. McEneaney (UC-San Diego), Chapman & Hall/CRC, pp. 251-285, 2007. https://www.researchgate.net/publication/267079439_Adversarial_reasoning_Computational_approaches_to_reading_the_opponent%27s_mind

4. Stilman, B. �Linguistic Geometry: From Search to Construction.� Kluwer (now Springer), 2000, pp 416.� https://www.springer.com/us/book/9780792377382

KEYWORDS: 3-D Visualization; Tactical Decision Aid; 5-D Animation; Mission Planning Optimization; Course of Action; Enhance Operator Comprehension

 

** TOPIC NOTICE **

These Navy Topics are part of the overall DoD 2019.1 SBIR BAA. The DoD issued its 2019.1 BAA SBIR pre-release on November 28, 2018, which opens to receive proposals on January 8, 2019, and closes February 6, 2019 at 8:00 PM ET.

Between November 28, 2018 and January 7, 2019 you may communicate directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting January 8, 2019
when DoD begins accepting proposals for this BAA.
However, until January 23, 2019, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS during the Open BAA period for questions and answers and other significant information relevant to their SBIR/STTR topics of interest.

Topics Search Engine: Visit the DoD Topic Search Tool at www.defensesbirsttr.mil/topics/ to find topics by keyword across all DoD Components participating in this BAA.

Proposal Submission: All SBIR/STTR Proposals must be submitted electronically through the DoD SBIR/STTR Electronic Submission Website, as described in the Proposal Preparation and Submission of Proposal sections of the program Announcement.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or via email at [email protected]