Improved Electronics Maintenance through Tester Prognostics
Navy SBIR 2013.2 - Topic N132-091
NAVAIR - Ms. Donna Moore - [email protected]
Opens: May 24, 2013 - Closes: June 26, 2013

N132-091 TITLE: Improved Electronics Maintenance through Tester Prognostics

TECHNOLOGY AREAS: Information Systems

ACQUISITION PROGRAM: PMA 260

OBJECTIVE: Develop innovative tools and processes required to leverage electronics prognostics and health management (ePHM) in Navy Automatic Test System (ATS) environments to support electronics maintenance

DESCRIPTION: Within the aviation arena, virtually all electronic Units Under Test (UUTs) are tested on Automatic Test System (ATS). By improving the overall maintenance of UUTs, operational availability and readiness of their weapon systems can be improved. Current approaches to prognostics have focused on physics based and data driven models. These approaches require extensive data about the UUT, some of which can currently be provided by the UUT itself. However, it has become evident that supplementing the UUT provided data with data from the Automatic Test Equipment (ATE )test results will greatly enhance the analyses. The combination of UUT and ATE data to support Prognostics and Health Management (PHM) will require a system that can manage data across multiple networked ATSs and combine data and results among multiple maintenance organizations. In order to ensure consistency of approaches and tools, industry standards related to test results, maintenance history, diagnostic session data, and Automatic Test Markup Language (ATML) need to be incorporated, and possibly enhanced. Therefore, participation with industry standards organizations in developing and/or applying such standards to electronics PHM will be necessary.

To achieve these objectives, a set of tools that uses domain ontologies, system-level approaches to health monitoring, diagnostics, and prognostics, is sought. These tools should build upon previous successes in the electronics prognostics domain by managing and combining test and diagnostics data from the UUTs themselves (e.g. Built In Test, and on-system diagnostics) and ATE test results (e.g. Test Program Set test logs), and aggregating this information across the spectrum of UUTs and ATE in the Navy to enhance the prognostics results. The tools developed will implement a process for model maturation based on historical data, with methodologies and supporting algorithms to analyze the models. The tools will also consider development for real time monitoring. Finally, a mechanism for creating and utilizing a library of degraded component models, along with environmental effects, which will allow for simulations of system degradation, should be considered.

The proposed system needs to draw upon the existing IEEE SCC20 and other industry standards while identifying ways to enhance these standards to support the PHM objectives outlined here. If necessary, new standards can also be recommended. Finally, the approach needs to consider the various ATS and processes used across the Navy and DoD so that the resulting solution can be applied to a broad spectrum of electronics maintenance activities.

PHASE I: Develop and demonstrate a proof of concept system for one of the members of the DoD family of testers. The proof of concept should focus on information obtained from UUTs and ATE test results.

PHASE II: Fully develop the system from Phase I into a usable analysis tool. Evaluate and demonstrate the prototype tool using one of the members of the DoD family of testers. Implement a process for model maturation based on historical data. Continue to refine analysis methodologies and supporting algorithms, including development for real time monitoring. Ensure the approaches and tools are consistent with industry standards.

PHASE III: Refine and deliver algorithms and a tool for a generic PHM, suitable for use on general-purpose ATS across the DoD. Transition the technology to appropriate test platforms.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: Industries involved in large scale, weapon-system maintenance, such as the automotive, shipping, space, and aviation industries.

REFERENCES:
1. Sheppard, J., Wilmering, T., & Kaufman, M. (2008). IEEE Standards for Prognostics and Health Management. IEEE AUTOTESTCON 2008 Conference Record, Salt Lake City, UT, pp. 97-103. Digital Object Identifier: 10.1109/AUTEST.2008.4662592

2. IEEE Std 1232-2010, Standard for Artificial Intelligence Exchange and Service Tie to All Test Environments (AI-ESTATE), Piscataway, NJ: IEEE Standards Association Press, 2010. Digital Object Identifier: 10.1109/IEEESTD.2011.5743076

3. IEEE P1636.1, Draft IEEE Standard Software Interface for Maintenance Information Collection and Analysis (SIMICA): Test Results and Session Information, Draft 1.0, Piscataway, NJ: IEEE Standards Association Press, 2011. http://standards.ieee.org/develop/project/1636.1.html

4. IEEE Std 1636.2-2010, IEEE Trial Use Standard Software Interface for Maintenance Information Collection and Analysis (SIMICA): Maintenance Action Information, Piscataway, NJ: IEEE Standards Association Press, 2010. Digital Object Identifier: 10.1109/IEEESTD.2010.5648434

5. IEEE Std 1671-2010, IEEE Standard for Automatic Test Markup Language (ATML) for Exchanging Automatic Test Equipment and Test Information via XML, Piscataway, NJ: IEEE Standards Association Press, 2010. Digital Object Identifier: 10.1109/IEEESTD.2011.5706290

6. Wilmering T. & Sheppard J.(2007) Ontologies for Data Mining and Knowledge Discovery to Support Diagnostic Maturation. Proceedings of the 18th International Workshop on Principles of Diagnosis (DX-07), Nashville, TN, pp. 210�217. http://w3.isis.vanderbilt.edu/dx07/Proceedings/Proceedings.pdf

7. IEEE SCC20 (2012), IEEE Standards Coordinating Committee 20 on Test and Diagnosis for Electronic Systems. http://grouper.ieee.org/groups/scc20/

KEYWORDS: Ontology, Diagnostics, Prognostics and Health Management, Automatic Test System, Maturation, Electronics Maintenance

** TOPIC AUTHOR (TPOC) **
DoD Notice:  
Between April 24 through May 24, 2013, you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. Their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting May 24, 2013, when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (13.2 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 13.1 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or email weblink.