Automated Test Program Set Analysis for Maintenance Data Metrics Generation
Navy SBIR 2015.1 - Topic N151-014
NAVAIR - Ms. Donna Moore - [email protected]
Opens: January 15, 2015 - Closes: February 25, 2015 6:00am ET

N151-014 TITLE: Automated Test Program Set Analysis for Maintenance Data Metrics Generation

TECHNOLOGY AREAS: Information Systems, Materials/Processes, Electronics

ACQUISITION PROGRAM: PMA 260

OBJECTIVE: Develop a novel method for extracting usage metrics from test program set (TPS) source code and automated test equipment (ATE) logs.

DESCRIPTION: The Consolidated Automated Support System (CASS) family of testers currently hosts more than 1,500 TPSs in support of the testing and repair of avionics and weapon system units under test, spanning numerous aircraft platforms. Several hundred additional TPSs are also slated for development. This has resulted in a large pool of TPS code and associated data, stored in the Navy's Automatic Test System (ATS) Source Data Repository.

This data is viewed as an untapped resource to aid in ATS planning and support. The ability to relate test instrument capabilities to TPS source data and ATS usage data would provide a comprehensive look at how avionics maintenance is performed. Data mining on this comprehensive data set could serve to expose run-time inefficiencies or under- and over-utilized test equipment (or specific capability ranges within a piece of equipment), providing significant benefit to the selection of new ATS components during replacements and upgrades. Broad questions could be answered about ATS component capabilities, including not only the frequency of their use but also the manner. Additionally, such an analysis could identify economic targets of opportunity for the deployment of new and innovative test techniques.

Complexities in the execution of TPSs present frequent challenges to the analysis of the data sets. TPS instrument settings can be variable, not hard coded. These variables are often set procedurally but other times via manual input from the ATS user. This product should be capable of assigning TPS variables regardless of their dependencies. Development of such a capability poses a technical challenge that is part test simulation and part data mining/analysis. Once every TPS can be simulated and their results archived, a total envelope of all ATS instrument usage can be generated.

PHASE I: Define and develop a concept for the aggregation and analysis of ATE and TPS data. The concept must apply to PMA-260's CASS family of testers (CASS, Reconfigurable Transportable CASS, and electronic CASS [eCASS]) but may provide a complete data metrics generation concept or contribute to a step in the aggregation and mining of such data.

PHASE II: Further develop the concept defined in Phase I. Demonstrate the ability to simulate TPSs while storing the values of any variable instrument settings, until such time that a comprehensive set of parameters for each variable are defined. Verify these parameters against log files from actual TPS runs on CASS.

PHASE III: Complete testing and transition technology to PMA-260 ATE and TPS development and acquisition support processes or appropriate platforms and users.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: The testing of complex electronic assemblies is not just limited to Navy ATS. The concept product developed through this SBIR would be easily applicable to other Department of Defense (DoD) ATS, with potential further applications to commercial avionics test equipment or non-avionics electronics test in the private sector.

REFERENCES:
1. Sparr, C. & Dusch, K. (2010, September). Prioritizing parallel analog re-host candidates through ATLAS source code analysis. Paper presented at the Institute of Electrical and Electronics Engineers Autotestcon Proceedings 2010, Orlando, Florida. doi: 10.1109/AUTEST.2010.5613562

2. United States DoD. (2004). DoD automatic test systems handbook. Retrieved from http://www.acq.osd.mil/ats/DoD_ATS_Handbook_2004.pdf

3. United States Navy. (2002, January). Performance specification test program sets (MIL-PRF-32070 A). Retrieved from http://www.acq.osd.mil/ats/MIL-PRF-32070A.pdf

KEYWORDS: Data Mining; ATE; TPS; ATS; metrics; Avionics Maintenance

** TOPIC AUTHOR (TPOC) **
DoD Notice:  
Between December 12, 2014 and January 14, 2015 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting January 15, 2015 , when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (15.1 Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the SBIR 15.1 topic under which they are proposing.

If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at (866) 724-7457 or webmail link.

Return

Offical DoD SBIR FY-2015.1 Solicitation Site:
www.acq.osd.mil/osbp/sbir/solicitations/sbir20151/index.shtml