Fooling Computer Vision Classifiers with Adversarial Examples
Navy SBIR 2018.2 - Topic N182-127
ONR - Ms. Lore-Anne Ponirakis - [email protected]
Opens: May 22, 2018 - Closes: June 20, 2018 (8:00 PM ET)

N182-127

TITLE: Fooling Computer Vision Classifiers with Adversarial Examples

 

TECHNOLOGY AREA(S): Information Systems, Sensors

ACQUISITION PROGRAM: Various NAVAIR drone programs that committed to airborne computer vision (STUAS, BAMS), Locust INP

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 3.5 of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Identify effective techniques to change the appearance of objects and thereby cause computer vision classifiers to misclassify those objects. These techniques will modify the appearance of the physical objects themselves rather than images of those objects. A set of techniques will be identified and tested against multiple state-of-the-art classifiers using images from multiple viewpoints around the target objects.

DESCRIPTION: The goal of this SBIR topic is to better understand the mechanisms with which one can trick computer vision classifiers in order to anticipate and counter enemy efforts at camouflaging and misguiding our classifiers. For example, a unit may want to confuse enemy surveillance into thinking that a Honda Civic might be a tank or vice versa. Deep neural networks are achieving recognition of familiar objects with near certainty that might otherwise be unrecognizable to human eyes. Nevertheless, there are some interesting differences between the ways human vision classifies entities and the ways that classifiers do. Certain perturbations to the images can result in otherwise highly effective classifiers to lose robustness on large-scale datasets. Similar changes to the objects themselves may also impact classifier performance and act as �computer vision camouflage.�

An understanding of the methods available to fool computer vision classifiers would help the Navy gain robustness in its own algorithms to properly classify objects and entities of interest. This SBIR topic will explore how state-of-the-art deep learning classifiers are impacted by these perturbations as well as what causes these classifiers to misclassify the target entities.

While this SBIR topic builds on extensive recent published research in this area, it will differ in its focus on methods to change the appearance of objects vice changing images of those objects. Effective techniques should cause misclassification of images taken from a variety of angles (nadir to +/- 45 degrees) and distances (300 feet to space). Camouflage techniques should also be tested against multiple classifiers to determine if they are generally applicable or only effective against certain classifiers. Ideally, target objects will have some military relevance (e.g., aircraft and vehicles) and algorithmic improvements proposed to counter the identified camouflage techniques.

The Phase I effort will focus on theory and method maturation. Phase II will develop classifier fooling physical kits and algorithm counters. Phase III will focus on the transition of the technologies into operative units and vehicles.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by DoD 5220.22-M, National Industrial Security Program Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Security Service (DSS). The selected contractor and/or subcontractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances, in order to perform on advanced phases of this project as set forth by DSS and the Office of Naval Research (ONR) in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material IAW DoD 5220.22-M during the advanced phases of this contract.

PHASE I: Determine feasibility for the development of operationally relevant techniques for fooling computer vision classifiers. Conduct a detailed analysis of literature and commercial capabilities. Assess which known techniques for image modification could be applied to this project of physical object modification. Identify the target objects and computer vision classifiers to be used for testing the effectiveness of camouflage techniques. Produce Phase II plans with a technology roadmap, development milestones, and projected Phase II achievable performance.

PHASE II: Develop physical modifications for fooling computer vision classifiers and evaluate their effectiveness against the classifiers identified in Phase I. Attempt to identify root causes for the misclassifications. Propose and test improvements to the classifier algorithms to counter the physical modifications. Phase II deliverables will include five different physical modification kits for aircraft and vehicles for applying computer vision deception techniques, test results for the effectiveness of both physical modifications and algorithmic improvements, relevant source code for any algorithmic improvements, and a demonstration using a scenario of interest. The demonstration scenario would include analyzing a target object both before and after modification, and show that the classifier reliability shifts from one decision to another. The computer vision classifier to be used will be determined by the offeror.

It is probable that the work under this effort will be classified under Phase II (see Description section for details).

PHASE III DUAL USE APPLICATIONS: Transition physical modification kits to operational units and vehicles. The proposer should provide a means for performance evaluation with metrics for analysis (e.g., effectiveness of physical modifications) and method for operator assessment of product interactions (e.g., ease of use). Collaborate with private sector providers of computer vision products. Developed technology would be relevant to the mixed reality gaming market as it would allow environments seen through special glasses to be varied easily from game to game.

REFERENCES:

1. Kurakin, Alexey, Goodfellow, Ian J., and Bengio, Samy. �Adversarial examples in the physical world.� 2016. Proceedings of the International Conference on Learning Representations, 2017. https://arxiv.org/pdf/1607.02533.pdf

2. Goodfellow, Ian J., Shlens, Jonathon, and Szegedy, Christian. �Explaining and harnessing adversarial examples.� In Proceedings of the International Conference on Learning Representations (ICLR), 2015. https://arxiv.org/pdf/1412.6572.pdf

3. Carlini, Nicholas and Wagner, David. �Towards evaluating the robustness of neural networks.� IEEE Symposium on Security & Privacy, 2017. https://arxiv.org/pdf/1608.04644.pdf

4. Nguyen, A., Yosinski, J., and Clune, J. �Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images�. Computer Vision and Pattern Recognition (CVPR '15), IEEE, 2015. https://arxiv.org/pdf/1412.1897.pdf

5. Luo, Yan, Boix, Xavier, Roig, Gemma, Poggio, Tomaso, and Zhao, Qi. �Foveation-based mechanisms alleviate adversarial examples.� 2016. https://arxiv.org/pdf/1511.06292.pdf

KEYWORDS: Computer Vision; Algorithm Warfare; Artificial Intelligence; Deception; Deep Learning; Decoys

 

** TOPIC NOTICE **

These Navy Topics are part of the overall DoD 2018.2 SBIR BAA. The DoD issued its 2018.2 BAA SBIR pre-release on April 20, 2018, which opens to receive proposals on May 22, 2018, and closes June 20, 2018 at 8:00 PM ET.

Between April 20, 2018 and May 21, 2018 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. During these dates, their contact information is listed above. For reasons of competitive fairness, direct communication between proposers and topic authors is not allowed starting May 22, 2018
when DoD begins accepting proposals for this BAA.
However, until June 6, 2018, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS during the Open BAA period for questions and answers and other significant information relevant to their SBIR/STTR topics of interest.

Topics Search Engine: Visit the DoD Topic Search Tool at www.defensesbirsttr.mil/topics/ to find topics by keyword across all DoD Components participating in this BAA.

Proposal Submission: All SBIR/STTR Proposals must be submitted electronically through the DoD SBIR/STTR Electronic Submission Website, as described in the Proposal Preparation and Submission of Proposal sections of the program Announcement.

Help: If you have general questions about DoD SBIR program, please contact the DoD SBIR Help Desk at 800-348-0787 or via email at [email protected]