Active Transfer Learning for Intelligent Tutoring
Navy STTR 2015.A - Topic N15A-T013
ONR - Ms. Lore-Anne Ponirakis - [email protected]
Opens: January 15, 2015 - Closes: February 25, 2015 6:00am ET

N15A-T013 TITLE: Active Transfer Learning for Intelligent Tutoring

TECHNOLOGY AREAS: Human Systems

ACQUISITION PROGRAM: STEM Grand Challenge and Personal Assistant for Life Long Learning - NETC

OBJECTIVE: Develop a novel framework for intelligent tutoring utilizing algorithms that recognize performance on one task to domain separate but related tasks. The goal is to leverage skills across task domains to speed up skill acquisition on new tasks that might require similar sub-skills related to other task domains. The end result will be an algorithm that allows predictions of performance on varying tasks (e.g., How does a strong foundation in introductory electricity influence the learning of diode circuits, microprocessors, or motors?).

DESCRIPTION: Develop efficient, reliable, and robust algorithms for intelligent tutoring that apply previous domain knowledge through crowd-sourcing analytics on separate but related tasks to predict performance and identify knowledge deficient areas on a new task.

Previous research on transfer learning for human populations (Ormrod, 2004) describes the effective extent to which past experiences (transfer source) affects learning and performance in a new situation (transfer target). Similar predictions culled from machine algorithms use estimates of knowledge gained while solving one problem then applying it to a different but related problem (Pan & Yang, 2010). While predicting far transfer has been shown with human populations, it remains difficult utilizing machine learning techniques.

To predict far transfer, an active component might be necessary. Here, crowd sourcing may assist in identifying the amount/type of information that is relevant to transfer. Previous work on Active Transfer Learning methods may be useful in developing an approach to identify what amount/type of information is relevant, and then verify by crowd sourced experts. The algorithm, through interaction with a subject-matter expert (SME), would then transfer that knowledge to increase the efficiency on the learning task.

Tasks (composed of different skills) should transfer to performance on different skills among a variety of tasks and predict performance. The primary benefit of this approach is that it should not require training on disparate tasks. Rather, this approach should allow for training on only those task domains to identify specific skill-based performance. Skill based learning requires a foundation of knowledge and procedures with some level of assessment on completed tasks (for example, this assessment could be obtained from supervisor Personnel Qualification Standards (PQS) sign-offs. This approach promises to develop much more efficient and effective targeted training across different task domains.

PHASE I: Identify the concept and plan for the development of an algorithm that will catalog skills, that can be mapped in a skill-to-task network (shaped by crowd sourcing experts), and used to predicted future performance.

PHASE II: Produce prototype based on Phase I work; validate algorithm of predicted performance. Reliably predict problem areas for new tasks by examining the skill-based performance on known tasks. Crowd sourcing data should exploit current mobile technology. If practical, the algorithm should integrate with existing navy tutoring systems (ONR�s Grand STEM Challenge) in order to determine its accuracy and usefulness.

PHASE III: During Phase III, this algorithm should integrate with tutoring devices and learning management systems that allow operators to hone skill based performance, thereby increasing task based efficiency. These mobile online tutoring devices allow the algorithms to assess student�s abilities and predict areas of requisite skill and efficiency. For example, assume maintenance personnel have a YouTube like library that they can reference for help. System can assess queries and completed repairs (PQS) to build a model of what the student knows and what future tasking would help extend their capabilities. Videos could be accessed that demonstrate procedural errors or visual inspections tasks that assess the quality of the sailor�s skills.

PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: The proposed technology offers many practical private sector commercial uses. Specifically, organizations with large training overhead costs might apply concepts and tools developed under this Algorithm to reduce overall training costs. Further, this Algorithm might be used as a job selection aide, efficiently in identifying, not only skill deficient areas, but also those areas where the individual may not possess the requisite aptitude to perform the task. Thus "select out" individuals that either cannot perform the task or training would simply take too long.

REFERENCES:
1. Ormrod, J. E. (2004). Human learning (4th ed.). Upper Saddle River, NJ, USA: Pearson.

2. Pan, S. J., & Yang, Q. (2010). A survey on transfer learning. Knowledge and Data Engineering, IEEE Transactions on, 22(10), 1345-1359.

3. Raina, R., Battle, A., Lee, H., Packer, B., & Ng, A. Y. (2007, June). Self-taught learning: transfer learning from unlabeled data. In Proceedings of the 24th international conference on Machine learning (pp. 759-766). ACM.

KEYWORDS: Intelligent Tutoring; Active Transfer Learning; Far Transfer; Skills Acquisition; Crowd Sourcing; Training

** TOPIC AUTHOR (TPOC) **
DoD Notice:  
Between December 12, 2014 and January 14, 2015 you may talk directly with the Topic Authors (TPOC) to ask technical questions about the topics. For reasons of competitive fairness, direct communication between proposers and topic authors is
not allowed starting January 15, 2015 , when DoD begins accepting proposals for this solicitation.
However, proposers may still submit written questions about solicitation topics through the DoD's SBIR/STTR Interactive Topic Information System (SITIS), in which the questioner and respondent remain anonymous and all questions and answers are posted electronically for general viewing until the solicitation closes. All proposers are advised to monitor SITIS (15.A Q&A) during the solicitation period for questions and answers, and other significant information, relevant to the STTR 15.A topic under which they are proposing.

If you have general questions about DoD STTR program, please contact the DoD SBIR/STTR Help Desk at (866) 724-7457 or webmail link.

Return

Offical DoD STTR FY-2015.A Solicitation Site:
www.acq.osd.mil/osbp/sbir/solicitations/sttr2015A/index.shtml