Search Results - "Nothwang, William D."

Refine Results
  1. 1

    PZT-Based Piezoelectric MEMS Technology by Smith, Gabriel L., Pulskamp, Jeffrey S., Sanchez, Luz M., Potrepka, Daniel M., Proie, Robert M., Ivanov, Tony G., Rudy, Ryan Q., Nothwang, William D., Bedair, Sarah S., Meyer, Christopher D., Polcawich, Ronald G.

    Published in Journal of the American Ceramic Society (01-06-2012)
    “…This review article presents recent advancements in the design and fabrication of thin‐film (<3 μm) lead zirconate titanate (PZT) microelectromechanical system…”
    Get full text
    Journal Article
  2. 2

    An Embodied Multi-Sensor Fusion Approach to Visual Motion Estimation Using Unsupervised Deep Networks by Shamwell, E Jared, Nothwang, William D, Perlis, Donald

    Published in Sensors (Basel, Switzerland) (04-05-2018)
    “…Aimed at improving size, weight, and power (SWaP)-constrained robotic vision-aided state estimation, we describe our unsupervised, deep…”
    Get full text
    Journal Article
  3. 3

    Unsupervised Deep Visual-Inertial Odometry with Online Error Correction for RGB-D Imagery by Shamwell, E. Jared, Lindgren, Kyle, Leung, Sarah, Nothwang, William D.

    “…While numerous deep approaches to the problem of vision-aided localization have been recently proposed, systems operating in the real world will undoubtedly…”
    Get full text
    Journal Article
  4. 4

    DTM: Deformable template matching by Hyungtae Lee, Heesung Kwon, Robinson, Ryan M., Nothwang, William D.

    “…A novel template matching algorithm that can incorporate the concept of deformable parts, is presented in this paper. Unlike the deformable part model (DPM)…”
    Get full text
    Conference Proceeding Journal Article
  5. 5

    Human-autonomy sensor fusion for rapid object detection by Robinson, Ryan M., Hyungtae Lee, McCourt, Michael J., Marathe, Amar R., Heesung Kwon, Chau Ton, Nothwang, William D.

    “…Human-autonomy sensor fusion is an emerging technology with a wide range of applications, including object detection/recognition, surveillance, collaborative…”
    Get full text
    Conference Proceeding
  6. 6

    BooM-Vio: Bootstrapped Monocular Visual-Inertial Odometry with Absolute Trajectory Estimation through Unsupervised Deep Learning by Lindgren, Kyle, Leung, Sarah, Nothwang, William D., Shamwell, E. Jared

    “…Machine learning has emerged as an extraordinary tool for solving many computer vision tasks by extracting and correlating meaningful features from high…”
    Get full text
    Conference Proceeding
  7. 7

    Dynamic belief fusion for object detection by Hyungtae Lee, Heesung Kwon, Robinson, Ryan M., Nothwang, William D., Marathe, Amar M.

    “…A novel approach for the fusion of heterogeneous object detection methods is proposed. In order to effectively integrate the outputs of multiple detectors, the…”
    Get full text
    Conference Proceeding
  8. 8

    Vision-Aided Absolute Trajectory Estimation Using an Unsupervised Deep Network with Online Error Correction by Shamwell, E. Jared, Leung, Sarah, Nothwang, William D.

    “…Adstract- We present an unsupervised deep neural network approach to the fusion of RGB-D imagery with inertial measurements for absolute trajectory estimation…”
    Get full text
    Conference Proceeding
  9. 9

    Task-conversions for integrating human and machine perception in a unified task by Hyungtae Lee, Heesung Kwon, Robinson, Ryan M., Donavanik, Daniel, Nothwang, William D., Marathe, Amar R.

    “…The different strategies for feature extraction and synthesis employed by humans and computers are often complementary, hence combining the two into an…”
    Get full text
    Conference Proceeding
  10. 10

    A deep neural network approach to fusing vision and heteroscedastic motion estimates for low-SWaP robotic applications by Shamwell, E. Jared, Nothwang, William D., Perlis, Donald

    “…Due both to the speed and quality of their sensors and restrictive on-board computational capabilities, current state-of-the-art (SOA) size, weight, and power…”
    Get full text
    Conference Proceeding
  11. 11

    The human should be part of the control loop? by Nothwang, William D., McCourt, Michael J., Robinson, Ryan M., Burden, Samuel A., Curtis, J. Willard

    Published in 2016 Resilience Week (RWS) (01-08-2016)
    “…The capabilities of autonomy have grown to encompass new application spaces that until recently were considered exclusive to humans. In the past, automation…”
    Get full text
    Conference Proceeding
  12. 12

    DeepEfference: Learning to predict the sensory consequences of action through deep correspondence by Shamwell, E. Jared, Nothwang, William D., Perlis, Donald

    “…As the human eyeball saccades across the visual scene, humans maintain egocentric visual positional constancy despite retinal motion identical to an egocentric…”
    Get full text
    Conference Proceeding
  13. 13

    Multi-Hypothesis Visual-Inertial Flow by Shamwell, E. Jared, Nothwang, William D, Perlis, Donald

    Published 08-03-2018
    “…Estimating the correspondences between pixels in sequences of images is a critical first step for a myriad of tasks including vision-aided navigation (e.g.,…”
    Get full text
    Journal Article
  14. 14

    Vision-Aided Absolute Trajectory Estimation Using an Unsupervised Deep Network with Online Error Correction by Shamwell, E. Jared, Leung, Sarah, Nothwang, William D

    Published 08-03-2018
    “…We present an unsupervised deep neural network approach to the fusion of RGB-D imagery with inertial measurements for absolute trajectory estimation. Our…”
    Get full text
    Journal Article
  15. 15

    Cyber-Human Approach For Learning Human Intention And Shape Robotic Behavior Based On Task Demonstration by Goecks, Vinicius G., Gremillion, Gregory M., Lehman, Hannah C., Nothwang, William D.

    “…Recent developments in artificial intelligence enabled training of autonomous robots without human supervision. Even without human supervision during training,…”
    Get full text
    Conference Proceeding
  16. 16

    DTM: Deformable Template Matching by Lee, Hyungtae, Kwon, Heesung, Robinson, Ryan M, Nothwang, William D

    Published 12-04-2016
    “…A novel template matching algorithm that can incorporate the concept of deformable parts, is presented in this paper. Unlike the deformable part model (DPM)…”
    Get full text
    Journal Article
  17. 17

    Fast Object Localization Using a CNN Feature Map Based Multi-Scale Search by Lee, Hyungtae, Kwon, Heesung, Bency, Archith J, Nothwang, William D

    Published 12-04-2016
    “…Object localization is an important task in computer vision but requires a large amount of computational power due mainly to an exhaustive multiscale search on…”
    Get full text
    Journal Article
  18. 18

    Passive switched system analysis of semi-autonomous systems by McCourt, Michael J., Robinson, Ryan M., Nothwang, William D., Doucette, Emily A., Curtis, J. Willard

    “…While autonomous capabilities have proliferated across a wide range of commercial and domestic applications, some tasks require intermittent aid from a human…”
    Get full text
    Conference Proceeding
  19. 19

    Relevance and redundancy as selection techniques for human-autonomy sensor fusion by Brody, Justin D., Dixon, Anna M. R., Donavanik, Daniel, Robinson, Ryan M., Nothwang, William D.

    “…Human-autonomy teaming using physiological sensors poses a novel sensor fusion problem due to the dynamic nature of the sensor models and the difficulty of…”
    Get full text
    Conference Proceeding
  20. 20

    Degree of automation in command and control decision support systems by Robinson, Ryan M., McCourt, Michael J., Marathe, Amar R., Nothwang, William D., Doucette, Emily A., Curtis, J. Willard

    “…This paper investigates the effects of integrating automation into the various stages of information processing in a military command and control scenario…”
    Get full text
    Conference Proceeding