Daily Note: Opportunistic User Interfaces, Everyday Objects and Augmented Reality

These notes are a summary of concepts presented in “Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects into Tangible 6DoF Interfaces Using Ad hoc UI.”

Ruofei Du, Alex Olwal, Mathieu Le Goc, Shengzhi Wu, Danhang Tang, Yinda Zhang, Jun Zhang, David Joseph Tan, Federico Tombari, David Kim. 2022. Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects into Tangible 6DoF Interfaces Using Ad hoc UI. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI ’22 Extended Abstracts), April 29-May 5, 2022, New Orleans, LA, USA. ACM, New York, NY, USA, 7 pages. https://doi.org/10.1145/3491101.3519911

  1. Workflow Details
    • Definition
      • Summon virtual interfaces on everyday objects via voice commands or tapping gestures
      • Transform arbitrary objects into Tangible User Interfaces (TUIs) without predefined physical objects
    • Goal
      • Enrich tangible interaction with multimodal support (voice, gesture)
  2. Conceptual Framework
    • Opportunistic controls
      • Associate virtual content with physical objects semantically
      • Expand direct association, activation, and interaction without contextual limitations
    • Future prospects
      • Recognize user-pointed or gazed objects to summon interfaces
      • Extract essential patterns with orthogonal re-projections
  3. Perception
    • 2D visual tracking
      • Track and move objects in front of a camera
    • 6DoF pose estimation
      • Real-time pose estimation for geometrical texture registration
      • Simplifies programming by unifying computer vision and graphics systems
    • Hand tracker
      • Hand detection (bounding box, pose classification)
      • Index-fingertip regressor detects tapping gestures
  4. Representation
    • Visual patterns
      • Register everyday objects as input devices or interfaces
    • Virtual widgets
      • Represent common functionalities as input proxies
    • Relationship submodule
      • Maintain associations between tracked objects
  5. Interaction
    • Deictic
      • Use physical objects as pointing devices
    • Iconic
      • Mimic physical user interfaces
    • Metaphoric
      • Perform motion-trajectory gestures to invoke actions
    • Proxy
      • Use objects as anchors for virtual content
    • Proxemic
      • Leverage distance and orientation
    • Audio Activation
      • Use the Live Transcribe Speech Engine for voice interaction
    • Hand Manipulations
      • Grasping: Activates interfaces or enhances details
      • Pointing: Enables hovering effects for virtual buttons
      • Tapping: Activates virtual buttons/sliders on objects