Daily Note: Merging Virtual and Physical Spaces

These notes are a summary of concepts presented in “Design Frameworks for Spatial Zone Agents in XRI Metaverse Smart Environments.”

J. Guan, J. Liu and A. Morris, “Design Frameworks for Spatial Zone Agents in XRI Metaverse Smart Environments,” 2024 IEEE International Conference on Artificial Intelligence and eXtended and Virtual Reality (AIxVR), Los Angeles, CA, USA, 2024, pp. 75-79, doi: 10.1109/AIxVR59861.2024.00017.

  1. Hybrid Objects and Hyper-Connected Environments
    • Creation of hybrid physical-virtual objects
    • Hybrid interactions between physical and virtual spaces
    • Resulting environment
      • Social, smart, engaging, immersive
      • Consistency, resilience, reduction, and correlation across applications
    • Use of AI, Extended Reality, Internet of Things
  2. Design Components and Strategies
    • User Interface and widgets
      • Menus, information panels, labels, annotations, controls, monitors, and displays
    • Spatial references and visualizations
      • Points, paths, areas, boundaries, and other visualizations
    • Embedded visual effects
      • Anthropomorphic, virtual replicas, ghost effects, texture mapping
    • Interaction models
      • Tangible, touch, controller, gesture, gaze, voice, proximity
  3. Smart-Spaces and Spatial Agents
    • Transformation of smart spaces into zone-aware spatial agents
    • Interaction based on user spatial context
    • Definition of spatial computing
      • Interaction with real objects and space
      • Mixed Reality combining real and virtual worlds
  4. Agent System Design and Mixed Reality
    • Agent system design in intelligent systems
      • Foundations for intelligent and scalable agent-based systems
      • MiRAs: Mixed Reality Agents framework
      • Human-Robot Interaction (HRI) and XR
    • Zone agents design
      • Focus on spatial context and user-zone events
      • Embodiment through zone-driven actions (virtual/physical)
  5. Prototype of Zone Agent
    • Scenario and interaction in lab environment
      • User experience with head-mounted display device and virtual plant avatar
      • Tasks: learning, relaxing, and meeting zones
    • Agent interaction
      • Gesture recognition, task initiation, zone transitions
    • Tracking and Context Adaptation
      • Workstation and relaxation zone agents
      • Time tracking for study sessions
      • Light and projector control in meeting zone
    • Agent types
      • Simple reflex agents with potential for stronger agent paradigms (e.g., machine learning)
  6. Prototype Framework
    • User navigation and interaction in zones
      • Use case scenario with work, leisure, and meeting zones
      • Integration of sensors, actuators, decision-making, and communication paradigms
    • Adapting to spatial contexts
      • Agents designed to adapt to user needs
      • Physical or virtual reality changes based on user contexts
      • Refining agency levels, physical-virtual responsiveness, spatial interactions