← Back to Projects

AR/VR Interaction Enhancement

decoupled_de-occlusion_and_pose_estimation_models Not Started

Project Actions

Open in Terminal

Project Status

Source Idea

Decoupled de-occlusion and pose estimation models can improve navigation and interaction in AR/VR environments by dynamically adjusting scene elements based on user gaze and movement.

View Source Idea →

Files (10)

  • README.md
  • metadata.json
  • requirements.txt
  • src/__init__.py
  • src/data_loader.py
  • src/evaluate.py
  • src/interface.py
  • src/model.py
  • src/train.py
  • src/utils.py

README Preview

# AR/VR Interaction Enhancement ## Description This project explores the hypothesis that decoupled de-occlusion and pose estimation models can improve navigation and interaction in AR/VR environments by dynamically adjusting scene elements based on user gaze and movement. The aim is to enhance user immersion and satisfaction compared to traditional static scenes. ## Research Hypothesis Decoupled de-occlusion and pose estimation models can improve navigation and interaction in AR/VR environments by dynamically adjusting scene elements based on user gaze and movement. ## Implementation Approach - Develop a system integrating eye-tracking and motion sensors with decoupled de-occlusion and pose estimation models. - Conduct user studies to observe the impact of dynamic scene adjustments on user experience. - Measure improvements in immersion and satisfaction. ## Setup Instructions 1. Install the required Python packages: ```bash pip install -r requirements.txt ``` 2. Ensure you have access to AR/VR equipment with eye-tracking capabilities. ## Usage Examples - Run the training script: ```bash python src/train.py ``` - Evaluate the model: ```bash python src/evaluate.py ``` ## Expected Results - Enhanced user immersion and satisfaction through dynamic scene adjustments. ## References - SceneMaker: Open-set 3D Scene Generation with Decoupled De-occlusion and Pose Estimation Model [Paper URL](http://arxiv.org/abs/2512.10957v1)