Unmet Need: Current methods for aligning real-world objects with virtual objects in augmented reality (AR) environments are prone to misalignment errors, especially when the user’s viewing angle changes. This can affect the accuracy and usability of AR applications, such as surgical guidance, architectural design, and manufacturing.
JHU Solution: The inventors have developed a novel calibration technique that allows the user to align a real-world object with a virtual object from multiple viewpoints simultaneously, using reflective AR displays that simulate mirror-like views. The user can adjust the position of the real-world object until it matches the virtual object from different angles, and the system can store the calibration definitions for future use. The system can also generate the virtual object based on the images or video of the real-world object captured from different viewpoints.
Value Proposition: The proposed technique can improve the alignment accuracy and stability of AR applications, by taking into account the user’s viewing angle and the spatial relationship between the real-world object and the virtual object. The technique can also reduce the calibration time and effort, by allowing the user to align the objects from multiple viewpoints at once, rather than sequentially. The technique can also enable the creation of virtual objects that match the real-world objects in shape, size, and position. The proposed technique can address the challenges and limitations of existing AR alignment methods, and enhance the performance and user experience of AR applications. The technique can also create new opportunities for AR applications that require precise and stable alignment of real-world and virtual objects, such as surgical planning, training, and navigation, architectural and engineering design, and industrial manufacturing and inspection.