Developing Mixed Reality Applications with Platform for Situated Intelligence

2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW) | , pp. 48-50

Publication

Both industry and academic interest in mixed reality has skyrocketed in recent years. New headset devices for both virtual and augmented reality are increasingly available and affordable, and new APIs, tools, and frameworks enable developers and researchers to more easily create mixed reality applications. While many tools aim to make it easier to create and interact with content rendered to the head-set, these new devices are interesting not just from an output, but also from an input perspective-they contain powerful multimodal sensors that provide unique opportunities to drive forward research on egocentric perception and interaction. In this paper, we intro-duce Platform for Situated Intelligence-an existing open-source framework-to the mixed reality community. The framework was designed to help developers and researchers create and study real-time, interactive AI systems that process multimodal streaming data. Recent extensions to the framework include new capabilities and components designed specifically to support mixed reality sensory streams and scenarios.