DICOM PS3.17 2024c - Explanatory Information

PPPP.5 Use Case: Augmented Reality

Application combining multiple real-time video sources

Figure PPPP.5-1. Application combining multiple real-time video sources

For image guided surgery, Augmented Reality (AR) applications enrich the live images by adding information as overlay, either 3D display of patient anatomy reconstructed from MR or CT scans, or 3D projections of other real-time medical imaging (3D ultrasound typically). In the second case, display devices (glasses, tablets…) show a real-time "combination" image merging the primary live imaging (endoscopy, overhead, microscopy…) and the real-time secondary live imaging (ultrasound, X-Ray…). The real-time "combination" image could also be exported as a new video source, through the DICOM Real-Time Video protocol.

All video streams have to be transferred with ultra-low latency and very strict synchronization between frames (see Figure PPPP.5-1). Metadata associated with the video has to be updated at the frame rate (e.g., 3D position of the US probe). The mechanisms used for generating augmented reality views or to detect and follow 3D position of devices are out of scope. Only the method for conveying the multiple synchronized video/multi-frame sources along with the parameters, that may change at every frame, is specified.

DICOM PS3.17 2024c - Explanatory Information