This paper tries to address the difficulties with collaborative interaction with volumetric displays. They use the Actuality Systems Volumetric Display (see first short video). The way the volumetric display works is there is a virtical projection screen that rotates around the virtical Z axis and a projection engine using relay optics that projects an imagine onto this spinning screen so that a 3D image is then seen by the user (see image below).
Collaborative control is used with handheld devices to manipulate the display and hats with sensors are used to determine different users position around the display.
The paper goes into great detail on some of the many problems with collaborative interaction in general 2D space and then addressses some new problems that arise in 3D space. The prototype they create attempts to solve many of these problems including: location awareness, navigation, 3D cursors, options dialog box, etc.
The paper also talks about their unique algorithm for hidden surface removal. This always for the user, depending on his position around the volumetric display to only see that side of the 3D object, as opposed to seeing outlines through the object of the otherside.
Seems like very useful research beginning research. They made a point to say that their goal was not to show cool ways of interacting with it which is why the used hand held remote controls and keyboards, but to come up with a way of collaboratively interacting with volumetric displays. So the hardware technology may not be that impressive compared to some other volumetric display research, the fact that the software is designed for collaborative interaction is the main thing to keep in mind.