Microsoft’s HoloLens Not Fit for AR-Assisted Surgery, Study Suggests
Participants were less accurate and became more tired when completing a task with the HoloLens, compared to the naked eye
With the right device, some programming, and the flick of a switch, augmented reality (AR) can change the world—or at least change what we see a few centimeters in front of our eyes. But while the industry rapidly expands and works hard to improve the AR experience, it must also overcome an important natural barrier: the way in which our eyes focus on objects.
A recent study shows that our eyes are not quite up to the task of simultaneously focusing on two separate objects—one real and one not—in close proximity to one another.
The results, published 6 May in IEEE Transactions on Biomedical Engineering, suggest that accomplishing an AR-assisted task that’s close at hand (within two meters) and requires a high level of precision may not be feasible with existing technology. This could be unwelcome news for researchers attempting to design certain AR-assisted programs.Read this article for FREE on IEEE Xplore until 17 June 2019
For instance, some researchers are exploring the possibility of using AR to virtually guide surgeons who must make precise incisions, or to display a virtual axis over the surface of a bone to steer realignment surgery. But if our eyes can’t focus on both virtual and real objects simultaneously (a phenomenon called “focal rivalry”), this leaves room for error.
In the new study, Sara Condino, Vincenzo Ferrari, and their colleagues at the University of Pisa explored how focal rivalry affects people’s performance when using AR to complete precision tasks. The researchers asked 20 participants to take a “connect-the-dots” AR test, where a sequence of numbered dots was visually projected using an optical see-through (OST) device mounted on participants’ heads. With this type of AR, computer-generated content is projected onto semi-transparent displays in front of the user’s eyes, and the user can still see real-world objects beyond the screen. In these experiments, the researchers used one of the most advanced OST devices available, the Microsoft HoloLens.
With the connect-the-dots task projected through the HoloLens, participants then had to draw the connecting lines using a ruler on real paper in front of them. “This task forced the user’s eye to contemporaneously focus on the virtual content, which is the numbered dots, and real objects—the pen, ruler, and paper,” explains Condino.
Participants completed the test under four different scenarios: with and without the AR headset, and with one or both eyes open. Through all scenarios, the researchers assessed participants’ timing and accuracy as they completed the task, and later asked them about their experiences.
According to participants, they felt that they completed the test similarly whether or not they used the AR device. But the performance data tell a different story. On average, participants made errors of 2.3 mm in length when using the HoloLens (with a maximum error of 5.9 mm), compared to errors averaging 0.9 mm during the naked eye tasks.
“Unfortunately, the users were not aware of [their] bad precision performances in AR-guided tasks,” says Marina Carbone, a researcher involved in the study. As well, she notes that participants reported experiencing more fatigue during the AR-guided tasks. Carbone says, “To further evaluate this point, we are now planning to repeat the experiments acquiring the EEG during the exercise.”
As well, the team plans to further study issues related to AR-assisted surgery by developing a new type of hybrid AR system that can switch between OST and video-see through (VST)-AR. With this latter approach, a real view of the world is acquired by external cameras on a head-mounted device and presented to the user after being merged with the virtual content. By comparing the two AR systems, the researchers hope to isolate and better understand the visual limitations that occur across the two platforms, with the ultimate goal of improving the performance of AR systems during precision tasks such as performing surgery.
Notably, an OST device that creates a light field display could overcome all of the limitations of existing approaches—but that technology is still in the early stages of development.