A new study associated virtual reality (VR) headset use with a reduced risk of dry eye in healthy patients, which authors said is due to elevated temperature inside the headset.
While previous studies have examined tear-film stability before and after VR headset use, real-time changes in tear-film behaviour during VR headset use have remained unexplored. To address this gap, Associate Professor Yoshiro Okazaki from Waseda University, Japan, and Visiting Professor Dr Norihiko Yokoi from the Kyoto Prefectural University of Medicine, Japan, designed a novel method using VR headsets with an ultra-compact camera for real-time observation of tear-film dynamics.
Published by Nature Scientific Reports, the study’s 14 healthy participants played a VR game for 30 minutes while the built-in camera monitored changes in their tear film lipid-layer interference pattern at baseline and every five minutes during brief pauses. The researchers found the tear film’s lipid-layer interference grade increased, indicating thickening of the lipid layer. Additionally, the corneal and upper eyelid temperatures increased significantly after the VR session. “These findings suggest that the periocular warming inside the headset may have led to the thickening of the tear-film lipid layer,” said A/Prof Okazaki.
While not intended as a health claim, these findings provide insights into how the thermal environment inside VR headsets may influence tear-film behaviour, he said. “This is useful not just for the users but also for the headset designers who are involved in developing future VR systems.”
However, since the study’s subjects were healthy participants, questions remain about whether similar results would be seen in individuals with dry eye disease or meibomian gland dysfunction; neither did it include a non-headset control group, noted A/Prof Okazaki, adding the team plans to resolve these questions with subsequent work.