THesis RESEARCH 

Review my master's thesis research and visual assets

Visualizing Glaucoma:

Accurately Characterizing and Depicting Visual Loss via Virtual Reality

MM Figure 20 rep img mask
MM FIgure 19

Abstract 

 Glaucoma is the leading cause of global irreversible blindness, affecting more than 70 million people worldwide between the ages of 40-80. Tests to diagnose and understand the impact of the disease are well established, however the actual patient experience of glaucoma-affected vision has been confined to epidemiologic descriptions of function and imprecise visualization of what the patient sees. Many patients diagnosed with early-stage glaucoma are prescribed life-long therapies, yet they experience minimal visual distortions. The eventual, long-term impact of glaucoma on their activities of daily living and quality of life eludes them, reducing chances for treatment compliance. Furthermore, the limited visual depiction of the disease may prevent providers and family members from providing empathetic care and support. 

 

Existing visualizations portraying the first-person experience of glaucoma suffer from methodological shortcomings. Most current representations are static, 2D images that do not correlate with patient-specific visual field (VF) impairment; these images do not capture or address the variability of vision loss and its effects on the patient's ability to decipher visual information. Moreover, most have not been derived from a systematic, patient-centered approach. Thus, there is a need for better methods to visualize disease from the patient perspective, and new ways to communicate that experience. 

 

This research protocol accomplished these goals through a two-phased process: Phase 1 involved characterizing the visual experiences of several patients with unilateral, moderate to severe glaucoma via a series of custom eye assessments and interviews. Phase 2 depicted the resulting data through virtual reality (VR) eye-tracking technology in order to demonstrate dynamic aspects of the disease. The final VR application includes (i) a real-time video feed which represents to patients various glaucoma patient visual field loss patterns derived from our pool of characterized patient data, (ii) an immersive environment for visual search tasks with the option to toggle off representations of the disease state, and (iii) a patient education module with animations outlining the physiology of glaucoma, including links between disease pathology and findings in common tests used to identify and assess progression of disease. 

View Full Thesis Here >>

Abstract 

 Glaucoma is the leading cause of global irreversible blindness, affecting more than 70 million people worldwide between the ages of 40-80. Tests to diagnose and understand the impact of the disease are well established, however the actual patient experience of glaucoma-affected vision has been confined to epidemiologic descriptions of function and imprecise visualization of what the patient sees. Many patients diagnosed with early-stage glaucoma are prescribed life-long therapies, yet they experience minimal visual distortions. The eventual, long-term impact of glaucoma on their activities of daily living and quality of life eludes them, reducing chances for treatment compliance. Furthermore, the limited visual depiction of the disease may prevent providers and family members from providing empathetic care and support. 

MM Figure 20 rep img mask


Existing visualizations portraying the first-person experience of glaucoma suffer from methodological shortcomings. Most current representations are static, 2D images that do not correlate with patient-specific visual field (VF) impairment; these images do not capture or address the variability of vision loss and its effects on the patient's ability to decipher visual information. Moreover, most have not been derived from a systematic, patient-centered approach. Thus, there is a need for better methods to visualize disease from the patient perspective, and new ways to communicate that experience. 

 

This research protocol accomplished these goals through a two-phased process: Phase 1 involved characterizing the visual experiences of several patients with unilateral, moderate to severe glaucoma via a series of custom eye assessments and interviews. Phase 2 depicted the resulting data through virtual reality (VR) eye-tracking technology in order to demonstrate dynamic aspects of the disease. The final VR application includes (i) a real-time video feed which represents to patients various glaucoma patient visual field loss patterns derived from our pool of characterized patient data, (ii) an immersive environment for visual search tasks with the option to toggle off representations of the disease state, and (iii) a patient education module with animations outlining the physiology of glaucoma, including links between disease pathology and findings in common tests used to identify and assess progression of disease.

MM FIgure 19
View Full Thesis Here >>

This research protocol accomplished these goals through a two-phased process.

MODULE 1: Patient Education

 This module allows the user to access different animations explaining glaucoma. The module also contains 3D model access with corresponding labels that are meant to provide better spatial and anatomical understanding of some of the topics addressed in the animations. 

MODULE 2: Live-Camera Simulator

 The live camera module uses the front-facing fisheye camera of the VR headset to offer a viewer a direct comparison of their environment with and without visual defect. The UI provides access to different patient distortions along with information about the patient condition. The jittery movement of the distortion reflects the fact that it follows eye-movements rather than head orientation.  

MODULE 2: Live-Camera Simulator

 The live camera module uses the front-facing fisheye camera of the VR headset to offer a viewer a direct comparison of their environment with and without visual defect. The UI provides access to different patient distortions along with information about the patient condition. The jittery movement of the distortion reflects the fact that it follows eye-movements rather than head orientation.  

MODULE 3: Search Task Simulator

 This module places the viewer in a virtually simulated living room and challenges them to search for a remote, in a timed setting, with a selected visual defect. The static in the background and what would be accompanying audio would amplify the frustration one might feel from delaying their ability to complete a simple task. 

View 2021 MBI Thesis Presentations

 

VIEW MORE OF MY WORK

SURGICAL
PORTFOLIO
PERSONAL