Skip to Content

Faculty Research Spotlight

Vangelis Metsis, Computer Science

Multi-disciplinary Research Team Uses Virtual Reality to Help Veterans with PTSD


“Our team measured the effects of VR exposure in real-time by collecting physiological biosignals from sensors worn by test subjects.”

A test subject going through a VR grocery shopping session
A test subject going through a
VR grocery shopping session

The number of veterans transitioning out of the military and reporting Post Traumatic Stress Disorder (PTSD) has been on the rise in recent years. This is a result of prolonged U.S. military operations over the past decade and multiple deployments as well as harsh combat and environmental conditions. An increasing number of combat veterans report struggles with “maladaptive patterns of social functioning” due to a strong relationship between PTSD and Social Anxiety Disorder, characterized by distress in social interactions, social avoidance patterns, and impaired social relationships. Furthermore, the presence of anxiety for combat soldiers re-integrating in civilian life is a key factor for externalized behavior problems, such as aggression and substance abuse.

Previous research has demonstrated a strong correlation between virtual reality (VR) exposure and emotional response, as well as the potential of VR as a psychotherapy tool. However, there is a lack of objective, quantitative metrics to assess the emotional response in general and the effect of controlled exposure to VR on each individual specifically. Current standards rely on subjective metrics such as personal reports and other standard psychological measures. One of the main reasons that real-life applications of VR are still limited is that the effects of VR (either positive or negative) have not been proven and are difficult to quantify without objective metrics, including measurements obtained from wearable physiological sensors (e.g. heart rate, respiration).

placement of sensors for physiological data collection
Placement of sensors for
physiological data collection

In response to these limitations, I led an interdisciplinary team of researchers from Texas State to develop an intervention methodology with controlled exposure to VR imagery that simulates real-life scenarios. More specifically, the study 1) identified non-combat, realistic, and commonly occurring scenarios that cause heightened anxiety in veterans with combat-related PTSD; 2) developed immersive virtual environments that simulate these scenarios and cause a similar emotional response; and 3) developed and evaluated quantitative metrics to assess emotional response effects of Virtual Exposure Therapy (VET) via the acquisition and processing of physiological data.

Our team measured the effects of VR exposure in real-time by collecting physiological biosignals from sensors worn by test subjects. We analyzed the collected biosignals using machine learning algorithms to quantitatively assess exposure effects in terms of emotional response and to track variations over time. Student veterans were recruited as volunteers for focus and experiment groups. Our findings were evaluated and compared against traditional psychological and PTSD scoring schemes.

This project was funded by a Texas State University Multidisciplinary Internal Research Grant (MIRG). The research team included the following investigators:

PI: Vangelis Metsis, PhD, Assistant Professor - Computer Science, Director of the Intelligent Multimodal Computing and Sensing (IMICS) Lab
Co-PIs: Kenneth Scott Smith, PhD, LCSW, Associate Professor - Social Work, Director of Virtual Reality and Technology Lab; Dan Tamir, PhD, Associate Professor - Computer Science; Katherine Selber, PhD, LMSW-AP, Professor - Social Work, Veteran Research Expert; Mark Trahan, PhD, LCSW, Assistant Professor - Social Work; Grayson Lawrence, Assistant Professor - Communication Design

I am also currently the PI and director of the NSF-funded Computer Science Research Experiences for Undergraduates (REU) Site on Smart and Connected Communities. This REU Site engages students in research in the emerging area of Smart & Connected Communities (S&CC), the essential building blocks of smart cities.

Snapshots of the VR environment during an experimental session. Three stages of the session are shown (left to right): beginning of the session in the parking lot of the grocery store, shopping process, checkout at register.
Snapshots of the VR environment during an experimental session. Three stages of the session are shown (left to right): beginning of the session in the parking lot of the grocery store, shopping process, checkout at register.