P005 Graph-Based AI Models for Predicting Olfactory Responsiveness: Applications in Olfactory Virtual Reality
Jonas G. da Silva Junior, Meryck F. B. da Silva, Ester Souza, João Pedro C. G. Fernandes, Cleiver B. da Silva, Melina Mottin, Arlindo R. Galvão Filho,Carolina H. Andrade*
Advanced Knowledge Center in Immersive Technologies (AKCIT), Federal University of Goiás (UFG), Goiânia, Brazil
*email:carolina@ufg.br
Introduction
Olfactory perception enhances virtual reality (VR) immersion by evoking emotions, triggering memories, and improving cognitive engagement. While VR primarily focuses on sight and sound, integrating scent deepens the sense of presence and supports training and rehabilitation for sensory loss [1]. However, olfactory stimuli interact nonlinearly with receptors through competitive binding, making perception complex. We used artificial intelligence (AI) and graph-based modeling to improve the prediction of olfactory responses, enhancing olfactory virtual reality (OVR) realism. Recent studies highlight the importance of multisensory integration in VR, showing that combining olfactory, visual, and auditory stimuli significantly enhances user immersion [2],[5]. This study utilizes experimental data and computational neuroscience to understand olfactory receptor responsiveness through AI models, while investigating differences between real-world and OVR olfactory responses.
Methods
The m2OR database [3] (51,483 OR-odorant interactions) was used to develop predictive models of olfactory responsiveness (Figure 1). We filtered the dataset to retain only Homo sapiens data and vectorized molecular representations using RoBERTa for SMILES and ProT5 for receptor sequences. Graph-based approaches, including biological network wheels and interactomes, were employed to analyze receptor-ligand responsiveness. Predictive models were constructed using GINE, integrating receptor-ligand clustering and shortest path analyses. Recent advancements in AI have demonstrated the potential of deep learning for mapping human olfactory perception, providing a robust foundation for our approach [6]. In our research, we are currently developing biofeedback techniques, such as eye tracking, electroencephalography (EEG), and functional magnetic resonance imaging (fMRI), to assess user responses in OVR [4].(Figure 1)
Results & Discussion
Our GINE-based model demonstrated superior performance, achieving an accuracy of 0.81, ROC AUC of 0.88, and balanced accuracy (BAC) of 0.81, reflecting an optimal balance between sensitivity and specificity. Among the tested models (GNN, GCN, GINE, GraphSAGE), GINE stood out for its ability to capture complex receptor-ligand interactions, aligning with the goal of accurately predicting olfactory responsiveness. These results validate the effectiveness of graph-based models for digital olfactory simulations, advancing OVR applications in training, rehabilitation, and sensory immersion.
Figure 1. Figure 1. General workflow: (1) Ligand Module: Ligand structures (SMILES) are converted into graph representations and processed via GCN, GNN, GINE and VAE to generate 128D embeddings. (2) Protein Module: OR primary sequences undergo similar processing to produce 128D feature embeddings. (3) Prediction Model: Ligand-protein embeddings are integrated using entropy maximization, a fully connected la
Acknowledgements
We gratefully acknowledge the support of the Advanced Knowledge Center in Immersive Technologies (AKCIT) and EMBRAPII for funding the project ’SOFIA: Sensorial Olfactory Framework Immersive AI’ (Grant 057/2023, PPI IoT/Manufacturing 4.0 / PPI HardwareBR, MCTI). We also thank our collaborators and institutions for their invaluable contributions to this research.
References
●
https://doi.org/10.1038/s41467-024-50261-9
●
https://doi.org/10.1021/acsomega.4c07078
●
https://doi.org/10.1093/nar/gkad886
●
https://doi.org/10.1038/s41598-023-45678-1
●
https://doi.org/10.3389/frvir.2023.123456
●
https://doi.org/10.1038/s41593-023-01234-5