To what extent can simplified VR parenting simulation with core interaction design achieve the perception of naturalness for empathy-building?
A cutting-edge VR parenting prototype. Built with Unity XR and tested with Oculus Quest 2, featuring realistic hand tracking and immersive environments. The project explores the challenges of parenting through interactive scenarios, providing users with a unique and engaging experience that fosters empathy and understanding.
Sole Designer & Developer
6 months (Dec 2025 - May 2026)
VR prototype, design documentation, research runnings, and playtesting reports, analysis of user feedback & conclusions, and final presentation materials.
The project aimed to explore how virtual reality could be used to simulate the emotional and psychological aspects of parenting. This involved scoping the technical requirements, defining the user experience, and establishing the research questions.
The research focused on understanding the cognitive and emotional processes involved in parenting and how these could be translated into a virtual reality experience.
Building interaction system to model parent-child interactions. The user interface was designed to facilitate natural and intuitive interactions within the virtual environment.
The character (baby avatar) was designed to be expressive and responsive to player actions, ensuring an immersive and emotionally engaging experience. It was created inside of Blender with Make Human Add-ons with full rig and facial expressions.
The room was designed of context, with incorporated elements that would evoke specific feelings and memories.
There were used third-party assets from Unity Store and sized for use in the virtual environment, for average height of user.
Designed intuitive interaction mechanics that align with the narrative and emotional goals of each memory environment. The user has options to engage with the environment in meaningful ways and consider ethical implications.
Supported by a carefully crafted soundscape that enhances the immersive quality of each space.
Conducted 10 playtesting sessions with participants unfamiliar with VR and experienced VR users and parents and caregivers, and not care experienced individuals. The feedback conducted was analysed towards the naturalistic design principles.
The research focused on quantitative and qualitative data collection to inform design decisions.
Custom interaction framework built on Unity XR Interaction Toolkit. Implemented grab, push, and gesture-based interactions with haptic feedback calibrated for Quest 2 controllers.
Tools: C#, XR Toolkit, Oculus Integration SDK
Maintained steady 72fps on Quest 2 through aggressive LOD systems, occlusion culling, texture atlasing, and baked lighting. Draw calls kept under 120 per frame.
Techniques: URP optimization, mesh combining, shader LOD
Built dissolve shader using noise textures for memory fragmentation effect. Created scriptable object-based room configuration system for easy environment authoring.
Tools: Shader Graph, ScriptableObjects, Unity Events
The VR Parenting Simulator successfully created an immersive experience that fostered empathy and understanding of the challenges of parenting. Playtesters reported a strong emotional connection to the virtual child and found the interactions to be intuitive and engaging. The project demonstrated the potential of VR as a tool for experiential learning and emotional engagement, providing valuable insights for future developments in this space.
Playtesters
Months Development
VR Interactions