|
Welcome to the HAVIC Lab - led by Dr Nathan Caruana, Associate Professor of Cognitive Psychology at Flinders University.
🧠 Our Mission We study how people understand and interact with others - both human and artificial. Our research explores the cognitive and neural processes that support social cognition and human-technology interactions. We hope to deliver knowledge and innovation that supports the wellbeing of all neurodiverse people. 🔍 Our Focus Our lab investigates how bottom-up perceptual cues (like gaze and gestures) and top-down psychological factors (like beliefs and expectations) shape how we coordinate attention and share experiences with others. We’re especially interested in joint attention - a foundational social skill that allows us to understand others’ perspectives and learn through interaction. Because joint attention is hard to study in traditional lab settings, we develop virtual interfaces that simulate realistic, controlled social scenarios. We also study how people perceive and interact with artificial agents - including virtual characters and robots - and how virtual reality and robotics can be deployed in applied settings to improve wellbeing and productivity. Our current work specifically explores how design, expectations, and context shape trust, collaboration, learning and cognitive performance in both educational and manufacturing/industrial settings. Neurodiversity is a key research priority, working with both autistic children and adults to improve social, education and employment outcomes. The HAVIC Lab is a founding partner of the Flinders Autism Research Initiative. 🔬 Our Methods
👥 Our Stakeholders
📂 Our Projects You can see a snapshot of our current research projects below. 🎧 Learn More Please explore our website and reach out if you would like to join our lab, collaborate or take part in our research. In the interview below, Dr Caruana spoke briefly about our lab's research program with the Academy of Social Sciences in Australia. You can hear more about our lab's work on our Presentations page. |
Current Projects
|
ALIGN: Using VR paradigms to examine how people optimally attend to, and use, multiple non-verbal gestures (eye gaze, head movement, hand pointing) to coordinate and cooperate with other humans or robots.
GAZE COMM: Investigates how the timing and sequence of eye contact influences perceptions of communicative intent in human and robotic agents. The project aims to inform the design of intuitive gaze-based communication systems. SIGNAL: Investigating how people with and without hearing loss and cochlear implants use non-verbal gestures to facilitate communication. VIBE: Exploring the factors that shape social experience and behaviour in online virtual environments (e.g., avatar choice). SOCIAL VR: Examining autistic young people's experiences and perspectives towards engaging in social interactions using virtual reality. IN-ter-VR-U: Exploring how VR can support neuro-inclusive hiring practices for autistic people. VERT: Exploring the use of VR as a tool to help neurodivergent children regulate their emotions (Virtual Emotion Regulation Training). Aut-2-Label?: Investigating how autism labels or diagnosis disclosure influences the social evaluations and behaviour of non-autistic people towards autistic people. Aut-2-WORK-Safe: Examining the psychosocial hazards that autistic people experience in the workplace and identifying supports to help young autistic people safely transition to new workplaces. EAT: This project explores the eating preferences, behaviours and experiences of autistic people. CARV: Revising and validating autistic traits measures in various languages in collaboration with autistic people to ensure they are accessible, safe and valid. Invigi-BOT: Elucidating the interactive behaviours of social robots that positively and negatively impact our cognitive performance (attention/vigilance) when they observe us complete difficult or tedious tasks. Ling-BOT: Investigating how social robot presence and interaction shapes study engagement when learning a new language. BookBOTs: Working with industry (Bookbot.com) to design and evaluate artificial agents that support reading engagement, literacy and wellbeing. Outback Robots: Designing accessible, low-cost, 3D-printable robots, and Digital Technology Curriculum resources that can improve access to high-quality AI and STEM education in rural and remote primary schools. ShAIring: This project examines the factors that influence how people perceive, talk to and disclose information to AI agents. |
The projects above represent our lab's four core conceptual areas of research, which are supported by four key methodology-technology domains. Some of these domains overlap in our research (e.g., where we use co-design approaches with social robotics; or implement experimental psychology approaches in VR experiments). The matrix below summarises these areas of research and approach domains - and shows where the above projects currently sit within this matrix.



































