eystory Logo

Welcome to EyeStory

A storytelling project by Deakin researchers exploring children's experiences of vision impairment and their treatment journey.

About the study

This research seeks to engage children with vision impairment, and their families, to share their experiences through digital storytelling and play in the home. Our first step is to pilot a journaling and activity game with a child (5-8) and their family, this includes a physical toybox which will be provided to the family.

To find out more click the link here for our extended description of the study, activities involved and family that we are looking for.

You can also email us at eyestory@deakin.edu.au to find out more about the project.


Log in

Forgot Password?

About Us

We are an interdisciplinary team of researchers at Deakin from the Health Sciences (Optometry), Deakin Motion Lab (a creative research hub within the School of Communication and Creative Arts) and Computer Sciences.

You can email us to find out more about the project.

Contact us


We'd like to acknowledge the following people who have contributed their valuable talents to the EyeStory Journey. Eric O from the Faculty of Health, Deakin University for development of the web application, and for his untiring efforts to integrate continually evolving requirements while maintaining the usability of the EyeStory system.
The talented Torre family for their creative work in the App. Specifically, Thomas Torre for his work as voice actor for the puppet 'Gus' and, alongside Vivienne Torre, composition and musician for the video soundtrack. Both Lienors Torre and Rose Woodcock developed our treehouse room interiors and backgrounds for the EyeStory application and Rose built and animated our sock puppet narrators as well as being voice actor for the puppet 'Blinky'. Sharing in this sock puppet creativity, Sae Tosaki contributed to the logo design.
Within the activities, we'd like to acknowledge Nenad Markuš for his models for face detection and pupil tracking. Finally, we'd like to acknowledge seed funding from the Deakin MotionLab.