Her project will be designed specifically for those who suffer from deafness and the hearing impaired. By using 3D sensors and tracking mechanisms, Natasha hopes to help bring the gaming market to an area that she feels is currently under serviced. We talked to Natasha about the her plans for the project and why she felt Queen of Code was a good fit…
Tell us a bit about your company and how you got started in game development…
As far as I remember, I always loved to play games as a child. I remember myself skipping my English classes, not always of course, to go and play arcade games in a local game shop, with my favourites being Pac-Man and Frogger. I could be there for hours. From 2005 until 2013, I was the course leader for the BSc Computer Games Course at the University of Westminster, where I was involved in various game projects with charities and museums. After that, I became the Faculty’s B.U.G.S representative. You can read more about B.U.G.S by clicking here.
How would you describe your Queen of Code game?
I want people to be aware of the various roles of games, not only the entertaining part. There are excellent games developed by charities for disabled people, like the SpecialEffect organisation, and yet not many people are aware. I want the Sign Language Game for Deaf People to help parents and children, to learn together the sign language alphabet, improve their communication and, of course, have loads of fun.
What made you want to develop a game to help the deaf community?
Hearing loss affects millions of families in the UK and I personally believe that action needs to be taken to better support the communication between parents and children but also teachers who teach at schools with deaf units.
What were your main inspirations?
The fact is that gesture recognition technology for sign language is still behind its speech counterpart. As an active researcher in using low-cost 3D sensors for pervasive technologies, I am inspired to develop games around the gamification of sign language practice. An area not very well explored and actively looked at by many people.
What technology does your game use and how does this make it unique?
The current system is developed using the Kinect sensor and the OpenNI open source framework that allowed us to build the point-cloud surface reconstruction algorithm. For the training of the gestures we used the probabilistic Hidden Markov Models (HMMs).The uniqueness is that most gesture played games are single player games with large-arm hand tracking movements. While this game will be for multiple players and will combine both large arm gestures and finger/hand tracking algorithms.
What ambitions do you have for your game?
I would love to see it in the market and being used by many deaf children and parents. Also, it can be used as proof of concept for charities devoted to addressing the needs of deaf people in the community (Deafness Research UK, Sign Health) and to the NHS.
How difficult is programming body movement to match answers?
It is quite difficult. Processing and representing data in real time from low-cost 3D sensors is a challenging task. That’s why only big players such as Microsoft, Intel and Leap Motion are genuinely interested to develop commercial products around the gesture recognition technology.
What do you hope to achieve from the Queen of Code programme?
To have the support to create the vertical slice of the game as a proof of concept to pitch to publishers and project funders to raise funding to develop and release the full product.
Why do you think people should back your game?
There are not any commercial products available around the gamification of sign language practice. Games can be more than fun and this technology can address a community which is currently under-represented.
Support Sign Language Recognition for People with Deafness by pledging on their #QueenofCode campaign: