Updated 29/01/21
Intrepid Intents (accessible VR)
In April 2020 I won 1st prize in the "Accessibility VR/AR Game Jam 2020" with a game prototype called Intrepid Intents. My project focused on bringing actions we take for granted in VR and making them more accessible. In this article I'll share my ideas, progress made so far and my hopes for future developments.
A game for everyone
I want players to take on the role of an Intrepid explorer on a dangerous and varied journey that incorporates many of the experiences and interactions that VR excels in and I want to make that journey accessible to the broadest range of players by understanding their Intents and assisting them in ways that make them feel empowered.
Accessibility features
Play standing or seated.Move via teleport or analogue controls.Use 2 controllers, 1 controller or gaze-based interactions instead of controllers.Button presses are not required, but they can be used to speed up selections.All buttons invoke the same action, so use whichever you like.Minimise need for large hand and head movements.Allow complex interactions via assisted techniques.Optionally use mic to detect sound and use as a button press. [planned]
Interaction demonstration
Satisfying interactions
Interacting with the world in VR is immersive and rewarding. When planning accessibility features we don't want to bypass or oversimplify these interactions and diminish the experience. Instead we want to help in ways that still give a sense of control and connection with the world.
Interaction points
When within a few meters of an object (and if you are broadly looking towards it), an icon displays above it to identify the interaction type.
To initiate the interaction, you point at the icon (using a controller or your gaze) and hold that position until a progress bar completes around the icon. You can speed up the selection by pressing a button on the controller. Eventually, you'll also be able to speak instead of using a button press.
The pointer uses a large cursor to make it easy to hover over icons with very little precision.
Interaction examples
Remote grab - pick up an object from a few meters away by pointing at it. No button press is required. You don't have to hold the grip button to keep the object in hand as that can lead to fatigue. The object follows your hand in a smoothed movement so it won't highlight shaking hands. It attaches to the hand that picked it up, or your chest if you are interacting using your gaze.
Remote drop - point at the placeholder in the object's original location to put it back down. Keeping a consistent location for objects helps people keep track of objects more easily and feel more familiar with the area.
Remote smash - when a suitable object is held (a hammer) you can interact with a breakable object (a crate) by pointing at it. The hammer now moves over the crate. You can move the hammer on a 2d plane that intersects with the crate. With minimum hand movement you can wield the hammer and smash the crate. The hammer returns when a countdown expires or you smash the crate.
Remote lever operation - point at a lever and you can grab it from a few meters away. Once held, you can precisely position the lever. You automatically release the lever when a countdown ends or when you move the lever to its right or left extremity. Again, this requires no button press and only small hand movements to control the lever.
Remote buttons - buttons can be activated in the same way as a remote grab. A jewel moves from your hand to the button, pushes it in and then returns to your hand. This simple interaction maintains the principle that you are remote pressing the button using a physical action, rather than it just feeling like a mouse click on a computer user interface.
Hopefully, the above interactions should provide a satisfying sense of control with very limited physical movement.
Future features
I want to provide a wide range of scenarios that evoke reactions in VR, including moving along high ledges, crawling through tunnels, swimming underwater, riding on a zip wire. I believe these can all be achieved in accessible ways using this icon based approach.
I'd like to explore slow-motion "quick-time" events where interaction icons or grabbed levers allow for dodging / evading dangers. A low stress, more thoughtful approach to evasion.
Where gaze tracking is being used I'd like to have a setting to exaggerate the distance the icon moves relative to head movement to reduce neck strain.
We should detect when a player is stuck and provide gentle cues to help them proceed. They should also have the option to call on more direct instructions if required.
Progress should be regularly and automatically saved.
Once of my other projects, Breath Tech on the Oculus Rift, used the microphone in the VR headset to project your breath into VR. I can draw on this technology to make sound or breath an additional input mechanism in the game.
I also considered accessibility in my Jigsaw 360 game. This game requires careful positioning of 3d puzzle pieces around a globe. A snapping strength could be configured to gently rotate and pull the piece into position as you got close. This helped players with less steady hands. An Oculus Go version of the game also allowed the 3d puzzles to be completed simply using a pointer. I may draw on these features to allow artefacts to be constructed in the game.
Locomation
You can use analogue controls to turn and move or teleport instead.
To teleport, you can activate icons above teleport points in the same way as other interactions in the game, using your pointer or gaze. The teleport points are carefully placed so that you are within range of remote grabs, but not so close that you need to look down too steeply to interact. Levels and teleport points need to be designed to require minimum physical strain and head movement.
Goal of the game
The concept is to simply find the missing statues and place them in their original locations (identified by placeholders) to activate the portal to the next level. The final level will reward the player with the treasure they seek. The tech demo currently includes the tutorial levels that were created as part of the 48 hour game jam. You can watch a full walkthrough video on the Intrepid Intents download page.
The game concept should remain purposefully simple to reduce any confusion or frustration, but within that idea are endless possibilities that focus on different ways to navigate around and interact with the environment.
This would be a gentle, relaxing and thoughtful game, with a few short / exciting events thrown in.
Game Style
The game currently uses Unreal Engine and Quixel Megascans for detailed models and textures. This provides detailed environments and also gave me assets I could quickly work with in the game jam. The levels should feel like old "otherworldly" ruins with a hint of an Indiana Jones vibe. I'm open to exploring other styles.
Benefits
While the scope of my demo currently focuses on physical limitations I would like make the accessibility features as broad as possible including consideration for cognitive, sight and hearing issues.
The goal would be to make the game a showcase for how accessible VR games can be.
I would hope that the project would inspire more developers to consider accessibility issues in their own games and provide an example of potential mechanisms they could use.
I have talked to people that have disabilities and understand how excited they feel about the potential of VR, but they can find games difficult to control or too tiring to play for long periods. I'd like to make sure everyone has the chance to experience the moments in VR that have excited me over the years.
The first VR game I ever created, Dimensional was all about room-scale, physical movement and unique experiences. I'm now trying to re-imagine accessible ways to deliver the same type of moments and feelings in VR.
A marketing conundrum
My game jam entry drew little interest on itch.io. I believe that one of the reasons is due to the focus on accessibility in the game description. Being branded in this way could give people the false impression that the game has limited gameplay mechanisms or lacks general appeal.
My plan for the game would be to allow varied interaction methods. You can walk over to the hammer, bend down and pick it up, walk over to the crate and smash it with a swing of your hammer. Or you can use the accessible mechanisms mentioned above.
I want to ensure that the people that would benefit the most from this experience will find it. But, for profitability, I also need to ensure that the broader market does not ignore the title because they think it's not intended for them.
Funding / sponsorship
Due to having such a strong focus on accessibility, this may not be a profitable venture, but it would be a worthy one. As such I would welcome support in the form of funding or sponsorship and advice and assistance with testing and feedback.
If you would like to help, please contact me