About a month has passed since I finished the “Unity Essentials Pathway” and began the “VR Development Pathway,” and I must say that I’m eager to complete it and start making virtual reality games!
It has been an exciting journey, and I’ve learned a lot. For me, it’s not only my first interactions with Virtual Reality but also my first 3D interactions with Unity. Being able to touch and “feel” objects within virtual reality is magical. Picking up a cube and throwing it… pure joy. Grabbing a hat and hanging it on a rack… speechless. And the best part is the multitude of ideas that come to mind to create and try.
I still find it very tough to become a “generalist” (someone who does everything: development, sound, image, art…) because of the extensive knowledge required in various areas, but for now, these are some aspects of the pathway, besides development, that have caught my attention:
Haptic Sensations
In Virtual Reality, we must try to convey as much as possible to the player, not only through sight but through… 3 senses? At least for now (not using taste and smell). The first of them is touch, and we can try to transmit something similar thanks to the vibration of the controllers. The trick is to add that vibration at the right moment, with the appropriate duration and form.
For instance, if we want the user to feel when they pass their hand (or raycast) over an object that can be manipulated, we will not only change the raycast color but can add a brief vibration, indicating that you can interact with the object. When you grab it, we will add a slightly stronger and longer vibration, simulating the touch as if you really picked up the object.
3D Sound
Until recently, I had only made 2D games with Unity, and my 3D trials had been minimal without delving into sound. Unity can perform incredible sound processing. An amazing option is 3D sound. It is 100% necessary when making a 3D game, even more so a virtual reality game, which is even more immersive.
Another example: imagine a room full of furniture and a table with a radio playing in a corner. Thanks to 3D sound, we will hear it from the direction it is emitted, and if we turn our head, we will hear it better in one ear than the other. Beyond that, we can play with distance and volume, making the sound more realistic by sounding louder as we approach the radio and quieter as we move away.
There are also components called reverb zones to use with 3D sound. These reverb zones help us define areas to make various sounds intermingle or even cut off at a specific point. For example, if we’re in a forest and there’s a cave, we can make the sound of the forest cut off immediately upon entering the cave or blend with the cave’s sound, such as the crackling of a fire.
There are countless options and ways to exploit all this. To give one more example, going back to the first room in the example, we can indicate the type of room or area. We can make it sound like a normal room or simulate it as an auditorium, creating an echo effect. Unity offers these configured options, and they are very useful.
Visual Aspect
Haptic and auditory aspects are significant sensory parts in VR, but the visual aspect remains king.
Without a suitable environment, the sensation is lost, but there are a thousand aspects that can make the visual sensation awful. Optimization, for instance, is critical. Without an optimized project, FPS increases, and lag can make the user feel dizzy. The same goes for the game’s movement type.
Lights, post-processing, and anti-aliasing to make borders as smooth as possible are also critical for achieving a fluid and realistic sensation.
It’s not just about placing objects well or having impressive graphics. Many VR headsets are not connected to a PC or console, and therefore their performance is more limited. Optimization is crucial.
Accessibility
A factor that many developers overlook. Coming from web development, I’ve had to work on many web pages, applications, and online stores that applied few or no accessibility techniques.
When I saw the concept in games, and especially in virtual reality, I was surprised. But it quickly made a lot of sense. Let’s use examples. Imagine a person who tends to get dizzy in virtual reality with snap turns. We must give them the option to change to other types of movement, like continuous.
Imagine that you’ve created a game where the distance to interact with objects must be close (as if you were grabbing them for real) instead of using the typical VR rays. What happens if you place an object on a high shelf and a person in a wheelchair plays your game? We must provide options for interacting with the environment, like changing how to acquire objects.
Imagine that your game relies on sounds to follow the lore or specific orders. If your player is deaf, you must assist them with captions where they can read speech subtitles or simply the description of the sound that is occurring. We must provide options for everything we can think of. What if your player only has one hand? If your game can only be fully played using two controllers, this person won’t enjoy it. It’s better to try to make actions possible with a single controller.
Of course, this depends on the game, the audience, and many other factors. But as much as possible, we should think about adapting and providing options so that our game is accessible to as many people as possible.
I hope to balance this with the development of OhMyShape! and everything else, and finish the “VR Development Pathway” before September. To obtain accreditation, I must complete a project as a final assignment. My idea is to develop a short but well-made game or experience that serves not only to obtain the badge but as a piece for my portfolio. So I might soon make a Devlog about this project.
See you next time!
Leave a Reply