A simple tool that lets you place, edit and spatialize sound in VR
Posted on January 19, 2018 by Adam Amaral
We were asked by Intel to create a useful tool for VR devs that leverages the power of intel CPU’s. Unreal Engine has a powerful virtual reality editor option, but something they did not include is the ability to edit and place sounds while inside VR. It can be troublesome constantly having to restart the editor after adjusting a sound to test what it sounds like in VR. So we decided to create a sound editor that allows game devs and sound designers alike to quickly place, edit, and test spatialized sound inside VR! This will prevent the user from having to constantly enter and exit the editor.
-Unreal engine 4.18.1 or greater
-Visual Studio 2017
What you’ll learn:
-Motion Controller interaction
-How to create custom C++ class
-Saving editor changes
-Sound spatialization parameters
Below is a step by step tutorial outlining the details of how we made this from start to finish:
After you have downloaded and unzipped the project folder, you will have to do a couple of things to get started. We’re assuming you have at least version 4.18.1 of Unreal Engine installed.
First, right click on Intel_VR_Audio_Tools.uproject and select Generate Visual Studio project files. After that completes open the project. A popup that says “Missing Intel_VR_Audio_Tools Modules” will appear. Click Yes to start the rebuild, this should take less than 20 seconds. This is needed because of how we are dynamically finding .wav files that have been added to the project, which will be explained in the Custom C++ Class section.
Setting up VR player:
We started with Unreal’s Virtual Reality template and chose the MotionControllerPawn as our pawn which has motion control setup and allows movement by teleporting.
Motion Controller Interaction:
Before the motion controller can interact with 3D widgets a Widget Interaction component needs to be added to BP_MotionController, which is located in the VirtualRealityBP folder. Also added was a scene component for the sound selector widget, called soundScene.
Press and Release Pointer keys were attached to the event called when right trigger is pulled. This was added to the MotionControllerPawn also located VirtualRealityBP.
Custom C++ Class:
The reason you had to rebuild the project was because early during the making of this tutorial the issue of knowing the names and locations of the sounds and dynamically updating a widget to match all those files sounded daunting. Luckily, Unreal Engine has some stuff to help us out.
The IntelSoundComponent is a C++ class that can be added to any blueprint for an easy way to dynamically locate and load a .wav file into a USoundWave which is how Unreal loads a sound file.
First, we had to right click in the content browser and create a new C++ class which we named IntelSoundComponent. This action created an IntelSoundComponent.cpp file and an IntelSoundComponent.h file.
Next, we added some includes which are needed to locate and manage files.
Includes added in IntelSoundComponent.cpp are Paths.h, FileManager.h and Runtime/Engine/Classes/Sound/SoundWave.h (which for some reason needed everything before SoundWave.h).
We began by creating a bool named exists, 2 FString variables named dir and SoundDir and a TArray of FStrings named soundFiles. Since soundFiles is a TArray we are able to call soundFiles.Empty(); which empties the TArray. We believe it’s also the fastest way if new wave files are added. Then, we set FString dir to FPaths::ProjectDir();(which gives the root location of the project). Now, we set FString soundDir to dir + “Content/Sounds” because that is the folder we are putting our .wav files into. FPaths has another method that can check if a directory exists so we set our bool to that .(exists = FPaths::DirectoryExists(soundDir);)
On Begin Play we start by instantiating IFileManager by using IFileManager &fileManager = IFileManager::Get();. This was done to debug and test if the wave files were being found with fileManager.FindFiles which are searching for .uassets instead of the .wav files we were using before as .uassets are more reliable when sharing projects.
Lastly in the .cpp, we create two functions that will be exposed as blueprint nodes. SoundArray which passes the soundFiles TArray into blueprints and setWavToSoundWave which honestly took a while to figure out because we had to find a way to dynamically reference a .wav file in a way which Unreal could understand which is a USoundWave. For this problem we discovered LoadObject
In the IntelSoundComponent.h we created two UFUNCTIONS as a way to make the two functions in the .cpp blueprint callable.
Blueprint function to expose sound files into blueprint.
Blueprint function passing a wav converted in USoundwave into blueprint.
Setting up UI:
Three UMG widgets we need to create
Create the blueprints needed to manage those UMG widgets
We have a couple of widgets for this project. AudioParamsSliderWidget is the widget that pops up when you select a sound. soundButtonWidgetBP is just a button widget for the sounds in the Content/Sounds folder. soundSelectorWidgetBP is the widget, which i put in the level by having an actorBP we created called IntelSoundWidgetBP (you could do this dynamically but then you would have to get a reference to the newly spawned actor everytime you began play.), gets the sounds from the SoundArray C++ node and populates soundSelectorWidgetBP with soundButtonWidgetBPs. All this happens in the IntelSoundManagerBP which was also placed in the level from the start.
In the image above we get the soundFiles TArray of FStrings and split at the period in the name of the (name of sound).wav. We send that string into an array of strings in IntelSoundWidget to name the buttons being dynamically populated.
In the IntelSoundWidgetBP we spawn the soundUI,
and if we didn’t use the Set Widget node the widget would spawn but not be visible in game.
Once the player selects a sound from the widget. An IntelSoundAudioActorBP actor will spawn. In this actor you will see the AudioParamsSliderWidgetBP and if Spatialize? is clicked 3 attenuation settings exposed to be changed through the widget..
Sound Attenuation is essentially the ability of a sound to lower in volume as the player moves away from it.
The 3 settings exposed are Attenuation Function, Attenuation Shape and the Falloff Distance.
There are plenty more settings that could be exposed with more time. Here are images of the Attenuation Setting struct in Unreal.
We believe the 3 we chose are the most basic and fundamentally needed settings. Showing debug lines when you are changing settings is something we are working on. We were looking for a way we could use the attenuation setting debug lines Unreal uses to show attenuation in editor in game, but we have not found that answer. So, we might get the shape extents of the attenuation shape and function chosen and use Unreal built in draw debug lines nodes.
Saving on Exit:
When you exit the game and have spawned sounds, moved them around and played with the audio parameters we save all the variables we believe are important using IntelSaveGameBP through IntelSoundAudioActorBP.
Now if everything worked correct you should be able to edit any sounds in your folder inside VR.
Tutorial written and developed by Rob Meza
Master of Shapes in AR
Posted on December 15, 2017 by Adam Amaral
We were asked by Snap Chat to be an early developer for there new Snap Chat Lens Studio. So we decided to make this chill Zen Shapehead.
Essentially the new tool Snap Lens Studio is like a simplified Unity and it makes it quite easy and user friendly to make an AR “lens”
The character was modeled, textured and animated in our 3d program of choice then exported as FBX where you then relink the textures and animations and add some simple logic even sounds in Snap Lens Studio. Its easy to push to your phone for testing because they have a built in wifi pairing. The last step in the process was making the snap icon which was just as fun as making the AR filter itself.
And here it is out in the wild:
Go ahead and check it out for yourselves with this link or snap code above
Try out Zen Shapehead AR Lens
All the fun none of the fumes
Posted on April 19, 2017 by Adam Amaral
After working on the first-person-shooter-oriented Mobile Room-Scale, we wanted to make show off the more creative side of VIVE Tracker. For this, we developed a 3D-printed spray paint can that uses the GPIO pins on the VIVE Tracker to send commands to a demo game we’ve created in UE4.
In this tutorial, we will teach you how we did it. We’ll give you the print files needed, walk you through assembling the physical model once it has finished printing, and show you how to connect the incoming signals from the VIVE Tracker Spray Can into the demo game in UE4.
Part 1: Making the Controller
What you’ll need:
Step 1: Print the models
Print Files Folder
Step 2: Assemble the Can
Full Assemble Can Video
Part 2: Making the Game
What you’ll need:
For this tutorial, you will need the FBX file for the entire spray can along with the project files for the game we are building. The game is already finished, and the blueprints have all been extensively commented to let you know why we’re doing what we’re doing.
Spray Can: FBX Model
Spray Can: UE4 Project Files
Since the VIVE Tracker has just come out, some of the programs we are using haven’t quite caught up to fully incorporating their functionality as of this tutorial. As such, we have a few workarounds to hold you over in the meantime. Once everyone has updated to the OpenVR version that fully supports the Trackers these workarounds won’t be needed.
The first thing you will need to do is download and use the VIVE Tracker Role Changer (VTRC). As of this tutorial, version 0.8 was the newest. Here is the LINK. That allows us to use the GPIO pins as controls when sending to programs, like UE4, that have not updated to the new version of OpenVR just yet.
Here are a few things we think you should be looking out for while we wait for the update to OpenVR:
Step 1: Setting up UE4 for the Motion Controllers
Full Motion Control Setup Video
Step 2: Create the Controls for the Inputs
Full Create Controls Video
This section is built to follow within the Project as there are many parts to its operation. Please open the project files for even more detail on this build out.
Below is a summary of what to look for within the Project files:
Outside of VR
More fun can be had even outside of VR. For example combined with a projector you can have all the fun of graffiti without the fumes.
Multiplayer cross platform VR
Posted on April 19, 2017 by Adam Amaral
We debuted our first multiplayer mobile room scale experience at CES this year, and we were actually quite surprised to see how many people actually had alot of fun playing our game “Cover Me!!”. Quick background: “Cover Me!!” is a cross-platform multi-player experience where a person plays in VR alongside his friends who use their cellphone or tablet to blast away waves of enemies. I know what your thinking… “typical wave-based shooter yada yada” BUT by using Vive trackers and attaching them to your cell phone or tablet you have full room-scale tracking ability just like the Vive system. Throw in a few Bluetooth guns and next thing you know you’re shooting laser blasters back-to-back with your buddy in VR, fully aware of each other in the game and working as a team. For us, this solved the problem of going over to a friend’s house and watching them having the time of their lives while you sit on the couch waiting your turn. Now you can play along with them!
Since Vive trackers are available to the public, we thought we’d share how to make a mobile room-scale experience. This could be done in Unity or Unreal Engine, but in this specific tutorial We’ll be showing you in Unreal Engine 4.15. Difficulty: intermediate
Replicating tracker positions
One thing to know about the VIVE Tracker is that while they were designed and produced by Vive, the core tracking technology comes from SteamVR (Valve). The trackers use a proprietary Bluetooth connection that requires SteamVR to be running. Currently SteamVR does not run on ARM processors which most(all) phones use. To solve this, we need to replicate the position of the trackers from the VR computer to the other mobile players. Sounds kind of rough but honestly isn’t that bad and over a local network there is no noticeable latency.
If this is your first time building a multiplayer game I highly recommend checking out this tutorial: Blueprint Multiplayer Shootout Game and having a good understanding of replication and how it works inside of UE4. From there let’s handle sending position to the other players.
As you can see in the image above it’s a fairly simple setup. We’re identifying if the player is a VR player or Tracker Player (this is stored when player joins game) then we check if this event is happening on the Server or on a remote client (switch has authority node). Notice the custom events being called on tick (red). They are slightly different and this is important. Since we know we can only get position values from a tracker on the PC running SteamVR we only want that PC setting our variables and then broadcasting over the network. We do this by setting “Execute on Server” this prevent us accidentally setting the tracker position variable on a device that doesn’t even have Steam VR running.
We use the built in “Get Tracked Devices Position and Orientation” node using the index of the tracker. We know its id “5” in this case because we are assuming we have base stations (0 & 1), HMD (2), Left and Right controllers (3&4) making the two trackers connect (5 & 6). You could add additional logic here but for sake of example we hard coded.
Once you have this setup and working locally now it’s time to package your game for android and PC. Start your PC server first then it should be straightforward to connect mobile devices as long as you’re on the same network. One “gotcha” to look out for, that Unreal by default assumes you’re not going to be connecting over LAN, to force this edit your “DefaultEngine.ini” file located in your projects config folder by adding the following anywhere in the file:
Hope this helped and let us know if you run into any issues.
HTC Vive Tracker + Google Daydream VR
Posted on February 17, 2017 by Adam Amaral
We are big fans of the new HTC Vive tracker and have been lucky to have early developer access. At CES we used the trackers for our mobile room scale experience “Cover Me!!” , which allows players to use their mobile devices along side a VR teammate. But we see alot more potential for the trackers… One thing that has been on our minds as of late is the Google Daydream Headset and I have to say its pretty solid. While it doesn’t match the tracking capabilities or graphics of a desktop HTC Vive experience the Daydream is wireless and the pixel resolution is a good bit higher (1440×1280 px per eye). Now wouldn’t it be awesome if you could have the room scale tracking ability of the Vive with the wireless higher res Daydream….? To the lab!!!
Combining the Vive Tracker with Google Daydream seemed like a perfect combo. From our previous experience with the Vive trackers we have already solved how to stream their position data to android devices so the only next step was designing a way to hold it on the Daydream. We contemplated creating a head strap to mount on top of your head but for sake of quickness we decided to just mount directly to the Daydream with a custom 3d mount. (this might change revision 2) Our trusty 3d printer came in handy, a quick model in 3d software of choice followed by a lucky first try fit and we were in business. The Vive Tracker has a universal tripod mount on the back so using that in combination with a tripod hot shoe for dslr camera’s gave us a snug strong fit.
Now I’m sure your wondering but is it as good as the Vive? In short, NO…. its hard to compete with tracked hand controllers, desktop graphics and a wider field of view so Vive is still the winner here. But!! I will say the daydream with added room scale is pretty awesome. There is something really cool about having no tether and sharper resolution. The big downside for now is that there aren’t any room scale games for Daydream (minus ours I guess) so we just ran a demo of our game “Cover Me!” modified to support the tracker and new headset. In the future I could really see this expanding (we aren’t finished yet!)
Excited to keep experimenting!