Sunday, February 8, 2015

Star Citizen and Google Cardboard

The Motivation

Yesterday, my roommate decided that his computer wasn't powerful enough and asked if he could install Star Citizen on my gaming machine (one of the original Steam Machines!).

... Of course I said yes.  Why would I not want the most exciting upcoming space (and more) sim on my machine, to play when my roommate is out?

About 40 GB of downloads later, (during which my roommate left to do some work), I fired up Star Citizen.  I'm not going to review my experience here; suffice it to say that it's looking really good.

The one thing that bothered me was that I could not get immersed in the game the way I wanted to.  Sure, the cockpit view looks awesome, but unless I were to sit two feet from the television, the real world will continue to interfere with my immersion.

So, seeing as how I had my own work to do, I decided to completely ignore that work and build Google Cardboard - DIY virtual reality for those without $300 or more to spend on some other VR kit.  I had a set of cheap plastic lenses from a set of folding cardboard binoculars.  They weren't exactly right, but that unfortunate fact was offset by the fact that they were entirely free, meaning I could build my Cardboard for a hefty sum of $0.

I downloaded and printed the design from the Cardboard page, pulled out my set of Exacto knives and a huge roll of masking tape, found an old shoe box, and got to work.

A half hour or so later, I had built my own Cardboard, and it fit together really well.

So now, how to get the game video to my phone in side-by-side 3D?

A quick internet search revealed multiple options.  These were the ones that I tried:

  • Trinus Gyre - a set of software (PC server and mobile client) that allows you to transmit video to the client AND sensor data back to the PC as head tracking data
  • Limelight - open source implementation over NVIDIA's GameStream protocol, low-latency game streaming to a variety of devices

Trinus Gyre

Trinus Gyre seemed like the perfect candidate.  You can set up an Android device for USB tethering, meaning that you get much lower latency than over wireless.  AND it does sensor-based head tracking and outputs the data in a variety of formats (FreeTrack, TrackIR, etc).  The full Android app costs a little over $6 (raising my total for this project from $0 to a whopping $6).

After installing the server and the client, and starting both, I was greeted with a stereo view of the Trinus Gyre PC application on my phone screen.  So far, so good.  I was also able to move the mouse by tilting my phone... even better!

However, things began to go downhill when I launched Star Citizen.  As soon as the launcher window was open, the phone screen went completely black.  After launching Star Citizen and loading the hangar, the screen was still all black.

Luckily, a bit of reading turned up a solution to this: Trinus Gyre works best if the game is run in windowed mode.  I was able to get some decent looking fake 3D out of the software, AND I could do fake head tracking with my phone's sensors.  It's also got some really nice lens adjustment features built in that allow you to really fine-tune the stereo image.

However, the latency was really too much.  I was getting severe delays when moving or even hitting keys, despite having rigged up the phone over USB to avoid additional latency over wifi.

A Brief Interlude for Focus

While I was testing Trinus Gyre, I realized that my screen was WAY out of focus.  The lenses that I'm using have a VERY different focal distance than those suggested by Google.  I was able to empirically determine a good screen distance by moving my phone back until everything looked really sharp.  The final depth is about 2.5" further away from my face than the original Google Cardboard design, so I had to add some extra cardboard to the Cardboard to make it work.  The result is quite aesthetically pleasing: 



... really, I swear.  It's a beauty.

Limelight and NVIDIA GameStream

The next option was to stream the game over wifi using the same technology that NVIDIA uses to stream games to the NVIDIA shield (how many more times can I say NVIDIA?).  

I installed the Limelight app on my phone, and it immediately detected the desktop.  All I had to do now was allow Star Citizen to be streamed.  As it turns out, setting up a game for streaming in the GeForce Experience app is really easy.

So now I can launch Star Citizen from my phone, and view it in full HD with acceptable latency.  However, the picture is not in side-by-side 3D.  Crap.  How do I get side-by-side 3D renders out of Star Citizen?

There are some non-free software options out there that can give you decent results.  However, it turns out that CryEngine 3, which is what Star Citizen is built on, has some neat options for enabling various types of 3D output.

CryEngine 3's Stereo Settings

There are a number of useful settings for CryEngine 3 that turn on side-by-side rendering, set parallax depth, eye distance, and more.  I do not have a scientific explanation for the settings, although descriptions of some can be found in the CryEngine documentation (who knew?).  Here are the settings I used that at least got me side-by-side rendering that converged into a single 3D image when viewed through the Cardboard.

r_StereoMode=1
r_StereoOutput=4
r_StereoDevice=1
r_StereoStrength=-8
r_StereoFlipEyes=1
r_MotionBlur=0

r_StereoEyeDist=0.02
r_StereoScreenDist=0.25

These settings enable side-by-side rendering.  The r_StereoEyeDist and r_StereoScreenDist variables have to be tweaked to get the images to converge comfortably.

Put these settings in a file called "user.cfg" in the same folder as the StarCitizen.exe file, and CryEngine3 automatically begins to output split screen 3D.  The aspect ratio is a bit weird (I have yet to figure out how to fix this), but it's pretty cool as a built in feature.

Trying Limelight for Streaming

So, now I can get split 3D, but my streaming solution is slow.  Can I speed this up?

Enter Limelight.  I installed the app on my phone, and it immediately detected my desktop as a potential streaming server.  Even cooler still, it allows me to launch Star Citizen directly from my phone!

The result is a high-quality, very low-latency stream from computer to phone.  However, there are two downsides as compared to Trinus Gyre.

First:  there is no lens correction, and no ability to scale the output to correct any of the aspect ratio weirdness I mentioned earlier.

Second: no sensor streaming, so no fake head tracking.

Still, it allowed me to play the game and not feel like the response was way behind my control inputs.  It was in this setup that I first got my eyes (and the CryEngine settings) to cooperate, and MAN was it awesome. Even with the squished aspect ratio and no head tracking, I felt so much more immersed in the game.  I was even blown away simply by how three-dimensional a catwalk in the hangar looked.

Of course, I still wasn't satisfied.

Back to Trinus Gyre

I really wanted head tracking to work.  I had been held up before by the way that Windows 8.1 prioritizes network interfaces (the USB network interface was tried first, causing Star Citizen to fail out with no internet connection), but I was determined to try the game with all the trappings.

I finally found the (very well hidden) settings to manually set network interface metrics (could be an entire post unto itself) so that priority would always go to my wired adapter.  Then, it was simply a matter of configuring the lens correction in Trinus Gyre to "undo" the squashed aspect ratio inherent in CryEngine's side-by-side output, increasing the sensor axis sensitivity for all three axes, and starting the game.

Suddenly I was there.  I was able to look around the hangar using my head movements only.  I looked up, down at my feet (which were fully rendered), all around me...  I did that for almost fifteen minutes.

For some DIY virtual reality, it was awesome.  The lag was still there, but it was worth feeling kind of drunk to take a look around this virtual room with my own motions.

The Bottom Line

What did I get out of this insanely long (well, two day) experiment in DIY?

I got a workable, if slightly laggy, virtual reality solution that could likely be improved with a bit more time.  With my phone's ridiculous display (I'm using a Droid Turbo), the only screen door effect I got was from the video compression that was being used for streaming.

I'll probably construct a head strap and phone mount so that I can try real, lengthy play sessions.

What I really got from this, however, is real hope for virtual reality.  If a bunch of cardboard, a phone, and some free software could do what I experienced, real consumer devices should be enough to blow me away.


I'll be tweaking settings to improve the view and reduce the lag, and will post any better settings if I find them.

3 comments:

  1. Hi Charles. Great article. I was wondering if you've gotten this to work with Star Citizen Alpha 2.0. I can launch the game menu in SBS, but I get a black screen when launching Hangar, Arena Commander, and Universe game modes. I'm actually trying to get this work with my Samsung Gear VR by not connecting it directly to the micro USB port and prevent it from me using the Trinus Gyre.

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. What resolution did you set Star Citizen too? GearVR has a resolution of 1280x1440 per eye, so do I set 1280x1440 or 2560x1440?

    ReplyDelete