Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice

Google Cardboard Toolkit

Discussion in 'Works In Progress - Archive' started by HoverX, Nov 17, 2014.

  1. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18


    Hey guys!

    I picked up a Google Cardboard recently and tried to build something in Unity for it. I was surprised to see that there was no definitive package for this. Durovis Dive deals with the gyroscope well enough but Andrew Whyte's click detecting script only scratches the surface and there's no great examples for learning. It should be dead simple to make something cool for this platform.

    Cardboard SDK for Unity packages everything together and improves on it:
    • Delegates for discrete magnet events, not just clicking
    • Methods to grab data such as how long the magnet has been held for
    • Tech demo and small game examples
    Best of all is that it's completely free on Github. I'm working on an Asset Store entry as a donation platform but the latest will always be free and open on Github.



    So why is it here in WIP? I have work left to do and I'd like to find the audience for it. Let me know if you work with Unity and have a Cardboard. Better still, download my Cardboard SDK and let me know what you think.

    I'll keep posting in here as I get time to check things off on the roadmap.
     
    John-G likes this.
  2. Whiteleaf

    Whiteleaf

    Joined:
    Jul 1, 2014
    Posts:
    728
    So, exactly what is this for?
     
  3. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Creating VR games for the Cardboard. The bulk of the work is interpreting data from the accelerometer and compass into something meaningful and useful for creating games.

    For example, the included sample lets you walk around and interact with a VR world with just the magnet trigger. Cardboard SDK exposes all the input events for that and sets you up with a Dive camera to handle the head-mounted display. You could make an interesting VR point-and-click adventure around that.
     
    Last edited: Nov 17, 2014
    John-G likes this.
  4. jellybit

    jellybit

    Joined:
    Nov 21, 2009
    Posts:
    32
    This is wonderful. Thanks so much for sharing. I'll contribute if I can improve on it. I was seriously disappointed with the Durovis Dive Unity code when I tried it a week ago, as it would constantly jitter greatly even if I placed the phone on the ground. I'm excited about trying out another approach. It's great you're putting this out there. Again, thanks a lot.
     
    Last edited: Nov 20, 2014
  5. HeadClot88

    HeadClot88

    Joined:
    Jul 3, 2012
    Posts:
    736
    This is very awesome!

    *Tips Hat*
     
  6. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Thanks for the kind words! It was a bit hard to know if anyone would care as Cardboard still feels like a bit of a niche. I think there's a lot of opportunity here.

    Quick update:

    As I built the examples I realized that it didn't make much sense to have multiple instances of CardboardInput and its main functions were called on Start and Update. The last few commits change CardboardInput to inherit MonoBehaviour and run from a manager object. It ends up simplifying a lot of code if you can stand to have another manager script in your scene.

    You can also trigger the magnet with Space (or whatever's on your "Jump"). It helps to debug and now you can play when you don't have a Cardboard on hand.

    Let me know how it goes. I use Dive to handle the gyroscope bits but maybe something I did helps mitigate that jitter. I'd love to replace Dive with something more open if I could.
     
  7. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    I took some time to play Google's official apps and noticed a major thing that I was missing: vibrations. When you click it vibrates the device to give you feedback that your click was registered. It seems like a valuable tool from a user experience standpoint so I spent a while to get that working.

    When I did, it become very apparent when it was registering a click and I started noticing a lot of false detections. I took the last couple days to build better debugging tools and hunt down these rogue clicks. No one likes being told they clicked when they just rotated the device quickly.


    It left the code a bit of a mess though so my goal now is to clean that up. After that I can tag it 2.0 under semantic versioning as it technically hasn't been backwards compatible since I moved to a MonoBehaviour scheme.

    When I have these basics down then I want to prototype a rails-shooter like Time Crisis or Link's Crossbow Training. I'm not sure exactly what that will look like but it'll help flesh out documentation and I hope make for some cool promo screenshots.
     
    HeadClot88 likes this.
  8. rukanishino

    rukanishino

    Joined:
    Nov 20, 2012
    Posts:
    13
    Wow this is cool. I just tried Cardboard and surprised with the (kinda) smooth experience, even with Unity demo from Dive out there. It's way easier to show and demonstrate stuffs to people than setting up the my oculus DK.

    Hoping to create small prototypes later, thanks for sharing this.
     
  9. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Yeah, I love the pick-up-and-go nature of the Cardboard. Hopefully this helps to accentuate that and enable developers like you.

    I tagged 2.0 yesterday: https://github.com/JScott/CardboardSDK-Unity/releases/tag/v2.0. You should post your prototypes progress here as well! I would love to see how you guys use the toolkit and I bet you guys will come up with way cooler things than I can think of :)

    I'm going to start working on my own prototype now to help showcase the project. As much as I'd love to start adding fun new features like camera and shake input, it'll be good to use my own toolkit and see what works and what doesn't. I'll keep posting updates in here like I just encouraged all of you to do.


    Boring dev stuff ahead:

    Hopefully the code speaks for itself but I also wanted to dev blog on the technical architecture of the kit a bit because I think it's interesting. I was warned when I started that going beyond what Google does for magnet input in their apps -- just detecting the click -- was tricky because the phone's sensors can be pretty touchy: I can go into the next room and potentially change the magnetic field.

    The trick is to treat the magnet relatively, which the original scripts did but I had to take it a step further. I could detect and rely on the change in magnetic field from one moment to the next. This helps remove false negatives from just walking around. However, I didn't really save state between the discrete events of up, down, and click. Click was a product of coming up but didn't check that the magnet went down first until recently. This is important because the phone might suddenly say it went up out of nowhere because of the imprecise nature of the device. So now I have a collection of discrete events, from CardboardInput.cs, and the stringing of them together triggers your delegates, from CardboardManager.cs.

    I also can't stress the importance of extremely clear, readable code. The single best thing to fixing these false detections was to take a step back and restructure the code. Keep your methods short, abstract, and self-documenting without comments. It's still far from perfect but it brought a lot of weird things to light. In understanding what I was doing it became obvious what I should be doing. If you're inclined to take a look at my code, let me know what you think.
     
  10. rukanishino

    rukanishino

    Joined:
    Nov 20, 2012
    Posts:
    13
    I just spent some time to test my idea using the plugin! The code from github need a little set-up because some references are gone (missing script, missing reference in prefab), so maybe you'll need to force meta files to make it visible? Will it helps? After a little tinkering it works nice though, the code is also easy to understand.

    Anyway, I tried to create a manga/comic reader prototype for VR. Watching movie is already awesome and I want to create a good environment for reading stuffs. It's simple for now, moving right or left and using magnet event for moving the next/previous page to the front. I want to try to complete it a few more features before going fancy haha.



    I also tried to recreate the "rotate to portrait for back / reset" in the cardboard app for the plugin. Apparently Unity can't detect the screen orientation if it's already excluded in the build settings. That makes me can't use Screen.orientation for detecting portrait if I didn't enable it (I'm using auto-rotate landscape). Other solution is using gyroscope/accelerometer for detecting the orientation manually, but I haven't got to that yet.

    Keep up the good work!
     
  11. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Ah jeez, I thought I cleaned up the prefab. Thanks for letting me know, I'll be sure to fix it up when I get a moment.

    Have you tried Input.deviceOrientation instead of Screen.orientation: http://docs.unity3d.com/ScriptReference/Input-deviceOrientation.html? My first thought was using the gyroscope but it seems like that would get complicated quickly. Be sure to make a pull request or post the code here when you get the gesture working reliably so I can integrate it for other people.

    Also, cool idea! It's always uncomfortable for me to read comics on phone screens so this is a really smart use of the benefits of VR space.
     
  12. rukanishino

    rukanishino

    Joined:
    Nov 20, 2012
    Posts:
    13
    Ah nice, Input.deviceOrientation works. I guess Screen.Orientation was only called when the screen has been rotated and changed, and it can't be called when disabled (because it doesn't actually change). Thanks for the info!

    Also tried to create the pull request, I haven't tested it extensively but it seems well enough. I rechecked with Cardboard app and realized that we only deal with Landscape Left and Portrait (for reset), so I used that as a reference for detecting the orientation.
     
  13. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    There's a hole in the back for the camera as well which makes for a fairly intuitive "correct" positioning.

    I pulled it into the 'rukanishino-orientation-reset' branch and cleaned up the code a little to fall more in line with the conventions so far. I just want to mix it into the tech demo scene for the sake of documentation before I make a master release for it.

    Edit: latest master has OnOrientationTilt. Thanks again for kicking that off, rukanishino! There's a lot of value in the tilt gesture.
     
    Last edited: Dec 10, 2014
  14. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Someone on Reddit was kind enough to point out that the Durovis Dive license prevents you from including the SDK. That means that I can't include Dive in my SDK for your convenience. It seems silly but rules are rules.

    Because of this I quickly threw together v2.2:
    - Ripped out the Dive SDK and Castle example to avoid legal issues
    - Added instructions on including the Dive camera
    - Added debug controls so you can test it out without Dive

    I also threw together a compiled APK of the Tech Demo scene, which the Dive license does allow, and a .unitypackage file for easier integration. Check out the release on Github!

    At this point I'll probably keep adding features when the mood strikes but I feel it's good enough to start advertising elsewhere. Can you think of any other forums or communities that might be interested in this toolkit?
     
  15. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    The initiative of this is great! Cheers for the work, i'll have a look a bit later but I think it's just the thing I needed to sort out one aspect of something i'm working on. I am, however, a little disappointed at the use of the Dive plugin, which while being very useful when working with the cardboard so far, the license of which makes the use of it restrictive and i'm looking for an alternative (After trying a few myself, including self written)

    I'll have a go and i'll have a root around in the oculus mobile sdk for how they grab their orientation information with a mind to, if suitable, moving it over to unity-friendly form. Ideally this should work with any suitable mobile device, i havent looked but i'm assuming their sdk isnt dependant on the Gear VR for sensor information, if it is then maybe instead the Unity api can be used in tandem to produce equivalent results (although obviously the average mobile sensors arent going to be as accurate as the Samsung peripheral)
     
  16. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    Oh, and while i'm not really a Reddit user, i find a lot of useful information there and there's a decent amount of discussion there (So you'll probably get some notice).
     
  17. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    I would also love to have an alternative to Dive to use with this. As you can tell by my last post, I'm no fan of their bizarre license either.

    If for some reason you're not making progress with the Rift SDK, there seems to be a little bit of information around how Android's Cardboard SDK does it. HeadTransform gives you the rotation angles you need to set the camera and someone decomposed the .jar so you can see what the class is doing here. I can't for the life of me figure out how mHeadView really gets set though.

    Edit: HeadTransform was the wrong place to look. HeadTracker is probably what you want to deconstruct. I gotta stop looking at this stuff now or I'll be up all night :)
     
    Last edited: Dec 15, 2014
  18. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
  19. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    It's a nice feature list but there are a few problems with it at first glance.

    First, it requires Unity Pro. If there's a way around this then it isn't well documented.

    Second, the event system bugs me. If you look at Teleport.cs you'll see that you get Cardboard.SDK.CardboardTriggered. Delegates instead of booleans ensure you don't miss an event and Google only commits to supporting a click, not if the magnet goes up or down and how long it's been held for. That data enables certain control schemes for games.

    Third, the documentation is non-existent and the code doesn't explain itself very well. Maybe it'll be fixed over time but given what we see for the Android SDK code I'm not so sure.

    What I'm saying is that if we develop an alternative to Dive that works with Unity Free then we have a legitimate competitor or companion to the Google-provided SDK.
     
  20. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    Well, fair enough points, but its probably worth grabbing a cardboard with the strip rather than the magnet and see how that alters an approach to input also. I plan to have a look around the oculus mobile sdk anyways (not so sure about the android sdk) in case there's some useful clever things in there, if i have luck ill post here
     
  21. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    I had not heard about this strip cardboard! Link?

    Also, is the Oculus SDK generally readable? I assumed it'd be worse but Google's C# code reads like Java :\
     
  22. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    Well, my bad, it might be more of an on/off kind of thing, but here http://www.dodocase.com/products/google-cardboard-vr-goggle-toolkit i'm reasonably sure it's covered in the SDK

    I've spent a bunch of today looking at the oculus sdk code as well as the unity implementation, it's pretty involved, well thought out stuff, naturally building towards a future, i'd imagine the mobile sdk was much the same, but when i was trying to avoid Dive i was more interested in just getting nice, stable, orientation information, and not doing very well at it

    If you see your efforts as an agreeable alternative to google's unity implementation, maybe have a look at what they're doing with the orientation tracking code and maybe take what you think looks good. C# isn't hugely different from java anyways and given its roots the google similarity isnt that surprising
     
  23. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Yeah, I don't blame them, I was just hoping for better. I was trying to find what they do for head tracking but half the battle is wading through a sea of undocumented classes.

    Do you have any prototypes worth showing off from your previous efforts? I'd be interested in your experiences. The gyroscope returns so much good data about its orientation to Unity that there's gotta be something quick and dirty to get basic head tracking.
     
  24. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    There's not much i'd bother with sharing because most of it was going up the wrong path - just nearly right, but in the way it was wrong suggested it was definitely the wrong approach. There a good few examples via google showing different folks approach to it but all tend to agree its a bit of a tricky thing to get right, you're probably not going to come across any solution that doesnt involve some applied thinking
     
  25. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    So hopefully this thread necromancy isn't too bad but there's been a recent development with Unity 5 that I want to follow up on. There's been a lot of complaints about Dive so I wanted to replace it with Google's SDK as soon as it was released. However, there was a known bug where Google's SDK would disable the gyro and accelerometer which doesn't let anything like my scripts work in conjunction with it. Unity 5 seems to fix this for whatever reason.

    Over the next little while I'd like to remove my dependency on Dive and replace it with Google's SDK now that it works. The license is more permissible and it works better which means an improved dev experience. As long as you're not an iOS user that is.

    Since this product has shaped up pretty well I'll probably move it from WIP to the Asset Store forum once I make this transition and version bump it. It might even deserve a name change since it became more of a Cardboard Input Enhancement SDK after the dust settled.

    Anyway, more updates soon as I make them!
     
  26. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    Well, good luck, but it may become obsolete as Unity plan to introduce vr support as standard (a tick box on the camera component) fairly soon, but i applaud the endeavour
     
  27. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    Oh cool, do you have a link to that being mentioned somewhere? I'd like to read more but I don't really know what to search and end up with a bunch of generic articles about the Rift on Unity.

    Either way, that shouldn't render this obsolete. Whatever innate support Unity gets for VR, I doubt that it will include additional hooks for Cardboard input. Google already provides an SDK for that and my kit goes above and beyond to expose more events than Google does and in a better way for C# developers. This toolkit has always leveraged another framework for creating the stereoscopic display, whether that be from Durovis, Google, or Unity.
     
  28. HoverX

    HoverX

    Joined:
    Aug 15, 2012
    Posts:
    18
    It turns out that there were no real issues with a direct port to Unity 5 with Google's Cardboard SDK. I've updated and simplified the Github code and submitted it to the Unity Asset Store gods. As promised, I'll make a proper post on the Asset Store forum when it gets approved.

    For those who are still curious about what this actually does, I tried to highlight that clearly in the new readme. I'm no marketer but hopefully it gets the point across better than it used to.

    Does anyone have an idea of how valuable Unity 4.6 support is? I learned a lot about packaging things up for the Unity 5 so the old package is kind of sloppy. If everyone's moving to 5 then there's not much of a reason to bother fixing it though.