A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Made With Unity' started by bliprob, Nov 21, 2010.
Very Very Good ! This amazing
Could somebody please explain a little (or give a link to good info) about how I could send an image quickly /fast from a windows dll plugin to unity pro ?
You guys do it at fast framerates and I wonder how it has to be done. Does unity expect an array of bytes somehow to get this done quickly ?
Well there is now an OpenNI Unity wrapper which does most of the things people would want.
I think its Windows-only for now as NITE does not work on macs yet but I would hope its just a matter of time.
As well as stuff like making skeletal joint rotations available to Unity, the sample shows me the detected user image so Bart this stuff should answer your question, if you even need it answering now that this wonderful wrapper is out!
great news, any chance of a video showing its usage.
I will only be posting videos showing it in use once Ive done something with it myself, because the sample they provided is really a bit ugly to my eyes. Also I have very little space here so I cant really show off the fact that their demo seems to feature 2 models being controlled by 2 users.
I got the OpenNI samples working, and the PrimeSense NITE examples working, but if I try running the Unity Wrapper "UnitySampleProject", it seems to initialize and then crashes Unity every time. The crash error.log says:
"Unity Editor [version: Unity 3.1.0f4_55865]
openNI.dll caused an Access Violation (0xc0000005)
in module openNI.dll at 0023:658cdf64."
I am running Win 7 64 bit. Any guesses? Is it working stable for you?
Yeah I havent had any crashes at all, but I was trying this on 32 bit Vista.
If you are still having problems then you should probably talk to them about this, Im not sure if 64 bit Windows 7 is supposed to work at this stage or not:
the demo i posted in this thread is done on windows 7 on a MacBook Pro, 64-bit
Is Kinect actually precise enough to track finger movement? Or doesn't it work at such a resolution?
If you held your hands close enough to it oriented in a certain way then its likely there would be enough data for someone to write finger tracking stuff for. However I cant imagine this working at the same time as trying to do skeletal tracking. Right now the NITE skeletal tracking cant tell the orientation of wrist joints, let alone fingers.
I suspect the key to making decent games driven by this technology is to keep things fairly simple. Even pretty straightforward stuff such as using a simple skeleton to interact with the environment, can cause user frustration unless it is well thought out - the brain can have trouble with 3d spatial awareness when we cannot actually see the objects we are trying to touch in the same physical space as our real limbs are moving in. For example here are some OSC-based Kinect tests Ive been doing where Im touching stuff with my hands, and it works ok but it can be real hard to actually touch what I mean to touch in a very controlled way:
This leads me to believe that a lot of simpler mechanisms will be used to greater success. eg certain gestures being used to trigger stuff, and the positions orientations of joints relative to other joints being used to trigger stuff, as opposed to completely free movement in the real world 3d space being mapped to the virtual 3d space. Stuff we take for granted with stuff like first person shooters, ie the simultaneous precision movement of character and aiming, may not be well suited to this skeletal technology, especially as the actual physical player is likely to want to remain oriented towards the screen most of the time. So Id be more inclined to put my character on rails, and just have a couple of very simple body movements help to control their position on the rail, and use arms for aiming firing etc.
I coud be wrong, time will tell, its going to be an exciting 2011.
A guy managed to compile OpenNI to MacOS (Intel only I think), but it's not very stable and, of course, it needs a lot of hacks to compile. Sometimes being on Windows has its advantages (sometimes, or a lot of times =P, it doesn't)
OpenNI has binaries only for Windows 32 Bits. The guy that managed to have it working with a 64 bits OS probably compiled the sources for that plataform
Ah but there are two different parts to this:
OpenNI which is the overall framework, source code is available and some people have worked out how to get it going on OS X.
NITE which is middleware which does the skeletal tracking algorithm bit and some other stuff. This is only available as a binary and there is no mac version yet, although I believe one is coming due to popular demand.
Ok, I got it working.
The problem was there is a xml config file in the location of the UnityInterface.dll.
I had not edited it to include the license key that Primesense provides on the OpenNI site.
I guess that is what was causing the crash, because now it is running great. Very cool to see the soldier avatar matching my movement!
Ok, so I want to use OSCeleton to send the joint positions into Unity. How do I accomplish this? I'm not that familiar with OSC so some help would be awesome I downloaded OSCulator for the mac and I have OSCeleton running on my Windows virtual machine, but OSCUlator isn't picking anything up... I could wait for a Mac port of NITE, but I want to try this so badly!
For OSCulator to see data coming in, set the port it is listening on to 7110. This will show you that you are receiving data and you can do some other stuff with it using OSCulator if you want, but OSCulator itself should not be running when you are wanting to send the OSC to Unity instead.
Also you need to run OSCeleton from the command line with the right parameter to specify the IP address of the machine to send the data to (see the OSCeleton readme and use the IP address that is assigned to your mac).
Also bear in mind that OSCeleton wont send much data at all, just a very few OSC messages, until the user has been calibrated and proper skeleton tracking begins.
Thats the easy bit. You will need scripts in Unity to enable Unity to receive OSC data. There have been a few posted to the forums etc over the years, I think I used one that was designed for the Wiimote but I modified it quite a bit so it could read the format of data that OSCeleton sends. http://forum.unity3d.com/threads/21273-OSCuMote-Wiimote-support-for-the-free-version
OK to save you some time I will post a sample project shortly. But its only to be used for messing around with before NITE is availble on mac. Here are the downsides to my scripts which explain why I dont really want to release them or have people using them for anything other than early experiments:
Im fairly new to Unity and some of these languages. My code is a quite horrible botch of code that others provided which was designed for use with th wiimote. There are likely bugs and inefficiencies. I havent bothered removing references to the wii, and I havent commented any of my code changes. I think there is also some bug with the original OSC stuff I used that causes problems with compiled projects, perhaps something not being finished properly when the application is exited, so when its run again the OSC stuff does not work, Im not sure. Works OK in the Unity editor though. Also due to some rubbish coding on my part, I think you need to make sure that OSC joint position data is coming in via OSC as soon as you press the run button, or else things fail to work properly.
Thank you so much for all your help! I've been playing with Animata and OSCeleton at the moment. Though I haven't tried your OSC wiimote plugin for Unity yet. But if you think you can put out an example project, that'd be most welcome! Thank you This Kinect stuff is very exciting, and I believe a user somewhere has posted a tool for converting this data to major 3D animation software, so I can see this being used for cutscenes in-game as well!
Its not my OSC Wiimote plugin, someone else did it, I just butchered it to work with OSCeleton data. An ideal OSC implementation for Unity would be quite a bit different to my butchery, but like I said its just something to play with now for people that dont have time to do OSC properly and arent using the OpenNI wrapper yet because of NITE not being available for mac yet.
Anyway Ive been tidying things up a little bit and will post the example shortly, just trying to squash an annoying bug.
Here we are: http://www.mutantquartz.com/OSCeletonUnityTest.zip
Anyone trying this please see earlier post where I stated the various limitations and downsides of this example. I may have fixed one or two things since I wrote that post but there are still a few issues, and this way of doing it is quite far removed from the proper OpenNI Unity wrapper. This example is also not attempting to drive a proper rigged character or anything, joint positions are simply being used to position some primitives, and I used the camera look at script a few times to point some of the primitive limbs in the right sort of direction. Im not doing anything clever at all to take account of people varying in size either, so my primitive character will likely be quite out of whack if you are a different size shape to me. Oh yes and it only works with one person because I didnt write code to get the data for a second user from OSCeleton.
In short, I would not have bothered sharing this code at all were it not for the unavailability of NITE on mac at this moment in time. I also wouldnt have needed to bother releasing it if there were real nice, tidy and flexible OSC scripts for Unity that were easy to find. This is partly because the term OSC is a bit short for the forums search index, but I know there is more than one solution already out there if people are lucky enough to find them. I may also have missed some obvious examples of good Unity OSC stuff by not looking in the right place. For those that arent so lucky or want a quick and temporary shortcut, try my example
Thanks man! It works wonderfully! I didn't even have to modify anything, the biped moved just fine to me. This will hold me over very well until a proper port for the Mac appears. Thank you so much! I'm having a lot of fun now
Looking forward to trying this. Thanks!
As far as OSC on Unity, yes, the search function on this forum brings back little or nothing - it's been broken for a while. Try OSC + Unity on Google.
Here's a thread I contributed to that has a starter example of OSC implementation on Unity:
Make sure to note the fix required to run this in Unity 3.x
Thanks to Elbows for his work! Now we wait and watch for NITE on OS X.
Kinect + Quartz Composer + Unity
Hi, maybe someone else is interested in experimenting with this too (Mac only)
I downloaded KinectTools for QuartzComposer from http://kineme.net/
then Syphon for QuartzComposer and Unity: http://syphon.v002.info/
with all together you have a nice system for experimenting with Kinect and Unity by using the QuartzComposer framework...
hi guys, i used that wrapper, but there was a mistake in character-setup, and the legs rotated wrong.
I've modified the script a little, if someone wants, i can share the prj, or script
you are welcome to share stuff here.
As i wanted the character to move in space, i decided to use the "Root" tranform as the "spine" in Initialize function
spine = UnityUtils.FindTransform(go, "Root");
That is why i commented out the whole defenition of the old root transform in the code.
Next i changed the function TransformBone so, that i can use it also to move the bone in the space.
void TransformBone(uint userId, NiteWrapper.SkeletonJoint joint, Transform dest,[B] bool move[/B])
dest.position = new Vector3(trans.pos.x, trans.pos.y, -trans.pos.z);
And the last: i used "move = true" only to move the "spine" transform:
public void UpdateAvatar(uint userId)
TransformBone(userId, NiteWrapper.SkeletonJoint.TORSO_CENTER, spine,[B] true[/B]);
Anyway, the whole changed nite.cs is available to download
Thanks for the updated file, vtornik23.
An other improvement on this is to change the line <Mirror on="false"/> to <Mirror on="true"/>, so the avatar on screen actually moves right when you move right (instead of the other way around, which is counter-intuitive and not like in the Kinect games).
It took a while to figure out how to get the OpenNI UnityWrapper and the sample project working, so I decided to share the steps I took with the community:
installing the kinect driver, OpenNI and NITE SDK
download the OpenNI binaries from http://www.openni.org/?q=node/2 (in this case: OpenNI 1.0 Alpha build 23 binaries for Win32)
download the NITE SDK from http://www.primesense.com/?p=515 (in this case: NITE 1.3 Beta build 17 SDK for Windows)
download SensorKinect from https://github.com/avin2/SensorKinect
install NITE SDK and use the following free license key frop openni.org: 0KOIk2JeIBYClPWVnMoRKn5cdY4=
unzip the SensorKinect file
connect the Kinect device
open the device manager (Start > Run... > devmgmt.msc) and verify that 'Xbox NUI Motor' is in the list (but with an exclamation mark)
Right-click on the device and choose 'Update Driver' and make windows look for a driver in the avin2-SensorKinect-b7cd39d\Platform\Win32\Driver folder. This will install a device called 'Kinect Motor'.
reboot your computer when asked
After rebooting, two new devices will be found. Repeat the same process by pointing the wizard to the driver folder. One of the two new devices will fail to install (the one for the audio), the one that will succeed will add a new device called 'Kinect Camera'.
install the executable the zip file in the folder avin2-SensorKinect-b7cd39d\Bin
run C:\Program Files\OpenNI\Samples\Bin\Release\NiUserTracker.exe to test if everything works
if it fails, run C:\Program Files\OpenNI\Tools\vcredist_x86.exe
installing the kinect unity example
download the official UnityWrapper from OpenNI from https://github.com/OpenNI/UnityWrapper
make sure Unity3 or Unity3 pro is installed
Nicely done bernardfrancois. Those steps worked great.
Thanks Bernard, you made my day.!!!
I followed your steps and it works fine and easy
The soldier is moving a bit in a funky way, but its ok.
the important is that the skeleton info is arriving into unity
Hi . I tried the soldier, and its true, there is some mistake in the rotation in the legs.
can you share the modifications you made?
lyserdelic, i've already posted it 3 post earlier.
Just download the script and use it for your fun
Thanks, vtornik, its working fine
>Here we are: http://www.mutantquartz.com/OSCeletonUnityTest.zip
Can you say how you would use this? Is it only possible to use while running NITE in a Windows virtual machine?
I am able to use my Kinect in Flash on my Mac, while running an as3-server from Terminal, but it doesn't have any of the skeletal stuff working just yet.
I´m not sure everybody has seen this video, but these guys managed to play World of Warcraft using Kinect and openNI. Worth checking if you´re interested in Kinect hack.
Well for skeletal tracking we need NITE, and thats only for Windows and Linux right now. You need either a windows virtual machine or a real windows machine on a network. You coul use Linux if you knock up your own skeleton->OSC linux app (which I did before OSCeleton arrived).
This is not supposed to be a proper solution for actually making releasing Unity games, its just something for a few people to play with in the short-term. PrimeSense have indicated that a mac version of NITE is coming soon, at which time I expect the proper Unity OpenNI wrapper to work and this will be a proper solution, leaving my temporary botch in the dust.
NITE is now available for OS X, yay. Installing all the components required to make this stuff work is not the most trivial task in the world right now, so I am still in the early stages of trying to get all this stuff installed on OS X.
This is so much fun! OpenNI only feeds skeletal data into unity via the dll, (looking at the source, that's all it's meant to do) does anybody know of a way to get all of the other data into Unity like depth info, audio, image info... and maybe even send data to control it?
Actually the sources of that .dll are in the same archive, and you can build it by yourself.
That means that you can implement your own functions to get the data you need.
OK it sounds like the Unity wrapper doesnt work on macs yet, but someone is looking into refactoring it so that it does.
Installing the prerequisites on OS X is nowhere near as trivial as it is on windows either. But I have got this stuff installed and working now, so all I am waiting for is the OS X-compatible unity wrapper.
Rebuilding the DLL seems indeed to be the way to go. The calibration pose is quite difficult to recognize it seems, and after a quick look at the NITE documentation, it seems it can be modified.
For now I won't be looking further into this though. The NITE and OpenNI documentation seem to be very good, so when I need to use Kinect for a project this is where I'll be looking.
I've tried to follow all step by step instructions stated above, but got an error messages when attempted to run the samples :
"The procedure entry point xnProductionNodeRelease could not be located in the dynamic link library OpenNI.dll"
"The procedure entry point xnUSBEnumerateDevices could not be located in the dynamic link library OpenNI.dll"
What should I do?
Thanks in advance,
ajie assuming you are using Windows where the Unity wrapper should already be capable of working, I think the first thing to establish is whether you have OpenNI, NITE the Kinect sensor driver installed ok on your machine. Can you run any of the non-Unity samples such as the Sinbad demo without errors?
Which instructions did you follow exactly?
Common problems include not installing OpenNI or NITE, issues with not using the right license key with NITE, or installing the PrimeSense driver instead of the ones that have been modified to work with the Kinect.
Also you may get better help by asking for help with these sorts of problems on the OpenNI google group rather than here.
Generally I didnt find installation on Windows to be all that painful, although it is more than we could reasonably ask the average end-user to do. Its even more of a pain on Linux OS X. So we are not really at a stage where developers would be safe to ship stuff that uses OpenNI to the end-user, I epect this will improve in time and at this stage I think PrimeSense are aiming at developers rather than end users anyway.
I am working in Windows environment, following bernardfrancois step by step instruction from installing OpenNI, NITE, and the modded driver for kinect sensors. Installation was fine, but then when I try to run some OpenNI samples, the errors occurred. So, I am not get into the Unity wrapper yet. I am not really into OpenNI/NITE development though, as Unity user I just wishing that someday Kinect Unity could work perfectly side by side.
I'll try to follow the instruction at kinect mod README... It may have something to do with the OpenNI/NITE version.
Yeah I suppose it could be a version issue, I only tried stuff on windows before the latest unstable stuff came out.
Ok, It is a version issue after all. By using unstable release of OpenNI/NITE, and then re-install the primesense kinect mods solves my problem.
Hi. I would like to do PointCloud vizualization ( like in the openframeworkd addonn examples http://www.creativeapplications.net/wp-content/uploads/2010/11/kylepointclouddepth.png)
But i have no idea how can i get the points in Unity.
I have the UnityWarper, and I searched inside the NITE.CS file and I saw that ti gets from "UnityInterface.dll" info like ::
But i dont have any idea how can i use it to transform it in points in space.
I am sorry if my questions are to basic, I am a newbie ;P.
Hope somebody can help me, or post an example that i can use. (like the one pricemap post it before).
My company would be happy to purchase your software if it allows us to generate a depth map of a corresponding sensed image. What equipment do I need to purchase? I am writing a technical proposal and would like to be able to place depth maps in in to illustrate ideas I have on creating pseudo 3D images out of orthogonally placed images.
Ed Bachelder, Ph.D.
Principal Research Engineer
Systems Technology Inc.
13766 Hawthorne Blvd
Hawthorne CA 90250
PC, Windows, Kinect, Unity
I noticed that it may be necessary to re-install the Kinect drivers each time windows is rebooted.
I only rebooted once since installing these drivers for the first time, and they didn't work any more. After reinstalling them, they worked again. I heard that sometimes windows flushes certain drivers after rebooting (for unknown reasons).
rebooted a number of times here bernard and not needed to reinstall- i did have to unplug and plugin my kinect after a reboot. looking forwards to the macosx version; great stuff.