Kinect Hitboxes DX11
Author: | gontz |
Date: | 15 Jan, 2013 |
Category: | tool |
Credits: | patched by Andrej Boleslavský for Colorbitor |
Download
Description
- Kinect Hitboxes tracker DX11 v3.2b
Tool for creating site-specific interactions using Kinect. DX11 version utilizes the performance of GPU compute shaders. This tracker is a great replacement of all kinds of presence sensor such as PIR, IR gates, ultra sounds. Installation and setup takes only few minutes.
Demo image sequence included.
Sample projects
Little Boxes, Bego M. Santiago Hopscotch³, dezInterzis Composition for a drone, Mária Judová Leafstrument, dezInterzis ArtistTalk, Andrej Boleslavský
Credits and licence:
Patched and coded for Colorbitor by Andrej Boleslavský CC BY-SA-NC 3.0
Required software:
vvvv 45beta32 x86 vvvv addons 45beta32_1 Kinect for Windows Runtime v1.7 VVVV Packs DX11 b32 x86 by Vux
- KinectVirtualHitBoxes v2.42b
A tool for triggering events using the depth image of Kinect. You define a virtual box; when pixels from pointcloud enter that space it triggers a message. The messages are sent over osc or keyboard keys
Included patch:
hopscotchᵌ, a collaborative, musical hopscotch as seen on http://vimeo.com/57388809
Tutorial on using it:
Credits and licence:
Patched for Colorbitor by Andrej Boleslavský, coding support Martin Zrcek, included sample by Nicola Pavone. CC BY-SA-NC 3.0
Required software:
vvvv 45beta29 vvvv addons 45beta29 OpenNI: 1.5.2.23 OpenNI NITE: 1.5.2.21 PrimeSense SensorKinect: 5.1.0.25
Comments
Comments are no longer accepted.Please create a new topic in the vvvv beta forum to discuss this contribution.
hi, missing CreatePointCloud2 (value) plugin
sorry but the plugin is still missing anyway looks fun :)
should work now.
Works well here, thank you.
Works here too. I find it very hard to adjust it properly though. I am playing for few hours already and cant calibrate it correctly. I almost line up the point cloud to the real image , move the boxes but nothing happens after that , i can never achieve a hit . Any calibration advises ?
Now, it looks interesting and I want to use it. But may be there is some help file to find out how the program works. Is there a really working sample of the program. I would be very happy because I am doing with kinect music in a modern classical sense. I am hoping for an answer.
Thank You eglod
OK either it is hard to align the camera to the real world position of the Kinect or i am uber lame ... Anyway i was thinking last night is it possible to use some of the camera calibration techniques to align it ? I Ask because i can not do it on my own. If you are using the dept stream of the kinect to generate a point-cloud , why not use the RGB srream for the camera calibration process. Just a thought , i might be way off in my dreams but it could be possible, thus making it easier for artist/users to adjust the system and use it for something creative.
hi synth, what do you mean by calibrating camera? kinect virtual hit boxes is meant for site specific installation where you add interactivity to a defined space (virtual box) based presence of object (people, animal, etc.).
it's meant as a replacement for traditional sensors such as infra or ultrasonic proximity sensors, passive infrared motion detectors, light gates etc.
back to your query about camera calibration - what you generally need to do is to rotate and move the point-cloud so the significant features of the real space are aligned with the 3d coordinates - floor is aligned to the XZ plane, wall to ZY etc. first you set the correct rotation of the point cloud, then the position...and you are ready to set up the "boxes" align them to the features of the physical space.
see the sample projects using this tracking system http://www.youtube.com/embed/UZTOmCChtGo
some other project suggestions: -games (such as bowling, app could evaluate how many pins were knocked) -behavioral experiments (rat labyrinths etc.) -lifted toilet lid detector (could save your relationship) -intelligent room light (light on when you are in the room, off when you are in the bed, different light when you sit behind the desk) -interactive visuals for parties ("put your hand up!") -simple window shop interactions (detect present of visitor, add few interactive buttons) -musical instruments (musical stairs, drawn musical interfaces, etc. )
Hallo synth,
here is a tutorial for using it http://youtu.be/gscalND9gHE
Erm as i said .. I am super, ultra, uber lame . I totally forgot about the camera controls thats why it was hard for me to adjust the point-cloud position. About camera calibration techniques i meant something like : http://www.kimchiandchips.com/blog/?p=725 by Elliot Woods and Kyle McDonald.
Thank you for the tutorial video, was really showing everything that is suppose to happen.
Thanks Synth, comments and suggestions are always welcome as we'll keep on adding features and publish updates. Could you elaborate further on you idea about camera calibration?
I believe in case of our point cloud aligning the 3PointsToPlan plugin would do the job. By clicking on a three points in a point cloud, it would get aligned to the axis.
thanks id144 for the dx11 version!
hi, kinect pointcloud not working ?!
noir: welcome :) medodawod: interestingly enough, i have a same problem now with GTX590 and no shader reports an error. what kind of graphic card do you have?
hi all, I manage not find the node Kinect ¿ Can anyone send it to me?
Hi ERV, did you download recent addonpack and dx11 pack? Which version of vvvv, dx11, addopack are you using?
thanks to all who developed it! just wanted to leave short notice that it works really well for me in vvvv_50beta35 :)
one question: is it possible to use multiple kinects with the hitboxes (maybe doublicate kinect part and select another index.. integrate into the 3d view?!) ;D They dont have to overlap, I just would like to cover a bigger area side by side, then I send the OSC-Signals to Resolume Layers etc
hi @yochee! thanks for the feedback :)
yes, with kinect v1 aka 360 it is possible to use multiple devices, though you need to adjust the patch a it as the output of the kinect node is not spreadable.
probably a faster way to do is to launch hitboxes on two machine, another option is to launch vvvv in two different instances with /allowmultiple command line parameter. each instance should use same table of hitbox coordinated but the transformation of the kinect position and the index should be different.
when using multiple kinects on a different machine, it's best to have them on two separate usb host controllers. i wonder how do multiple kinects perform if you disable color stream and connect them to same usb 3.0 host controller.
kinect v1 may overlap, there quality of depth map is decreased because they do jam each other, on the other hand, this does not generate false positives.
Hey thanks for this great contribution. im having issues where the kinect would "freeze" until i reset it, this happens somehow randomly. 45beta34.2_x64
Its not USB or EnergySaving related. Any ideas?
@gegenlicht, welcome :) Yes, it sounds like USB related issue. For a permanent installation I did in the past, I used separate USB controller precisely because of this issue. As a hotfix you may reset Kinect when it's idle for n frames.
Is it possible to add more hitboxes?
hey @vizmagician sure you can do that!