Sex chat kinect
Less mathtasticness for finding hands means more mathtasticness for shiny.There's some microphones on the kinect too but we don't know how to use them yet so I'm just gonna ignore them for now. There's a few different sex things you could do with the kinect.
The airvents are pretty small, and honestly, it's a camera. -- Motion Swinger is basically a joke website with pictures from Second Life that's not all that interesting outside of the following block: let Kinect scan in your whole body and see how you and your willie rank in comparison to other motion swinger Users (Internet & Xbox Live required) Now, there's a few reasons this isn't very easily possible, but I'll get into that later.Jessica Pixel has all the details you need for streaming Kinect-based motion capture to your Second Life avatar here, the cool hack we saw on her live You Tube stream earlier today."The software [called Rinions] is developed by Fumi Hax (Fumikazu Iseki IRL) at the Network System Laboratory group at the Toyko University of Information Science," she says. There is actually a more recent version of Firestorm (4.6.9) compiled with the Rinions code available now and it's pretty easy to setup and run.I've even been able to get the animation server working, so if another person had the proper software installed, they could see my movement in their viewer in real time (versus over the stream)." Got questions about the setup for Jessica? So let's ruin the end of our nice 4-day weekend (in the US, at least) by taking some new technology and donginating it a bit. If you haven't heard about it yet, you're probably not reading this blog right now, because people interested in dildo tech are usually also at least semi-interested in real tech, and this has been all the fuck over the real tech news lately. So you have your normal color picture, as well as how far away things are in the picture, which means we can easily cut out what's in front of and behind a body and just see the body, and then spend our computing power figuring things out about that certain body's position and movement instead of just trying to find it with all of the background clutter. I just had to stop integrating people's code long enough to actually make the damn blog post. The evil looks like this: And now, here's a picture of me doing an impression of the dude from Indiana Jones that got the medallion burned into his hand, in a combination RGB and depth image from the kinect.
Not only that, there's an open source, cross platform driver set available so you can do whatever your perverted little heart wants with this poor piece of future, thus continuing the human condition of "not being able to have nice things." (Full disclosure: Some of the code in those drivers is my fault, as is the code integration.) But, for those of you have haven't read up yet, here's a quick kinect overview. Instead of having to do all the weirdness with the calibration and the dots in the video, now we just say "Only look for things like between 2 and 2.5 feet away from the camera", and the camera goes "ok" and we expect to find a hand in there and we just have to look at the depth outline.
It does this by projecting evil rays onto whatever is in front of it and then seeing how much the evil distorts and using that to calculate distance. Being hit by the kinect camera is evil, and is a sin.
This is WAY easier and less mathtastic than the dots.
I /guess/ it could be insertable (lord knows that never stopped anyone with a wiimote), but... The Kinect Titty Tracker brings up a lot of interesting properties of the kinect.
First off, it's using depth to find body shape in order to know where to put the images.
This would be a much more difficult problem with just an RGB camera due to trying to figure out where the body in the picture was with all of the background clutter behind it.