3D-360 Object in Youtube

15 02 2011

3D At home is going fast. Youtube announced it now accepts 3D content from cellphones with 3D camera’s (a few have just been released). The new Youtube 3D viewer is great: you can choose ‘anaglyph’ glasses (red/green, red/blue, etc..) or if you have a real 3D monitor you can just tell youtube what you use, and soon you can watch 3D at home as you saw Avatar.

In the video above, you can choose next to the resolution, the type of 3D viewer you are going to use.

So, how about watching products: we all know the roundshots from objects. The future then is, that we want to see 3D roundshots. Sadly, there is no easy browser-software for this (yet), but we do have the great YouTube implementation. So what you can do, is make a 360 roundshot into a short movie, a 3D movie, that rotates 360′, hence, 3D-360. “Why not just record a rotating object with a 3D video camera?” = because that is just too easy; and eventually we want to use the 3D-360 images to implement in a 3D-360 browserwindow when the time is right (~3 years?).
Let’s put this idea to the test. Project time: 1 hour.

This will be some kind of “How to make a product or object 3D 360 video”.

3D-360 Test Setup

The nice thing about a roundshot is that you change the angle. So we do not need two camera’s; instead we take “x” images, and the frames for the lefteye are then “1 to x-1” and those for the right “2 to x”. Or: in the first 3D image we use f.e. the 10deg angle and 20deg angle pictures, and in the second 3d image we use the 20deg and 30deg image; hence we re-use images, so we do not need 2 camera’s. Though, i had decided 10deg intervals would be OK for this test, i think 2~5deg would have been better, since it takes effort to see the 3D since the 10deg shift is quite big.

What we will do

  1. Take 35 pictures (35x10degs=full circle)
  2. Batch 35 pictures (Photoshop?) to square and without background
  3. Make a movie (.avi) from pictures 1-to-34 for the left eye, and another movie from pictura 2-35 for the right eye.
  4. Merge two movies together in the way youtube wants it.

Batched pictures..

Now I have used Photolapse (freeware & single .exe) to merge the two sequences into video files (.avi).

Make two movies from pictures

Then we import these two video files. We use StereoMovie maker (freeware, single .exe), actually Nasa uses this software for education.

Import two videos

Press Save Stereo (1) and reduce size (youtubes wants this format)

I ended up with these files. Now upload the right to youtube.

Now upload to YouTube, and add the tag “yt3d:enable=true” when uploading.

Then you should see this after uploading.

Now as you see from the image above, we can choose the method of 3D we want to see! Underneath here i have put the three most common 3D options next to eachother. And oh yeah, my video is 1080P, full HD ofcourse =)

When watching you can change the method of 3D you like!

Final Note

Sadly though, the 3D TV’s arent compatible YET with the Youtube3D (or YT3D). Not even all the PC 3D solutions with LCD’s and glasses are YT3D compatible yet, let alone 3D phones! But this is cutting edge: Sony announced that this summer it will incorporate the YT3D viewer in its 3D TV’s. I am convinced 3D is going to get very big, but we are not there yet. Youtube has made a nice step in the good direction!

Questions andsoforth in the comments.





3D Realtime Face Reconstruction with Matlab and Kinect

14 02 2011

This post has been updated and moved to a new dedicated website HERE

I managed to map the texture on the pointgrid aquired from the Kinect 3D image.

This is what my program does:

  1. Gets 3D + live image
  2. Converts 3D coordinates to real-world coordinates
  3. Makes a trigonial mesh
  4. Fills in the entire trigonial mesh with the correct colors (from live image)
  5. Starts from 1

Specs:

  • ~2 seconds per frame (0.5 FPS)
  • Accurate to ~5mm
  • Works from 50cm to at least 700cm
  • 0.3 MP Resolution
  • If not set to realtime (=postprocessing) framerate would be 30FPS

See below this youtube of my face, which consists of around 40.000 datapoints:
This post has been updated and moved to a new dedicated website HERE

And then here follows the code. Looks basic? It was, i spent a total of 2 hours on it.

Read the rest of this entry »





“3D-360” Object Exposition

13 02 2011

First inspire yourself by looking at this image

Imagine a virtual world in your hands

Then, imaging looking at a static object in 3D, on a screen. Then, imagine the object in front of you, and move it by moving your hands through the air as if holding it (see my previous post for this method Sky touch 3D objects).

I tried to explain this graphically in this self-explanatory image, this solution would cost around €700, or, if you already have two projectors, if would cost around €150, or if you already have a 3D camera (as I have) it would cost ~€30, and if you already have polarization filters, it would cost ~€0.. Wait, I have all those things, so I will be physically testing this setup shortly:

3D-360 Virtual object interactive viewer





3D Photographic Exposition

13 02 2011

Lately I was thinking about how one could give an affordable 3D photographic exposition for still images. Obvisouly this is possible by just taking a 3D projector or a 3D ready computer/tv screen, and showing the 3D image with shutterglasses. But that’s more for video, and that solution is too expensive (>€500 per exposed image) and complex for photographic exhibitions. So last night before I fell asleep i solved my problem for about €50 per exposed picture. Also, for this we do not need €100 shutterglasses, but can keep it for easily available €1~€ 3 polarized glasses.

I have tried to explain my idea graphically, which is self explanatory. So you only need a transparant negative of every image, which anyone can print on transparant paper. The offset and skew you introduce by the fact that the projectors are side-by-side you solve by skewing the images fysically on the projector, or calculate it by software, skew them in photoshot, and print afterwards.

3D Photographic Exhibition solution


Practicability

Implementation of this set-up would be either projecting from the back of the projection sheet, with the viewer in front. Otherwise the projectors could be mounted on the ceiling, or otherwise very generally in front of the sheet, next to the viewer. Every viewer should wear low-cost polarized glasses.





Sky touch 3D objects

2 02 2011

This post has been updated and moved to a new dedicated website





Kinect on Windows 7 with Matlab

1 02 2011

This post has been moved to HERE

I have recently purchased a Kinect camera (although I do not even own an X-Box). It’s main purpose is to get the multi-touch working on it. Well, multi-touch: multi-sky-touch that is. For this i would be programming in C, but for now it is easier to develop in Matlab, so let’s focus on Matlab.

Kinect Hardware connections

The Kinect is provided with a normal USB connector and split to a normal power adapter. It gets quite hot after a while.

This post has been updated and moved to a new dedicated website HERE

Kinect Drivers
There are three options to get the driver working on your Windows machine (and one for your OS X beast):

  • CLNUI untested
  • Open-Kinect
  • OpenNI/NITE

Open-Kinect

If we are going to use Open-Kinect with Matlab, we will be using an external DLL written for Matlab after the Kinect is working. For this we will use the Image Acquisition Toolbox (IMAQ). I had a lot of trouble with this. Though, this seemed like the good option since the Image Acquisition-toolbox is made especially for these kinds of streaming video data, for which you would otherwise need buggy loops. So after some problems with IMAQ i switched to the other driver. Also, you need Visual Studio for this approach.

Open-NI and NITE

This worked great for me, although this is not entirely open-source, but i was not going to touch the source-code much anyway. So, lets get this working: Do this:

  1. Download Kinect Drivers (https://github.com/avin2/SensorKinect) and press the DOWNLOAD button. Search for the file dpinst-x86.exe and run it.
  2. Download the unstable binary which worked for me (http://www.openni.org/downloadfiles/2-openni-binaries) otherwise try the unstable binary. Scroll down and press LATEST UNSTABLE and download.
  3. Download the unstable NITE binary (http://www.openni.org/downloadfiles/2-openni-binaries), same fashion as (2), the install key is 0KOIk2JeIBYClPWVnMoRKn5cdY4=
  4. Download the unstable binary from http://www.openni.org/downloadfiles/30-openni-compliant-hardware-binaries and install.
  5. Plug in the Kinect. If you go to the device manager it should say Xbox NUI Audio unknown, this is obvious, and under PrimeSensor you’ve got your Cam and Motor
  6. For this step check http://www.studentguru.gr/blogs/vangos/archive/2011/01/20/how-to-successfully-install-kinect-windows-openni-nite.aspx
  7. Run a test to make sure it works from C:\Program Files\OpenNI\Samples\Bin\Release and C:\Program Files\Prime Sense\NITE\Samples\Bin\Release. If they do not work, do not proceed. Switch binaries; stable or unstable this time.
  8. Get http://sourceforge.net/projects/kinect-mex/ , press DOWNLOAD and unpack: then get to Matlab and run one of the test .m files. You can only execute the files that have _sample in them, the rest is compiled stuff you should not touch. Also, don’t touch the .XML files. Now, hacking begins!
  9. If you have any problems, drop a line in the comments.

Specifications

  • Streaming RGB ánd depth image: 20 FPS
  • Streaming depth image only: 30~35  FPS
  • Streaming without viewing: 35~50 FPS

Here a sample video in which i 3D reconstruct my desk-area.


update The Kinect also records audio ofcourse. There are, as far as i know, no drivers yet to get audio from the Kinect. This holds at least true for Matlab, and i’m not sure if they’ve managed to import audio to windows anyway.

update2 This Kinect method does not work for Matlab 2008. Try upgrading your version.





Tweede Kamer Touchscreens

1 02 2011

At the Dutch parliament (Tweede Kamer der Staten Generaal) they have got touch-screens which you can use so you can see information about the people that reside there. The screens themselves are great, they are set-up in big tables and can be used by up to four persons at a time. So far i have seen two set-up. The idea was great, but the software was (sorry..) dreadful. I was expecting at least Windows 7 multitouch, but it was not at all, there was no such thing as a multi touch actually. It was more like single touch. The interface looks great, the information is A+, really nice. But, the software is very slow although graphically it is not intensive at all. I had some errors, one of them was the worst; the one where you accidentally touched the screen with more than only 1 finger, then the software got in some brilliant loop. Although camera’s are not allowed I filmed the experience.