Not sure if I rendered it right for YT, but h opefully the effect works! A little preview of how the series could look in 3D :). but with my cheap equipment
I’ll confess, I can’t tell the difference. Except that in the footage with Tom, the wall in the right image looks grimy. Mind you, SD and HD video looks the same to me.
Are you watching it with your eyes crossed?
Very cool. Fyi, I find the effect much more obvious when the bot is moving relative to his surroundings, rather than when it’s just the camera moving.
I didn’t have my tripod with me, so I was limited to the height I could use AND I didn’t want me showing in the camera ;o). Maybe I will do another one with the tripod so I can get a higher elevation and make it look more show like.
How far apart are the left and right lenses? If you’re doing this with a multi-camera smartphone (I have a couple of apps that do this), then lenses are only a centimeter or two apart at best. The parallax (? I don’t know if that’s the right term here) may not be enough for the effect to be appreciable at more than a couple of feet from the subject, especially if the background is close behind it. They also have the issue of the left and right images being captured at different resolutions because the app is compensating for the camera lens stacks having been constructed to operate at different magnifications/diopters.
[Edit — struck out as inapplicable. See later posts]
I do believe this is my camera, or close enough too it that I bought oh so long ago. didn’t realize HOW long ago, when until just yesterday I noticed the date to set on the camera doesn’t go up to 2021. Guess they didn’t think someone would keep it that long :).
This camera shoots it all natively in 3D, and even has a screene that shows the effect instantly without the need for glasses. I might have messed up a bit in the post processing, first via vegas pro 18 and THEN to Youtube. not even sure YT can still handle 3D properly as it’s kind of gone out of style. But, thought you guys and the MST3K crew hoping to do their first 3D episode might get a kick out of the bots filmed in a native camera built for 3D :).
I did not know Sony had made a 3D camcorder! I’m striking out everything I wrote since it’s obviously not applicable to a purpose-built device.
I keep looking at your YouTube footage, and I’m not certain that the same image isn’t being shown for both the left and right eyes. It may be just me, but I’m not seeing much, if any parallax differences between the left and right images.
But the special episode will be in the “vintage” red/blue 3D
That’s just a way (anaglyph 3D) of combining both the left and right eye images into a single frame so it can be printed, displayed, or projected without special equipment. You still need cyan and magenta filters to visually separate the images. Later variants of the technique employed in movie theaters use two perpendicularly polarized images (linearly and, later, circularly à la Real3D) projected simultaneously at the screen along with polarized filters (like polarized sunglasses, but slightly different) over each viewer’s eyes. The polarization techniques don’t generally work with standard monitors, however, and with those it’s common to project the left and right images sequentially with a pair of synchronized shutters (in remotely controlled glasses) to blind each eye to the opposite eye’s image when it’s onscreen.
Another way is to “stripe” the left and right images and put a lenticular panel over the display so that one set of alternating stripes are steered at the left eye, and the other set is steered at the right eye. This is what was used in the Nintendo 3DS systems. It’s not a method that lends itself well to movie theaters or large monitors, though, and the viewing “sweet spot” is small.
If money’s no object, there are low-power laser-based systems in development that project one or more beams directly into the eye and draw the appropriate image directly upon the retina, bypassing the need for any kind of screen, filters, blinders, or shutters. Tracking the movement of both eyes to keep the images stable is a non-trivial problem, however.
I couldn’t see it cross-eye or parallel viewed. I subtracted the right eye from the left eye (at 24 sec. with Youtube set to 1080) in Krita and got a pure black frame, which suggests they’re identical at that point.
A quick-n-dirty mux on the playback is suggesting strongly that, if filmed with a HDR-TD10, it’s just one camera’s worth of video.
@Shredder565: You’ll need to extract both cameras’ 2D streams and then stitch them together. But you’ll also need to bake in some corrections that will allow a wider audience of people to be able to view the illusion with a trade-off in the overall effect (because the distance between the cameras is fixed and human eyes, from person to person, are not).
If resigning to forced vergence for the illusion, you’re also losing the ability to move your camera. Try keeping motion to the items in frame instead.
i’m still learning how to do 3D Videos properly. all these years, and I’ve only had occasion to use it a few times. most of the times folks are just amazed by it. I gave a small demonstration to jim cummings, voice of darkwing duck once :).
one day I will do a follow up post, only with a higher angle and the characters moving. maybe it can be included in kinga vision ;o)