AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Omnifocus pro focus8/30/2023 ![]() ![]() I should have known, that pixel occlusion pattern is totally the result of a software “magic wand” - plus, the ability to determine distance implies some form of stereoscopy. I thought it was a pretty good guess, though. We’ll keep you posted on this new technology. In the mean time, though, I think a lot of consumers are just starting to discover depth of field in their video as they start shooting with cameras like the T2i. Whatever the case, they think they can apply it to the video world in general, and I hope they do. I’m not sure if I’m explaining it correctly but it makes sense to me. See how the edge of the finger is sort of all-or-nothing pixellation? There’s absolutely no overlap between the light coming from the doll and the light coming from the finger, suggesting the camera/sensor is only accepting light that is coming straight at it. With OmniFocus Pro, you can use individual perspectives’ View options to choose whether Layout preferences apply to that particular perspective. The full frame crop here also suggests a large, low-resolution sensor and parallel rays: The images you see, notice, have an extremely narrow field of view, which supports my theory, as a rounded lens would produce both a larger field of view and divergent light rays within the device, and that would make the images we see impossible. I would guess, though, that a complete and flat image is created via a polarized and flat “lens” (for lack of a better term) and the light is sent in parallel back to a high-sensitivity sensor. Not a lot of clues there except perhaps for an optics expert. They call it a “ Divergence-ratio Axi-vision Camera,” or Divcam for short. I doubt this new device is simply a pinhole camera, though. At F/22, a common minimum aperture value, you’re essentially getting a pinhole image, and the way the light is bent and re-bent results in the entire image being in one focal plane. Traditional cameras could make it happen using an extremely small aperture. As you can see in the images, objects only centimeters from the front of the device are as sharp as objects several meters away. The device described by Professor Keigo Iizuka at the University of Toronto, breaks with that tradition. Moving the lenses around creates zoom, changing focus, and results in more or less light transmission. Ordinary cameras, for centuries now since the very first experiments in optics, have relied on organizing lenses in sequence to recreate an image. There’s a trick here, I just can’t quite figure it out. ![]()
0 Comments
Read More
Leave a Reply. |