Thu, Nov 28, 1:49 AM CST

Renderosity Forums / Poser - OFFICIAL



Welcome to the Poser - OFFICIAL Forum

Forum Coordinators: RedPhantom

Poser - OFFICIAL F.A.Q (Last Updated: 2024 Nov 27 5:12 pm)



Subject: Poser lens length for human eye?


Cage ( ) posted Sat, 01 June 2013 at 2:04 PM · edited Mon, 25 November 2024 at 4:42 PM

The "quick answer" to the general question of the lens length of the human eye seems to be 50mm, or so Google tells me.  But if I dig a bit, there is debate about this, at least for 3D representation.  Does binocularity affect the matter, or other factors?  To be Poser-specific, is Poser's camera handling consistent with the more general debate?  Is it the same as in other 3D contexts, or are there quirks to consider?

Trying to "eyeball" it (is that a pun?  No?  Umm.  :unsure:), setting a dolly camera to 50mm Focal doesn't seem to accurately represent what I would expect to see if I were standing in the environment.  I have to go much lower before I feel comfortable with the peripheral range of the displayed image.  But I'd rather be accurate than guess, or I'd at least like to know the correct answer before I decide whether to favor it or not.

Does anyone have any idea?  😕

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


JoePublic ( ) posted Sat, 01 June 2013 at 2:42 PM

Content Advisory! This message contains nudity

file_494891.jpg

 

Sorry, no definite answer, but I'm using 30mm outside (See pic), 22mm inside and about 15mm for close-ups.

50mm looks way too "flat", IMO.


ockham ( ) posted Sat, 01 June 2013 at 3:45 PM

I don't think you can adjust any camera, physical or digital, to match the human eye.  Way too much of our processing power is in the brain itself.  Constantly adjusting the lens, the iris, the sensitivity of the retina, and innumerable filters and detectors along the pathway.

Lots of experiments have showed that the brain can produce the same image no matter how the physical parts of the eye are distorted by external lenses or by disease conditions.

My python page
My ShareCG freebies


MikeMoss ( ) posted Sat, 01 June 2013 at 4:34 PM

Hi

That's an interesting question.

I been creating 3D stereo images in Poser and I never thought about how using different focal length lens settings could effect the way that the final image would look?

Will it have the same effect of using a wide on telephoto lenses when viewed in stereo?

One more thing to try out.

Mike

If you shoot a mime, do you need a silencer?


SamTherapy ( ) posted Sat, 01 June 2013 at 4:55 PM

It's also said the focal length of the human eye is 25mm, approximately the size of the eye itself.

As everyone else has remarked, there's no easy way to reproduce what - or how - a human sees because there's a lot going off in the brain which we still don't understand.

Case in point, the human eye is actually a pretty ropey piece of optical equipment.  The mechanics of how the eye works are well known and documented.  The real fun begins after the nerve impulses leave the back of the eye, since nobody knows how they're turned into an image of such remarkable fidelity.  I use fidelity in a rather loose sense here, since we can't see a great deal of the EM spectrum, nor can we see in as much clarity as some animals, and compared to pigeons, our retina responds with glacial slowness.  Still, given all that, we can see pretty damn well.

There's some recent research which suggests we recognize things by their edges first, and our perceptions respond to bumpiness most of all.  So I guess a sillhouette of NVIATWAS would be top of the list for recognition. :D  

 

 

Coppula eam se non posit acceptera jocularum.

My Store

My Gallery


moogal ( ) posted Sat, 01 June 2013 at 5:44 PM

Quote - I have to go much lower before I feel comfortable with the peripheral range of the displayed image.

I think that is the problem right there.  You want to see everything in the picture that you think you would see standing there, but a picture is rarely going to be as wide as your field of view.


Cage ( ) posted Sat, 01 June 2013 at 6:30 PM

So... I get the sense that this is not such a simple question.  :lol:  Thank you for the interesting responses.  I think I may make myself a small box, kind of a set of blinders with top, bottom, and sides, to look through.  See if I can get a sense of how much I should see, with my peripheral vision clipped to the range of the render area.  Hmm.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


mylemonblue ( ) posted Sat, 01 June 2013 at 7:52 PM

The old rule of thum, with the photographers I associated with over ten years ago, was that for 35mm format the 40+mm lens was closest to the human eye. Thing is that's an odd size and the manufactures settled on 50mm. I imagine 50mm was a more convenient number with less distortion. Some would say that 75mm-100mm was about a normal lens for medium format cameras depending on which medium format you used. Here's the Rub, in Poser I have no idea if it's designed with any known film format in mind.

My brain is just a toy box filled with weird things


MikeMoss ( ) posted Sat, 01 June 2013 at 9:25 PM

Hi

The thing is that the human eye has a very wide field of view.

While I'm sitting here looking at my monitor I can see things at a greater then a 180 degree field of view.

The verticals are still vertical even in the corner of the room almost 90 degrees from where I'm looking.

When you do that with a lens you get distortion.  Lines curve, the verticals go off things look like a fishbowl.

Your eyes don't do that because your brain processes the image.

I'm an example, I have had eye surgery on one eye, and I will have the other eye done this summer.

I can see well out of one eye, really poorly out of the other.

But I can still see better with both because my brain combines the two images the bad and the good and comes up with an image better then either.

I will experiment with focal length to see what effect it has on stereo images as soon as I have time, I've never thought about making a fisheye lens kind of 3D image.

But it will be easy to set up a scene and then make right left images with different focal lengths.

Mike

If you shoot a mime, do you need a silencer?


Latexluv ( ) posted Sat, 01 June 2013 at 9:25 PM

file_494895.jpg

I don't know how Real World Poser's cams are supposed to be, but since I was doing some shader testing, I thought I'd do a couple of renders with the Face Cam.

Here's 40mm. Looks very surreal!

"A lonely climber walks a tightrope to where dreams are born and never die!" - Billy Thorpe, song: Edge of Madness, album: East of Eden's Gate

Weapons of choice:

Poser Pro 2012, SR2, Paintshop Pro 8

 

 


Latexluv ( ) posted Sat, 01 June 2013 at 9:26 PM

file_494896.jpg

And this is what I usually use on the Face Cam -- 125 mm.

"A lonely climber walks a tightrope to where dreams are born and never die!" - Billy Thorpe, song: Edge of Madness, album: East of Eden's Gate

Weapons of choice:

Poser Pro 2012, SR2, Paintshop Pro 8

 

 


JoePublic ( ) posted Sat, 01 June 2013 at 10:10 PM

file_494897.jpg

Except for when I'm rigging, I only use the Main Camera. The face cam is much "closer" to the subject, so distortions caused by "short" lenses will get exaggerated.

Here's 30mm with the main camera.


JoePublic ( ) posted Sat, 01 June 2013 at 10:11 PM · edited Sat, 01 June 2013 at 10:12 PM

file_494898.jpg

And here's 50mm. Not bad, but a lot of background info and depth is lost.


bagginsbill ( ) posted Sat, 01 June 2013 at 11:20 PM · edited Sat, 01 June 2013 at 11:20 PM

The 50mm matching human eye has more to do with viewfinder than anything else.

If you hold a 35mm SLR camera against one eye, and look with both eyes open, and adjust the focal length, the focal length (to the nearest 10) that comes closest to matching both eyes (or matching what you'd see with just a window held in front of your eye) is 50.

50mm in a 35mm SLR is not going to give the same magnification in a medium format camera, nor in a DX format DLSR.

Furthermore, the Poser 50mm is actually off by a factor of 1.4x versus a 35mm SLR camera. It's much closer to the DLSR DX format - though by accident, not design.

So - in Poser, you would choose 35mm to match the magnification of what you'd see in an un-magnified rectangular window held up in front of your eye.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


hborre ( ) posted Sat, 01 June 2013 at 11:20 PM

Yes, but look at your compression.  Background detail on the 50mm lens looks closer than the 30mm which, in reality, is a wide angle lens.


bagginsbill ( ) posted Sat, 01 June 2013 at 11:25 PM


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


-Timberwolf- ( ) posted Sun, 02 June 2013 at 6:33 AM

You will never have a human eye view. The  firs reason is , human beings have two of them which gives a wider view. The second more important is, that eyes are allways in motion,very quick. They kind of scanning the environment. The brain stiches a complete picture out of those visual samples. Try this: close one eye , with the other one focus a point at the wall. You'll see you won't see much , no big view. You see with you brain, not with your eyes. ;-)


caisson ( ) posted Sun, 02 June 2013 at 7:36 AM

Eyes are like cameras in the same way that brains are like computers; that is to say, they are completely dissimilar things. It may be useful sometimes to draw rough analogies between them, but it can be misleading. For example, in the center of your vision, all the time, there is a blind spot about the size of a lemon, one for each eye (due to the optic nerve at the back of the retina). The brain seamlessly fills in this blind spot so you never notice it - eyes and brains work together to interpret the visual environment.

Back when I was learning photography in the pre-digital age, it seemed to be generally held that the most commonly used lens (in the 35mm format) for photojournalism was 35mm as it came closest to approximating the human field of view, but for portraiture it was 85mm as it was the best approximation of our perception of perspective.

----------------------------------------

Not approved by Scarfolk Council. For more information please reread. Or visit my local shop.


bagginsbill ( ) posted Sun, 02 June 2013 at 7:42 AM

Quote - for portraiture it was 85mm as it was the best approximation of our perception of perspective.

Not quite. Perspective lines are defined entirely by where you're standing and are not influenced by focal length. Focal length does influence magnification. So the issue is how to stand at normal distance like you're talking to a person, but still fill the photo with the face / portrait. For this reason, we use 85mm to 105mm so that we don't have to get 2 feet away to fill the frame, resulting in a very odd perspective that we're not used to.


Renderosity forum reply notifications are wonky. If I read a follow-up in a thread, but I don't myself reply, then notifications no longer happen AT ALL on that thread. So if I seem to be ignoring a question, that's why. (Updated September 23, 2019)


caisson ( ) posted Sun, 02 June 2013 at 8:05 AM

Thanks for clarifying - I should have explained what I meant better!

----------------------------------------

Not approved by Scarfolk Council. For more information please reread. Or visit my local shop.


aRtBee ( ) posted Sun, 02 June 2013 at 11:40 AM

When I look forward, one eye covered, the other one relaxed, I can see one arm width from left to right, at one arm length. From geometry this implies that my field of view is 30* center to side (as tan(30)=0.5) for one eye without eye-muscle usage / eyeball movements.

When I put a number of pilons in a Poser scene and start zooming, then I find that a field of view of 30* matches a focal length of 45mm. When I zoom out a bit to F=35mm, the result sort of matches the field of view of TWO eyes, in relaxed mode.

This matches the standard lens for compact digital cameras. High end digital cameras have standard lenses with a larger focal length (50mm, or even 70mm on Hasselblad), while the good old analog cameras had a 50mm standard lens as well.

When comparing a camera with my eye, my problem is that I've a hard time shutting off the auto-focus and auto-lighting functions of my eye. This distorts any conclusions.

When using a (compact) camera, 20-28mm is good for landscapes, 35-50 is good for scene overviews and cars, 70-100 is good for portraits, and 150-200 is good for fashion, hair, makeup, jewelry (all deliberately flattening the model) and product closeup shots. A Hasselblad needs different (larger) settings.

Just my 2 cents to the debate.
Have fun.

- - - - - 

Usually I'm wrong. But to be effective and efficient, I don't need to be correct or accurate.

visit www.aRtBeeWeb.nl (works) or Missing Manuals (tutorials & reviews) - both need an update though


Keith ( ) posted Sun, 02 June 2013 at 12:20 PM

There's another significant issue: if you try to create an image replicating what the eye sees, someone looking at that image may very well decide the image is wrong because they know what photographs look like and your image doesn't look like it.

It's called The Coconut Effect. It's the same reason lens flares are added to completely CGI scenes.



bobbesch ( ) posted Sun, 02 June 2013 at 12:44 PM

Quote - The 50mm matching human eye has more to do with viewfinder than anything else.

....

 SLR is not going to give the same magnification in a medium format camera, nor in a DX format DLSR.

 

The "normal" focal length of a given film or sensor format is roughly its diagonal. For 35 mm film, it's actually 43,27 mm, for classic middle format (6 x 7) it's 88,87 mm. What remains constant (apart from the aspect ratio) is the angular field.


MikeMoss ( ) posted Sun, 02 June 2013 at 12:49 PM

Hi

I'm pretty sure that your field of view with one eye closed it a lot more then 30 degrees.

I'd say it's more then 90.

I can stare with my left eye at the center of my monitor and can see the lamp on the left side of the room that is actually slightly behind me.

Of course it doesn't look clear like what I'm staring at but it is visible, it's what you can't do on screen but it is what makes you feel in the environment and not just like you are looking at it.

Gamers are trying to get this, in the world effect buy using multiple monitors that display the peripheral vision on monitors set at a 45 degree angle to the center one.

I'd do that if I could afford to by 2 more 27" 144 Hz monitors.

I'm really into 3D.

It's why you need to set at the right distance from the screen in a 3D movie to get the full effect.  

Sit so the sides of the screen are out near your peripheral vision limit so that the edges of the screen don't cut off the image too much, and you will really feel like you are in it.

If you see the edges of the screen too far in, like I do my monitor, everything looks like it goes in, like looking out of a window, but only things near the center area look like they stick out at you.

Mike

If you shoot a mime, do you need a silencer?


WandW ( ) posted Sun, 02 June 2013 at 7:59 PM

Quote - I don't know how Real World Poser's cams are supposed to be, but since I was doing some shader testing, I thought I'd do a couple of renders with the Face Cam.

Here's 40mm. Looks very surreal!

If I look at a real person from that close (I just tried it with my Wife)  it looks much like your render...

----------------------------------------------------------------------------------------

The Wisdom of bagginsbill:

"Oh - the manual says that? I have never read the manual - this must be why."
“I could buy better software, but then I'd have to be an artist and what's the point of that?"
"The [R'osity Forum Search] 'Default' label should actually say 'Don't Find What I'm Looking For'".
bagginsbill's Free Stuff... https://web.archive.org/web/20201010171535/https://sites.google.com/site/bagginsbill/Home


obm890 ( ) posted Mon, 03 June 2013 at 4:10 AM

Content Advisory! This message contains profanity

Quote - I'm pretty sure that your field of view with one eye closed it a lot more then 30 degrees.

I'd say it's more then 90.

I can stare with my left eye at the center of my monitor and can see the lamp on the left side of the room that is actually slightly behind me.

Of course it doesn't look clear like what I'm staring at but it is visible, it's what you can't do on screen but it is what makes you feel in the environment and not just like you are looking at it.

Yes, the eye's field of view IS a lot more than 30 degrees, but clarity and focus are the issue here. Objects in your peripheral vision are really vague, I doubt I could sketch (or even describe accurately) the objects at the edges of my field of view, there just isn't enough information getting through. You know there's a lamp there, but you can't actually see any details on it.

The "50mm/30 degrees = field of view of the human eye" convention is really a compromise, it's the limit of what we can see clearly (and without undue distortion), not the limit of what we are aware of.

In photography or 3D rendering it's tricky to try to include more by widening the field of view, you tend to get a lot of stretching and distortion of objects near the edge of the view. Around 28mm is/was typically the limit for wide angle lenses in architectural photography, wider than that and straight lines start to curve.

In renders I don't go much beyond about 30mm in modo (you could go lower in poser), that's still nowhere near as wide as the eye can 'see', but it's about the  point at which the distortion starts to attract attention to itself, away from what should be the focus of the image. Near the edges of the image the proportions of recognizable objects like cars and furniture become screwy, things like floor and brick patterns look weird, and circular objects near the corners of the image get really elongated.

 

Quote - Furthermore, the Poser 50mm is actually off by a factor of 1.4x versus a 35mm SLR camera. It's much closer to the DLSR DX format - though by accident, not design.

So - in Poser, you would choose 35mm to match the magnification of what you'd see in an un-magnified rectangular window held up in front of your eye.

 

Ah, yes, 'Poser does things a bit differently' again. This is the sort of shit that makes poser such a pain in the arse to use. I know it's built on old foundations (which were probably weird even when P1 first appeared), but if Poser wants to shake off its 'toy' reputation and be taken seriously as a 'Pro' application, it's time all these strange little quirks and 'things not behaving quite as expected' were cleaned up.



moogal ( ) posted Mon, 03 June 2013 at 6:25 PM

Quote -
Ah, yes, 'Poser does things a bit differently' again. This is the sort of shit that makes poser such a pain in the arse to use. I know it's built on old foundations (which were probably weird even when P1 first appeared), but if Poser wants to shake off its 'toy' reputation and be taken seriously as a 'Pro' application, it's time all these strange little quirks and 'things not behaving quite as expected' were cleaned up.

Yeah, that would be great.  I would like to have default I/O paths for each filetype Poser supports, or at least to have it remember more than one location so say loading a texture wouldn't take me to where I last saved an image.  I mentioned this in two separate "new features for Poser threads" at RDNA because I was under the impression it might be seen there by the right people.  But PP2014 still seems to behave in the same annoying way as previous versions.  I finally decided that I'd put in a support ticket and hope for the best.  If you can explain why the way Poser does something is wrong, submit it to the bug tracker.  They aren't going to bother with it if no one seems to care.  PP2014 is, IMHO, a great release with many impressive new features, but hopefully the SRs will polish it up a little more.


Cage ( ) posted Mon, 03 June 2013 at 6:53 PM · edited Mon, 03 June 2013 at 6:55 PM

I do hate the way Poser keeps changing the save path.  :lol:  Every time I import something, load a script, run a script that saves something behind the scenes, I have to brows and browse to get back to my preferred pz3 folder.  Ah, well.  The list of things I hate about Poser is long, but ultimately the list of things I love is longer.  I'm pretty sure it must be.  :lol:  Otherwise I'm just a dummy who likes to be irritated all the time.  :unsure:

I should add that input from this thread had caused me to adjust my camera settings.  I started at 25mm, tried 35, and eventually settled on 30 for the effect I want.  It feels right enough, at any rate.

===========================sigline======================================================

Cage can be an opinionated jerk who posts without thinking.  He apologizes for this.  He's honestly not trying to be a turkeyhead.

Cage had some freebies, compatible with Poser 11 and below.  His Python scripts were saved at archive.org, along with the rest of the Morphography site, where they were hosted.


mackis3D ( ) posted Mon, 03 June 2013 at 9:44 PM

Quote - I do hate the way Poser keeps changing the save path.  :lol:  Every time I import something, load a script, run a script that saves something behind the scenes, I have to brows and browse to get back to my preferred pz3 folder.

That's somehow like crossword puzzles for the elderly people. It helps us to memorize.  :-)


Privacy Notice

This site uses cookies to deliver the best experience. Our own cookies make user accounts and other features possible. Third-party cookies are used to display relevant ads and to analyze how Renderosity is used. By using our site, you acknowledge that you have read and understood our Terms of Service, including our Cookie Policy and our Privacy Policy.