Human Eye FPS

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
BoredShirtless
BANNED
Posts: 3107
Joined: 2003-02-26 10:57am
Location: Stuttgart, Germany

Post by BoredShirtless »

kojikun wrote:
BoredShirtless wrote:Not in the context of this thread they ain't. Bitch.
Dumbass. Frames Per Second is a very singular word, meaning one and only one thing,
Bullshit. It takes a very special brand of idiot not to see there's a difference between the speed our eyes register images, with the speed our brains process them.

Our eyes DO process light and it's fucking legit to analyse JUST THAT. And again:
Rye wrote:Well what FPS does the human eye register
EYE asswipe.
User avatar
kojikun
BANNED
Posts: 9663
Joined: 2002-07-04 12:23am
Contact:

Post by kojikun »

BoredShirtless wrote:Bullshit. It takes a very special brand of idiot not to see there's a difference between the speed our eyes register images, with the speed our brains process them.
And that kind would be you, because you fail to realize that its not our brain but the rods and cones themselves that are sending the signal down the optic nerves at ~30Hz.
Our eyes DO process light and it's fucking legit to analyse JUST THAT.
Irrelevant. Our eyes still use ~30Hz input cycles.
And again:
Rye wrote:Well what FPS does the human eye register
EYE asswipe.
Point being? This doesnt change the fact that rods and cones have input cycles that last about 1/30th of a second during any given cycle of which all input light is contained, so the eye DOESN'T register RATES >30fps, but it registers LIGHT that last <1/30th of a second. Theres a universe of difference.
Sì! Abbiamo un' anima! Ma è fatta di tanti piccoli robot.
User avatar
Mad
Jedi Council Member
Posts: 1923
Joined: 2002-07-04 01:32am
Location: North Carolina, USA
Contact:

Post by Mad »

BoredShirtless wrote:Please point where I made the claim a digital camera CAN'T register a frame shown for 1/220th of a second.
Are you claiming that a 30 fps digital video camera can register a frame shown for 1/220th of a second?

You are discussing the ability to register light, which is a function of intensity over time. The higher the intensity, the less time is required to register the image.

However, that would have nothing to do with the concept of "frames per second," because both are receiving that data as part of a frame. Since that 1/220th of a second flash is the only data in the frame, that will make up the entirety of the frame.

So, then it appears that you are not discussing anything related to framerate, but soley the smallest amount of time required to register an image. That depends entirely on the brightness of the image.

Theoretically, it would seem that a bright enough image at less than the response time you mentioned should still register since photons are still interacting with the eye. They just wouldn't show up until the response time has occured, even though the stream of photons isn't continuous over that period.

However, in anything other than very specific circumstances, that's meaningless (it'd have to be a dark enviornment with a bright flash), as it has everything to do with the brightness of the object. Something more meaningful would be "how many separate images can the eye register per second?"
Later...
User avatar
BoredShirtless
BANNED
Posts: 3107
Joined: 2003-02-26 10:57am
Location: Stuttgart, Germany

Post by BoredShirtless »

kojikun wrote:
BoredShirtless wrote:Bullshit. It takes a very special brand of idiot not to see there's a difference between the speed our eyes register images, with the speed our brains process them.
And that kind would be you, because you fail to realize that its not our brain but the rods and cones themselves that are sending the signal down the optic nerves at ~30Hz.
:lol: So explain why a game at 30fps appears smoother at 60fps. And that's not even indictive of the output rate of our retina, but our brains processing speed.
kojikun wrote:
Our eyes DO process light and it's fucking legit to analyse JUST THAT.
Irrelevant. Our eyes still use ~30Hz input cycles.
Prove it.
kojikun wrote:
And again:
Rye wrote:Well what FPS does the human eye register
EYE asswipe.
Point being? This doesnt change the fact that rods and cones have input cycles that last about 1/30th of a second during any given cycle of which all input light is contained, so the eye DOESN'T register RATES >30fps, but it registers LIGHT that last <1/30th of a second. Theres a universe of difference.
Proof that retinas have input cycles of 1/30th of a second?

Our eyes have, depending on light intensity, a lower limit for flicker free sight of 50Hz. It's worth repeating: lower limit. It's actually worth repeating one more time: LOWER LIMIT. Knowing this, I want you to now prove to me our retinas have an UPPER LIMIT of 30Hz.
User avatar
Mad
Jedi Council Member
Posts: 1923
Joined: 2002-07-04 01:32am
Location: North Carolina, USA
Contact:

Post by Mad »

BoredShirtless wrote: :lol: So explain why a game at 30fps appears smoother at 60fps.
Already explained repeatedly. Motion blur. The eye is used to seeing everything with some degree of motion blur. A game at 30 fps would have no motion blurring. A game at 60 fps would cause the eye to blend two frames together for each frame the eye sends, thus causing motion blur. Because of the motion blurring, the image looks much more natural, being much closer to what the eye is used to seeing.
Later...
User avatar
BoredShirtless
BANNED
Posts: 3107
Joined: 2003-02-26 10:57am
Location: Stuttgart, Germany

Post by BoredShirtless »

Mad wrote:
BoredShirtless wrote:Please point where I made the claim a digital camera CAN'T register a frame shown for 1/220th of a second.
Are you claiming that a 30 fps digital video camera can register a frame shown for 1/220th of a second?
Yep.
Mad wrote: You are discussing the ability to register light, which is a function of intensity over time. The higher the intensity, the less time is required to register the image.
Right.
Mad wrote: However, that would have nothing to do with the concept of "frames per second," because both are receiving that data as part of a frame. Since that 1/220th of a second flash is the only data in the frame, that will make up the entirety of the frame.
Define "frames per second".
Mad wrote: So, then it appears that you are not discussing anything related to framerate, but soley the smallest amount of time required to register an image. That depends entirely on the brightness of the image.
Ah. I see we cannot apply the concept of a "framerate" to our eyes. Why?
Mad wrote: Theoretically, it would seem that a bright enough image at less than the response time you mentioned should still register since photons are still interacting with the eye. They just wouldn't show up until the response time has occured, even though the stream of photons isn't continuous over that period.
Yes, this would be our eyes "theoretical" framerate.
Mad wrote: However, in anything other than very specific circumstances, that's meaningless (it'd have to be a dark enviornment with a bright flash), as it has everything to do with the brightness of the object.
It was a specific question, it wanted a specific answer.
Mad wrote: Something more meaningful would be "how many separate images can the eye register per second?"
You just answered this above:
Theoretically, it would seem that a bright enough image at less than the response time you mentioned should still register since photons are still interacting with the eye. They just wouldn't show up until the response time has occured, even though the stream of photons isn't continuous over that period.
We can conclude that our eyes framerate depends on the intensity of light. The brighter the image, the faster our cones and rods get saturated, and the quicker the information gets sent to the optic nerve.
User avatar
BoredShirtless
BANNED
Posts: 3107
Joined: 2003-02-26 10:57am
Location: Stuttgart, Germany

Post by BoredShirtless »

Mad wrote: A game at 30 fps would have no motion blurring. A game at 60 fps would cause the eye to blend two frames together for each frame the eye sends, thus causing motion blur.
Rubbish. Do you have a CRT monitor? Set the refresh rate to 60 Hz and stare at it for a while. You can actually see the refreshes, which proves we can see beyond 60fps.

The critical flicker frequency is the highest frequency at which the flicker in a flickering light source can be detected. Source:
http://www.webvision.med.utah.edu/temporal.html#flicker
This means we require as a LOWER LIMIT around 50 Hz. That's a lower limit buddy.
Because of the motion blurring, the image looks much more natural, being much closer to what the eye is used to seeing.
We do not motion blur at 60fps.
User avatar
BoredShirtless
BANNED
Posts: 3107
Joined: 2003-02-26 10:57am
Location: Stuttgart, Germany

Post by BoredShirtless »

Yesh, I should stop drinking while posting. The flicker free lower limit is infact about 2Hz, but can get up to 45Hz:

Image

Remember, this doesn't mean we start motion bluring over 45fps. Just means we don't see a flicker. I take it this graph is not very accurate when applied to CRT monitors, as we can definetly see our monitors flickering at 60Hz.
User avatar
kojikun
BANNED
Posts: 9663
Joined: 2002-07-04 12:23am
Contact:

Post by kojikun »

BoredShirtless wrote: :lol: So explain why a game at 30fps appears smoother at 60fps. And that's not even indictive of the output rate of our retina, but our brains processing speed.
Because with a 60Hz game two images are superimposed on the retina so that they contain two pieces of information slightly offset making the motion blurred ever so slightly, seeming smoother.
Prove it.
That our eyes have 30Hz input cycles? I'm not the one making claims. Burden of Proof is on you.
Proof that retinas have input cycles of 1/30th of a second?
The fact that we perceive anything above that as being smooth motion.
Our eyes have, depending on light intensity, a lower limit for flicker free sight of 50Hz. It's worth repeating: lower limit. It's actually worth repeating one more time: LOWER LIMIT. Knowing this, I want you to now prove to me our retinas have an UPPER LIMIT of 30Hz.
Prove that we have a "framerate" of >220Hz. :)
Rubbish. Do you have a CRT monitor? Set the refresh rate to 60 Hz and stare at it for a while. You can actually see the refreshes, which proves we can see beyond 60fps.
All that proves is that during each refresh theres significant brightness difference over the course of one input cycle to be apparent.
The critical flicker frequency is the highest frequency at which the flicker in a flickering light source can be detected. Source:
http://www.webvision.med.utah.edu/temporal.html#flicker
This means we require as a LOWER LIMIT around 50 Hz. That's a lower limit buddy.
Notice in the chart you showed theres a distinctive drop after a certain point? You know what that indicates? That the peak is where the synchronisation offset between the light and our eyes is enough to be apparent. Anyone who knows the slightest thing about wave interaction knows that there are interactions between two non-identical waves that produce distinct periods of peak activity and periods of no activity, it doesn't prove anything about faster receving frequency.
We do not motion blur at 60fps.
Sure we do. They're just not significant enough to be apparent.
Sì! Abbiamo un' anima! Ma è fatta di tanti piccoli robot.
User avatar
BoredShirtless
BANNED
Posts: 3107
Joined: 2003-02-26 10:57am
Location: Stuttgart, Germany

Post by BoredShirtless »

kojikun wrote:
BoredShirtless wrote: :lol: So explain why a game at 30fps appears smoother at 60fps. And that's not even indictive of the output rate of our retina, but our brains processing speed.
Because with a 60Hz game two images are superimposed on the retina so that they contain two pieces of information slightly offset making the motion blurred ever so slightly, seeming smoother.
If we did blur frames at 60fps, BLUR would be produced, not smoothness. Have you ever played a game with blur? Imagine how hard it would be to aim a rocket launcher if your target was blurring all over the place.

In film, "motion blur" refers to blurred images on each frame of film. This allows us to see movies smoothly even at a frame rate of 24fps. Try running computer animation [frames aren't blurred in animation] at 24fps; the flicker is very noticeable.

Although "motion blur" is a film rendering technique, WE can see motion blur when looking at non-flickering things. Turn on your bedroom light, and rapidly shake your head from side to side; you will see blur.

So let's see. We have proven motion blur for non-flickering sources, and know that motion blurred frames give smooth movies [not applicable to digital cameras]. Now, I want you to prove your assertion that we motion blur flickering images flickering at a rate > 30Hz.

kojikun wrote:
Prove it.
That our eyes have 30Hz input cycles? I'm not the one making claims. Burden of Proof is on you.
What kind of fucking idiot are you? Tue Aug 12, 2003 1:50 pm, you wrote:
"Irrelevant. Our eyes still use ~30Hz input cycles. "
kojikun wrote:
Proof that retinas have input cycles of 1/30th of a second?
The fact that we perceive anything above that as being smooth motion.
Bullshit. Monitors refreshing at TWICE that speed aren't smooth. If you can see a CRT monitor flickering at 60Hz, you have NOT reached the critical threshold between "flicker" and "smooth". Claiming that threshold is 30Hz is clearly baloney.

Our eyes do not "cycle" the way you're suggesting. They respond to stimulus [light] at a rate dependent on light intensity. If you turn down the brightness and contrast on your monitor low enough, you can actually perceive smooth motion at 1/5th of a second.
kojikun wrote:
Our eyes have, depending on light intensity, a lower limit for flicker free sight of 50Hz. It's worth repeating: lower limit. It's actually worth repeating one more time: LOWER LIMIT. Knowing this, I want you to now prove to me our retinas have an UPPER LIMIT of 30Hz.
Prove that we have a "framerate" of >220Hz. :)
Why? I cited that Air Force experiment as proof that we can register and process an image at 1/220th of a sec. I also spent considerable effort to explain that the experiment does NOT prove our visual cortex can process frames at that speed. I did make an assumption however that our retinas, having a response time of about a picosecond, would be fast enough to react to a frame rate of 220fps. That's our eyes ability to register images; NOT our brains, which is what "Prove that we have a "framerate" of >220Hz. :)" is essentially asking.
kojikun wrote:
Rubbish. Do you have a CRT monitor? Set the refresh rate to 60 Hz and stare at it for a while. You can actually see the refreshes, which proves we can see beyond 60fps.
All that proves is that during each refresh theres significant brightness difference over the course of one input cycle to be apparent.
An input cycle has ended if we can DETECT the damn flickering.
kojikun wrote:
The critical flicker frequency is the highest frequency at which the flicker in a flickering light source can be detected. Source:
http://www.webvision.med.utah.edu/temporal.html#flicker
This means we require as a LOWER LIMIT around 50 Hz. That's a lower limit buddy.
Notice in the chart you showed theres a distinctive drop after a certain point? You know what that indicates? That the peak is where the synchronisation offset between the light and our eyes is enough to be apparent. Anyone who knows the slightest thing about wave interaction knows that there are interactions between two non-identical waves that produce distinct periods of peak activity and periods of no activity, it doesn't prove anything about faster receving frequency.
My patience for sorting through your bullshit is up. Rewrite the above and place a little context with each of your scientific words.
kojikun wrote:
We do not motion blur at 60fps.
Sure we do. They're just not significant enough to be apparent.
Easily the worst non sequitur I've ever seen you make. That's it I'm going to the pub.
Post Reply