I need to find the technical term for a specific visual\psychological
effect related to apparent motion.
Bear with me as I'm not very sure how to describe this. :-)
Apparent motion (sometimes also referred to as "persistence of
vision") is the psychological effect where the brain perceives a rapid
series of still images as motion; it is the principle upon which film,
television and other forms of animation are based.
In general, the more images displayed per second the "smoother" the
motion looks... a game running at 60 frames per seconds is generally
perceived as "better" than at game running at 15 or 30 frames per
second. By frames, I mean unique images, and not the "flicker rate" or
refresh rate of the monitor or projector--which also has an effect,
but is independent of the effect I am talking about.
However--and this is the crux of my query--there are certain cases
where a higher frame rate does not automatically result in "smoother"
motion. In particular, the brain seems to somehow "anticipate motion"
and having temporally-equal sized steps seems to have a strong effect
on the perception of smooth motion. I've proven this empirically on
friends and neighbors by showing them two different versions of the
same computer-generated animation: one which moves at a fixed time
interval, and another which moves at a slightly varying but smaller
time interval (higher frame rate). Both are mathematically accurate,
and the position of the objects is precisely where it should be for
the time interval. The refresh rate of the monitor is sufficiently
faster than the frame rate to ensure no frames are skipped/delayed due
to vsync. However, the lower frame-rate animation is nonetheless
frequently perceived as "smoother" despite providing less visual data.
The perception of this effect is different among individuals, and is
more pronounced for simple images. i.e. People seem to be more able to
perceive frame rate variances in an animation of a dot moving in a
straight line than they are in, say, an animation of a group of horses
galloping.
The brain's temporal perception also seems to be of "lower resolution"
than its motion perception: an animation with slight time variations
that runs at near 24 frames per second will appear smoother if the
motion moves in even increments than if it runs in true time. i.e. If
the animation varies between 22-26 frames per second, it will look
smoother if the dot or whatever you are animating moves in same-sized
steps--as though it were always running at 24 frames per second, even
though it isn't. The fact that the animated object is slightly "off"
of where it should be for the time interval seems to have very little
impact, whereas having not-quite-the-same sized steps for a near time
interval seems to have a very large impact.
I'm sure someone else *must* have researched this before. However, I'm
having a hard time finding research without knowing what this thing is
called. :-) There's got to be some sort of technical or industry
terminology for this phenomenon. I'd like know the technical term for
"the brain guesses where the object will be next, and perceives motion
more accurately than time for small intervals". |
Clarification of Question by
tfpsoft-ga
on
26 Oct 2004 14:33 PDT
That sounds close, but not quite. I'm betting you're right, that it
does have something to do with the sacades, which would explain why it
happens less in complex scenes as the eye will be more inclined to
stay focused on the center of the screen instead of smooth tracking
the moving object.
"Suppression" isn't right, though. Looking at the article you posted
and a couple of others, that just seems to refer to the mechanism by
which the eye filters out it's own motion. I'm really looking for how
the brain\eye moves in *anticipation* of motion, especially in
relation to time.
|