See also: historically PAL regions. For some inexplicable reason broadcast TV in the UK is still 50Hz despite the fact that every display has been been 60Hz+ for 15+ years.
Also a huge pain for old games. Lots of crummy conversions that straight up just run 17% slower on the PAL version.
Can somebody explain why this is the case? A 50FPS signal (max 25Hz) sampled at 60Hz should not produce any errors.
I can only imagine errors to show up by "doing it wrong", that is, aliasing artifacts due to not lowpass filtering the signal beforehand.
That said, I'm not totally sure that a 50FPS signal is acutally capped to 25Hz since a 50FPS GIF is usually not interpreted as a signal sampled at 50Hz but as a piecewise-constant signal with 50 "steps" per second. So the first question is whether these "jumpiness" is an inherent, desired property of GIFs or an artifact of bad renderers. Can somebody clarify this?
You're going to get judder; I don't understand what you mean by 50 FPS (max 25Hz), but if you've got 50 frames per second of data and 60 frames of display, about ten frames per second are going to be shown twice and the other fourty only once, that's not going to look nice. If it's 25 frames per second of data, fifteen frames would be be shown twice, and ten shown three times; this is reminiscent of telecine judder where 24 frames per second film material is shown on 60 fps screens using a 2:3 pulldown: half of the frames are shown twice (or as two fields if interlaced) and the other half are shown three times (or as three fields), but it's not as consistently one then the other with 25 fps source material.
Like the sibling comment, I wasn't sure what you mean when you write "50 FPS (max 25Hz)". "Hz" is literally "per second"; "50 frames per second" is "50 frame-Hz". Because we're talking about frame-oriented video, the "frame" is implied; FPS and Hz are equivalent units.
Based on this "divide by 2" confusion and your mention of lowpass filtering, I'm thinking that the confusion is that you're thinking of this as audio or other wave-based signals. With audio the "volume" (amplitude) is a relatively minor detail and the key thing-of-interest is the frequency at which that amplitude is oscillating; not so with video. The signal doesn't follow a wave-pattern and we don't (normally) care about the frequency one bit. To apply video-numbers to audio; to represent a 60Hz tone, you'd need to represent that the amplitude goes down and back up 60 times per second; a 60Hz tone for 1 second is 120 "discrete" events, 60 "down" and 60 "up". If the samples are only at 60Hz then it would seem that the amplitude is staying constant, you wouldn't be able to observe that it's oscillating so quickly. And that oscillation is how we perceive audio. But it's not how we perceive video, light amplitudes don't normally oscillate like that, they stay steady and when they do change we notice that as a discrete event, not as part of a frequency. In fact, many displays (especially CRTs) do flicker, and are specifically counting on us not being able to notice this oscillation.
If the video signal emits a new frame at 50Hz but the screen updates at 60Hz, naively that means that 80% of the video's frames will be on-screen for .0166s, and 20% of them will be on-screen for .033s. While humans might not notice a .0166s difference on its own, we're pretty good at noticing changes in rate and will notice the jitter.
You could argue that similar to a lowpass filter for audio, that the display should interpolate between its samples to hide this jitter. It could; some TVs do this, but computer monitors don't because this necessarily introduces a delay of at least one frame (but realistically the software to do this will introduce more delay than that), and for interactive use we sure do care about lag/latency; even more than we care about jitter. TVs that do this will often have a "video game mode" that disables this and other features that introduce latency.
Excellent reply, though I'm going to be an ass and nitpick to say that the term that's typically used when you have repeated frames due to mismatched frame rates is "judder" (in American English at least)
Do you have a plan to transform a video into frequencies and back so you can do a lowpass filter?
The biggest issue with an FPS change is movement. To keep speeds consistent, which the human eye is very sensitive to, you need to calculate a continuous velocity for every piece of the image and then interpolate the position of every object separately. And then you have to fill in all the gaps that leaves.
If you don't want to throw tons of CPU and guesswork at the problem, there isn't a solution to changing the framerate. Instead get a screen with a framerate that is either variable or 200+ or both.
Replying here because the various replies seem to say more or less the same...
"With a 50 FPS GIF played back on a 60 FPS screen, some frames get shown twice"
Frames shown twice is an aliasing artifact that happens without a lowpass filter. It's exactly the same as spatial aliasing when you show a 50px image on a 60px area, "some pixels get drawn twice". Anti-aliasing gets rid of that.
"Do you have a plan to transform a video into frequencies and back so you can do a lowpass filter?"
Well, I (somewhat naively) assumed that this works the same as converting a still image into frequencies: Fourier transform. Only for an animated image the signal is 3d, not 2d.
edit: Filters normally don't do a FT explicitly. You use FT for the theory, but filters operate directly on the signal.
"Like the sibling comment, I wasn't sure what you mean when you write "50 FPS (max 25Hz)". "Hz" is literally "per second"; "50 frames per second" is "50 frame-Hz". Because we're talking about frame-oriented video, the "frame" is implied; FPS and Hz are equivalent units."
I was very sloppy with this. by 50 FPS (max 25 Hz) I mean that 50 FPS is equivalent to a sample rate of 50 Hz, which can only capture signals of less than 25 Hz. FPS and Hz are only equivalent when they refer to sample rate because FPS can't really refer to anything else.
> I'm thinking that the confusion is that you're thinking of this as audio or other wave-based signals
Doesn't most lossy image compression make the same assumption, e.g. JPEG?
> but computer monitors don't because this necessarily introduces a delay of at least one frame
I agree with this one, at least AFAIK you can't really lowpass-filter a realtime signal without adding delay.
Your idea is more or less right, but the problem here is that Nyquist only applies to continuous functions, which a GIF is not. At frame boundaries the frequency is potentially infinite, like an ideal square wave.
A GIF is already a sampled signal.
Yes, you can LP-filter the GIF but that's no longer reproducing the original signal, which wasn't band-limited to begin with.
You're right that, if the original signal that the GIF samples is band-limited to 1/2 the Nyquist rate (which is a big "if") it can be accurately reproduced without aliasing by passing it through a reconstruction filter.
One could argue that the same applies to old pixelated graphics, such as the original Super Mario sprite. It has to be pixelated, if you lowpass-filter it it looks ugly. So that one is non-continuous / not band-limited in the spatial direction.
This brings me back to this question:
> So the first question is whether these "jumpiness" is an inherent, desired property of GIFs or an artifact of bad renderers.
Like for pixel-y displays, I assume that this was never well-defined for the GIF format, so what is basically a property of the first GIF renderers became a de-facto property of the format.
I'd argue that GIFs that exploit this property and rely on the "jumpiness" to look good are not band-limited, require a "jumpy" renderer and will suffer from the 50Hz/60Hz problem. On the other hand, GIFs that don't exploit this property are meant to be continuous and are just sampled at a low frequency, and for those re-sampling in time will allow to fix the 50Hz/60Hz problem.
Though another problem will turn up then: when the latter were sampled at a low frequency, they were likely not lowpass-filtered beforehand, so they already contain aliasing artifacts you cannot get rid of anymore.
> > I'm thinking that the confusion is that you're thinking of this as audio or other wave-based signals
> Doesn't most lossy image compression make the same assumption, e.g. JPEG?
Yes, but the key difference is that there's no temporal component; the wave is across the screen, not across time. When one speaks of a low-pass filter in image processing, it's generally operating on the horizontal- and vertical-axes, not the time-axis.
To put "not wave-based" in to other words: Unlike audio, video across time is neither continuous nor band-limited.
A framerate/refresh rate mismatch like this will result in one of two problems:
1: Frame judder, as a result of some frames from the 50hz content being displayed for more refresh cycles than others.
2: Image tearing, which is caused by the content being updated in the middle of the 60hz refresh cycle.
Ideally you want the framerate of your content to be an even factor of the display's refresh rate. So for a 60hz display, that means you want the framerate of your content to be 60, 30, 20, 15, etc.
You either get duplicate frames and the video stutters a bit or you interpolate and get either blurs or distortions. You're right that capturing 50 FPS video at 60 FPS is sorta OK in terms of data loss; you'll capture all the frame, and you can get a pretty good idea of which frames are duplicates, but that's different from high-quality playback.
I'm not even sure if Nyquist applies here since the GIF is already a non-continuous function and therefore has potentially infinite frequency components. The input itself is not band-limited.
Another way to put this is that a GIF is already a sampled function. For example imagine intensity of each pixel in a particular GIF is a function sin(x * t). Nyquist just says you can only accurately reconstruct this for 25px wide GIFs at 50FPS without aliasing.
And as you pointed, you'd still have to bandlimit the output.
Frame blending can help make judder less obvious but it is far from an ideal solution.
Modern TVs do some clever work to detect 24 fps content in a 60hz signal, extract it, and present it properly to the panel. This takes a fair amount of processing power, and it isn't really feasible on computing platforms where the video content is not always presented full screen.