Post by unkinhead on Feb 17, 2020 14:34:37 GMT -8
So i figured this is somewhat of a controversial subject among film and television nerds and the like regarding new technology frame rates and the issues that have presented themselves as a result.
Modern TVs (such as newer LED and OLED models) all generally output 60hz or 120hz (for simplicities sake: they can smoothly output 60fps or 120fps). Most films however are presented in 24fps as this is the standard.
The issue is that ironically as TVs become better, the 'stutter' effect while watching 24fps films on a >=60hz screen gets worse. Films look choppy (basically because frames are static for longer periods of time), and the motion can look 'jerky'. There are some 'fixes' for this that introduce their own problems such as BFI (black frame insertion) and motion interpolation.
Anyways, because it is now affecting consumer decisions (newer tvs can make motion look worse), there is a bit of a debate surrounding whether films should conform to the new standards (60fps) or TV manufacturers should conform to the older 24fps standard (resolve the technological issue).
Those on the former side suggest that the only reason viewers find 60fps content 'unfilmlike' is because of conditioning and the 'soap opera effect' which too is also just your brain associating 60fps to home videos and soap operas (therefore you think its cheap looking). They suggest filmmakers should shoot in 60fps and viewers will relearn what looks 'good'
Those on the latter suggest that the technological issue should be resolved as whatever the cause, films in 60fps DO look cheap and bad, and lose the 'magic' that cinema offers. For examples of this, you may recall watching The Hobbit in theaters and finding the motion disorienting (it was shot in 60fps). You can also watch this video here: youtu.be/SPZXR4sxfRc
Personally I stand on the side that the technology should adapt to 24fps. For the reasons outlined but also in rejection of the claim that the 'cheap soap opera effect' is merely (only) a socialized effect. Films generally work because of artifice. They're not supposed to be real at all. It's true that 60fps looks more fluid and 'realistic', but this was never the desired effect of cinema, and that the 60fps ruins the magic and artifice of films. This is especially apparent for me as I don't have the perception of high quality documentaries being cheap when they are shot in 60fps...because it's supposed to be realistic. I'd argue that it's more convinient circumstance (or maybe not, idk how 24fps was chosen) that the 24fps output assists the power of movies rather than weakning it. Therefore I'd like to see a partial integration of some films to the 60fps standard where it is appropriate (it would also be helpful to experiment so consumers can discover 'when it actually is' appropriate for film), but for the brunt of the responsibility to fall on the TV industry to resolve motion issues with 24fps content.
I'd be interested to hear if anyone here has thoughts on this (if any)!
Modern TVs (such as newer LED and OLED models) all generally output 60hz or 120hz (for simplicities sake: they can smoothly output 60fps or 120fps). Most films however are presented in 24fps as this is the standard.
The issue is that ironically as TVs become better, the 'stutter' effect while watching 24fps films on a >=60hz screen gets worse. Films look choppy (basically because frames are static for longer periods of time), and the motion can look 'jerky'. There are some 'fixes' for this that introduce their own problems such as BFI (black frame insertion) and motion interpolation.
Anyways, because it is now affecting consumer decisions (newer tvs can make motion look worse), there is a bit of a debate surrounding whether films should conform to the new standards (60fps) or TV manufacturers should conform to the older 24fps standard (resolve the technological issue).
Those on the former side suggest that the only reason viewers find 60fps content 'unfilmlike' is because of conditioning and the 'soap opera effect' which too is also just your brain associating 60fps to home videos and soap operas (therefore you think its cheap looking). They suggest filmmakers should shoot in 60fps and viewers will relearn what looks 'good'
Those on the latter suggest that the technological issue should be resolved as whatever the cause, films in 60fps DO look cheap and bad, and lose the 'magic' that cinema offers. For examples of this, you may recall watching The Hobbit in theaters and finding the motion disorienting (it was shot in 60fps). You can also watch this video here: youtu.be/SPZXR4sxfRc
Personally I stand on the side that the technology should adapt to 24fps. For the reasons outlined but also in rejection of the claim that the 'cheap soap opera effect' is merely (only) a socialized effect. Films generally work because of artifice. They're not supposed to be real at all. It's true that 60fps looks more fluid and 'realistic', but this was never the desired effect of cinema, and that the 60fps ruins the magic and artifice of films. This is especially apparent for me as I don't have the perception of high quality documentaries being cheap when they are shot in 60fps...because it's supposed to be realistic. I'd argue that it's more convinient circumstance (or maybe not, idk how 24fps was chosen) that the 24fps output assists the power of movies rather than weakning it. Therefore I'd like to see a partial integration of some films to the 60fps standard where it is appropriate (it would also be helpful to experiment so consumers can discover 'when it actually is' appropriate for film), but for the brunt of the responsibility to fall on the TV industry to resolve motion issues with 24fps content.
I'd be interested to hear if anyone here has thoughts on this (if any)!