@lololulu19 Camera noise.
In the past 10 years camera sensors have skyrocketed in terms of noise. I would say mainly thanks to Sony (lead technology). But because of mobile demand (lead consumer/"user").
When I mean noise I mean higher sensitivity, bla bla.
Thus, even theater movies from the late '10 look noisy compared to nowadays crisp and noise-free tv-shows recorded in the past 18-24 months.
Past production chain:
"Low quality" native resolution source sensor --> "low quality" lossy codec for production (4:2:0) --> poor reencoding low bitrate (MPEG2, QT, whatever). --> CRT displays /Early LCD (usually low res, low dpi)
Current production chain:
"High quality" high resolution source sensor --> "high quality" lossy or almost lossless codec in production (4:4:4 or 4:2:2) --> Video Downsizing (even more noise reduction) --> high quality reencoding with higher bitrate (H264, H265, VP9, AV1) --> IPS/OLED displays (higher res, higuer dpi)
So:
- Porn was usually recorded in poor light conditions and/or not top of the notch cameras.
- Access to professional grade or even cinematographic grade cameras and production process has reduced 100 times (maybe more) in the past 20 years.
- The average consumer display has 6-20 times the pixels, 4-10 times the contrast, and 3-5 times the colour gamut compared to the DVD era.
There you have it.