DungeonMasterOne said:I had that using my phone to film Flying Scotsman in the dark the other week. But then I knew I was pushing things somewhat past the extreme limits with that shot, and it did stabilise once the lights of the coaches were going past, just meant that the fire-lit cab of the locomotive wasn't in sharp focus as it passed me. Still looks OK though, and not many enthusiasts have the shot I got.
Isn't that annoying when it happens! Mind you, the fault isn't always on the lineside - sometimes it's on the track! I was at Bridgnorth for the SVR steam rally (2 years ago this week, I think) and witnessed a hilarious failure.
It had been an extraordinarily warm day for September (though not quite as impressive as a couple of weeks ago) but suddenly chilled right down at dusk for one of the last services of the evening at about 6pm, resulting in much condensation on the rails. I knew there was going to be trouble when Taw Valley emerged, tender-first, from the yard, completely unable to avoid slipping ... even as a light engine! Once coupled to a full train, that became far worse and it took over 5 mins before the last coach had passed the end of the platform, crawling away at about 4mph, even then! The scene would presumably be repeated at every stop. Astonishing that nobody thought to load the sanders, which all of us watching it were talking about! Spectacular but hilariously embarrassing for the train crew. I caught the whole lot from the footbridge on my DSLR.
Lizzie_Claymore said: To be honest 4K (or, more accurately UHD, since 4K is 17:9 Digital Cinema Initiative) is largely pointless, other than a method for manufacturers to cajole people into buying new kit by bamboozling them with a load of marketing bollocks. (The very act of calling it 4K, which relates to the number of pixels per picture *width* (for DCI - actually only 3840 for UHD) demonstrates the cynical nature of the marketing when all previous values have been based on lines per picture *height*. It's actually only 2160 lines high, not 4K.)
Various research papers (e.g. BBC WHP092) have demonstrated that the visual acuity of the eye is insufficient to resolve the difference between even 1080 lines and 720 lines on an average sized TV viewed at an average distance available in a typical UK home. (Houses in the US are bigger so can accommodate larger displays, on average, however.)
Therefore going to 2160 lines (i.e. 4K/UHD) provides 4x the number of pixels that you're still unlikely to be able to see *unless* you're sitting closer than the intended minimum distance! (SD: 5H, HD:3H, UHD:1.5H. where H is the image height)
UHD currently only comes into its own for fully immersive gaming / training simulators etc. or for being able to extract a subset as an HD image from the UHD original. (e.g. useful in fast moving sports to enable a digital 'zoom in'.)
It does, however, provide the basis for progressive (i.e. non-interlaced) images, high frame rate, high dynamic range, wide colour gamut and next gen audio. These are the factors that really increase quality. Just increasing the number of pixels is the con that's pushed by the manufacturers to make you buy the new kit. It also means you fill your disk at 4x the rate of HD and need higher bandwidth transmissions (and/or greater compression again, which can introduce unwanted artefacts, countering the supposed increase in quality).
Until all those other factors are available (which mostly aren't yet), I have no desire to fill up my disk at 4x the rate of HD for no significant benefit in most cases, unless I'm peering closely at the screen which for most of the time, I'm not. As David Wood (former director of technology for the EBU) once said "We don't need *more* pixels, we need *better* pixels"!
My recommendation would be to shoot at a higher non-interlaced frame rate in HD rather than moving to UHD. Remember that image quality is based on *dynamic* resolution, which is a function of both the static resolution (number of pixels) AND the refresh rate (frames/sec). Shooting progressive (i.e. non-interlaced) HD at 50 frames/sec (most of the world) or 60 (59.94) frames/sec (USA/Japan) will thus provide a significant improvement in perceived quality compared with 25 or 30 (29.97 fps) without the massive increase in cost and overheads that moving to UHD requires.
It also provides a perfect down-conversion to lower standards when required, since halving the frame rate again (or converting alternate frames into odd/even fields for interlacing) merely gets you back to conventional HD without introducing visible artefacts. Set your camera frame rate to match the mains frequency in your area to avoid strobing and aim for 1080p/50 or 1080p/60(59.94) accordingly.
As has been stated above, there is no substitute for decent sized image sensors (better signal:noise ratio) and high quality glass (higher modulation transfer function). What phones can do is miraculous for their size and cost (and is often adequate for amateur use) but it's still not as good as proper sensors and good glass even when used at a lower resolution and it never will be, for those reasons.
Incidentally, there are still reasons to use a true video camera rather than a DSLR that can also shoot video. A camera sensor capable of resolving for a still image (very high resolution) means that its lenses are capable of passing spatial frequencies in excess of what the video definition's sampling can resolve. This results in a phenomenon known as 'spatio-temporal aliasing', the effects of which become very visible when there is any movement involving fine patterns, which appear to move in the opposite direction to the main movement.
A true video camera (e.g. a 3-chip broadcast camera with a trichroic splitter block) has its optical path frequency response tailored to the video resolution in use to avoid this problem but it means that it can't be used to capture high-resolution stills. Having one type of camera to do one type of job provides the peak of quality. Having one type of camera to do all types of job always results in compromises. When the camera is merely a bolt-on to a multifunctional device (a phone) even more so!
Most of the studies are outdated and some have found to possibly be fabricated. It was one thought human visual acuity was 20/20 or 6/6 in metric, when you take an eye test, they correct your eyesight to 6/4.5 meaning at a distance of 6 metres you can see what the outdated "average" sees at 4.5m. Some of the best such as professional athletes have 6/3 eyesight.
As for the idea to that the average British living room is too small, when you walk round a neighbourhood at night, since many people don't draw the curtains in their living rooms, you can see how massive their TVs are.
I don't agree that 1.5 image height is required to see 4k, you can do so even further than the distance of the image diagonal of the screen.
Just as someone pointed out here, their sales are split between hd and 4k, that's despite almost no mobile devices and very few laptops having 4k screens, so they are taking the effort to watch it on their TVs.
Almost all the world's broadcasts have gone with 1080i instead of 720p despite the menace of interlaced video because of the higher static resolution. Only ESPN, fox and abc in the United States chose 720p, they switched earlier because the NTSC system has lower resolution.
Many mirrorless cameras omit the optical low pass filter because the sensor outresolves the lens and diffraction, that's not often true and aliasing is often visible.
A 3 chip camera is much better as a single sensor relies on bayer interpolation which results in a loss of spatial resolution and moire artifacts, but it's getting impossible to align today's higher resolution larger sensors, people value shallow depth of field more.
DungeonMasterOne said:Right now people don't want to be wasting 5 gig of phone storage space on every wam video they download. But in the future when everyone has a quantum smartwatch with a holographic projector display and multiple petabytes of on-board storage, it'll be good to be able to remaster Full-HD scenes to 4K.
... even though the eye cannot resolve such levels of detail (as proved by the research papers - visual acuity 1 minute of arc)? That's my point. Even if we ever manage to get to holographic projection, your original was still shot in 2D so will only ever be viewable in 2D, whether that's in HD or 4K.
We're now reaching the stage with video where manufacturers are doing what audio manufacturers have done for about 50 years (i.e. selling stuff to people that they can't actually detect while insisting that it's better). Hi-Fi nutters have long argued that they can hear nuances at 35-40KHz but double blind controlled trials have proved that they can't. Nonetheless, they continue to believe it, so manufacturers continue to sell them more and more expensive kit that they lap up. "Emperor's New Clothes syndrome."
There are circumstances where 4K does make sense, as outlined above - but they're generally few & far between - and taking resolution above that is even more ridiculous and requires far better optics (at insane cost), otherwise the MTF of the lenses acts as a low pass filter on the input path to the sensor anyway, making the increased sensor resolution completely pointless (again)!
The engineers, meanwhile, know that the other factors that are harder to explain and sell to the public (HDR, HFR, progressive scanning only, NGA, WCG) are the things that will really improve the perceived quality over the simplistic 'numbers game' of pixels... but the latter point is the one that's easy to market so that's what the marketing people keep on doing.
DungeonMasterOne said:Always shoot in the highest resolution you can possibly afford.
I would agree with this but with the caveat "up to the point where it becomes silly!" Let's not forget that we're talking about largely transient fetish videos, not historical records of detailed scientific discoveries where even the tiniest detail might be important.
There is no proof in science, there is only theory backed by evidence which can be flawed or later evidence can prove to the contrary, there is no such thing as scientific proof. Proof only exists in mathematics and logic. You cannot say something is scientifically proven because it does not exist.
16/44.1 is insufficient for audio because of aliasing. You know what aliasing is from your previous post. For 16/44.1, you need a brickwall low pass filter that perfectly attenuates all signals beyond 22.05 khz without affecting the passband, such a steep filter analogue or digital is impossible. The question then becomes one of if the blurring caused by the low pass filter is audible, a meta analysis seems to suggest so. By moving to 96khz or 192khz, your low pass filter is much gentler. Don't forget you also need a low pass filter on reconstruction due to mirror aliases generated. The SACD format sampled at 64x the cd rate at 1 bit, no low pass filter is required on sampling, but 1 bit is extremely noisy, all that noise is shifted above 20khz on reconstruction, you can use a gentler low pass filter that does not brick wall, hence why SACD can sound much better than a CD.
The mtf or the lenses do not make 4k pointless, today's lenses easily resolve 50+ megapixels, what is a problem however for 8k is diffraction. To get sufficient depth of field requires stopping down which increases diffraction, at 33mp, you begin losing contrast at f5.6 although it's slight, f8 sees more pronounced diffraction. This is 8k, the vast majority of lenses easily resolve 4k.
What is a con is phone makers claiming their phones shoot 4k which they don't. Their phones have 12mp in 4:3, so 4000*3000. The thing is these are Bayer sensors which require interpolation, so you're losing resolution. This is true of 50mp phones as they are quad bayer, so their bayer pattern is only equivalent of 12.5mp.
I disagree that HDR, WCG are more important, the dynamic range of the human eye without the iris opening and closing is possibly around 6.5 stops, and the human eye is far more sensitive to luminance than chrominance. I am not going to claim that 8 bit is sufficient for viewing and 10 bit is more for editing Headroom, but 4k is a more important upgrade from hd. Hfr is important too.
Nostalgic Erotica Prod said: FPS standard is 30fps for broadcast.
Only for the Americas and Pacific rim countries! [I think you guys are also now stuck with 29.97fps equivalent if you want to maintain backward compatibility with broadcast standards and having to work with drop frame timecode, originally resulting from the need to adjust the frame rate when NTSC was introduced because of interference with the colour subcarrier.]
The rest of the world runs at 25 & 50Hz and doesn't need to deal with any drop frame timecode. (You have to shoot to match the mains frequency (60Hz or 50Hz) in whatever part of the world you're in, to avoid a very obvious 10Hz strobing from mains lighting, so 59.94 or 29.97 for the Americas and 50 or 25 for the rest of us.)
Nostalgic Erotica Prod said: There has been attempts to run at 60fps however problem when you get to those frame rates is sometimes it will actually make the footage look worse.
There's no real reason why that should be the case so long as the frame rate is an exact multiple and the sensor's integration time (a.k.a. 'shutter speed') has been adjusted accordingly. For years, the EBU's recommendation was to *shoot* at 1080p/50, which allows exact downconversion for *emission* to 1080i/25 (interlaced video look) or 1080p/25 (cinema look). These days, that can now be raised to 2160p/50 or 2160p/100 if you have the gear that can do it and don't mind the storage requirements and bitrates resulting from it. The important point is that both the resolution and the frame rate are exact multiples.
Since the fields for i/25 are merely throwing away the odd or even lines of alternate frames in the p/50 originals to make the alternate fields for the i/25 frames, it provides a perfect reconstruction (not that you'd *want* to introduce the hideous artefacts of interlacing that become visible when viewed on moderen progressive scan screens (e.g. LCD) when you don't need to!) Similarly, if you go to p/25, you merely throw away every alternate complete frame. You can even make p/25 behave for interlaced emission by treating it like telecine through creation of PsF where the signal behaves as two interlaced fields derived from the same original frame, so there's lots of flexibility available.
If the 'shutter speed' is fixed to 1/50th or fractionally faster, it's very close to what would have been used in cine cameras anyway, so allows for both the video look and film look by merely converting the scanning standards downwards. (It is essential to understand in detail what the scanning standards are doing, however, in order to get the best out of it ... as is always the case with any technology!)
Obviously all these figures would be replaced with 59.94 and 29.97 for you guys over there and the 'shutter speed' would then need to be fixed to 1/60th. Creating the cine look that way might result in fractionally more judder for you because you'd be needing the slightly faster 'shutter speed' to cope with your faster frame rate, so as not smear across frames, which conversely means that you can't take advantage of using the slower shutter at 1/50th to reuce 'film judder', in exactly the same way that cine cameras do. (The use of stupidly short shutter speeds results in masses of flicker and judder that are often seen, instead of performing what is, in effect, a low-pass temporal filter through the longer integration time.)
The whole worlds computers and mobile device displays refresh at 60hz which is what most people watch on, if you shoot 24/25 or 50fps, you are going to get motion judder as they are not multiples of 60. With 29.97 and 59.94, the frame repeat every 1000 frames is imperceptible.
You should shoot at 29.97 or 59.94 to match computer and mobile displays where the vast majority watch, even televisions that can display 25/50 are not set up correctly as people don't change the refresh rate in their computers or streaming sticks when connected to tv for viewing 25/50fps content.
As for light flicker, all modern fluorescent and led lights refresh at a much higher rate than the 50 or 60hz of AC electricity, there is no need to shoot at 25fps in Europe to avoid lighting flicker. You should only shoot 25 fps if your footage is going to be broadcast on TV.
There are some android phones with smart refresh rate but for video they default to 60hz in most cases.
My advice would be to use 29.97 or 59.94 wherever you are in the world as it's a multiple of your display refresh rate.
A faster shutter speed for NTSC frame rates does not cause stutter because there are more frames, what causes strobing and stutter is excessively high shutter speeds which make each frame more distinct, this often can't be helped especially with phones as you can't use a Nd filter or close the aperture to reduce the amount of light to prevent blowing out. As long as you're using a 180 degree shutter, so 1/120 for 60fps, you're good.
The number 1 reason I wouldn't just a phone for video is the focal length of the main sensor which is the largest sensor. To squeeze in the larger sensors of modern phones, the focal length is extremely wide angle leading to lots of perspective distortion. The other lenses are lower quality.
For video, because we tend to sit further from the screen than viewing a photo, and because our eyeballs are round with a curved retina, and our eyes scan to build a picture, you need to use a longer focal length for the image to seem natural compared to a photograph.