There are many cameras that shoot stills and video. I am primarily a stills photographer, but dabble in video, mainly for our church's YouTube channel. At home we have an HD TV. When I read reviews, e.g. on DPReview, there are often a lot of terms I do not understand.
Staring at the beginning, there are different codes used for video formats, such as MP4. Then there are codecs. More and more I see HD, 4K, 6k and 8k. I get that these are to do with resolution, but what devices will show these. 8K TVs seem thin on the ground. I also see comments that to future proof videos, one should shoot in the highest resolution (8K at the moment?) and then downsample for current use. What software handles these resolutions, and enables downsampling? What computers will handle these higher resolutions? What devices will show these higher resolutions?
Moving on, I see comments about cameras that can shoot 8 bit, 10 bit etc. The codes seem to have 3 numbers separated by full stops. What do these codes mean? What computers and software handle these different types of video, and what devices can display them once processed?
It all seems a rapidly changing environment and I feel I am a dummy. I need a primer to explain it all. I have written to DPReview and asked if Jordan Drake can do an idiots guide, and all they have said is that they will consider the request.
Any suggestions would be appreciated.
Jonathan