I don't have a 4K TV or monitor yet, but the video I make today is better to be compaible with future's TV or monitor. I assume it can be converted through over sampling or simply shoot 4K now.
However, I still cannot see the difference of 4K video as of today. Anyone can really see that? May be because the programs are not from 4K cameras?
I can simply buy a digital camera with 4K video, but then another quesion comes, there are two types of resolution both claimd to be 4K, the 4096 DCI and 3840 UHD. How do you choose?
There are two 4K numbers, and you've identified both of them.
One comes from TELEVISION, and that's the 3840.
It really comes down to aspect ratios. TV is 1:1.78 (16x9) , whereas cinema is 1:1.85 or 1:2.35
3860 is the 1.78 ratio for TV or sometimes called UHD.
4096 is the CINEMA standard, allowing for 1:1.85 (it's actually 1.89 but they allow some wriggle room for masking in the cinema)
I wouldn't get too hung up on the numbers. Some people worry about their UHD camera not being a 4K camera, but it's kind of stupid. Most people don't realise that the currently most acclaimed high end cinema camera which shot most of the cinematography oscar winning film Revenant is a 2.7K or maybe 3.2K sensor (the Arri Alexa)
It's moronic but Netflix when commissioning shows are saying it has to be a 4K camera, so that actually means the digital cinema camera that has pretty much been the gold standard for CINEMA PROJECTION isn't considered good enough for a low data rate streaming service on a home Television.
It's total mesuarbating of the worst kind. So many things other than pixel resolution affect the end result.
So to answer your question directly, is your 4K TV a 16x9 TV ? If so then you're fine with UHD which will fill it nicely. If you shoot 4096 then you're going to have to scale it (and have a black bar at the top and bottom ) or you'll have to crop it to...ummmmm UHD.....
Don't get too hung up on it.