"8K UHD" I love the title lol.
There are no 4K TV's today. 4K is a standard @ 4096x2160. What we really have is UHD TV's @ 3840x2160 the industry loves to term as 4K. Regardless of the output resolution/bit depth most TV sets are input electronics limited, e.g. (HDMI 1.4, DisplayPort 1.1, etc) which mostly run at 10.X Gbps. 4K requires 12 Gbps @ 60 fps @ 8-bit of colour per pixel. HDMI 2.0 and DP 1.2 will expand this, but are not common place yet. The terms UHD and 4K are used interchangeably, but incorrectly so. Cinema (and other) projectors are true 4K, and have been for 15+ years. All cinemas that show hollywood films are required to use top spec projection units (that meet DCI requirements or better). The vast majority are based on Texas Instruments 3-chip DLP which is currently limited to 4K resolution. As of today, TI has no plans that I am aware of for producing 8K DMD's (chips). Wobulated 8K (a 4K DMD that takes up 4 separate positions) showed up 3 years ago, but isn't a slam dunk. "Faux-k" (wobulated HD) has shown up in cheaper projection units sold at big box stores and lower end commercial, claiming 4K, bit they are lying and frankly the image quality is crap compared to true 4K.
Semantics aside, Why is there no 8K DLP plans? Japan is planning on broadcasting the olympics in 8K as a technology push, but there is a significant amount of infrastructure required for it. Fortunately as we move to Video over IP (the industry is slow to adopt), this becomes easier to acquire higher bandwidth without the need for purpose built hardware.
Let's take a step back first. So if 8K is the next big thing, why is there so little 4K content today from major studios available at home given every cinema film is 4K? Minus YouTube,
go-pro footage and the like, it's actually hard to find.
Cost. 4 years ago when "4K" TV's were hitting the mainstream market, the natural question of when studios were going to make 4K content available to the masses was posed. The answer? "We're not". The audience gasp, a few heart-attacks ensued, but the reality is 4K takes up 4x the space for studios, who keep all raw original footage. They often actually capture content in much higher bit depth (up to 16 bits) per colour, hence why they can release re-mastered content years down the road, as TV sets catch up.
The reality is, human eyes suck at distinguishing colour and resolution, but are adept at distinguishing light levels (in particular light level changes). Cinema went to rec 2020 (wider colour gamet, in particular with lasers) recently, which looks good, bit it is only a small improvement (IMO). Did you notice? Probably not. Think of how quickly your eyes notice a very short flash of light. It's a survival trait. We notice brightness much more than we do colour or spacial resolution. Most people cannot distinguish 4K and 8K on a television set. HDR is cheaper (adding 2/4 bits to a standard 8-bit pixel) and more impactful than quadrupling the number of pixels (and thus data). 4K and 8K will come, in due time.
So HDR (more specifically HDR10) to the rescue. Unlike 4K/UHD, HDR is not itself a standard. It is a general term (High Dynamic range), implemented in various ways by different manufacturers. Currently there is much squabbling to see who's variant will be the format standard. So just buying an HDR set today is not necessarily good enough, rather which HDR variant is it using.
Should you buy a HDR compliant TV? I'll admit, I'm a fan, but until the format is settled, and content is HDR capable with newer compression standards, there isn't much value in paying an early premium. Notice that any YouTube, Netflix, and the like content is actually quite soft in appearance? Common compression standards (e.g. H.265/VP9) are typically 8-bit limited by hardware decoders and will be the enemy for a while yet.