I'm pretty sure you'll only ever be able to have either an open source TV or all the streaming apps. No reason for Amazon, Netflix, YouTube, or the others to support this platform.
The biggest hurdle is the DRM. Plasma Bigscreen ship with it's own browser but you still need the proprietary blob for the DRM running on an ARM device instead of the usual one for intel/amd. Making things worse, even on intel/amd the video quality is often limited to 1080p on Linux.
Funny, since DRM is never an issue if you play pirated content. Arrgh maties!
But ironically, DRM is only a huge annoyance for paying law abiding customers.
Did you pay Ubisoft $50 10 years ago for Assassin's Creed 2, am amazing game? Great, because now you can go f*ck yourself instead of playing the game since Ubisoft took the server running the DRM for that game offline. Did you pirate the game? Great, because now you can play it for free indefinitely.
And there are countless horror stories of paying customer bases getting shafted on the products they (used to) own via DRM.
Even worse is the fact that they sacrifice quality for bandwidth. Even if you manage to meet all their silly DRM requirements, you get poor quality video that's high definition in name only. There are titles in Netflix that have compression artifacts in 90% black frames.
Meanwhile pirates enjoy Blu-Ray rips encoded by people known for taking pride in providing the highest possible quality.
Are you watching YouTube on Linux by any chance? AFAIK modern versions of chrome and FF should not have this issue even on Linux but I could be wrong as I'm out of the loop on modern DRM.
Fortunately since 2015 it has been legal to circumvent DRM for games that have had their single-player mode rendered inoperable by the decommissioning of an activation server:
Yes, it still sucks when publishers make their customers jump through hoops like this, potentially exposing them to malware if the necessary DRM circumvention software comes from a dubious source.
Seriously? Well not connecting them to the internet works less and less, Amazon is rolling out their 900Mhz mesh with echo devices and there's LTE-M, NB-IoT and such.
The idea of not letting a device connect to the Internet is slowly becoming a thing of the past unless you live in a faraday cage (which is becoming more and more tempting). Oh, correction, thanks to mesh networks like that of Alexa living in a Faraday cage is not actually enough if at least one device is connected. Yay.
I do just fine not connecting proprietary IoT shit to the internet. Sometimes I have to bust out a soldering iron, but if it is hardware you own it is always a choice.
Even with a mesh, devices still have to be authorised to get access.
Just don't do that.
The more difficult option is if they come with cellular modems built-in, thus bypassing any of your infrastructure (which needs authorisation). That of course is technically possible, tho probably commercially unacceptable (modems cost ~15 to 20 dollars in bulk which is significant at smart-tv scales, not counting the data cost).
More expensive IoT items like modern cars ship with factory activated cellular modem based tracking. Your options today are to buy older base model vehicles or learn to use a screwdriver and a soldering iron.
> Even with a mesh, devices still have to be authorised to get access. Just don't do that.
Nope, the whole point of Amazon Sidewalk is that it "just works." I don't think there's even a way to know what devices are connected, let alone any kind of authorization.
LG WebOS TVs are capable of image + sound fingerprinting at OS level! Not only can they identify what you're watching, it's accurate enough to tell which scene you on and provide "handy" information overlays.