VU Premium Android 4k TVs aren't really HDR! Please go through this thread before buying one.

Vu just sells rebranded Hisense TVs. You can read their reviews to get an idea how HDR performs. Premium Android is Hisense's H8E.

Without high peak brightness,1000 nits is ideal, HDR is pointless, but you can get better results with efficient tone mapping at close too 500 too. Panasonic's 2019 range seems to show some promise there, specifically GX800, if they launch it here with VA panels. Full-array local dimming, or efficient local dimming in edge-lit, such as Sony's X930E, also helps. Hisense's 2018 models aren't good for HDR.

The 2019 Hisense models however, H8F and H9F, seem promising. So we'll have to wait and watch if Vu gets these sets, at least H8F, in all their glory next year. I say that because H8E does have local dimming, even if crappy, but Vu Premium Android doesn't.
 
Okay, so I just discovered that the TV is able to play Dolby Vision content on Netflix. I created NF premium account yesterday with 30 days trial and mahn, the quality id stunning! The only downside is this warning message: 'This device is not optimized for Netflix (-14).' Check this video I made while playing last Black Mirror episode in Dolby Vision but no mention of HDR10 whatsoever: https://photos.app.goo.gl/hdiNih7PEa9UKXCU8. The playback quality is very similar to my downloaded 1080p x264 Blu-ray (4.5 gigs) file playing via Plex. I could hardly distinguish the black levels and vividness of this episode on NF and Plex. This TV is not Netflix-certified though but still it works. Some kind of patch they used people say idk but the app UI is insanely ancient-looking. The only options are Sign-out, change account and Help. Like wtf!? At least give an option to change the fuckin resolution or audio quality like PLEX does! I will check the version and report here later. Standby...
 
Okay, so I just discovered that the TV is able to play Dolby Vision content on Netflix. I created NF premium account yesterday with 30 days trial and mahn, the quality id stunning! The only downside is this warning message: 'This device is not optimized for Netflix (-14).' Check this video I made while playing last Black Mirror episode in Dolby Vision but no mention of HDR10 whatsoever: https://photos.app.goo.gl/hdiNih7PEa9UKXCU8. The playback quality is very similar to my downloaded 1080p x264 Blu-ray (4.5 gigs) file playing via Plex. I could hardly distinguish the black levels and vividness of this episode on NF and Plex. This TV is not Netflix-certified though but still it works. Some kind of patch they used people say idk but the app UI is insanely ancient-looking. The only options are Sign-out, change account and Help. Like wtf!? At least give an option to change the fuckin resolution or audio quality like PLEX does! I will check the version and report here later. Standby...
Update: So the version is latest. 6.2.2 build 2573 something like that and the release is of May 21. Netflix Android TV never updated since but I'm happy for now that Dolby Vision works at least no matter how but it does. My anger for this company has drastically came down lol but it's still there however.
 
Update: So the version is latest. 6.2.2 build 2573 something like that and the release is of May 21. Netflix Android TV never updated since but I'm happy for now that Dolby Vision works at least no matter how but it does. My anger for this company has drastically came down lol but it's still there however.


Rule of thumb #1 when buying a low cost TV - never buy it for its playback ability/smarts but always for the panel quality
Always couple it with a good quality playback device

If you just want local LAN playback - pick a cheap android amlogic box and flash it to coreelec
If you want LAN + streaming and are OK with some incompatibilities (e.g. no HDR10 on Netflix although DV works) - get a Amazon 4K stick
If you want everything with ease of use, get a Apple TV 4K
If you want everything coupled with hours of tweaking and head scratching, get a Nvidia shield

On second thoughts, this rule of thumb applies for expensive TVs too

PS: In all likelihood, your TV has a 10 bit panel , as mentioned before, the problem was with your rendering choice when you were ready to throw the TV
You still do have a problem (although less severe) as I can tell you with almost certainty that The sideloaded Netflix app won't come close to what you can get with true Netflix support
 
Rule of thumb #1 when buying a low cost TV - never buy it for its playback ability/smarts but always for the panel quality
Always couple it with a good quality playback device

If you just want local LAN playback - pick a cheap android amlogic box and flash it to coreelec
If you want LAN + streaming and are OK with some incompatibilities (e.g. no HDR10 on Netflix although DV works) - get a Amazon 4K stick
If you want everything with ease of use, get a Apple TV 4K
If you want everything coupled with hours of tweaking and head scratching, get a Nvidia shield

On second thoughts, this rule of thumb applies for expensive TVs too

PS: In all likelihood, your TV has a 10 bit panel , as mentioned before, the problem was with your rendering choice when you were ready to throw the TV
You still do have a problem (although less severe) as I can tell you with almost certainty that The sideloaded Netflix app won't come close to what you can get with true Netflix support

Well said :)

BTW I downloaded Netflix from Play Store officially but this TV isn't Netflix-certified. Now what bugs me is that if you're suggesting this panel to be 10-bit then how come there's only Dolby Vision logo with a Netflix content ? I mean why isn't HDR 10 appearing alongside it? I know it might sound stupid and I'm aware that DV & HDR10 are two sides of the same coin but doesn't a TV first need to be HDR10 to then support DV further or is my case just DV and No HDR10? I get it that they implemented DV into this TV because of DV's backwards compatibility with 8-bit and 10-bit HDR panels.

Coming to rendering via my pc: Why are Windows Color Settings and Nvidia Control panel still showing 8 bits of color depth if the panel, as you're asserting 'in all likelihood', is 10-bit?
 
Well said :)

BTW I downloaded Netflix from Play Store officially but this TV isn't Netflix-certified. Now what bugs me is that if you're suggesting this panel to be 10-bit then how come there's only Dolby Vision logo with a Netflix content ? I mean why isn't HDR 10 appearing alongside it? I know it might sound stupid and I'm aware that DV & HDR10 are two sides of the same coin but doesn't a TV first need to be HDR10 to then support DV further or is my case just DV and No HDR10? I get it that they implemented DV into this TV because of DV's backwards compatibility with 8-bit and 10-bit HDR panels.

Coming to rendering via my pc: Why are Windows Color Settings and Nvidia Control panel still showing 8 bits of color depth if the panel, as you're asserting 'in all likelihood', is 10-bit?
Maybe Netflix is in bed with Dolby Labs to promote Dolby Vision.
On LG 55C8 using native apps
- Netflix always has only DV logo
- Prime shows DV or HDR logo depending on content

Cheers,
Raghu
 
Well said :)

BTW I downloaded Netflix from Play Store officially but this TV isn't Netflix-certified. Now what bugs me is that if you're suggesting this panel to be 10-bit then how come there's only Dolby Vision logo with a Netflix content ? I mean why isn't HDR 10 appearing alongside it? I know it might sound stupid and I'm aware that DV & HDR10 are two sides of the same coin but doesn't a TV first need to be HDR10 to then support DV further or is my case just DV and No HDR10? I get it that they implemented DV into this TV because of DV's backwards compatibility with 8-bit and 10-bit HDR panels.

Coming to rendering via my pc: Why are Windows Color Settings and Nvidia Control panel still showing 8 bits of color depth if the panel, as you're asserting 'in all likelihood', is 10-bit?
I think you are mixing 10 bit colors with HDR10
10 bit color (i.e. wide color gamut) is what you are after
HDR10 and DV are just competing standards (like say Betamax vs VHS) for rendering 10 bit color (or what is often termed as HDR)
DV, HDR10 and HLG are all high dynamic resolution though and support 10 bit colors (and I presume that's what you are looking for)

As for windows color settings showing 8 bits - sorry, can't help you there.
Just my opinion but windows for consumers in its current state is an outright disaster and I wouldn't touch it with a bargepole!
MS today can't even get basic retail customer expectations like a reliable BT stack or touchpad drivers right on even its own surface devices right.

Expecting them to get 10 bit color working right off the bat from 3rd party hardware (or at least without 24 hours of googling /driver swaps/registry edits etc) is a bit rich if you ask me.
I had my fun whiling away my time fixing windows - but that was in an era long gone by..

PS: If your panel was actually 8 bit, you wouldn't have got that washed out color scenario you had earlier - That scenario is a clear indication of a 10 bit display switching to 10 bit mode but getting 8 bit YCrCbCr data instead.
 
I think you are mixing 10 bit colors with HDR10
10 bit color (i.e. wide color gamut) is what you are after
HDR10 and DV are just competing standards (like say Betamax vs VHS) for rendering 10 bit color (or what is often termed as HDR)
DV, HDR10 and HLG are all high dynamic resolution though and support 10 bit colors (and I presume that's what you are looking for)

As for windows color settings showing 8 bits - sorry, can't help you there.
Just my opinion but windows for consumers in its current state is an outright disaster and I wouldn't touch it with a bargepole!
MS today can't even get basic retail customer expectations like a reliable BT stack or touchpad drivers right on even its own surface devices right.

Expecting them to get 10 bit color working right off the bat from 3rd party hardware (or at least without 24 hours of googling /driver swaps/registry edits etc) is a bit rich if you ask me.
I had my fun whiling away my time fixing windows - but that was in an era long gone by..

PS: If your panel was actually 8 bit, you wouldn't have got that washed out color scenario you had earlier - That scenario is a clear indication of a 10 bit display switching to 10 bit mode but getting 8 bit YCrCbCr data instead.
But my Dell S2240L SDR Monitor also display washed out colors when I try to play any 10-bit content via both MPC-BE x64 (using madVR) and MPC classic x64. Well, it is 8-bit and displays in both sRGB and YCrCbCr color primaries but my TV shows the exact same response. Why is that?

PS- A guy emailed VU tech support some questions and he got one of his questions regarding 'whether the panel used is 10-bit' replied as "No, the panel is 8-bit but the TV has a hardware called TCON which displays 10-bit data." So they're doing tone mapping by exploiting some hardware to display 10-bit but I mean why the fuck am I not able to play it anyway till date? I tried playing off of External drive, Usb Flash Drive, tried many many media players on tv, tried on PLEX, Smart YT (by selecting HDR in video quality), and you name it all but it's fucking stuck on displaying decolorized picture in every single instance. Idk why are you asserting my panel to be truly 10-bit.

Oh and yeah, I forgot to mention that in the official tv spec sheet, the color depth is mentioned as 8-bit (see attachment).
 

Attachments

Last edited:
He isn't that much exaggerating in his regard but he surely isn't well-researched. The specs he's mentioning are not even present in 99.8% of the TVs. Obviously you need at least 700+ nits of brightness or so to display good contrasts through a 55" panel and the WCG requirement is somewhat true but not at least 12-but for DV. DV is backwards compatible with 10-bit and 8-bit panels as mine.
No. VESA's DisplayHDR standard requires minimum 400nits and 8-bit panel 400nits for HDR. VESA DisplayHDR is the HDR standards and certification program to which many big as well as small companies are party to.
The guy above said HDR requires 1000nits, which is completely false.
VESA requires 400nits because HDR is open source and anyone can adopt it, but there has to be some good in it that's why.
It is HDR10 and above that requires 10bit panel and not HDR.
The guy above also says, Dolby Vision requires 3000+.
I mean how wrong can you be?
Currently, not many (if any) consumer class displays can produce higher than 2000nits.
Dolby Vision lab in UK itself uses display (Dolby Pulsar) that can get 4000nits max to analyze and make videos Dolby Vision certified.
If you can observe, in this video, Dolby Labs uses 400nits display to compare it with Dolby Pulsar.
So, even Dolby Vision can be used with 400nits displays.
In July TCL in Europe is launching EP68 TV that is Dolby Vision with just 400nits display.
Forget HDR, if these overly exaggerated brightnesses requirements of Dolby Vision were so true, how would Dolby allot a super strict standard like Dolby Vision just to a 400nits panel of TCL?
 
When I researched before buying LG55C8, I remember this:
HDR10 is a baseline on which DV is built.
HDR10 allows for metadata to be carried and applied for the content being displayed.
This is static metadata. DV builds on top of this using frame-by-frame or scene-by-scene metadata.
So the brights and darks are realistic enough to what the director intends.
Dolby with all its muscle is surely gonna push its magic thru by working with studios and producers.

Video processing has become so complicated now, it makes no sense to armchair decode the mechs-n-specs.
Just take a demo, look at the features that matter, fix a price point and choose a panel.
Cheers,
Raghu
 
HDR is mastered to 1000 nits. In some cases, 4000 nits. So in any TV with lower peak brightness, the content is tone mapped. How effective is the tone mapping depends on how well it's done. Panasonic and Sony are generally the leaders in that.

Nonetheless, for HDR to really pop, not only do you need higher peak brightness, you also need effective local dimming to lower the blacks. Otherwise, the blacks will look washed out because of the limited contrast of even VA panels. That's why HDR looks meh even on Samsung TVs that cost thrice as much as Vu Premium Android.

Hopefully, next year we should get better TVs coming in India with FALD, from both TCL and Vu/Hisense.
 
8k 60fps 10 bit color is not Possible over HDMI 2.0. Windows 10 pc should have the latest updates and drivers 1903 build being the latest. HDR should be turned on. You may try with the integrated graphics first if you have an 8 th or 9 th gen CPU. On the TV end to HDR settings should be set properly although I believe it is a problem from the pc side. You can use VLC default oratest MPC with madVR video renderer for proper playback with hardware decoding enabled for smooth playback. 1000 plus nits is great but 700 plus should be ok.
 
But my Dell S2240L SDR Monitor also display washed out colors when I try to play any 10-bit content via both MPC-BE x64 (using madVR) and MPC classic x64. Well, it is 8-bit and displays in both sRGB and YCrCbCr color primaries but my TV shows the exact same response. Why is that?

PS- A guy emailed VU tech support some questions and he got one of his questions regarding 'whether the panel used is 10-bit' replied as "No, the panel is 8-bit but the TV has a hardware called TCON which displays 10-bit data." So they're doing tone mapping by exploiting some hardware to display 10-bit but I mean why the fuck am I not able to play it anyway till date? I tried playing off of External drive, Usb Flash Drive, tried many many media players on tv, tried on PLEX, Smart YT (by selecting HDR in video quality), and you name it all but it's fucking stuck on displaying decolorized picture in every single instance. Idk why are you asserting my panel to be truly 10-bit.

Oh and yeah, I forgot to mention that in the official tv spec sheet, the color depth is mentioned as 8-bit (see attachment).

I assume you have bought the VU Premium Android 4k TV. Apart from the HDR stuff, how do you find the regular HD and Full HD content render on this TV? I am planning to buy this TV in couple of months and more interested in HD and FullHD stuff.
 
From where do you get such a great knowledge?
Please show me the path to that Alexandria.

Please find the like below for your reference, by the way i was talking about true HDR10 & Dolby Vision not so called HDR & Dolby Vision,
 
Last edited:
Please find the like below for your reference, by the way i was talking about true HDR10 & Dolby Vision not so called HDR & Dolby Vision,

The official Dolby vision documentation says this

"
The Dolby Vision Master must:
• Be graded on a display with a minimum peak brightness of 1,000 nits and a minimum contrast
ratio of 200,000:1 set for P3 color gamut and PQ (SMPTE-2084) EOTF. A Rec.2020 display may
also be used if required.
• Be delivered as an RGB, 16-bit, P3, D65, PQ, or Tiff image sequence....."
 
Please find the like below for your reference, by the way i was talking about true HDR10 & Dolby Vision not so called HDR & Dolby Vision,
Yeah, following highlighted words are mine, right?
36669
You were constantly using the word "HDR" instead of "HDR10" and you say you were talking about "HDR10"???? Why do I have to assume that you were talking about "HDR10" when you write the word "HDR"? If HDR and HDR10 were NOT two different things, I would happily assume what you try to say "HDR10" when you write "HDR". But, HDR & HDR10 are two different things. So, if you write "HDR", I will read "HDR" and if you write "HDR10", I would read "HDR10" PERIOD
Please find the like below for your reference, by the way i was talking about true HDR10 & Dolby Vision not so called HDR & Dolby Vision,
My reference for what? Now where in the article it is mentioned that HDR10 requires at least 1000nits, and where in the article you find mention of Dolby Vision requiring 3000+ nits, as claimed by you in your initial comment above. NO. Dolby Vision doesn't require you as an end user to have 3000+nits or even a 4000nits producing display. Dolby Vision just uses up to 4000nits display (Dolby Pulsar) as REFERENCE monitor to grade OTHER displays or OTHER video content or both. The article you've provided clearly says that the Dolby Vision content is MASTERED on/with UP TO 12bits and 4000nits displays.
36670
What it says is, the dolby is MASTERED for around 4000nits, it can is future ready for displays till 1000nits. So, these figures are for GRADING and REFERENCE DISPLAYS used and not for end users. THEY COMPARE THESE REFERENCE DISPLAYS WITH CONSUMER DISPLAYS. There is no absolute brightness requirement in terms of nits for Dolby Vision or even HDR10, better numbers are always better. Also, Dolby Vision supports UP TO (=maximum) 12bit panel, it also supports 10bit panels. While HDR10 and above can support only 10bit and not 12bit.

It seems, you didn't even properly read the article that you are asking others to read.

This is my final message to you. I am not going to waste my time in educating you when you yourself are unwilling to do so.

DON'T SPREAD MISINFORMATION.
 
Yeah, following highlighted words are mine, right?
View attachment 36669
You were constantly using the word "HDR" instead of "HDR10" and you say you were talking about "HDR10"???? Why do I have to assume that you were talking about "HDR10" when you write the word "HDR"? If HDR and HDR10 were NOT two different things, I would happily assume what you try to say "HDR10" when you write "HDR". But, HDR & HDR10 are two different things. So, if you write "HDR", I will read "HDR" and if you write "HDR10", I would read "HDR10" PERIOD

My reference for what? Now where in the article it is mentioned that HDR10 requires at least 1000nits, and where in the article you find mention of Dolby Vision requiring 3000+ nits, as claimed by you in your initial comment above. NO. Dolby Vision doesn't require you as an end user to have 3000+nits or even a 4000nits producing display. Dolby Vision just uses up to 4000nits display (Dolby Pulsar) as REFERENCE monitor to grade OTHER displays or OTHER video content or both. The article you've provided clearly says that the Dolby Vision content is MASTERED on/with UP TO 12bits and 4000nits displays.
View attachment 36670
What it says is, the dolby is MASTERED for around 4000nits, it can is future ready for displays till 1000nits. So, these figures are for GRADING and REFERENCE DISPLAYS used and not for end users. THEY COMPARE THESE REFERENCE DISPLAYS WITH CONSUMER DISPLAYS. There is no absolute brightness requirement in terms of nits for Dolby Vision or even HDR10, better numbers are always better. Also, Dolby Vision supports UP TO (=maximum) 12bit panel, it also supports 10bit panels. While HDR10 and above can support only 10bit and not 12bit.

It seems, you didn't even properly read the article that you are asking others to read.

This is my final message to you. I am not going to waste my time in educating you when you yourself are unwilling to do so.

DON'T SPREAD MISINFORMATION.
There you go i was comparing with the VU tv which has only 8bit panel and they say its an Dolby Vision, just look above what it is mentioned for Dolby Vision that supports 12bit maximum but is also supports in 10bit but do you think is supports in 8bit bit as well? and i was trying to explain the real diff and the true 10bit & 12bit to enjoy the complete HDR10 & Dolby Vision Content on the Screens nothing much, by the way i was talking about HDR10 not HDR.
 
There you go i was comparing with the VU tv which has only 8bit panel and they say its an Dolby Vision, just look above what it is mentioned for Dolby Vision that supports 12bit maximum but is also supports in 10bit but do you think is supports in 8bit bit as well? and i was trying to explain the real diff and the true 10bit & 12bit to enjoy the complete HDR10 & Dolby Vision Content on the Screens nothing much, by the way i was talking about HDR10 not HDR.
Go to hell, man.
 
Guys I have been following your thread with interest. I would request all gentlemen here to counter argue each other with decorum and not breach any rules of the forum. With the hope that this discussion continues with civility.
 
Join WhatsApp group to get HiFiMART.com Offers & Deals delivered to your smartphone!
Back
Top