Author Topic: The conundrums of calibrating a 4K display playing a HDR content...  (Read 3256 times)

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
I discovered that once again, we are screwed by the 4K UHD manufacturers and 4K UHD consortium when it comes to getting the best quality picture possible. As of this post, there is no hardware (i.e. display in the likes of 4K UHD TV and Projector) able to produce BT2020 or REC2020 wide color gamut. At this point, BT2020 color gamut is like a MKV media container that houses the the various types of codecs that gives you metadata for the video, audio as well as the subtitles. So in every sense of the word, BT2020 is a "container" as it is to a MKV except BT2020 is acting as a repository for color gamuts. At present, BT2020 is the de facto standards used for HDR content. For a 10-bit color gamut, this will be the HDR10 and for the more advanced and refined version will be Dolby Vision that make use of 12-bit color resolution. Higher bitrate translates to better color gradation, richer color palettes and eliminates color banding to a zilch when viewed on a bigger screen like 100" and above. This is all good and future proofing but once again, just like the state of affairs that has been played out between HDCP2.2, HDMI2.0a (to benefit from HDR content) and the hardware specs of what constitutes a true 4K - i.e. 4096 x 2160p or 3840 x 2160p that paved the way for more confusion and thus delaying the adoption rate. In the last quarter, we saw a big push and amalgamation of standards, hardware and software specs coming together with the new de facto 4K Ultra HD Premium standards released by the 4K UHD Alliance to give consumers the confidence of knowing what they are paying for with their hard earned money.

For videophiles, the "new" kid of the block is not 4K anymore but the WCG and HDR that comes with it. Yes, I have sold off my Sony VW500ES native 4K projector for a HDR-compatible one, the JVC X-7000. In my opinion, nothing beats a true native 4K display as there is no "artificial" upscaling algorithm at play here. The biggest bugbear for JVC 4K line-up has to be its "faux-4K" using its proprietary 4K e-Shift. This is the 4th generation of e-Shift and with each successive progress in and refinement process, e-Shift has improved by leaps and bounds in reproducing images to near-4K quality and no, it is still NOT anywhere near to a true native 4K. But that's not my gripe, my gripe lies with the lack of content in HDR and heck the basic 4K HDR calibration disc is no where in sight. This make things very annoying and frustrating for videophiles who wanted to have the best visual experience for their display, ensuring color accuracy and gamma etc...For Rec709 content, we have no issue as there are plenty of calibration disc around in the likes of Avia, DVE, Munsil and Spears, AVSHD709 to name a few. But there is no 4K calibration disc with BT2020 color gamut in 10/12-bit resolution and HDR mix. This precludes the possibility to calibrate our 4K display when viewing HDR content. This is unacceptable as HDR content triggers a whole new level of things here. You simply can't use calibration disc authored in Rec709 standards to calibrate your dual-function display (switching between Rec709 and BT2020 (HDR) content) without the need to manually making some finer adjustments or switching to a different profile altogether to get the right "fit". The difference lies with the gamma curve as a result of the change in color luminance and its intensity when projecting a wide color spectrum as we goes into the realm of DCI P3 standards. As long as there is no standard calibration patterns for HDR content, there is no way to correct the gamma, grayscale tracking and White balance etc. All this affects the overall viewing experience for HDR content which is supposed to be the preferred way of viewing your 4K UHD bluray titles. The end result is often disappointing with comments like, "picture too dim", "black crush", and lack the contrasty punch as we lose out tons of details if there is not enough lumens to produce it. The only way to get a better viewing experience for HDR content now (at least for me in the projector realm), one needs to use the most primitive method - i.e. eye-balling by switching between Rec709 content and HDR with BT2020 (in DCI-P3) content to sift out the finer details and comes to terms of what constitutes a good "trade-off" between shadow details w/o risking the black levels and vice versa.

I am torn between having a gorgeous looking 1080p - upscaled to 4K picture with a perfect Rec709 calibration but a lacklustre performance when switching to HDR content viewing and the irony of it all is, I switched to a HDR capable projector for the sake of WCG and HDR and yet I am somewhat "crippled" by the fact that I can't get it to work in a perfect condition and have to resort to "eye-balling" method to calibrate this. Nevertheless, I am getting fairly good results viewing HDR content now after a few weeks of tweaks and reading through the experiences, comments, tips and recommendations by other users from AVSforum and other AV communities.


 :-\
« Last Edit: April 02, 2017, 11:01 by desray »

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
Chanced upon this link from AVSforum. This chap is offering some HDR10 test patterns. I will download it and do some basic measurements later. For those interested, can visit: http://www.avsforum.com/forum/139-display-calibration/2463698-r-masciola-s-hdr-10-uhd-test-patterns.html



He has an official website of his own. Looks professionally done! Link: http://rmadvancedcaldisc.com/rm-uhdhdr-10.html
« Last Edit: April 02, 2017, 10:59 by desray »

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
Reviewed his manual pdf and I am convinced that it will help me with the calibration in some ways for HDR-10 content. Hence purchased the digital copy of the test patterns at US$25. For those who wanted to get a copy from me at a discounted rate. Pls pm me. Otherwise you can go to his website to make the purchase. This is an invaluable tool for videophiles jumping on the 4K UHD (HDR) bandwagon.

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
Here's another post that provides some HDR10 test patterns and more importantly, its free. For those looking to do some HDR testing on your OLED or projector display, can give it a try.

Link: https://www.avsforum.com/forum/139-display-calibration/2943380-hdr10-test-patterns-set.html.



Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
De-mystifying HDR

What is EOTF?
To put it in its simplest form, Electro-Optical Transfer Function (EOTF) is actually referring to gamma settings in which most of us are familiar with. It is largely based on the SMPTE standards called BT.2084. Recall the 2.2, 2.3, 2.4 etc figures? Yep, that’s right! EOTF is actually a fancy term for “gamma”. Now that you know it is actually gamma, then you probably know the fundamentals and it weighted importance in affecting the overall PQ on your display (be it on a OLED TV or projector).

The importance of getting gamma right will determine whether you are able to see shadow details during dark scenes and same goes for bright scenes to prevent clipping. For HDR, EOTF (gamma settings) can be a little different from the standard BT.709 due to the use of Wide Color Gamut (WCG)  usually in the range of a DCI-P3 colors or wider (e.g. BT.2020). BT.2020 at the present moment is achievable for most conventional OLED TV and projectors. Hence DCI-P3 is usually used. Any display that can achieve at least over 80% of DCI-P3 colorimetry is considered a capable and efficient display to reproduce HDR-quality picture. Remember DCI-P3 is a subset of BT.2020 colorimetry, hence when we talk about HDR, most of the industry leaders tends to use BT.2020 instead of making references to DCI-P3 (Digital Cinema).

If EOTF is the same as gamma, of course we can also expect the same principles of gamma to be applicable here. While both EOTF and gamma are trying to achieve the same “outcome” - e.g. great details w/o clipping which resulted in crush blacks (losing shadow details in dark scenes), EOTF works in a different way. For the very first time, engineers working on the HDR content wanted to do more than just giving a “relative” term or meaning to gamma setting, it wanted to attach an “absolute” or specific value to how bright or how dark a picture or an object should appear in respect to a scene within a movie frame shot. For instance, a shimmering light in a distant corner can have a nit value of say “600 nits” while the dark background can have a nit value of say “200 nits”. This is contrary to what a normal gamma setting where different movie content does not play an “active” role in affecting the overall PQ of the movie during playback and leave the display (OLED TV or Projector) to determine what you can see (after proper calibration).

So EOTF sounds cool since it can essentially make the little nuances in details more “revealing” as compared to the olf school, gamma. Well, it is not all sunshines and rainbows with this new EOTF way of setting gamma as most OLED TV and projectors are unable to reproduce the “required” level of nits to achieve optimum HDR content. The culprit? Well you guess it, not enough illumination (the intensity of light source) to peak at say a scene authored at 1,000, 4,000 or even as high as 10,000 nits! Projector being the worst in reproducing HDR content faithfully. Sometimes it even has problem trying to hit the 1,000 nits average ballpark! So projector is out and I should get a OLED TV with local array dimming to maximize contrast unless I get a laser-based projector to solve the problem? Not necessary...while most projector can’t hit the figures of 4,000 nits and beyond, 1,000 nits is attainable but with limitations.

What is “tone-mapping” in HDR?
This is where tone-mapping in HDR content comes in to “fill in” the gaps for projector users. Because of its inherent weakness – i.e. not enough illumination to reproduce the HDR content faithfully on the display, the engineers came up with a “compromise” called “tone-mapping”. It is actually a less “dirty word” for “compression”. Yes, you read it right, tone-mapping is actually a form of video compression. Much like a 4K clip downscaled or compressed to fit in a video container (e.g. MKV) using H.265 or HEVC engine. Result? A somehwat convincing 4K without too much sacrifice to the overall PQ except for some missing frames and textures. So now you know, tone-mapping is nothing more than compression but what does that have to do with the whole EOTF story? Well, this is the interesting bit, lest you forget, EOTF correlates to gamma which determines how contrasty (e.g. bright and dark) the movie looks. The contrasty look is what elevates the perceived image thus the overall movie PQ. A dark scene with bright colorful object tends to have the “pop-out” effect that most OLED TV and projectors like JVC DiLA or the Epson transmissive 3LCD display are able to produce. But the problem is “controlling” the amount of illumination (nits) in each scene so that all the low, mid and and high tone details and highlights can still be observed.

Recall I have mentioned that not all display can reproduce 1,000 nits faithfully let alone 4,000 nits or higher, this is where “tone-mapping” on the EOTF to produce customizable EOTF curve (based on the PQ curve) comes into play. For some projectors like the latest JVC N series projector, it comes with auto tone-mapping feature built into the display itself. It then retrieves metadata information of the nit values from the Mastering Display Info found embedded inside a 4K UHD media – e.g. 4K UHD bluray disc and Netflix or iTunes 4K movie content with MaxCLL (Maximum Content Light Level) and MaxFALL (Maximum Average Light Level). These light level metadata will allow the projector or TV display to manipulate the various movie scenes “on-the-fly” to achieve the closest look and feel of the Director’s intent for that particular movie.

But there is a caveat and limitation to all of these...As I have mentioned earlier, there is a “compromise” or trade-off to be made here. Since we know the display is unable to achieve how luminance (nits) due to the display hardware limitation, then we have to work around it by matching or in this case “mapping” the source material to the display capability. For e.g, a bright outdoor scene in a movie can easily achieve say 2,000 nits but the display can only take in say 800 nits max, What tone-mapping does is to “re-map” the EOTF (gamma) values using the MaxCLL and MaxFALL figures to make the overall scene “darker” so as to ensure all other details in a scene can still be preserved. What it sacrificed is brightness for details. A highly efficient and capable OLED TV or projector will try to “re-map” the tone curve to be in line as closely to the PQ curve as possible without sacrificing lumens (brightness).

Which one to get? Projector or OLED TV?!
So the million dollar question here is...Is projector still worth buying since now we know projector is rated worse than its TV brethren when it comes to producing rich vibrant details of HDR content. For now, I can safely say the projector using the traditional lamp-based module is not what you should be getting if your all your movie collections these days are all 4K UHD with HDR10 and Dolby Vision, you probably better off getting an OLED TV for “now”. Even hybrid-laser based 4K projector (not talking about the Ultra-Short throw UST projectors which are meant for the general market) like Sony’s VW760ES or JVC’s very own laser-based projector or even JVC’s RS4500 (or Z1 in some market region) has its limitation in reproducing 4K HDR content faithfully. But we cannot deny the attractiveness of using Projesler for everything else, meaning if you still want big cinematic feel close to a cinema viewing or you still have a ot of SDR content (BT.709) like normal bluray, then the choice is harder to make.

 ;)

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
Examples of scenes showcasing the benefits of tone-mapping for projection display.









Stock images derived from a Panasonic Japan website featuring the HDR Optimizer at work here for their UB9000.



Online kaydee6

  • Trade Count: (0)
  • Full Member
  • Posts: 592
The availability of tone mapping lies on the player and not the projector?

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
The availability of tone mapping lies on the player and not the projector?

No, it can be implemented on both the source or the display. In this case, the source is the Panny player.

I’ve been trying to get my hands on a UB820 but importing it is very expensive and not to mention limited stock elsewhere in the countries that are sold as well. Panny Singapore informed that they will lot even consider bringing in Blu-ray players at all. Sigh.


Sent from my iPhone using Tapatalk

Offline ronildoq

  • Trade Count: (+2)
  • Full Member
  • Posts: 1469
De-mystifying HDR

What is EOTF?
To put it in its simplest form, Electro-Optical Transfer Function (EOTF) is actually referring to gamma settings in which most of us are familiar with. It is largely based on the SMPTE standards called BT.2084. Recall the 2.2, 2.3, 2.4 etc figures? Yep, that’s right! EOTF is actually a fancy term for “gamma”. Now that you know it is actually gamma, then you probably know the fundamentals and it weighted importance in affecting the overall PQ on your display (be it on a OLED TV or projector).

The importance of getting gamma right will determine whether you are able to see shadow details during dark scenes and same goes for bright scenes to prevent clipping. For HDR, EOTF (gamma settings) can be a little different from the standard BT.709 due to the use of Wide Color Gamut (WCG)  usually in the range of a DCI-P3 colors or wider (e.g. BT.2020). BT.2020 at the present moment is achievable for most conventional OLED TV and projectors. Hence DCI-P3 is usually used. Any display that can achieve at least over 80% of DCI-P3 colorimetry is considered a capable and efficient display to reproduce HDR-quality picture. Remember DCI-P3 is a subset of BT.2020 colorimetry, hence when we talk about HDR, most of the industry leaders tends to use BT.2020 instead of making references to DCI-P3 (Digital Cinema).

If EOTF is the same as gamma, of course we can also expect the same principles of gamma to be applicable here. While both EOTF and gamma are trying to achieve the same “outcome” - e.g. great details w/o clipping which resulted in crush blacks (losing shadow details in dark scenes), EOTF works in a different way. For the very first time, engineers working on the HDR content wanted to do more than just giving a “relative” term or meaning to gamma setting, it wanted to attach an “absolute” or specific value to how bright or how dark a picture or an object should appear in respect to a scene within a movie frame shot. For instance, a shimmering light in a distant corner can have a nit value of say “600 nits” while the dark background can have a nit value of say “200 nits”. This is contrary to what a normal gamma setting where different movie content does not play an “active” role in affecting the overall PQ of the movie during playback and leave the display (OLED TV or Projector) to determine what you can see (after proper calibration).

So EOTF sounds cool since it can essentially make the little nuances in details more “revealing” as compared to the olf school, gamma. Well, it is not all sunshines and rainbows with this new EOTF way of setting gamma as most OLED TV and projectors are unable to reproduce the “required” level of nits to achieve optimum HDR content. The culprit? Well you guess it, not enough illumination (the intensity of light source) to peak at say a scene authored at 1,000, 4,000 or even as high as 10,000 nits! Projector being the worst in reproducing HDR content faithfully. Sometimes it even has problem trying to hit the 1,000 nits average ballpark! So projector is out and I should get a OLED TV with local array dimming to maximize contrast unless I get a laser-based projector to solve the problem? Not necessary...while most projector can’t hit the figures of 4,000 nits and beyond, 1,000 nits is attainable but with limitations.

What is “tone-mapping” in HDR?
This is where tone-mapping in HDR content comes in to “fill in” the gaps for projector users. Because of its inherent weakness – i.e. not enough illumination to reproduce the HDR content faithfully on the display, the engineers came up with a “compromise” called “tone-mapping”. It is actually a less “dirty word” for “compression”. Yes, you read it right, tone-mapping is actually a form of video compression. Much like a 4K clip downscaled or compressed to fit in a video container (e.g. MKV) using H.265 or HEVC engine. Result? A somehwat convincing 4K without too much sacrifice to the overall PQ except for some missing frames and textures. So now you know, tone-mapping is nothing more than compression but what does that have to do with the whole EOTF story? Well, this is the interesting bit, lest you forget, EOTF correlates to gamma which determines how contrasty (e.g. bright and dark) the movie looks. The contrasty look is what elevates the perceived image thus the overall movie PQ. A dark scene with bright colorful object tends to have the “pop-out” effect that most OLED TV and projectors like JVC DiLA or the Epson transmissive 3LCD display are able to produce. But the problem is “controlling” the amount of illumination (nits) in each scene so that all the low, mid and and high tone details and highlights can still be observed.

Recall I have mentioned that not all display can reproduce 1,000 nits faithfully let alone 4,000 nits or higher, this is where “tone-mapping” on the EOTF to produce customizable EOTF curve (based on the PQ curve) comes into play. For some projectors like the latest JVC N series projector, it comes with auto tone-mapping feature built into the display itself. It then retrieves metadata information of the nit values from the Mastering Display Info found embedded inside a 4K UHD media – e.g. 4K UHD bluray disc and Netflix or iTunes 4K movie content with MaxCLL (Maximum Content Light Level) and MaxFALL (Maximum Average Light Level). These light level metadata will allow the projector or TV display to manipulate the various movie scenes “on-the-fly” to achieve the closest look and feel of the Director’s intent for that particular movie.

But there is a caveat and limitation to all of these...As I have mentioned earlier, there is a “compromise” or trade-off to be made here. Since we know the display is unable to achieve how luminance (nits) due to the display hardware limitation, then we have to work around it by matching or in this case “mapping” the source material to the display capability. For e.g, a bright outdoor scene in a movie can easily achieve say 2,000 nits but the display can only take in say 800 nits max, What tone-mapping does is to “re-map” the EOTF (gamma) values using the MaxCLL and MaxFALL figures to make the overall scene “darker” so as to ensure all other details in a scene can still be preserved. What it sacrificed is brightness for details. A highly efficient and capable OLED TV or projector will try to “re-map” the tone curve to be in line as closely to the PQ curve as possible without sacrificing lumens (brightness).

Which one to get? Projector or OLED TV?!
So the million dollar question here is...Is projector still worth buying since now we know projector is rated worse than its TV brethren when it comes to producing rich vibrant details of HDR content. For now, I can safely say the projector using the traditional lamp-based module is not what you should be getting if your all your movie collections these days are all 4K UHD with HDR10 and Dolby Vision, you probably better off getting an OLED TV for “now”. Even hybrid-laser based 4K projector (not talking about the Ultra-Short throw UST projectors which are meant for the general market) like Sony’s VW760ES or JVC’s very own laser-based projector or even JVC’s RS4500 (or Z1 in some market region) has its limitation in reproducing 4K HDR content faithfully. But we cannot deny the attractiveness of using Projesler for everything else, meaning if you still want big cinematic feel close to a cinema viewing or you still have a ot of SDR content (BT.709) like normal bluray, then the choice is harder to make.

 ;)

Very nice informative write up desray! And you are correct, even on the OLED I could only calibrate up to 70% for gamma in HDR mode, beyond that is meaningless as the tv itself is not able to produce the required nits as how it is authored in the 4K disc at 4000 nits brightness. The tone mapped encoded source Blu-ray files (Tekno3d) however,  really does a great job when the display kicks in in HDR mode. After watching these files, I pulled the handbreak on upgrading to PJ. I was demoing a few PJ with the Mrs. She looked at me and went “ are u sure u wanna upgrade to PJ?” The picture quality is so far off compared to the OLED, will u be happy or upgrade again in another year ?” I just paused because at this stage, I realised that though I’ll enjoy a bigger picture, I’ll not be contented with the HDR quality from the PJ . Even the Mrs is able to tell the difference , then suddenly I saw the JVC N series with tone mapping features and this piqued my interest again. Too bad you will not be reviewing this, was looking forward to that actually

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
For those using Apple TV 4K, the HDR10 is one big mess at the moment. How so? It is the problem of Dolby Vision (DV) trying to derive the HDR10 base and Apple still haven't found a solution to get the correct static HDR metadata for MaxCLL and MaxFALL to pass to the display to do proper tone mapping. Sigh...

Read here for info: https://www.avsforum.com/forum/39-networking-media-servers-content-streaming/3027066-atv4k-faulty-dv-hdr10-conversion.html
« Last Edit: February 10, 2019, 21:56 by desray »

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
HDR10 and HDR10+I just read that HDR10+ may be a revolution and perhaps even better than Dolby Vision. HDR10+ adds a dynamic metadata layer which allow frame-to-frame adjustments to the brightness (nits) of the display. What this all means is that we will have a more dynamic range of colors, highlights and details in different scenes instead of using a static metadata (MaxCLL and MaxFALL) that the current HDR10 (SMPTE ST 2086) is based on.

The takeaway here is HDR10+ is "dynamic" instead of "static" metadata. Pioneer UDP-LX500/800 will have a firmware update for HDR10+ sometime in the 2nd quarter of this year.



With Samsung* now out of the bluray player manufacturer, we have to rely on Oppo and Panasonic to push for the HDR10+. Watch out for 20th Fox Century and Warner Bros titles in the coming months (should be in the 2nd quarter or later) since HDR10+ firmware for most players that are eligible for an update will only happen sometime in the 2nd qtr or 2nd half of this year.

* - Samsung did not actually abandon HDR10+ but just simply divert its attention to its mobile division. See the upcoming Samsung's flagship - the Galaxy S10+. Source: https://www.whathifi.com/news/samsung-galaxy-s10-launches-with-hdr10-plus-and-a-headphone-jack




Who's the biggest culprit of poor 4K HDR image?!
The biggest problem for any tone-mapping 4K bluray players right now is due to Dolby Vision. Dolby Vision is the product of Dolby and although it is infused witrh a HDR10 baseline but because of Dolby Vision layer, it kind of screwed up the whole metadata (MaxCLL & MaxFALL) which proved to be detrimental to the whole HDR image quality. The worst affected lot being Projector users. :(

The irony of all this pursuit of HDR purity etc with Dolby Vision being the crown jewel, most tone-mapping technology at present is actually unable to tap on the metadata layer of Dolby Vision. And Projector users is again at the losing end as most if not all projectors like JVC and even Sony's flagship don't really support Dolby Vision. So HDR10+ which allows the players to analyze the frame dynamically before porting it to the projector could be something worth looking forward to...
« Last Edit: February 22, 2019, 18:16 by desray »

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
I stand corrected...JVC N series projectors will NOT benefit from HDR10+ processing from the source - i.e. 4K bluray players that supported it!

It appears that HDR10+ using dynamic metadata will NOT be supported by the new JVC N series projectors after all. I was corrected by AVSForum members (who have more knowledge in this aspect) that HDR10+ encoded 4K disc will continue to send only "static" metadata (MaxCLL and MaxFALL) to the projector from the source even though the source (i.e. 4K UHD players like Oppo and Panasonic UB820) can do HDR10+ "dynamically"...it sound kind of weird if you ask me but at the moment I was *corrected* by some of the members that it doesn't. So I will take it that JVC N series projectors will NOT stand to benefit anything from the upcoming HDR10+ encoded 4K bluray titles. If so, then don't waste your time and money to get a 4K player that can playback HDR10+ if you are going to use JVC N series projectors.


 :'( :'( :'(
« Last Edit: February 23, 2019, 09:59 by desray »

Offline badbad2000

  • Trade Count: (0)
  • Full Member
  • Posts: 1106
https://store.portrait.com/meters/spectracal-c6-hdr2000-with-add-ons.html

This meter should be able to calibrate HDR up to 2000cd/m2


Sent from my iPhone using Tapatalk

Offline sevenz

  • Trade Count: (+15)
  • Full Member
  • Posts: 2162
  • home theatre lover :D
Thanks for all these desray. Super useful read b4 i start calibrating HDR on my OLED.

So many technical things to learn for display calibration. Challenging...

Offline desray

  • Global Moderator
  • Trade Count: (+4)
  • Full Member
  • *
  • Posts: 18598
  • Bring the "Cinema experience" back home...
Thanks for all these desray. Super useful read b4 i start calibrating HDR on my OLED.

So many technical things to learn for display calibration. Challenging...

For calibration of a 4K HDR display, you just need the appropriate test patterns mastered in HDR and at correct peak brightness (1,000 - 4,000 nits) depending on the display capabilities. There are a lot of 4K HDR industry standards but the more important ones that affect consumer grade products are but not limited to the SMPTE standards - e.g. ST 2084 (PQ curve), ST 2086 (static metadata) and ST 2096 (dynamic metadata) been the prominent ones. 

For most projector users, we can barely hit the 1,000 nits which is why “tone mapping” feature is a must-have for all low brightness display. Without proper tone mapping, bright scenes transitioned to dark scenes and vice versa can make your eyes squint in brighter scenes and left you wanting more on the black levels in darker scenes.


Sent from my iPhone using Tapatalk