PDA

View Full Version : Encoding 4K HDR 4:2:0 10bit BT.2020 (continuation)


surami
28th November 2016, 19:55
This is the continuation of the Encoding 4K HDR 4:2:0 10bit BT.2020 (http://forum.doom9.org/showthread.php?t=172724) thread.

The last evaluable post was the #202 (http://forum.doom9.org/showthread.php?p=1783773#post1783773), please continue the discussion here...

kabelbrand
1st February 2017, 19:28
Are there any Hybrid Log Gamma HDR test sequences or reference encodes out there?

The BBC seems to have a HLG test clip available in iPlayer on some Smart TVs:
http://www.bbc.co.uk/rd/blog/2016/12/hdr-4k-uhd-iplayer-trial-planet-earth

visionplushdr
8th February 2017, 03:23
Indiana Jones and The Last Crusade VISIONPLUSHDR-1000 ( 10.000 nits grading ) sample in 1850 NITS MadVR Output - BT. 2020 Display Calibration colorspace, pure power curve gamma 2.60.

https://www.youtube.com/watch?v=zgWY4mCre84

The Martian VISIONPLUSHDR-1000 ( 10.000 nits grading ) sample in 1700 NITS MadVR Output - BT. 2020 Display Calibration colorspace, pure power curve gamma 2.60.

https://www.youtube.com/watch?v=T8thnOe8KxY

Wanted to show it.

Videos are recorded in BT. 2020 colorpsace output, by using saturation in MadVR to expand the color to be seen in YT.

Videos are BT. 2020 graded, where you can choose BT.709 colorspace in MadVR and you get default output for any TV, here saturation is not even needed. Goes to 0, default.

When you use DCI-P3 you have to get those newer TV sets with 97 or more coverage, here you also don't apply saturation. If you get a panel with less DCI-P3 reproduction, saturation helps again.

And for the native BT. 2020 you must always apply saturation due to non existant panels capable to fully reproduce the 2020 palette.

Applying saturation from in the player can sound odd, though expanded BT. 2020 native colorspace can look really good compared to "native" 709/DCI.

HDR Blu-Rays use DCI-P3 Conversion by metadata, this restricts the playback only to DCI.

Asmodian
8th February 2017, 17:40
Why would you use BT.2020 output from madVR?

If I understand what you are doing you are rendering into BT.2020 and then arbitrarily increasing the saturation so that it looks better when YT pretends it is BT.709. It would be a lot more correct to render to BT.709 in the first place.

If your display doesn't support a color space don't render into that color space.

visionplushdr
8th February 2017, 18:02
Why would you use BT.2020 output from madVR?

If I understand what you are doing you are rendering into BT.2020 and then arbitrarily increasing the saturation so that it looks better when YT pretends it is BT.709. It would be a lot more correct to render to BT.709 in the first place.

If your display doesn't support a color space don't render into that color space.

My display does 100% DCI coverage ( PRM-4220 ). I don't render in 2020, i do HDR transfers which are 2020 just like the blu-rays, when converting to DCI-P3 due to the non existant displays.

709 is a limited gamut, this is why HDR content works in DCI. When using 2020 and saturation it can look a lot better than 709 and it depends on how you grade. Not every grading is the same, grading is art.

In this particular video there's a lot of yellow output for Indiana Jones when applying saturation to the 2020. When it's played with 709 the colors are not so natural.

When you watch a 709 movie the palette is completely limited where you are used to see "red" skin. This doesn't happens when you use DCI or 2020 with saturation.

Overall colorspace is more natural in a higher colorspace, when your panel can't output the full palette, it will output what i can, though compared to the 709 colorspace, when grading is correct output is a lot more closer to the natural colors.

kolak
8th February 2017, 18:45
One thread has been closed because of you and now you are back with your "HDR masters".
Stop spamming, we have already seen your work.

visionplushdr
8th February 2017, 19:24
One thread has been closed because of you and now you are back with your "HDR masters".
Stop spamming, we have already seen your work.

Where does it says i can't post? What work you have seen? Youtube? That's great. I will continue to show my work as this is a forum to show work in digital video.

This is HEVC HDR thread and i do the same.

Your irony makes you look dull against no arguments whatsoever you can tell about my work. You don't seem to give to the thread anything interesting either i recommend you to leave then.

nevcairiel
8th February 2017, 21:19
This thread is about the technical aspects of encoding HEVC HDR. Your discussion about taste can go into separate threads, thanks.

visionplushdr
8th February 2017, 21:56
This thread is about the technical aspects of encoding HEVC HDR. Your discussion about taste can go into separate threads, thanks.

What taste? There's no taste in encoding HEVC HDR. And i have seen zero posts about encoding 4K HDR on here with proof on their work either. Do you encode in 4K HDR? Can you show what you do, since it looks people on here argues about other's people work though i don't see any work from you, or anybody in this topic.

Anyone can post "technical" copy paste from other websites/forums/blogs on here and it doesn't go to anywhere when there's no proof of anyone on here except myself showing their work and how the final video looks like with your technical aspects.

kolak
8th February 2017, 23:41
Here we go again- so xxx boring.

Do you use LAV filters? Nevcairiel is LAV developer, so he compared to you done quite an amazing job.

WhatZit
9th February 2017, 00:20
Guyz! Guyz! MusicVideos... err... sorry... "VisionPlus" is right! :stupid:

You CAN convert 8-bit BT.709 sources into PERFECT 4K BT.2020 for those times when overly-creative hollywood noobs can't even get the colours right themselves!

None of you "Doomers" with your extensive industry qualifications or decades of experience would EVER have thought of this!

Here's how! Read it and weep...

https://image.ibb.co/kgqMka/Colour.jpg

P.S. I'm the best!

P.P.S. This was sarcastic, if you couldn't tell :sly:

visionplushdr
9th February 2017, 01:02
Guyz! Guyz! MusicVideos... err... sorry... "VisionPlus" is right! :stupid:

You CAN convert 8-bit BT.709 sources into PERFECT 4K BT.2020 for those times when overly-creative hollywood noobs can't even get the colours right themselves!

None of you "Doomers" with your extensive industry qualifications or decades of experience would EVER have thought of this!

Here's how! Read it and weep...

https://image.ibb.co/kgqMka/Colour.jpg

P.S. I'm the best!

P.P.S. This was sarcastic, if you couldn't tell :sly:

So you are other kid?

Do you want me to post images but in the correct scenario? What is extensive industry decades of experience in HDR?

I will post some examples to prove you wrong if you want, give me some time.

visionplushdr
9th February 2017, 01:09
Here we go again- so xxx boring.

Do you use LAV filters? Nevcairiel is LAV developer, so he compared to you done quite an amazing job.

I didn't know Lav filter was an HDR grading. Thanks for the tip. I believe if you keep on arguing with me this will never end, are you re thinking your behavior?

visionplushdr
9th February 2017, 01:23
Guyz! Guyz! MusicVideos... err... sorry... "VisionPlus" is right! :stupid:

You CAN convert 8-bit BT.709 sources into PERFECT 4K BT.2020 for those times when overly-creative hollywood noobs can't even get the colours right themselves!

None of you "Doomers" with your extensive industry qualifications or decades of experience would EVER have thought of this!

Here's how! Read it and weep...


P.S. I'm the best!

P.P.S. This was sarcastic, if you couldn't tell :sly:

This is what im working on.

First SDR 709, then HDR DCI.

https://extraimage.net/images/2017/02/09/b8da95838b0b4fdb08ade0f592b5f8fe.md.jpg (https://extraimage.net/image/2Mca)
https://extraimage.net/images/2017/02/09/583581f3aef093251632a1dcecca0ce8.md.jpg (https://extraimage.net/image/2M7o)

I'm not using your software though. And i didn't know the industry had decades of experience in HDR, specially because they don't know how to do it even today. HDR10 standard is crap for example, completely dim in daylight scenes and poor nits graded as well. So?

And find a way to post images in thumbnail,so you are a better doomer.

Updated:

https://extraimage.net/images/2017/02/09/9c5490795ebe62a19a193d78a15e8c78.md.jpg (https://extraimage.net/image/2MOH)
https://extraimage.net/images/2017/02/09/1b357bf35ce43792243c623bfb37a8e7.md.jpg (https://extraimage.net/image/2MOF)

I don't know why this forum can't handle people showing their work, it's supposed you do HDR here, right? Then why nobody shows their work? I would love to see what you are doing in the HDR field.

I really don't care about the people talking about "proper" hdr grading is made out of 10-bit, because? HDR10 is crap. Colorspace from the master doesn't even matters when image in HDR is low nits and dim in daylight scenes, completely broken HDR standard.

Next would be the "dynamic HDR metadata" ( ST. 2094 ) which can be also be done without any metadata. Industry is using the HDR as pure marketing.

And here's a 1200 nits output in DCI-P3 ( higher nits grade than last images )

https://extraimage.net/images/2017/02/09/6ff084d3ac08016730d8fc4843d955f4.md.jpg (https://extraimage.net/image/2MSn)

You can see how HDR "popups" a lot more.

1500 nits :

https://extraimage.net/images/2017/02/09/bd433a1b004376bd6c510be2245a52fe.md.jpg (https://extraimage.net/image/2MS6)

Colors are fine too, i didn't really understand this guy saying before with the ironic sentence "overly-creative hollywood noobs can't even get the colours right themselves!"

What a world.

visionplushdr
9th February 2017, 04:11
Guyz! Guyz! MusicVideos... err... sorry... "VisionPlus" is right! :stupid:

P.S. I'm the best!

P.P.S. This was sarcastic, if you couldn't tell :sly:


Please do the same as i do with Avidemux. I wait for your work. You are a prodigy kid.

I'm also sarcastic.

kuchikirukia
9th February 2017, 05:01
Guyz! Guyz! MusicVideos... err... sorry... "VisionPlus" is right! :stupid:

You CAN convert 8-bit BT.709 sources into PERFECT 4K BT.2020 for those times when overly-creative hollywood noobs can't even get the colours right themselves!

None of you "Doomers" with your extensive industry qualifications or decades of experience would EVER have thought of this!

You call that HDR? This is HDR!

Boring, faded crap:
https://picload.org/image/roiaplir/big_buck_bunny_480p_surround-f.png


VAST IMPROVEMENT!
https://picload.org/image/roiaplii/capture.png

visionplushdr
9th February 2017, 05:07
You call that HDR? This is HDR!

Boring, faded crap:
https://picload.org/image/roiaplir/big_buck_bunny_480p_surround-f.png


VAST IMPROVEMENT!
https://picload.org/image/roiapllw/capture.png

Yes then looks like there are a LOT OF KIDS in doom9 nowadays. Hopefully there are still grown ups out there. Now i understand why people argues on the work, nobody does anything in HDR here?

kuchikirukia
9th February 2017, 05:12
Yes then looks like there are a LOT OF KIDS in doom9 nowadays. Hopefully there are still grown ups out there. Now i understand why people argues on the work, nobody does anything in HDR here?

You certainly don't.

visionplushdr
9th February 2017, 06:40
You certainly don't.

I don't what? This forum is extremely weird.

visionplushdr
9th February 2017, 14:32
An update with a hue/luminance fix.

https://extraimage.net/images/2017/02/09/7fdf665dde1bc2f98a253afcd62f351f.md.jpg (https://extraimage.net/image/2MMj)

Really hard movie to get it right in hdr.

This is a 1800 nits output by gamma 2.20.

It's supposed you can show work on here, why people keeps on arguing? This other new guy saying this is not HDR because he doesn't knows what is HDR.

Anyway, hope people begins to show their HDR work on here, would be really good to see what people does, as you may know almost nobody does HDR grading for anything, except this guy from mysterybox, which i don't like because he clips the details on every single video i have seen, plus colors are extremely off for natural scenarios.

Groucho2004
9th February 2017, 14:48
Rule 14) Multiple registrations are prohibited and are grounds for immediate account deletion.

Any mods around to kick this dimwit out? Just in case anyone missed it, this (https://forum.doom9.org/member.php?u=225059) is his alter ego.

visionplushdr
9th February 2017, 16:27
Any mods around to kick this dimwit out? Just in case anyone missed it, this (https://forum.doom9.org/member.php?u=225059) is his alter ego.

Do you encode in HDR? I'm visionplushdr. What are you talking about?

Can you show something of your work in HDR grading, or you are another guy with nothing to show?

Where is the rule for the users with nothing to show which offtopic an entire thread with garbage?

I believe there should have been a rule for that, and you should be banned among the rest of the kids with the HDR software ridiculous images without thumbnails.

Groucho2004
9th February 2017, 18:11
Do you encode in HDR?
No.

I'm visionplushdr.
There may be a cure for that.

What are you talking about?Rule 14, try to read my post again.

Where is the rule for the users with nothing to show which offtopic an entire thread with garbage?42

you should be banned among the rest of the kids with the HDR software ridiculous images without thumbnails.You kinda lost it there, drifting off into unintelligible drivel...

visionplushdr
9th February 2017, 18:25
No.


There may be a cure for that.

Rule 14, try to read my post again.

42

You kinda lost it there, drifting off into unintelligible drivel...

Yes you are kinda lost.

Then rule 42? You are filling the thread with garbage and you don't even do HDR. Double problem.

You are also trolling now.

May i ask how old are you? I see you everytime posting offtopics and warning mods with no arguments, you also post inside topics you have no knowledge or interest.

You also post thinking you sound smarter than others. Behavior like yours is exactly the opposite as an intelligent person. Just a heads up.

Groucho2004
9th February 2017, 18:45
Yes you are kinda lost.
I do sometimes feel lost. However, not today.

you don't even do HDR
Well, I'm straight.

You are also trolling now.Just a bit, it's very tempting when you're around.

May i ask how old are you?No, you may not.

You also post thinking you sound smarter than others.Really? If that is the case, it's certainly unintentional. Maybe you're simply not used to someone having a reasonable command of the English language?

visionplushdr
9th February 2017, 18:50
I do sometimes feel lost. However, not today.


Well, I'm straight.

Just a bit, it's very tempting when you're around.

No, you may not.

Really? If that is the case, it's certainly unintentional. Maybe you're simply not used to someone having a reasonable command of the English language?

Do you realize you are the one devastating the topic, right?

I wonder how you don't get banned when you even argue about people like me that only shows just work and you are here trolling, doing offtopic and you don't even encode anything at all. I have doubts you even know how to encode a video in full hd properly.

Let's get you in the right place: You are the one trolling and messing up the thread. Not me, you can continue to troll the whole topic then later people and moderators will know who is the problem here.

Following your trolling comes the other users that likes garbage filled inside topics, just for fun like you just said " i troll when you are here " . Then who is the problem, you or me?

visionplushdr
9th February 2017, 19:19
Anyway, i will avoid the kids.

Here's a screenshot in 1000 nits and DCI-P3.

https://extraimage.net/images/2017/02/09/9c2d6ab726d948dc2d0b79071a4c77e3.md.jpg (https://extraimage.net/image/2Me2)

I will throw more images later in different scenes.

This is a movie from 1989, the movie source got a heavy noise ( not only film grain ) on it that's why was hard to achieve a good HDR grade.

Present ringing on image is from MadVR resizers. Image is being upscaled to 4K.

Groucho2004
9th February 2017, 19:30
Do you realize you are the one devastating the topic, right?
Do you realize that everyone who replied to you in this topic is making fun of you or doesn't care about your bullshit? Do you really think that people don't know that you're posting with multiple accounts? How retarded are you?

visionplushdr
9th February 2017, 19:38
Do you realize that everyone who replied to you in this topic is making fun of you or doesn't care about your bullshit? Do you really think that people don't know that you're posting with multiple accounts? How retarded are you?

No i don't care because nobody here does HDR so far as i know, nobody is showing anything.
And one guy doesn't even knows what is HDR either. Why i would care? You are so small brain gifted that you care about such things?

I show my work and if you believe it's bullshit then argument on why it's bullshit. I can throw image comparisons against the source to get you shut up and place a warning on how to behalf in the future.

You don't know to grade in HDR. You can't even critic anything on here. You only feel good by trolling me because it's the only thing you can do on here.

I believe you have issues and a lot.

Like i said, i will continue to show the HDR grading i do. Some people ( A LOT, thousands ) loves what i do, 3 or 4 trolls from here are not the world, did you know that?

Thousands of people ( i mean, more than 10.000 ) watched my work and every single people liked it, and that's because nobody else is doing this.

If you don't know what to do on here but trolling then go and create a topic called Groncho Trolling Thread. You may get to know people like you and feel better.

kolak
9th February 2017, 19:54
Thousands of people ( i mean, more than 10.000 ) watched my work and every single people liked it, and that's because nobody else is doing this.

If you don't know what to do on here but trolling then go and create a topic called Groncho Trolling Thread. You may get to know people like you and feel better.


Watched on illegal site, where you are charging money to watch your crap HDR made of ripped Blu-rays etc?

ups sorry, I assume you realised that you may get in big trouble and now you don't charge. Why are you not doing it anymore, scared?
Instead your web if full of adverts and some crappy links to some scams.

Your website is a joke- "Real 4K content, Rec.2020" etc etc. You, master of HDR content sell people this crap knowing that all your work comes from Rec.709 masters and it's far from REAL HDR content.

Already told you- buy/rent a good camera, go shoot something, grade yourself and then come back and show us what you have created. Ripping Blu-rays and making HDR version from them is nothing to shout about :) Get real, if you have skill get a job where you can use them and earn money from it.

Do you really don't get it that no one here is interested in your next and next grab of some fake HDR?

visionplushdr
9th February 2017, 20:05
Watched on illegal site, where you are charging money to watch your crap HDR made of ripped Blu-rays etc?

ups sorry, I assume you realised that you may get in big trouble and now you don't charge. Why are you not doing it anymore, scared?
Instead your web if full of adverts and some crappy links to some scams.

Already told you- buy/rent a good camera, go shoot something, grade yourself and then come back and show us what you have created. Ripping blu-ray and making HDR version from them is nothing to shout about :) Get real, if you have skill get a job where you can use them and earn money from it.

What scams and what charges? Donations are not a charge. How torrent are illegal? In UK court is even legal now downloading movies. The site you mention is not even hosted in USA. Also what about Popcorn time?

How an HDR grade in torrent that doesn't even exists can be illegal? Movies with no HDR commercial version hosted worldwide in your computer as well, since you have downloaded it as you claim. Then you are in trouble?

Now let's move to the technical aspects.

You don't know how to grade a movie to HDR and that's why you keep on saying it's not proper grading. It looks fine and you can't even argument on why it doesn't looks good to you.

Compared to the SDR Blu-Ray it has HDR, deep blacks, expanded gamut, highlights and it's even native.

Picture is better than SDR version, in any field. So what's wrong with you?

You claim movies must be graded from a "master" and you have never seen such. Studios upscales from 2K and sells "4K". Where's the master there? If they use proper master, movie can easily transferred to full native 4K resolution but hey... it doesn't happens with most of the movies.

Prove me and the rest of the people the transfers i do looks worse than the SDR Version. And prove are not "proper" HDR because... argument on it.

And there's no such thing as fake HDR. It's HDR or it's not. You really have no idea on HDR at all.

visionplushdr
9th February 2017, 20:12
Your website is a joke- "Real 4K content, Rec.2020" etc etc. You, master of HDR content sell people this crap knowing that all your work comes from Rec.709 masters and it's far from REAL HDR content.



Prove that claim then. "Real HDR content" looks better. Prove it then.

Your claims needs some support or are just instant ignorance.

kolak
9th February 2017, 20:18
If you rip Blu-ray and use it for your own "pleasure" that's sort of fine. If you then take it, make fake HDR version and ask people to pay money to see it on your website, then this is a serious crime.

Why have you stopped charging for access to your website?

If studios sell upscaled content they are in most cases get "punished" for it. People now are more aware what is real/good and what crap and they talk about it. There were many case with BDs being re-done due to people's complain.

This should be on your website: "I rip Blu-rays, make fake HDR of them and try to make money on it, by lying that it's real HDR content. I don't care about content owners as law in my country allows for illegal distribution and copy"

Yes, in some way there is no real or fake HDR, but it's a different story if you make one from RAW assets, compared to some "up conversion".

Can you take SD content and make nice HD of it?

kolak
9th February 2017, 20:26
Prove that claim then. "Real HDR content" looks better. Prove it then.

Your claims needs some support or are just instant ignorance.

Go to NAB or IBC show then at least you will be able to see HDR on good monitors.

One thing- have you upgrade madVR- it has improved algorithms so your HDR may look even better then before, as you rely a lot on madVR processing :)

visionplushdr
9th February 2017, 20:27
If you rip Blu-ray and use it for your own "pleasure" that's sort of fine. If you then take it, make fake HDR version and ask people to pay money to see it on your website, then this is a serious crime.

Why have you stopped charging for access to your website?

If studios sell upscaled content they are in most cases get "punished" for it. People now are more aware what is real/good and what crap and they talk about. There were many case with BDs being re-done due to people's complain.

This should be on your website: "I rip Blu-rays, make fake HDR of them and try to make money on it, by lying that it's real HDR content. I don't care about content owners as law in my current allows for illegal distribution and copying"

I have never charged anything for any movie. And i do the movies for myself, people started to like the samples on youtube and other forums where shown, then huge portals talked with me to allow public to get it.

The "scams" were just a system to allow the site to be running as it costs money to host such site. And are not scams just surveys or contests. You really have no idea on anything at all, right?

And there's no fake HDR. And i don't rip any blu-rays either. I own the blu-rays, you can't do an HDR grade from a ripped movie.

Blu-Rays are really high bitrate in AVC and works fine for most of the movies, the ones with film grain can also be graded like Indiana Jones.

You confuse quality of source with colorspace as well. The 10-bit from your "masters" are just to do an easier HDR grade and gamut expand, with no dither processing which results in an easy to do HDR grade.

Dithered image to 10-bit ( well dithered ) can look as good as a native 10-bit to the eyes.

You confuse a lot of things because you don't know how HDR grading in movie industry is made.

It's not made with Davinci or Adobe Premiere either. It's not mandatory to grade HDR from a 10 or 12 bit source either. HDR grade can be done from any source. HDR is a process also made on images. You can do HDR to anything you want.

Your proper and real HDR grade doesn't exists, it's just what you believe is HDR in video.

Here's Indiana Jones the tank scene in 1400 nits output DCI-P3:

https://www.youtube.com/watch?v=Q3vd6V35-L4

Same scene from the SDR blu-ray:

https://www.youtube.com/watch?v=2tGDSAs_uU4

Now please argument with your technical aspects why it's a fake HDR. I honestly laugh a lot when i read such thing "fake HDR".

That video is in 1400 nits and you can setup up to 10.000 where you get higher range.

Argument like a man not like a kid now.

kolak
9th February 2017, 20:36
It's fake in the same sense how 2K master upscaled to 4K is fake, which are your words. You can't interpolate data from Rec.709 gamut to P3. It will never be as good as making P3 grade from RAW assets which "hold" way more colors.

If you were doing SDR from HDR with some cool algorithm then fine- this could work well, but not other way around.

Rec.709 master will have tons of data lost compared to Raw assets which master was made from and there is not a single algorithm in the world which is able to recover them. You perfectly know this. This is why you will never be able to match "real" HDR master which studios can do by going to Raw assets. It's as simple as source assets limitation problem.

No, you can't make HD master (from SD version) looking as good as original and the same applies to your work. It always going to be quite big compromise. Doing good job with what you have? Fine, but stop claiming some superiority etc. Also- note the you mainly compare your work to SDR, not HDR made by others. HDR will look better, as high brightness stimulates eye differently and old Rec.709 and 100nits standard is very outdated. We know this, that's why there is HDR. Technology improved, so time has come to give people something better even at home. It's driven by money as everything in the world, so again- you did not discover anything new here.

visionplushdr
9th February 2017, 20:41
It's fake in the same sense how 2K master upscaled to 4K is fake, which are your words. You can't interpolate data from Rec.709 gamut to P3. It will never be as good as making P3 grade from RAW assets which "hold" way more colors.

If you were doing SDR from HDR with some cool algorithm then fine- this could work well, but not other way around.

Rec.709 master will have tons of data lost compared to Raw assets which master was made from and there is not a single algorithm in the world which is able to recover them. You perfectly know this. This is why you will never be able to match "real" HDR master which studios can do by going to Raw assets. It's as simple as source assets limitation.

Again you have no idea and you post like you know what you are posting.

Argument on why it's fake in the looks my friend not on your confused brain. Stop listening to your confused brain and argument on the LOOKS OF THE MOVIE.

Does it have HDR?


Deeper blacks - Checked.
Expanded Gamut - Checked.
Better Highlights - Checked.
Expanded Gamma Curve *PQ - Checked.
More movie details - Checked.
It's native HDR HEVC ST. 2084 / 2020 grade - Checked.


In the looks compared to the SDR blu-ray version, how the SDR version looks like?

Movies are meant to be watched. HDR is meant to be better picture. Did you know that?

Rec. 709 can be HDR. 2020 is the expanded range/colorspace. For the expanded PQ curve which peaks in 10.000 nits. Where you can grade up to it.

Rec. 709 got a limited palette that can be also expanded with several methods. Technicolor is one.

I believe you have no idea and you can't argument on why it looks bad or "fake". Colors are also a lot better. So where's the problen with the 709 source here?

kolak
9th February 2017, 20:53
I believe you have no idea and you can't argument on why it looks bad or "fake". Colors are also a lot better. So where's the problen with the 709 source here?

Because studio (or you if had access) can take RAW assets and make master which will be way better than yours as their assets hold tons of more usable data. They have access to huge gamut, high bit depth, can push things a lot in any direction. They have massive possibilities, to tweak it as their assets will let them do it. You, with just 8bit BD source, are very restricted and nothing what you can do about it.

You know this, but yet refuse to acknowledge. Again- I will give you just DVD and ask to make best possible HD master from it. Then I will reveal HD version of it and prove that your upscaled version is crap.

I could point you to some EXR Alexa footage which was shot with dual exposure, so holds about 18 real stops of dynamic range. This is something which would let you to play a lot and also do your own creative grading. It was shot by German uni to do study on HDR.

visionplushdr
9th February 2017, 21:01
Because studio can take RAW assets and make master which will be way better than yours as their assets hold tons of more usable data. They have access to huge gamut, high bit depth, can push things a lot in any direction. They have massive possibilities, to tweak it as their assets will let them do it. You, with just 8bit BD source, are very restricted and nothing what you can do about it.

You know this, but yet refuse to acknowledge.

I could point you to some EXR Alexa footage which was shot with dual exposure, so holds about real 18 stops of dynamic range. This is something which would let you to play a lot and also do your own creative grading. It was shot by German uni to do study on HDR.

For sure, but having a higher quality source doesn't instantly makes a better HDR grade. Because people grades it not robots or meta-humans.

HDR10 is crap for example, you can have the best source in the world and when graded to HDR10, the result is horrible. Why? poor nits grade and colorspace doesn't even gets favored by the quality source.

Dolby Vision movies looks a lot better, why? Higher grade. Higher nits output helped by TV different processings to expand levels on lower nits panels from today ( image is changed, not native HDR either here ).

Now, let's say you have a high quality source and groncho grades it to HDR. How it will look like?

And another, you can get a 709 source and make it look better than a crappy HDR grade from a movie studio with a higher quality source, because even Disney screws up colors on their movies. So?


HDR grading is art. It's never fake and it's never perfect. Because people grades it and it depends on how they grade and what sort of methods they use to bring life to the High Dynamic Range version of the movie. Every movie in HDR looks different,a lot of commercial movies in HDR looks like crap.



You know this, but yet refuse to acknowledge. Again- I will give you just DVD and ask to make best possible HD master from it. Then I will reveal HD version of it and prove that your upscaled version is crap.



Upscaling has nothing to do with HDR Grading. Upscale is creating pixels where they are not even exists. HDR Grading is completely other thing.

Now with the 4K, a lot of commercial movies being named as 4K are 2K bilinear upscales which looks horrible as well.

Did you understand the whole point?

kolak
9th February 2017, 21:14
Send me 1 sec of some nice HDR scene from your file (original h265, not converted to anything).

visionplushdr
9th February 2017, 21:20
Send me 1 sec of some nice HDR scene from your file (original h265, not converted to anything).

Sure name a scene from last crusade and i render it. Or i just look for one myself.

kolak
9th February 2017, 21:22
Some colourful, Avatar etc.

visionplushdr
9th February 2017, 21:27
Some colourful, Avatar etc.

I was doing the scene where Jones jumps from the window in the middle of the storm, raining with light everywhere.

If you want colorful video then i send that later i have an scene from the martian in the space where you can watch the astronauts in heavy colorful palette.

kolak
9th February 2017, 21:28
Any- maybe 5 seconds.

kolak
9th February 2017, 22:35
Anyway- I just wanted to see if your video has expanded gamut, but I assume it does. It will be just a matter how good is algorithm.

Typical result for Rec.709 master:

https://s29.postimg.org/hmzdlyxaf/709.png


Samsung Chasing the Lights HDR sample:


https://s29.postimg.org/4k3r2p72f/samsung.png

visionplushdr
9th February 2017, 22:37
Anyway- I just wanted to see if your video has expanded gamut, but I assume it does. It will be just a matter how good is algorithm.

Typical result for Rec.709 master:

https://s29.postimg.org/hmzdlyxaf/709.png


Samsung Chasing the Lights HDR sample:


https://s29.postimg.org/4k3r2p72f/samsung.png

Just sent you over PM.

kolak
9th February 2017, 22:48
It looks fine, gamut is around P3.
Problem is that colors will be interpolated, so never as good as if you would take them from higher (than P3) gamut source. Limiting is easy, expanding is not. I bet you we would find issues with skin tones, etc. when watched on good reference monitor.

Pushing Rec.709 "edge" colors to P3 edge is easy, but what you do with all gradations etc. We're are back to the same argument. You are interpolating a lot and this always has its limits.

visionplushdr
9th February 2017, 22:58
It looks fine, gamut is around P3.
Problem is that colors will be interpolated, so never as good as if you would take them from higher (than P3) gamut source. Limiting is easy, expanding is not. I bet you we would find issues with skin tones, etc. when watched on good reference monitor.

Pushing Rec.709 "edge" colors to P3 edge is easy, but what you do with all gradations etc. We're are back to the same argument. You are interpolating a lot and this always has its limits.

Yes, but like i said it can be done with a 709 source and expand it to make HDR look "good". It's pretty obvious having a high quality source would result in higher quality HDR, but i and anybody in the public got access to those masters. I do what i can with what's available.

That sample you saw it still needs a lot of improvments anyway.

The Skin always is a problem but not for the color expansion, it's a problem in HDR due to the hue/luminance changes against the original in SDR. People can look a lot weird even with the right color.

kolak
9th February 2017, 23:10
These are all the reasons why people call your masters "fake".
They are bound to relatively the same issue as making HD from SD- it's all based on interpolation.
Taking all of it into account you can't claim any superiority as any decent job from studio would easily beat your HDR master.
Another thing- you are not doing any grading, but applying fixed math on top of already graded masters.
Grading is about qualifiers, power windows, tracking etc which totally separately adjusts image in many small areas (sometimes even) for each frame. You can keep doing it infinitely and it's directors role to say what he wants and say stop when he is satisfied.
Download some raw assets (there are samples publicly available) and try doing it yourself with e.g. free Resolve. With good source you can push things a lot, 8bit BD source is very restricive.

visionplushdr
9th February 2017, 23:19
These are all the reason why people call these master "fake".
They are bound to relatively the same issue as making HD from SD- it's all based on interpolation.
Taking all of it into account you can't claim any superiority as any decent job from studio would easily beat your HDR master.
Another thing- you are not doing any grading, but applying fixed math on top of already graded masters.
Grading is about qualifiers, power windows, tracking etc which totally separately adjusts image in many small areas (sometimes even ) for each frame. You can keep doing it infinitely and it's directors role to say what he wants and say stop when he is satisfied.
Download some raw assets (there are samples publicly available) and try doing it yourself with e.g. free Resolve.

It's the same grading movie studios do. It's a grading from the SDR master. What you are saying is a grading that changes the original movie essence. That can be easily made with even Resolve like you said. This changes the output and the essence.

What i do is an HDR grading from the SDR graded movie = transfer.

Movie studios don't re grade , they transfer.

If you re grade is basically a fan edit, which can look awesome to some and completely edited to others. Movies can't be touched in essence, because that's how director wanted it to look like.

HDR means high dynamic range, where you get deeper blacks, expanded gamut, powerful highlights and more movie details where in SDR you can't or see a lot less. Plus the improved movie contrast.

When you re grade a movie with "resolve" you change the whole thing. Even if you change only the sky or clouds, that's now different and not how it was intended.

Not to mention re grading a whole scene.

Resolve re grade is just for normal videos, home videos, fan edits. Not for movies.

I can show you later the grading from Exodus Gods and Kings ( DEMO IN HDR, official ) against what i do. And you will be surprised it's exactly identical in grading. Same with Life of PI against my transfer.

Movie studios do transfers, not re grade.

WhatZit
9th February 2017, 23:29
Guyz! Guyz! I don't know why I ever doubted him, because Vison is right AGAIN! :stupid:

You see, I shouldn't use a 15 year old program like Avidemux to convert my 1080p Rec.709 sources into 4K ST2084/Rec.2020 (even though it could actually do it).

Despite Vision having only JUST DISCOVERED the basics of upscaling & colour calibration, it's such an old fundamental concept that programs supporting the process are literally everywhere.

So, I should use more modern, oft-used and relied-upon video processing software, such as the sort of things that you know-nothing "Doomers" routinely develop.

I picked AVISynth. That's nice and fresh. Well, I actually used Vapoursynth instead, since AVISynth crashed. Anyway, sure enough, I was able to replicate ALL of his HD-to-HDR remastering examples with the following simple command:

vspipe --y4m I'mSoClever.vpy - | x265 --preset placebo --crf 1 --profile main10 --hdr --max-cll 0,0 --y4m - -o I_Hate_Lucy_s04e12_-_Lucy_Gets_Her_Gamut_Expanded.hevc

Now, I had to hack into Vision's computer and steal his spaghetti-coded VS HDR plugins, because he refuses to release them publicly (since all his work is so shit).

You can see them in action here, in I'mSoClever.vpy:

import vapoursynth as vs
core = vs.get_core()
video = core.ffms2.Source(source="I_Hate_Lucy_s04e12_-_Lucy_Gets_Deep_Colourised!.mkv")
video = core.reformat.JustLikeMinecraft(width=3840, height=2160, matrix_s="2020ncl", transfer_s="st2084", primaries_s="2020", fakebitdepth=12, phoneysampling=444)
video = core.colour.MakeMeAsPrettyAsIFeel(saturationgamut_s="walt_disney", ditherbits_s="more_than_10", rgbgamma_s="iseedeadpeople")
video = core.contrast.MoreNitsMoreNits(graph_s="IT'S_OVER_10000!", preserveoriginalfilmakerintent=0, conversion_s="constantlynon-linearoetf", function_s="linearnon-constantwtf")
video = core.encode.HDR10IsForPlebs(peasantsample=420, peasantquant_s="less_than_12", peasantsei_s="StarTrek2086")
video.set_output()

Wow! What results! Here's a grab of the exact same frame after running it through the above, original on top, remaster on bottom:

https://image.ibb.co/cwb8yv/vidplus.jpg

P.S. You really should listen to Vision! He's discovered a brilliantly inefficient, convoluted and abstract way to perform the same fundamental HD operations that everyone else has been successfully doing for a decade or more. That deserves some acknowledgement, surely?

kolak
9th February 2017, 23:36
I

Movie studios do transfers, not re grade.

Sorry, but you are very wrong. Studios (specially now) do new grades for HDR and they get director to approve it. They may use SDR as reference, but not as a source for some automated transfer. This is nonsense.
There are some attempts made by Technicolor to do automated or semi automated approach, but this is for old/cheaper catalogue where there is no money for the new HDR grade session.
I was on presentation from Dolby for semi automated system for HDR to SDR conversion, but colorists (of course) were not impressed with it at all. This is easier and I assume may happen a lot when HDR masters are common.
SDR to HDR is like SD to HD upscaling- not optimal at all. It makes more sense when the source is not the best anyway, so you can't really get much more from it.
This is exactly what I don't want to happen- some automated up HDR conversion. Do it properly or don't make HDR master at all. Similar story was with HD. At the beginning studios tried to use old films scans which were not good enough. After some time (and many complains on forums) they started doing new scans and now most BDs releases are quite decent. I made 500+ commercial BD discs, so I know this from the 1st hand.

visionplushdr
9th February 2017, 23:44
Sorry, but you are wrong. Studios (specially now) do full new grades for HDR.
There are some attempts made by Technicolor to do automated or semi automated approach, but this is for back/cheaper catalogue where there is no money for the new HDR grade session.
I was on presentation from Dolby for semi automated system for HDR to SDR conversion, but colorists (of course) were not impressed with it at all. This is easier andI assume may happen a lot when HDR masters are common.
SDR to HDR is like SD to HD upscaling- not optimal at all. It may make more sense when the source is not the best anyway, so you can't really get much of it.
This is exactly what I don't want to happen- some automated up HDR conversion. Do it properly or don't make HDR master at all. Similar story was with HD. At the beginning studios used old films scans which were into good enough. After some time they started doing new scans and now most BDs relates are quite decent.

There's no properly, a transfer to HDR is just that. It goes from SDR to HDR.

Re grading a whole movie , can also be done and it depends on the tastes. It changes the whole thing.

Can you show me a movie that has been completely re graded scene by scene?

I have seen almost every commercial HDR movie and every single movie is a transfer.

What you say about colorists is because the way they transfer. They do it one way other's do it different.

What movie has been re graded completely frame by frame? Just to see what you are saying here.

And a grading to HDR from SDR grade is a re grade as well. You believe i throw the SDR movie to virtualdub and i apply gamma and contrast changes? Well, you are extremely wrong. I actually re grade the whole thing until you get the correct output for each detail, contrast, gamut, highlight and so on. It takes me days to do a movie.

Btw, what's with this guy posting like that before you? How come this user is still on here? It's like a joke by now.

visionplushdr
9th February 2017, 23:51
Guyz! Guyz! I don't know why I ever doubted him, because Vison is right AGAIN! :stupid:

blah blah blah...

P.S. You really should listen to Vision! He's discovered a brilliantly inefficient, stupid and abstract way to perform the same fundamental HD operations that everyone else has been successfully doing for a decade or more. That deserves some acknowledgement, surely?

If "programs" with what i do are actually everywhere, then do the same with your programs and upload a video like i just did.

It's so easy to troll on here.

kolak
9th February 2017, 23:59
Of course it will be new look. It should be new look as it's totally different technology, so why you want to limit yourself to stay with SDR look?
Last time you did not care about directors opinion, now you are paranoid about it?

You don't grade- stop calling it this way. You apply fixed math on top of the whole movie.

You are starting your nonsense again.

Do you know how many A class movies are graded with Resolve? A lot.
Fact that it became free tool doesn't mean studios don't use it (or that it's bad). I prefer Baselight, but at the end it doesn't matter much.

Do you know/have been in any of the bigger studios? Seen the work, are you aware of budgets, time frames for post work, seen good grading monitor or whole setup, etc?
I doubt so.

visionplushdr
10th February 2017, 00:04
Of course it will be new look. It should be new look as it's totally different technology, so why you want to limit yourself to stay with SDR look?
Last time you did not care about directors opinion, now you are paranoid about it?

You don't grade- stop calling it this way. You apply fixed math on top of the whole movie.

You are starting your nonsense again.

Do you know how many A class movies are graded with Resolve? A lot.
Fact that it became free tool doesn't mean studios don't use it (or that it's bad). I prefer Baselight, but at the end it doesn't matter much.

Do you know/have been in any of the bigger studios? Seen the work, are you aware of budgets, time frames for post work etc?
I doubt so.

The problem is you confuse re grade with transfer. And i do re grade, what you have seen needed a lot more work as also said it when you watched it.

I will send you other samples with the progress so you can see there's a re grade.

kolak
10th February 2017, 00:11
If you do re-grade (which is still more than you actually do) then don't call your work grading. Fact that you try to make it good, means nothing (of course you want to make it good). You apply fixed math on top of whole movie, so this is transfer or conversion.

I'm not interested in your samples, seen enough.

Maybe you can offer a service for people who shoot with mobiles, etc to transfer their recordings to HDR.

kolak
10th February 2017, 00:17
I will send you other samples with the progress so you can see there's a re grade.

Yes, because your algorithm is not adaptive(?) so you have to find average settings which work for whole movie. This enforces massive compromise.
Compared to your work, some have adaptive algorithms which analyse scene after scene and use different math for each one.

visionplushdr
10th February 2017, 00:21
Yes, because your algorithm is not adaptive(?) so you have to find average settings which work for whole movie. This enforces massive compromise.
Compared to your work, some have adaptive algorithms which analyse scene after scene and use different math for each one.

That's wrong too. I grade for each scene and that's why i have mentioned the dynamic metadata can also be possible without it. Metadata's are just info and restrictions movie studios do for the commercial ecosystem.

I render each scene differently, as you may know avisynth can trim sections and apply different scripts for each scene, same happens with hdr and you can do anything you want there. The dynamic metadata ST. 2094 changes gamut and adapts the HDR to specific scenes. Same as i do.

https://extraimage.net/images/2017/02/09/c19f6d73c675d616d4064f247c9c94c3.md.jpg (https://extraimage.net/image/2Mrb)

That got another grade as what you have watched.

If you don't know what i do , please don't assume things.

kolak
10th February 2017, 00:28
And you do it manually?

visionplushdr
10th February 2017, 00:31
And you do it manually?

Yes with my fingers and the keyboard.

It's HDR without a metadata restriction, of course you have to do each scene manually. It's the same as colorists do but they put the info inside the restrictive metadata, so you and anybody must buy HDR TV's and change it later if it doesn't supports the ST. 2094... marketing.

Later on they could invent another metadata where movies do other changes, same manually added and coded to the metadata. That's why HDR is in child state, not because they don't know how to do it, because they want to keep on selling new TV's and stuff. Do you believe they couldn't do the dynamic metadata before?

WhatZit
10th February 2017, 00:50
If you don't know what i do , please don't assume things.

That's because you don't show us what you do, you just upload dislocated remastered garbage without any exploration of the process behind it.

You leave us to guess what you do, and, as it turns out, I was pretty close!

Listen to me very carefully, junior. NO-ONE wants to see your videos! NO-ONE!

SOME PEOPLE MAY want to see what process you use for BT.709 upconversion, so they can try it for themselves! Relaying technical know-how is the entire point of this board, especially this particular HDR encoding thread!

This ISN'T a bragging social-media selfie e-peen site! This ISN'T a public art gallery for aspiring colourists. Stop treating it as such.

If you want to demonstrate the amazing new SDR->HDR grading method that you've concocted, post your scripts, post screenshots of the program settings, post instructions and/or post logs.

If all you want to do is post videos, go to some wanky fatuous social media site and do it. We're only interested in technical discussions, not childish "look-at-me" bragging, which is all you've contributed so far!

kolak
10th February 2017, 00:51
It's so time consuming.
Metadata still sounds better as allows for more adaptation based on TV capabilities.
It's all about money, don't you know it :)

kolak
10th February 2017, 00:55
That's because you don't show us what you do, you just upload dislocated remastered garbage without any exploration of the process behind it.

You leave us to guess what you do, and, as it turns out, I was pretty close!

Listen to me very carefully, junior. NO-ONE wants to see your videos! NO-ONE!

SOME PEOPLE MAY want to see what process you use for BT.709 upconversion, so they can try it for themselves! Relaying technical know-how is the entire point of this board, especially this particular HDR encoding thread!

This ISN'T a bragging social-media selfie e-peen site! This ISN'T a public art gallery for aspiring colourists. Stop treating it as such.

If you want to demonstrate the amazing new SDR->HDR grading method that you've concocted, post your scripts, post screenshots of the program settings, post instructions and/or post logs.

If all you want to do is post videos, go to some wanky fatuous social media site and do it. We're only interested in technical discussions, not childish "look-at-me" bragging, which is all you've contributed so far!

It's a secret. All what you get are meaningless screen grabs and a lot of self-adoration :)
Besides- it's scene by scene work, so not very interesting anyway.

visionplushdr
10th February 2017, 01:01
It's so time consuming.
Metadata still sounds better as allows for more adaptation based on TV capabilities.
It's all about money, don't you know it :)

Yes i know it's time consuming but i like to do it. My videos got the ST 2084 metadata, not the 2094.

And about how i do what i do, i have already mentioned it before i use hdr core and plenty of scripts i have put together to make the whole things happen.

It's not a miracle or a non existant process, it's a matter of using the right combo for hdr processing.

And i don't upload it because im improving the whole process on every movie i transfer. Why i would upload something incomplete, plus it's not my goal either, since people that do HDR grade can do it from Resolve for example. I prefer to do it all manually via avisynth because i feel it a lot more powerful.

And about your recommendation on offering services to process home made mobile videos, that would be good. I actually do the movies because of fun, having a job on this would be a lot better. Anybody wants to do what they like as a living, the problem is i don't know where to offer and i didn't look where because i was improving the process.

Now that HDR10 can be "ripped" here's my work against commercial HDR10: http://screenshotcomparison.com/comparison/200268

VISIONPLUSHDR-1000 ( up to 10K Nits Re Grade )
https://extraimage.net/images/2017/02/11/97504eef8b10b9f607b4991a310d6715.md.png (https://extraimage.net/image/2UBQ)

COMMERCIAL HDR10 Output
https://extraimage.net/images/2017/02/11/4abef5926ec5b6c41df924f55052d801.md.png (https://extraimage.net/image/2UBL)

In terms of color palette, the visionplus output throws 40% more gamut than the HDR10.

This time, a 10-bit source was used.

Video recorded in 1080p:

https://www.youtube.com/watch?v=XNrnCc8g1t4

Another one with 1500 nits MadVR output:

https://www.youtube.com/watch?v=KgWCUIOuovQ

This is to prove you don't need any HDR TV to process HDR content. You need it for the restrictive commercial ecosystem full of marketing moves.

kolak
11th February 2017, 19:21
When you grade to some standard you actually limiting gamut (and studios throw away lots of colors compared to their Raw sources, which you then try to recover back by interpolation). This is the whole point of grading to a standard, so 10 TVs which meet this standard will display image about the same way (and the way how colorist seen it).
Going through some further processing (madVR in your case) almost negates whole point of grading.
Lastes madVR has improved algorithm, so your masters which you have done some time ago will look different. In next month they may look even different.

If you ask me to judge these grabs as final look, which I would get on my TV with madVR then your's is oversaturated.

We already discussed this- some movies have 50% of colors deliberately removed, as this is directors intention. Your idea is that all movies should be crazy saturated and punchy (all to maximum). If you make all movies this way they will look the same and boring.

visionplushdr
11th February 2017, 19:28
When you grade to some standard you actually limiting gamut (and studios throw away lots of colors compared to their Raw sources, which you then try to recover back by interpolation). This is the whole point of grading to a standard, so 10 TVs which meet this standard will display image about the same way (and the way how colorist seen it).
Going through some further processing (madVR in your case) almost negates whole point of grading.
Lastes madVR has improved algorithm, so your masters which you have done some time ago will look different. In next month they may look even different.

If you ask me to judge these grabs as final look, which I would get on my TV with madVR then your's is oversaturated.

We already discussed this- some movies have 50% of colors deliberately removed, as this is directors intention. Your idea is that all movies should be crazy saturated and punchy (all to maximum). If you make all movies this way they will look the same and boring.

There's no change at all in colors. Did you watch last video and the images? No change whatsoever in any frame. Maybe you just can't understand i don't change colors of the movie here i do re grade to HDR1000 which i name VISIONPLUSHDR-1000 where jumping to the Peak of the PQ in grading. It results in higher quality HDR, because it goes to the limit of the actual PQ. Which is the HDR standard.

I don't go anywhere else. And colors are exactly the same as in the HDR10, the problem with the HDR10 standard is it's low nits grading and limits the whole PQ.

That's why you CAN'T play HDR10 content on MadVR due to it will look like SDR. Where you CAN play HDR1000 as it has got higher nits and it will look even better than a HDR10 video played by an "HDR TV".

And MadVR HDR processing works like a TV. When it gets updated it will look better. That's it.

You can apply saturation, brightness, contrast and so on. Like in a TV.

kolak
11th February 2017, 19:39
That's why you CAN'T play HDR10 content on MadVR due to it will look like SDR. Where you CAN play HDR1000 as it has got higher nits and it will look even better than a HDR10 video played by an "HDR TV".



If you have HDR TV you don't need madVR (or have madVR processing disabled), if don't then fact that madVR looks like SDR is actually what you want and correct behaviour of madVR.

visionplushdr
11th February 2017, 19:46
If you have HDR TV you don't need madVR (or have madVR processing disabled), if don't then fact that madVR looks like SDR is actually what you want and correct behaviour of madVR.

Completely wrong. HDR10 Tv's can't handle PQ 10.000 nits graded video. Have tried on a lot, it doesn't recognizes the metadata. When applying up to 1000 in metadata it works and it looks wrong output due to the TV's HDR software not limiting properly the HDR video in higher nits.

Commercial TV's can only handle HDR10 video and Dolby Vision both, with also restrictive metadata.

MadVR looks like SDR in HDR10 , because it's a low nits grade and HDR10 is actually ALMOST SDR content. HDR10 is a subtle grade from the SDR movie. It peaks on 1000 when most of the times is around 300/400 nits average.

HDR1000 is completely different behavior for a TV processing software. Here, you don't even need a TV preset to expand levels which means it's outputting a native content without being messed up by the different TV processing out there. Which 100% of those, makes image look worse.

Tell me where in this image colors are changed:

It's MadVR in 1500 nits and DCI. There's nothing more than a HDR1000 Video. No changes with TV presets. And levels are fine.

See the difference?

https://extraimage.net/images/2017/02/11/25addf4a7be5b8aacc92df9e2eeb85d7.md.png (https://extraimage.net/image/2UYn)

You can apply saturation or whatever you want from MadVR or your TV when playing this video. You just don't get it.

4K resolution HDR details are also IMPROVED by the higher nits PQ Grade. As you can zoom in this image and see for yourself.

kolak
11th February 2017, 19:46
I don't go anywhere else. And colors are exactly the same as in the HDR10, the problem with the HDR10 standard is it's low nits grading and limits the whole PQ.



Well, because technology is where it's. Studios can see their work at 1:1 at 1000nits, but not in 10K. This is the reason why they do 1K grades, not 10K.
Is 10K version limited by each TV in different way, better than 1:1 1000nits version which TVs are about to able to do?

Studios can do 10K grades, but this is not the good practice. Again- this is the whole point of reference TVs- you see what you do at 1:1.
You can do what Dolby is proposing and have 4K or 10K grades, but then you need each TV to have fixed processing on top of this signal, so you know how it will look. Problem is that TV manufactures are not very keen on it, as this would force their TV to have same chipset as others and lost it's uniqueness. Maybe LG or Sony or Samsung can do better then Dolby, so why would they restrict themselves to Dolby way?

visionplushdr
11th February 2017, 19:51
Well, because technology is where it's. Studios can see their work at 1:1 at 1000nits, but not in 10K. This is the reason why they do 1K grades, not 10K.
Is 10K version limited by each TV in different way, better than 1:1 1000nits version which TVs are about to able to do?

Studios can do 10K grades, but this is not the good practice. Again- this is the whole point of reference TVs- you see what you do at 1:1.
You can do what Dolby is proposing and have 4K or 10K grades, but then you need each TV to have fixed processing on top of this signal, so you know how it will look. Problem is that TV manufactures are not very keen on it, as this would force their TV to have same chipset as others and lost it's uniqueness. Maybe LG or Sony or Samsung can do better then Dolby, so why would they restrict themselves to Dolby way?

Are you sure? How do i see the 10k nits and video perfectly leveled in 10.000 nits grade then?

I didn't know i was one of a kind. Come on, you still don't get it.

kolak
11th February 2017, 19:55
... HDR10 Tv's can't handle PQ 10.000 nits graded video. Have tried on a lot, it doesn't recognizes the metadata. When applying up to 1000 in metadata it works and it looks wrong output due to the TV's HDR software not limiting properly the HDR video in higher nits.

Commercial TV's can only handle HDR10 video and Dolby Vision both, with also restrictive metadata.




Exactly, as these are the standards and this what studios will work to.
Some 10K grade, which needs madVR further processing is purely your invention. Maybe it will change in the future. For now studios work with standards which do exist, not a custom versions of them.
You can't have world without any standard, as this would end up with real mess. Even with standards we have enough problems.

kolak
11th February 2017, 19:59
Are you sure? How do i see the 10k nits and video perfectly leveled in 10.000 nits grade then?

I didn't know i was one of a kind. Come on, you still don't get it.

You tell me how you do it.

Other way.
I give you movie which you take and create your 10K HDR grade. How do I watch it, so it looks great?

visionplushdr
11th February 2017, 20:09
You tell me how you do it.

Other way.
I give you movie which you take and create your 10K HDR grade. How do I watch it, so it looks great?

Are you kidding me. With MadVR !!!!!!!!!!!!!!

MadVR processes HDR content with different 3D Luts, exactly as what HDR10 TV does or Dolby Vision with the 4000 nits graded movies to be playback in oleds with just 640 nits real peak.

Dolby Vision looks better than HDR10 , why? Because of the higher nits grade. Dolby Chip works as the MadVR limiting and compressing highlights for you to watch a pleasant video and even use TV stuff to make it ridiculously popup or not.

MadVR process HDR in the same way as TV's. The difference? You can setup any nits you want from the INPUT, as well compressing, limiting, preserving hue and so on. UNLIKE what happens with an HDR TV where you can't touch those things.

Understood?

kolak
11th February 2017, 20:15
How do studios monitor during their grading?

visionplushdr
11th February 2017, 20:18
How do studios monitor during their grading?

With Reference HDR monitors. You are actually getting to the point. I'm impressed you are taking so long to get to it.

kolak
11th February 2017, 20:28
Exactly, monitor which can display 1000 nits and was calibrated, so there is 1:1 preview without any additional processing.
Studios want to see real thing, not just "limited" version. What would be the point of having reference monitors otherwise?

This is what you can't see. You can't see your masters at 10K nits. All what you see is some lower nits version of it. What if it looks not good at 10K, so when in the future you will have 10K TV you won't like it?

Can you explain me why 10K master grade will look better on 1000nits TV compared to 1000nits grade?
1000nits grade will be processed as it, where 10K will have to go over some processing, e.g. madVR or Dolby chip.
I just don't get it as if I can make almost special version for given Tv capabilities, there is no way some automatically processed versions from "better master" will look better. It's just not possible. At the end TV can display what it can.

Your constant claim that your version looks better over madVR is purely based on the fact that madVR processing is not good for HDR10 master (but who needs it as HDR TVs will process this as is), but this doesn't mean your masters are any better. They just look different. You could actually get your 10K madVR processed signal to look about identical to original 1K grade. In the same time any of your 10K grades watched on 1Knits TV could be matched with direct 1K grade. Yes?

visionplushdr
11th February 2017, 20:36
Exactly, monitor which can display 1000 nits and was calibrated, so there is 1:1 preview without any additional processing.
Studios want to see real thing, not just "limited" version.
This is what you can't see. You can't see your masters at 10K nits. All what you see is some lower nits version of it. What if it looks not good at 10K, so when in the future you will have 10K TV you won't like it?

Can you explain me why 10K master grade will look better on 1000nits TV compared to 1000nits grade?
1000nits grade will be processed as it, where 10K will have to go over some processing, e.g. madVR or Dolby chip.

Anyway, i really thought you were getting to the point.

Explain what?

Dolby Vision looks better than HDR10 and it goes through compressing highlights like MadVR with my HDR "standard". And it looks A LOOOOOT BETTER THAN HDR10, even with lower nits panel than playing HDR10 content in a higher nits panel.


HIGHER NITS GRADE = HIGHER HDR QUALITY. HDR is not all about pure extreme highlights, the grade in High Dynamic Range content addds deep blacks, hdr movie contrast, more gamut expansion and more movie details. The Highlights is the part where the nits of a full native panel is going to throw an impressively extreme light, which as you may already aware a 10.000 nits TV in front of your eyes is going to burn your brain before your eyes.

NITS are inside the PQ, which means peaking to the maximum gamma curve, where allows for the maximum range and all the rest i have already mentioned to you.

You can see how highlights reacts in high nits without having to actually burn your brain out.

kolak
11th February 2017, 20:44
Yes, but whatever you do is always LIMITED by actual TV capabilities. And if anything could be the best is actual grade to the specific TV. Studios done such a tests with old Dolby SDR reference TV using its native gamut etc. just to see how good video could look. In real world you have to stick with standards.

visionplushdr
11th February 2017, 20:45
but this doesn't mean your masters are any better. They just look different. you could actually get your 10K madVR processed signal to look about identical to original 1K grade. In the same time any of your 10K grades watched on 1Knits TV could be matched with direct 1K grade. Yes?


Not different, higher PQ. Which means higher range, higher gamut, higher deep blacks, higher everything.

A 10.000 nits graded video in PQ has got higher HDR quality, when 1K graded HDR is already limited and can't be fixed by HDR processing.

Higher nits grade shows everything improved, doesn't matter how you want to understand it.

The PQ is the standard where maximum curve peaks on 10.000, of course a higher nits graded video is going to be a lot better than an already capped 1K graded video.

It's just common sense as well, i'm surprised you don't know that by now.

visionplushdr
11th February 2017, 20:48
Yes, but whatever you do is always LIMITED by actual TV capabilities. And if anything could be the best is actual grade to the specific TV. Studios done such a tests with old Dolby SDR reference TV using its native gamut etc. just to see how good video could look. In real world you have to stick with standards.

No it doesn't. It's not limited.

If you see one image that has been graded in 10.000 nits from the PQ in a 1000 nits panel, it will look extremely more realistic in every aspect to the same frame rendered and graded in 1K top.

Do you want to test it? Get HDR10 frame and watch it in a 1000 nits TV. Then get HDR1000 ( or VPLUS1000 ) and watch in a 1000 nits TV.

Difference is abismal. The only that "limits" are the highlights, the rest is improved. You don't get it?

Even with compressed highlights when the grade is higher, highlights still look more realistic everywhere, because of reflections and such.

Why you just don't freaking watch a Dolby vision movie and later watch any HDR10 movie in the best HDR10 tv?. Then come back and tell me what looks best.

kolak
11th February 2017, 20:53
Whatever you throw into your 10K master at the end meets TV capabilities. If TV has x deepest black point, z whitest peak and e.g. 95% of P3 gamut this is it. If you feed 10K grade or 1K grade none of the values there will go outside TV capabilities, so I don't get how your 10K grade can have better gamut. What is in 10K grade that allows you to have colors which can't be included in 1K ? Maybe I don't get it, but as far as I understand in both case I can have P3 gamut+ final restriction due to TV capabilities.

visionplushdr
11th February 2017, 20:55
Whatever you throw into your 10K master at the end meets TV capabilities. If TV has x deepest black point, z whitest peak and e.g. 95% of P3 gamut this is it. If you feed 10K grade or 1K grade none of the values there will go outside TV capabilities, so I don't get how your 10K grade can have better gamut. What is in 10K grade that allows you to have colors which can't be included in 1K ? Maybe I don't get it, but as far as I understand in both case I can have P3 gamut+ final restriction due to TV capabilities.

The re grade got higher gamut expansion due to the expanded PQ. This is something actually basic.

When you watch a limited nits PQ video everything gets limited not only the gamut. Curve is extremely lower, you can't get gamut expanded in a limited curve.

Do you even know what HDR is about?

The PQ works with the 2020 for a reason.

What if you watch an HDR10 with just 709? What happens? It will look like crap.

The 2020 allows the PQ to expand the curve to the max, including the gamut.

kolak
11th February 2017, 20:56
No it doesn't. It's not limited.

If you see one image that has been graded in 10.000 nits from the PQ in a 1000 nits panel, it will look extremely more realistic in every aspect to the same frame rendered and graded in 1K top.

Do you want to test it? Get HDR10 frame and watch it in a 1000 nits TV. Then get HDR1000 ( or VPLUS1000 ) and watch in a 1000 nits TV.

Difference is abismal. The only that "limits" are the highlights, the rest is improved. You don't get it?

Even with compressed highlights when the grade is higher, highlights still look more realistic everywhere, because of reflections and such.

Why you just don't freaking watch a Dolby vision movie and later watch any HDR10 movie in the best HDR10 tv?. Then come back and tell me what looks best.

If this is how it works then it means standard/TV processing chain are crap and put restrictions by (due to current TVs "small" possibilities) supporting to small part of the to big standard (or bad masters).

kolak
11th February 2017, 20:59
The re grade got higher gamut expansion due to the expanded PQ. This is something actually basic.

When you watch a limited nits PQ video everything gets limited not only the gamut. Curve is extremely lower, you can't get gamut expanded in a limited curve.

Do you even know what HDR is about?

The PQ works with the 2020 for a reason.

What if you watch an HDR10 with just 709? What happens? It will look like crap.

The 2020 allows the PQ to expand the curve to the max, including the gamut.

How expanded? Colours fit into P3 gamut and in theory I can have all colors from this gamut own 10K grade and in 1K grade.

You have to explain in some other way. How expanded? Where will you get more colors from (which TV can't display anyway, so you won't see them)?
Sorry, but it makes no sense to me :)

visionplushdr
11th February 2017, 21:00
If this is how it works then it means standard/TV processing chain are crap and put restrictions by current TVs supporting to small part of the to big standard (or bad masters).

And that's what i was trying to get you understand from the scratch.

It's not just that, HDR is being used as money making sort of magic.

As today i can't believe how some 4K Blu-Rays looks like agains the crappy HDR10 version, which is extremely limited and even in DCI output.

It' about the market, they needed HDR. They got it and of course they won't show you how best it can look from the scratch.

kolak
11th February 2017, 21:02
I know this perfectly and this is exactly what I wanted to tell you next. Studios/TV manufactures never through the best straight away.

visionplushdr
11th February 2017, 21:03
How expanded? Colours fit into P3 gamut and in theory I can have all colors from this gamut own 10K grade and in 1K grade.

You have to explain in some other way. How expanded? Where will you get more colors from (which TV can't display anyway, so you won't see them)?
Sorry, but it makes no sense to me :)

No you don't. Gamut in a low PQ looks one way. Same gamut in a high PQ looks better.

Can you go outside and watch a flower in the middle of the sun and then watch the same flower when almost there's no sun?

What looks more colorful, in the same GAMUT.... of the REAL LIFE? Where it's a little better than the BT. 9055 (?)

That is what HDR is trying to "simulate". A higher PQ Curve to allow everything to look closer to real life.

visionplushdr
11th February 2017, 21:07
I know this perfectly and this is exactly what I wanted to tell you next. Studios/TV manufactures never through the best straight away.

I have really doubts on this claim. They know how to make it best, they just wait for it , worse when they have released and making hundreds of HDR10 movies and TV's.

That's why now they release the ST 2094 to allow some little improvement in the same crap.

kolak
11th February 2017, 21:14
No you don't. Gamut in a low PQ looks one way. Same gamut in a high PQ looks better.

Can you go outside and watch a flower in the middle of the sun and then watch the same flower when almost there's no sun?

What looks more colorful, in the same GAMUT.... of the REAL LIFE? Where it's a little better than the BT. 9055 (?)

That is what HDR is trying to "simulate". A higher PQ Curve to allow everything to look closer to real life.

Yes, but that's the brightness, which stimulates your eyes differently (this was 1st thing which I noticed on Dolby 2Knits monitor). If amount of colors displayed by a TV is fixed (which it's) and maximum brightens also then then only way how 10K grade can look better then 1K grade is only by having higher average scene brightness.
This brings me back to the point that you can go back and do 1K grade which will match processed 10K grade. There is actually possibility that 10K grade may suffer from only 10bit signal, as this can be not enough and cause banding and posterisation.

The only thing which could still somehow screw this up is if processing inside TV is optimised based on full 10K standard and can't justify such a "low" 1K signal. I'm just not convinced that this is the case.

I'm still not convinced if HDR10 is an issue, or if the real issue is that your 10K processed masters end up with much higher average light, so that's why they look better.

visionplushdr
11th February 2017, 21:20
Yes, but that's the brightness, which stimulates your eyes differently (this was 1st thing which I noticed on Dolby 2Knits monitor). If amount of colors displayed by a TV is fixed (which it's) and maximum brightens also then then only way how 10K grade can look better then 1K grade is only by having higher average scene brightness.
This brings me back to the point that you can go back and do 1K grade which will match processed 10K grade.

The only thing which could still somehow screw this up is if processing inside TV is optimised based on full 10K standard and can't justify such a "low" 1K signal. I'm just not convinced that this is the case.

Gamut looks better, i just don't know what TV or monitor you are using to watch the image i just posted from Scarlett Face.

MadVR works like a TV, you optimise the MadVR settings for the 10K output. When you playback the HDR10 Lucy movie in let's say... 1500 nits same as in that image, not only gamut will be hell of a bettter in the higher nits but all the rest as well.

If you play both in 1000 nits output, the high nits grade is going to make the 1K grade look like a DVD.

It looks like you didn't watch Lucy or any other HDR10 movie. Even with the image you can sort it out.

kolak
11th February 2017, 21:26
Even more- if your 10K grade processed to 1K looks better than original 1K (and you did not change colors etc) then the only reason for it will be that your average light is much higher. TV processing argument is eliminated here because in both case we are peaking at the same 1K.

Can you tell me what is the average light for some original 1Knits movie compared to your version (limited to 1K)? I bet you yours is much higher?
Try matching them and then there should be almost no difference in both versions.

visionplushdr
11th February 2017, 21:30
Even more- if your 10K grade processed to 1K looks better than original 1K (and you did not change colors etc) then the only reason for it will be that your average light is much higher. TV processing argument is eliminated because in both case we are peaking at the same 1K.

Can you tell me what is the average light for some original 1K movie compared to your version (limited to 1K)? I bet you yours is much higher?
Try matching them and then there should be almost no difference in both version.

Well then here's an uncompressed HDR10 output in DCI:
https://extraimage.net/images/2017/02/11/7dbaa0cfa4d0399cf3224c2fa6293b97.md.png (https://extraimage.net/image/2UrH)

And here's 5000 NITS output from the 10K Grade in the same DCI and everything else as before:
https://extraimage.net/images/2017/02/11/a109438784e5507a77ac9ce41180a316.md.png (https://extraimage.net/image/2UrF)


What looks better. You are making it really hard.

kolak
11th February 2017, 21:33
I just asked you:

Can you tell me what is the average light for some original 1K movie compared to your version (limited to 1K)?

visionplushdr
11th February 2017, 21:39
I just asked you:

Can you tell me what is the average light for some original 1K movie compared to your version (limited to 1K)?

709 vs 3941.

The HDR10 is uncompressed therefore you see it the best it can in that picture.

The 10K image is in 5000 out of 10.000 with a 0% preserve hue, compressed. Do you want me to set it at 10.000?

Do yo realize an uncompressed HDR output looks the best it can?

Do you compared the gamut? The HDR10 looks horrible.

Also why you are talking about light when if set at 5000 it's already way out the 3941 from that exact frame?

kolak
11th February 2017, 21:40
No, to 1000, to see how it to compares to original.

As I said- just trying to establish if your processed 10K master limited to 1K nits will have much higher average scene light compared to original 1K master.

visionplushdr
11th February 2017, 21:49
No, to 1000, to see how it to compares to original.

As I said- just trying to establish if your processed 10K master limited to 1K nits will have much higher average scene light compared to original 1K master.

In 1000 nits it will be extremely compressed, and it depends on how good MadVR is at this point. Same with a Dolby Chip. And here's where MadVR updates you have mentioned can make things look better and better.

Still, here's your 1000 nits output:

https://extraimage.net/images/2017/02/11/051cdf4218df1af104f27c6595ac0c2a.md.png (https://extraimage.net/image/2UGs)

If you know how to count gamut/palette you have already noticed it.

kolak
11th February 2017, 21:54
How average light compares to original for this scene?
madVR show the original value for 10K.

It looks not good. How do you watch your movies then? What do you limit them to?

visionplushdr
11th February 2017, 21:56
How average light compares to original for this scene?
madVR show the original value for 10K.

You have it measured. It's a high nits grade of course will have higher average light everywhere in the same scene/frame and that's the whole point.

A 10.000 nits graded video needs an extreme precise HDR processing to output in 1000 nits. That's why Dolby Vision uses a chipset. Even though those movies are just 4000 nits.

MadVR is powerful enough to render 10K HDR graded movies in 2000/5000 and show the perfect output.

visionplushdr
11th February 2017, 21:58
How average light compares to original for this scene?
madVR show the original value for 10K.

It looks not good. How do you watch your movies then? What do you limit them to?

I use 4000 most of the times and if i want, i set more and use TV presets to do whatever i want to the image. It will always look a LOT better than any HDR commercial movie including Dolby Vision's.

kolak
11th February 2017, 22:07
So your transfer to 10K, limit to 4K in madVR. On top if this TV will do it's own work. Is it much brighter on your TV compared to original?

visionplushdr
11th February 2017, 22:11
So your transfer to 10K, limit to 4K in madVR. On top if this TV will do it's own work. Is it much brighter on your TV compared to original?

Original looks dull compared to a 4000 nits output, even in a 6000.

Details, blacks, range and gamut is also improved in those settings against the original.

Let's say you get an scene in HDR10 where you see the sky with lot of sunlight and clouds... in HDR10 i see it dull and over exposed. I see no details. Contrast is crap. Gamut as well. Honestly HDR10 looks to me almost SDR with a subtle grade.

In let's say 5000 nits from the 10K grade, clouds shows off perfectly fine with even reflections and "body". More gamut is seen in more natural way as well. You see highlights even more realistic and the blacks are even deeper.

That's the difference. In the same TV, with the same TV output/preset if you watch an HDR10 or a 10K Graded one.

kolak
11th February 2017, 22:17
What is your TV?

visionplushdr
11th February 2017, 22:21
What is your TV?

I have more than 1. I like HDR content from VA's, 9800 outputs pretty amazing.

And btw, to reckon something as i almost forgot.

The images you saw had +14 saturation from both of the versions. This means the HDR10 looks even worse and also means the 10K grade is higher gamut, as you can see how color is still in natural state. Im sure you know what that means.

Gamut expansion.

If you want i can show you more sets where HDR10 in 0% saturation will be devastated but that's a bit harsh for the HDR10 standard.

But whatever.

Uncompressed HDR10 0% Sat against 4000 nits 0% Sat.

https://extraimage.net/images/2017/02/11/a237ac27323a8ddf82b0d32515a05176.md.png (https://extraimage.net/image/2UP2)
https://extraimage.net/images/2017/02/11/b5eac73fb7b7891581195dfaf38549e4.md.png (https://extraimage.net/image/2UPw)

It feels like in HDR10 Scarlett is some sort of zombie.

And peaking in 10.000 nits doesn't means every scene is in a lot of nits. A correct grade shows perfect levels for the "nits" inside each scene type.

kolak
12th February 2017, 00:30
Top one is not necessarily better.

WhatZit
12th February 2017, 02:11
What looks better. You are making it really hard.

Are you familiar with William Freidkin?

Specifically, are you familiar with what William Friedkin did with his DVD/BluRay "remasters"?

I mention this because... he is YOU!

With the very tangible differences that Freidkin has a 10 year head-start over anything and everything that you're currently doing, and that he'd already generated a back catalog of significant cinematic credibility before he did it!

His story is actually a case study in what happens when you do precisely what YOU'RE doing to films that people remember from their cinema experiences.

So, if you want to garner some reaction to the exact same sort of alterations that YOU are making to original sources, you only need to google the public reactions to Freidkin's over-zealous application of new video technologies to original film stock.

Now, going towards the OTHER end of the spectrum, are you familiar with how James Cameron oversees his remasters...?

CruNcher
12th February 2017, 02:30
Top one is not necessarily better.

In terms of light transfer it is Subsurface Scattering in combination with indirect lighting would not react that way.

In terms of the Director not necessarily what he want's you to see and feel though ;)

I wonder actually from which Camera and Sensor this scene and specifically Frame was captured :)

Sony, ARRI or RED ?

visionplushdr
12th February 2017, 05:32
Top one is not necessarily better.

It's a lot better and that's how i liked it. Which means a lot better to me, as how i setup the output. You forgot you can tweak , or better said... setup and calibrate the HDR output to your wishes. But not from the crappy HDR10, which is already limited. From an "uncapped" PQ video. Exactly as what happens with any content, where's not limited.

Guess what would happen when watching a limited BT 709 SDR video, the same goes for HDR10 and the rest of the PQ content, it's native and fully available to the final user when using the whole PQ.

visionplushdr
12th February 2017, 05:33
In terms of light transfer it is Subsurface Scattering in combination with indirect lighting would not react that way.

In terms of the Director not necessarily what he want's you to see and feel though ;)

I wonder actually from which Camera and Sensor this scene and specifically Frame was captured :)

Sony, ARRI or RED ?

What are you talking about? Unbelievable. You are also talking about light and there's no light at all in the HDR10 because it's completely dull and looks like garbage. More than unbelievable, incredible.

visionplushdr
12th February 2017, 05:36
Are you familiar with William Freidkin?

Specifically, are you familiar with what William Friedkin did with his DVD/BluRay "remasters"?

I mention this because... he is YOU!

With the very tangible differences that Freidkin has a 10 year head-start over anything and everything that you're currently doing, and that he'd already generated a back catalog of significant cinematic credibility before he did it!

His story is actually a case study in what happens when you do precisely what YOU'RE doing to films that people remember from their cinema experiences.

So, if you want to garner some reaction to the exact same sort of alterations that YOU are making to original sources, you only need to google the public reactions to Freidkin's over-zealous application of new video technologies to original film stock.

Now, going towards the OTHER end of the spectrum, are you familiar with how James Cameron oversees his remasters...?

WHAT.

What remaster? :eek: There's no remasters here. Just the PQ unlimited. Where the movie is encoded. Read on top, encode a video in half BT 709 SDR (?) then encode a video in the correct BT 709 SDR, is that a remaster? No. It's the correct way to use a Format/Standard.

The fact you are watching limited HDR content, doesn't means it's right.

And my movies works fine in Dolby Vision TV's [ by USB turns on BT. 2020 and HDR processing natively from dolby chip ], but not in HDR10. Guess why?

Im out.

WhatZit
12th February 2017, 06:24
You are also talking about light and there's no light at all in the HDR10 ...

Oh, no, Guyz! He's discovered Hollywood's greatest secret!

"Lucy" was the FIRST EVER film to employ psychophoton technology, where every scene is filmed in PITCH DARKNESS, but with the whole environment being rendered using the electroencephalography of an MKUltra Remote Viewer.

In turn, these recorded mind control emissions are pumped back into the viewing audience (using converted microwave ovens) so that everyone THINKS they're watching "Lucy" on the screen with their eyes, when, in fact, they're re-visualising the whole thing entirely inside their brain's synapses.

Try it yourself: put on a tinfoil hat, and go to any screening of "Lucy". You'll be asking "When does the film start?" while everyone else is going "Wow! Did you see that?!"

P.S. Many Bothans died to keep the secret that there were no visible light photons used in the production of "Lucy", and now Vision just blurts it out... sheesh!

P.P.S. Please, no-one start talking about Maximum Content Light Level and Maximum Frame Average Light Level, and definitely don't mention Candela/m², otherwise Vision's head will explode, and that'd be a waste of a good tinfoil hat, wouldn't it?

Blue_MiSfit
12th February 2017, 08:57
You guys need to chill out :D

This is ridiculous. There's 6 pages of this back and forth crap in less than 2 weeks.

This thread is supposed to be about ENCODING HDR content, not grading / transferring / remastering content.

Knock it off.

visionplushdr
12th February 2017, 15:19
Oh, no, Guyz! He's discovered Hollywood's greatest secret!

"Lucy" was the FIRST EVER film to employ psychophoton technology, where every scene is filmed in PITCH DARKNESS, but with the whole environment being rendered using the electroencephalography of an MKUltra Remote Viewer.

In turn, these recorded mind control emissions are pumped back into the viewing audience (using converted microwave ovens) so that everyone THINKS they're watching "Lucy" on the screen with their eyes, when, in fact, they're re-visualising the whole thing entirely inside their brain's synapses.

Try it yourself: put on a tinfoil hat, and go to any screening of "Lucy". You'll be asking "When does the film start?" while everyone else is going "Wow! Did you see that?!"

P.S. Many Bothans died to keep the secret that there were no visible light photons used in the production of "Lucy", and now Vision just blurts it out... sheesh!

P.P.S. Please, no-one start talking about Maximum Content Light Level and Maximum Frame Average Light Level, and definitely don't mention Candela/m², otherwise Vision's head will explode, and that'd be a waste of a good tinfoil hat, wouldn't it?

Lol. You really have mental issues. Lucy among any other movie works and looks even better on Dolby Vision TV's than the Dolby Vision movies. Again, guess why?

And the Maximum Content Light Level and MaxFALL is inside the metadata for the TV standards, which has NOTHING to do with the grading in the PQ. Why? MaxCLL and MaxFALL are meant for example to allow the TV to turn on or not the HDR processing... depending your CONTENT and how you graded it. For example, If you encode a 4000 nits PQ inside a 1000 MaxCLL, it will turn on in HDR10 TV's showing an incredible BS output. If you therefore find the MaxCLL for your content which most of the times is extremely easy to do as it's supposed you have graded the video and set it to a higher number it will not turn on in HDR10 TV's, because the software inside it is not able to playback more than HDR10 graded movies.

Please find other hobby. Maybe camper.

visionplushdr
12th February 2017, 15:21
You guys need to chill out :D

This is ridiculous. There's 6 pages of this back and forth crap in less than 2 weeks.

This thread is supposed to be about ENCODING HDR content, not grading / transferring / remastering content.

Knock it off.

Indeed, and why encoding HDR content got a whole thread if you only do it in one milisecond? I just don't get why there's a whole thread to encode HDR content, anybody can encode HDR content, the hard part is doing the HDR content.

And that's why anybody here is showing their HDR content, because they don't know how to do it. You can even learn how to encode in HDR from google search, this thread if only meant to "encode HDR" is nonsense.

Have a great day.

CruNcher
12th February 2017, 15:43
What are you talking about? Unbelievable. You are also talking about light and there's no light at all in the HDR10 because it's completely dull and looks like garbage. More than unbelievable, incredible.

Easy i made you a compliment on your Physical more correct output result of that RIM Light ;)

Also this is pretty nice if you expand the gamut to maximum

https://extraimage.net/image/2UYn

That's what HDR is about making every small detail better visible through the PQ :)

It appears very dark in SDR Bt 709 (wrong skin/hair response) and very bright in HDR Bt2020 (DCI-P3) (correct skin/hair response).

visionplushdr
12th February 2017, 15:46
Easy i made you a compliment on your Physical more correct output result ;)

Did you watch Lucy in SDR? Please post the same scene in SDR.

https://www.youtube.com/watch?v=UX37Dw1tqUw

Do you want me to post an image?

https://extraimage.net/images/2017/02/12/a9dcb9dbfbda06d54108c8b732f1b17f.md.jpg (https://extraimage.net/image/2z2r)

Closer now to see how the " Subsurface Scattering in combination with indirect lighting" works:

https://extraimage.net/images/2017/02/12/48b357f4db5a2217dc9aee88c038953a.md.jpg (https://extraimage.net/image/2z2G)

Uncompressed HDR10 0% Sat against 4000 nits 0% Sat.

https://extraimage.net/images/2017/02/11/a237ac27323a8ddf82b0d32515a05176.md.png (https://extraimage.net/image/2UP2)
https://extraimage.net/images/2017/02/11/b5eac73fb7b7891581195dfaf38549e4.md.png (https://extraimage.net/image/2UPw)

Anyway, i guess there is a lot of years in threads about HDR for the people to understand HDR in doom9.

And please don't talk about physically correct lights when you don't even know how SDR content looks like.

visionplushdr
12th February 2017, 16:08
Easy i made you a compliment on your Physical more correct output result of that RIM Light ;)

Also this is pretty nice if you expand the gamut to maximum

https://extraimage.net/image/2UYn

That's what HDR is about making every small detail better visible through the PQ :)

That whole interior should be even much brighter and you should get into the direction of the same skin response as the rim light scene

The output from MadVR is just a simulation when there's 1500 nits out from a 10K grade,worse when you use DCI and your panel can't handle it.

Do you understand HDR content must be playback through a TV? Where you setup/calibrate the input/output depending your TV?

Youtube/images are meant to show differences in render, not how it will look like in every TV panel because that is impossible.

What you see in the last posted images from the 4000 nits in VISIONPLUS or even the HDR10 is not how it looks like when playback in a TV with the correct TV settings.

The HDR10 will need a lot of help from the TV processing to look "normal" when the VISIONPLUS will need almost no help, because of the higher nits grade.

It looks like you really don't know what is or how HDR works, at all.

From MadVR you can have a lot ( and believe me A LOT ) different outputs in HDR content from the same video.

For example, read the settings:

https://extraimage.net/images/2017/02/12/33ccc051638eacd3f98f8327edaf7255.md.jpg (https://extraimage.net/image/2zNT)

You can calibrate the input and output ( TV ) to your freaking wishes. It's content, like SDR but in HDR.

CruNcher
12th February 2017, 16:51
I know i can only tell you i see even the dust particles in the light in your MadVR tonemapped simulation expanding the gamut and luminance comparing with overcompressed SDR is bullshit that's not what HDR is about neither it's about colors the PQ is about what data you can percept and how to display it most efficient mainly the rest is just marketing ;)
And HLGs Function is pretty fine for that you don't need Dolbyvision really.

HDR10 you need the Hardware to create the result it's like you have a negative that gets displayed in the RAW camera format and the industry sells you the projector currently at 1000 nits peak,which it cant even hold for very long only in a very small window at all.

It's overall Designed to be sold continuously with the Economical progression.

visionplushdr
12th February 2017, 17:38
I know i can only tell you i see even the dust particles in the light in your MadVR tonemapped simulation expanding the gamut and luminance comparing with overcompressed SDR is bullshit that's not what HDR is about neither it's about colors the PQ is about what data you can percept and how to display it most efficient mainly the rest is just marketing ;)
And HLGs Function is pretty fine for that you don't need Dolbyvision really.

HDR10 you need the Hardware to create the result it's like you have a negative that gets displayed in the RAW camera format and the industry sells you the projector currently at 1000 nits peak,which it cant even hold for very long only in a very small window at all.

It's overall Designed to be sold continuously with the Economical progression.

What Tone mapped? You mean 3dlut HDR processing?

To watch my movies you have to at least setup 4000 nits and up. My movies works and looks perfect in Dolby Vision TV's. Do you know how dolby vision tv process the HDR higher nits grade?

So basically you believe Dolby Vision movies are Tone mapped due to receiving the same processing as with MadVR when playback in panels with less than 4000 nits.

The Dolby Vision movies gets playback in panels with even less than 1000 nits, with a 4000 peak light on the grading video.

HDR10 is not a hardware is just a software inside a TV, exactly like MadVR. With limited capabilities from the input source that is almost SDR content, that's why it doesn't even uses a chipset.

Dolby Vision TV's got a chipset that works like madVR HDR processing by doing it exclusively. A chipset made exactly to only process the HDR content.

It really looks you have no idea, actually i don't even know why im trying to explain to you.

It's actually hilarious how you try to look that you have some little idea on the matter but you have graded... zero videos.

Let's do this get my videos and playback in a Dolby Vision TV then grade yourself a video and play it. Are you going to do that? I bet you won't.

CruNcher
12th February 2017, 17:45
The Display is the HDR10 Hardware the whole Panel defines how HDR gets Displayed it's not only Software especially not when we talk about things like specular reflections and how to represent them in the most efficient way on such surfaces you need more control then just Software for that.

Dolby vision is primarily a complete HDR ecosystem you become part of and License or not Software and Hardware which is buildup also highly on dynamic metadata so the Hollywood Grader can still decide how each scene gets displayed and switch the result whenever he want's.
HDR10 will get also dynamic metadata capabilities soon pushed especially by Samsung for.

visionplushdr
12th February 2017, 17:49
The Display is the HDR10 Hardware the whole Panel defines how HDR gets Displayed it's not only Software especialy not when we talk about things like specular reflections and how to represent them in the most efficient way on such surfaces you need more control then just Software for that.

No it's not there is no HDR10 panel. The panel inside a HDR10 TV is the same panel as you can find in a TV with no HDR software inside.

The panels are not HDR are just more or less nits in the 2%, 5% up to 100% peak and sustained windows.

To process HDR you need a software inside it, panels are not HDR. The panel is capable of throw more or less light.

When processing HDR content with high nits grade a software is in charge to compress, preserve and level the input source to the panel output.

HDR can be seen in a 200 nit 2% peak panel with the correct HDR processing.

Why would even exist HDR10 "panels" when there are Oleds, VA's with edge or full array even IPS! which goes from 600 to 1200 nits each/type.
Where in the Oleds with less nits output you get the higher quality Dolby Vision movies.

DID YOU GET IT? You can watch HDR in any panel. You can even watch HDR images in any panel. Because HDR is just a video/image processing.

When you get a panel with higher peak, an image in HDR will look better. Same with a video. AND THE SAME WITH SDR CONTENT.

Please stop typing.

kolak
12th February 2017, 18:46
Yes, but studios work with 1:1 mapping, so when they make scene with 1000 nits peak spot they see it on monitor which at this point will show 1000 nits (which they can measure with probe).

CruNcher
12th February 2017, 18:58
I know what Dolby Visions Chip does and that it tonemapps down the higher nits Grade to the capabilities of the TV it's used in like what MadVR can do.
The Panels have certain capabilities or not which can enhance the HDR output result i dont meant there would be specific HDR10 Panels existing but more and less capable HDR Displays.

Oleds are more expensive more premium perfect target for Dolby Licensing and geting the investment costs with a additional win back ;)

visionplushdr
12th February 2017, 19:05
I know what Dolby Visions Chip does and that it tonemapps down the higher nits Grade to the capabilities of the TV it's used in like what MadVR can do.
The Panels have certain capabilities or not which can enhance the HDR output result i dont meant there would be specific HDR10 Panels existing but more and less capable HDR Displays.

Oleds are more expensive more premium perfect target for Dolby Licensing ;)

Doesn't matter the Dolby brand. What makes look better Dolby Vision movies in lower nits is just the grade in the PQ.

I can honestly show you more than 1000 versions of the same frame output in MadVR for a HDR10 as well as a higher grade version.

You can calibrate anything in HDR, exactly as in SDR. What changes is the type of content.

What i was trying to explain is you can watch HDR anywhere. As well SDR anywhere. When higher nits peak SDR can and will also look better.

With HDR you have a higher range and all the rest of improvements HDR gives to video. Where you can setup and calibrate the input/output to whatever you like.

You just got a higher range video to begin with.

You can make HDR content from MadVR output from 200 to 10.000 nits, then preserve, compress, restore details. You can calibrate the colorspace, you can saturate the gamut or desaturate if you wish to do so.

You can select pure power gamma curve from 1.80 to 2.60 , where you can also tweak brightness from the input in Color & Gamma up to +100, same contrast and you also get the HUE.

MadVR HDR processing works like a TV. In your tv you can tweak several options to make the output look like complete garbage, popup image, dynamic presets or even a calibrated preset.

What you don't understand is i don't re master anything at all. Colors are exactly the same, with a higher grade in the PQ.

visionplushdr
12th February 2017, 19:09
Yes, but studios work with 1:1 mapping, so when they make scene with 1000 nits peak spot they see it on monitor which at this point will show 1000 nits (which they can measure with probe).

There's no 1:1 mapping at all. What 1:1 are you talking about when Dolby Vision are 4000 nits peak and playback in 600 nits Oleds.

Not every HDR10 tv is 1000 nits either. For the worse in the case of HDR10 movies that are actually a scam.

You can grade up to 10.000 nits with measuring and testing the output levels/peaks perfectly fine and also measure the SAME content in less nits where your display can actually throw it. If well done, it looks FINE in every panel.

kolak
12th February 2017, 19:21
It's a lot better and that's how i liked it. Which means a lot better to me, as how i setup the output.

Well, this is the point which you can't understand or refuse to acknowledge. It's nice for you, but rest of the world doesn't give a crap about your taste. The correct version is they way how studio graded it, as this is what director approved and if you use 10x different reference TVs it will look about the same on al of them.
The way how it looks at people's homes is always a compromise as TVs are not keeping standards properly and cheeper ones are simply quite poor.
You can spent years making your own versions. I can do the same and every one else can (even just with TV settings). It's irrelevant to studios work.

visionplushdr
12th February 2017, 19:27
Well, this is the point which you can't understand or refuse to acknowledge. It's nice for you, but rest of the world doesn't give a crap about your taste. The correct version is they way how studio graded it, as this is what director approved and if you use 10x different reference TVs it will look about the same on al of them.
The way how it looks at people's homes is always a compromise as TVs are not keeping standards properly and cheeper ones are simply quite poor.
You can spent years making your own versions. I can do the same and every one else can (even just with TV settings). It's irrelevant to studios work.

There's no re master. No color change. What you see when i show the output in MadVR is the config i just used for the output.

You still don't get it man?

https://extraimage.net/images/2017/02/12/25ba13190f60abd01845f20cb6f34e4d.md.jpg (https://extraimage.net/image/2zcX)

That's freaking 200 nits in 0% preserve HUE with DCI and 30 Saturation from a "Dolby Vision peak" grade.

Do you want me to show you this same scene 1000 times in different gamma curves, saturation, compressing, preserve 0 to 100% and so on? It will take a long time.

You can freaking make the output to look exactly as you want and exactly as the director intended because it is the SAME MOVIE with a higher PQ range.

Nothing is changed. You can't really understand what is HDR and what is the PQ in the standard most used.

Believe me this is a lot relevant to studios work as it's exactly the same they are going to offer to you later. Or now with Dolby Vision in this last image.

kolak
12th February 2017, 19:28
There's no 1:1 mapping at all. What 1:1 are you talking about when Dolby Vision are 4000 nits peak and playback in 600 nits Oleds.

Not every HDR10 tv is 1000 nits either. For the worse in the case of HDR10 movies that are actually a scam.

You can grade up to 10.000 nits with measuring and testing the output levels/peaks perfectly fine and also measure the SAME content in less nits where your display can actually throw it. If well done, it looks FINE in every panel.

Yes, but studios don't grade to 4K nits if they can't see 1:1 version of it. They will do it if they get Dolby 4K monitor. Most grade to 1k nits as that's what they can reliably monitor (at 1:1) and because this is here home technology is about.

Dolby can be monitored at not 1:1, but in this case it's more controlled as Dolby enforce quite fixed way for all devices (so you know what to expect).

It's way different than what you do. 10K grade, limited to X or Y or Z by madVR and limited further by every TV. This is far different than 1:1 preview used by studios. I don't argue that you can't do it your way.

visionplushdr
12th February 2017, 19:31
Yes, but studios don't grade to 4K nits if they can't see 1:1 version of it. They will do it if they get Dolby 4K monitor. Most grade to 1k nits as that's what they can reliably monitor (at 1:1) and because this is here home technology is about.

Dolby can be monitored at not 1:1, but in this case it's more controlled as Dolby enforce quite fixed way for all devices (so you know what to expect).

It's way different than what you do. 10K grade, limited to X or Y or Z by madVR and limited further by every TV. This is far different than 1:1 preview used by studios. I don't argue that you can't do it your way.

Home technology is 4000 nits? Then tell Dolby they are noobs.

I do it my way and it looks fine in every "HDR TV". A lot of people claimed my movies looks even better than what they have seen before. A lot, so? Why my movies look perfectly fine and people does not see wrong colors or a "fan edit" feel on any movie?

Now with HDR10 source things gets even easier.

kolak
12th February 2017, 19:32
There's no re master. No color change. What you see when i show the output in MadVR is the config i just used for the output.

You still don't get it man?

https://extraimage.net/images/2017/02/12/25ba13190f60abd01845f20cb6f34e4d.md.jpg (https://extraimage.net/image/2zcX)

That's freaking 200 nits in 0% preserve HUE with DCI and 30 Saturation from a "Dolby Vision peak" grade.

Do you want me to show you this same scene 1000 times in different gamma curves, saturation, compressing, preserve 0 to 100% and so on? It will take a long time.

You can freaking make the output to look exactly as you want and exactly as the director intended because it is the SAME MOVIE with a higher PQ range.

Nothing is changed. You can't really understand what is HDR and what is the PQ in the standard most used.

Believe me this is a lot relevant to studios work as it's exactly the same they are going to offer to you later. Or now with Dolby Vision in this last image.

I get your 10K master and have X brand HDR TV.
How do I watch it?

visionplushdr
12th February 2017, 19:34
I get your 10K master and have X brand HDR TV.
How do I watch it?

By today you can watch any of my movies with updated HDR10 TV's or Dolby Vision TV's.

Any of my movies works directly natively by USB in any Dolby vision tv. Are you reading me or what?

If you want to get "higher" tweaking options you use MadVR.

The last image is not 10K nits, is 4000 nits peak video just like Dolby Vision movies. That's why works fine in 200 nits Madvr output processing.

kolak
12th February 2017, 19:36
Home technology is 4000 nits? Then tell Dolby they are noobs.

Where did I say it?
4000 nits is not going to happen for next few years at home. To much energy usage.
Even at such a controlled environment like studios Dolby had massive issues to make such a TV. Huge cooling problems and energy consumption. That's why there only few such a TVs and currency this is all at prototype stage.
Their 2K nits TVs are more "normal".

visionplushdr
12th February 2017, 19:37
Where did I say it?
4000 nits is not going to happen for next few years at home. To much energy usage.
Even at such a controlled environment like studios Dolby had massive issues to make such a TV. Huge cooling problems and energy consumption. That's why there only few such a TVs and currency this is all at prototype stage.
Their 2K nits TVs are more "normal".

Yes for sure then they are offering "tone mapped" movies to the public?

kolak
12th February 2017, 19:38
By today you can watch any of my movies with updated HDR10 TV's or Dolby Vision TV's.

Any of my movies works directly natively by USB in any Dolby vision tv. Are you reading me or what?

If you want to get "higher" tweaking options you use MadVR.

The last image is not 10K nits, is 4000 nits peak video just like Dolby Vision movies. That's why works fine in 200 nits Madvr output processing.

Sorry, I'm Mac user, have no madVR. I'm 60 years old guy who is not good with PCs overall. Also, I'm a content owner, but I don't won't to give unprotected movies on USB.
Your technology is just for you.

visionplushdr
12th February 2017, 19:42
Sorry, I'm Mac user, have no madVR. I'm 60 years old guy who is not good with PCs overall. Also, I'm a content owner, but I don't won't to give unprotected movies on USB.
Your technology is just for you.

Well that's other issue and i completely understand you.

But then we get along better now.

The other user claimed it didn't have "Saturation" but now i have already explained and shown how HDR works in MadVR or any HDR TV.

This last image i will show is the same 200 nits as before, but 20 sat instead 30.

https://extraimage.net/images/2017/02/12/ca3056739df114e00f75c62cb7c72899.md.jpg (https://extraimage.net/image/2z12)

From a 4000 nits peak grading video.

kolak
12th February 2017, 19:43
This looks totally crap, so over processed.
Show it to director/colorist of Lucy and check if they will prove it :)

Groucho2004
12th February 2017, 19:47
This looks totally crap, so over processed.So far, almost everything I've seen from this guy is overprocessed and oversaturated. That's his taste, fair enough but what about the endless flood of screen shots about which nobody cares? What a waste of bandwidth.

kolak
12th February 2017, 19:47
In terms of light transfer it is Subsurface Scattering in combination with indirect lighting would not react that way.

In terms of the Director not necessarily what he want's you to see and feel though ;)

I wonder actually from which Camera and Sensor this scene and specifically Frame was captured :)

Sony, ARRI or RED ?

Quite few cameras used:

Camera: Arri Alexa XT Plus, Cooke S4 and Fujinon Alura Lenses (some scenes)
Red Epic, Cooke S4 and Fujinon Alura Lenses (some scenes)
Sony CineAlta F65, Cooke S4, Fujinon Alura and Angenieux Optimo Lenses

so looks like F65 was the main camera.

visionplushdr
12th February 2017, 19:54
This looks totally crap, so over processed.
Show it to director/colorist of Lucy and check if they will prove it :)

That's 200 nits. Of course it looks over processed. Are you reading everything i post on here?

600 nits:

https://extraimage.net/images/2017/02/12/6782d4bb0b7ccc1c55beb2730d270f7d.md.jpg (https://extraimage.net/image/2z1s)

You really want me to post 1000 images for you to finally get to the point?

Showing the director what? How i setup the output? Are you freaking listening to what you say?

kolak
12th February 2017, 20:01
Also crap, posterisation and over saturated. Some spots with discolouring. This is obvious taken into account your whole quite aggressive processing chain. It will happen on your every master.
Keep working and making your own versions, just don't post every version here as people are not interested in these garbs, regardless if they are good or bad.

kolak
12th February 2017, 20:09
Y
Now that HDR10 can be "ripped" here's my work against commercial HDR10: http://screenshotcomparison.com/comparison/200268

VISIONPLUSHDR-1000 ( up to 10K Nits Re Grade )
https://extraimage.net/images/2017/02/11/97504eef8b10b9f607b4991a310d6715.md.png (https://extraimage.net/image/2UBQ)

COMMERCIAL HDR10 Output
https://extraimage.net/images/2017/02/11/4abef5926ec5b6c41df924f55052d801.md.png (https://extraimage.net/image/2UBL)

In terms of color palette, the visionplus output throws 40% more gamut than the HDR10.



Green bits start looking un-natural. Skin tones also start showing some posterisation and discolouring. Colors are messed up compared to original (blue tint on white highlights on the left, reds are very different). I can tell you for colorist these are big differences.
You have to work harder.

visionplushdr
12th February 2017, 20:18
Green bits start looking un-natural. Skin tones also start showing some posterisation and discolouring. Colors are messed up compared to original (blue tint on white highlights on the left, reds are very different). I can tell you for colorist these are big differences.
You have to work harder.

Then tell the director they released a wrong color palette. It's the same movie up and down.


No scripts in coloring has been applied, at all. No matrix changes, at all.


LOL.

The fact you don't like a "pirated" HDR content being released publicly with your "protected content" doesn't means you can just give out BS on other's people work.

You talking about wrong colors in the same color palette without changes at all shows how noob you are? or how mad you are?

Watch this video:

https://www.youtube.com/watch?v=7KkU7oomos4

In every "hdr" output you see completely wrong colors. That is wrong, not what i do. There's one scene where behind the window shows white output and in "HDR" completely over processed blueish output. And same with any other.

The one at 0:23 is ridiculous for example.

Have fun.

CruNcher
12th February 2017, 20:23
Quite few cameras used:

Camera: Arri Alexa XT Plus, Cooke S4 and Fujinon Alura Lenses (some scenes)
Red Epic, Cooke S4 and Fujinon Alura Lenses (some scenes)
Sony CineAlta F65, Cooke S4, Fujinon Alura and Angenieux Optimo Lenses

so looks like F65 was the main camera.

then most was done from S-Log Base like in the Camp Sample ?

Though i don't really like Sonys HDR Sample it shows very Mediocre Encoding results Noise is really horrible

http://i1.sendpic.org/t/ak/akTCe5Lv1cjC6tqtJvkFYywDmIR.jpg (http://sendpic.org/view/1/i/mv4DguuVKyFXCUO3Zcj8waM3CbF.png)
http://i1.sendpic.org/t/8q/8qNKgiZPdYWNml9kt0WUEB2iCnA.jpg (http://sendpic.org/view/1/i/g8SA0QZ2ywQCbKNNWrGwcYo6rfo.png)

the same transformed over to the 8 bit SDR Encode

http://i1.sendpic.org/t/7j/7jrcyL2wbPssICIT2ux6MKEOYYj.jpg (http://sendpic.org/view/1/i/i2bW8fAF8gDxgkOApGuyelOPM7b.png)

kolak
12th February 2017, 20:29
Then tell the director they released a wrong color palette. It's the same movie up and down.


No scripts in coloring has been applied, at all. No matrix changes, at all.


LOL.



Your processing boosts any imperfection, so it becomes very visible.
It's the same as pushing any video source during grading. Every source has its limits and the heavier processing the more likely they will show up.

visionplushdr
12th February 2017, 20:31
Your processing boosts any imperfection, so it becomes very visible.
It's the same as pushing any video source during grading. Every source has its limits and the heavier processing the more likely they will show up.

No there's not any difference you are watching a MadVR processing output from a lot nits less than what the video has been graded and in DCI output as well.

To watch it properly, i have already explained step by step.

Plus, that's MadVR. If you watch that same video in a Dolby Vision HDR processing it will look completely normal due to being the same commercial movie matrix.

kolak
12th February 2017, 20:32
Watch this video:

https://www.youtube.com/watch?v=7KkU7oomos4



This is some pr crap video for public, where everything is 10x oversaturated. You should like it :)

visionplushdr
12th February 2017, 20:33
This is some pr crap video for public, where everything is 10x oversaturated. You should like it :)

Lol. Show me a good HDR presentation please or a photo. Actually show me how you like HDR output and i will give you the same identical image in higher nits. Just to end this BS.

And are not movies meant for public? So you are an alien?

kolak
12th February 2017, 20:39
It's an advert, not some studio work. It will be displayed on some show with TV set to Dynamic which makes whole thing unwatchable for people who have some idea about videos and amazing for the "public".

visionplushdr
12th February 2017, 20:42
It's an advert, not some studio work. It will be displayed on some show with TV set to Dynamic which makes whole thing unwatchable for people who have some idea about videos and amazing for the "public".

LOL.

Please tell me how much post processing chains does this image have then?

https://extraimage.net/images/2017/02/12/a679d7d330bcbea378ab29aba0af4e42.md.jpg (https://extraimage.net/image/2zEj)

kolak
12th February 2017, 20:44
I'm still amazed that you trust and believe in HDR core and madVR so much?
It's all math and processing any video has its limits. Different things will start breaking sooner or later. You think that making 10K version out of 1K master is problems free and will always be perfect? It's very opposite. It sounds like quite heavy processing, so things will break and show top on your master. Not only this- your 10K master is then processed further by madVR and further by TV.
Remember- the less processing the better, not opposite.

visionplushdr
12th February 2017, 20:45
I'm still amazed that you trust and believe in HDR core and madVR so much?
It's all math and processing any video has its limits. Different things will start breaking sooner or later. You think that making 10K version out of 1K master is problems free and will always be perfect? It's very opposite. It sounds like quite heavy processing, so things will break and show top on your master. Not only this- your 10K master is then processed further by madVR and further by TV.
Remember- the less processing the better, not opposite.

Stop talking avoiding questions.

Answer : How much post processing chains had the last image.:D

and further by TV.
Remember- the less processing the better, not opposite.

So basically you are claiming that commercial HDR IS BS. Since in HDR10 TV process a LOT and even In Dolby Vision. Exactly the OPPOSITE as my videos where TV doesn't have to process anything. What's better?

Still, i want your answer about the last image.

visionplushdr
12th February 2017, 20:51
then most was done from S-Log Base like in the Camp Sample ?

Though i don't really like Sonys HDR Sample it shows very Mediocre Encoding results Noise is really horrible

http://i1.sendpic.org/t/ak/akTCe5Lv1cjC6tqtJvkFYywDmIR.jpg (http://sendpic.org/view/1/i/mv4DguuVKyFXCUO3Zcj8waM3CbF.png)
http://i1.sendpic.org/t/8q/8qNKgiZPdYWNml9kt0WUEB2iCnA.jpg (http://sendpic.org/view/1/i/g8SA0QZ2ywQCbKNNWrGwcYo6rfo.png)

That's awful. And you are arguing on my work. Unbelievable. The encoding has nothing to do, the HDR grading is crap.

kolak
12th February 2017, 20:54
This was done in very early stages of HDR.
Also- most TV manufactures demos are quite crap, so never use them as reference.
I'e seen this video on Sony BVM HDR monitor.

kolak
12th February 2017, 20:56
LOL.

Please tell me how much post processing chains does this image have then?

https://extraimage.net/images/2017/02/12/a679d7d330bcbea378ab29aba0af4e42.md.jpg (https://extraimage.net/image/2zEj)

Show me reference than maybe I will be able tell you more.

visionplushdr
12th February 2017, 20:56
This was done in very early stages of HDR.
Also- most TV manufactures demos are quite crap, so never use them as reference.
I'e seen this video on Sony BVM HDR monitor.

What are early stages of HDR when HDR is extremely old image processing method?

You are still not answering about the last image post processing chain quantity (?)

kolak
12th February 2017, 20:57
That's awful. And you are arguing on my work. Unbelievable. The encoding has nothing to do, the HDR grading is crap.

You said all your work is 10x better then studios, so we have to be very hard don you.

visionplushdr
12th February 2017, 20:57
Show me reference than maybe I will be able tell you more.

No, tell me how it looks please.

Without "REFERENCE"

LOL.

This is gonna be good.

kolak
12th February 2017, 20:58
Doesn't look very good for me, but it's most likely original SDR version of the movie.

visionplushdr
12th February 2017, 20:59
You said all your work is 10x better then studios, so we have to be very hard don you.

If you watch what i do in a Dolby Vision TV and watch.. let's say ANY DEMO in HDR my video kills any other. What that exactly says to you?

The Sony HDR camp video looks as bad as any other HDR demo they show in stands.

Why they would show crap? And why mine looks better?

This is wrong.

And tell me how the last image looks please.

visionplushdr
12th February 2017, 20:59
Doesn't look very good for me, but it's most likely original SDR version of the movie.

Awesome!!!!

That was the HDR10 Version of the movie.

I will just have a great rest of the day. Or my brain will explode with so much bs on here.

kolak
12th February 2017, 21:02
If you watch what i do in a Dolby Vision TV and watch.. let's say ANY DEMO in HDR my video kills any other. What that exactly says to you?

The Sony HDR camp video looks as bad as any other HDR demo they show in stands.

Why they would show crap? And why mine looks better?

This is wrong.

And tell me how the last image looks please.

Because "public" is like you- like saturation, contrast and punchy colors. It's show for public, not colorists.
I have heard opinion of one of the decent colorist about these videos :) Not going to swear here :)

These videos are approved by marketing people, not technical :) We already said what is all this crap about, so don't be so surprised that they look crap. Don't be so also surprised that some studios work is crap, it's very relative. When done properly it does look good, definitely better than your masters. You clearly have 0 inside knowledge of studios reality. If you did you wouldn't be so surprised by all what you see.

visionplushdr
12th February 2017, 21:05
Because "public" is like you- like saturation, contrast and punchy colors. It's show for public, not colorists.
I have heard opinion of one of the decent colorist about these videos :) Not going to swear here :)

You just said colorists are noobs.

https://extraimage.net/images/2017/02/12/2f5a9f14f65477e7835a57a8a7ce7fc5.md.jpg (https://extraimage.net/image/2zEU)

Do you like the colors on there? What about the palette and gamut? Hey does Lucy looks so horrible for the "colorists" ?

I see an outstanding extraordinary horrible image output in there and it's just the commercial HDR10 made by the "colorists".

Can you tell me how can i contact them and tell Kolak the doomer said they are complete noobs?

But you have made the Sony HDR camper video right? Then you probably love the garbage.

visionplushdr
12th February 2017, 21:09
Because "public" is like you- like saturation, contrast and punchy colors. It's show for public, not colorists.
I have heard opinion of one of the decent colorist about these videos :) Not going to swear here :)

These videos are approved by marketing people, not technical :) We already said what is all this crap about, so don't be so surprised that they look crap. Don't be so also surprised that some studios work is crap, it's very relative. When done properly it does look good, definitely better than your masters. You clearly have 0 inside knowledge of studios reality. If you did you wouldn't be so surprised by all what you see.

No wrong, my "masters" looks better than the "studios". Also higher nits grade. If i compare with the Camper HDR video my masters are heaven.

Do you like how lucy looks on there? You also said it was SDR. But it was HDR10. How do you feel?

Do you want a whole set of images from the AMAZING PROPER MADE BY EXPERT COLORISTS 1:1 MAPPING HDR10 video?

kolak
12th February 2017, 21:16
How they are better if you don't really change colors? It's all relatively the same- well, this is what you claimed minutes ago.
Make your mind.
Go to Raw assets, do your own grade scene by scene (you already wasting time on this on top of studio grade) and then try to prove it's better. Have you ever tried it- starting with some LOG raw assets and making scene looking "good"?

You don't GRADE, you apply some transfer on top of existing grade. You don't relatively change colors, so how your versions can't be better. Stop this nonsense. Your creative part is very limited.

kolak
12th February 2017, 21:18
Do you want a whole set of images from the AMAZING PROPER MADE BY EXPERT COLORISTS 1:1 MAPPING HDR10 video?

And how are you going to show this to me without having reference HDR monitor which I can sat at and watch it?

CruNcher
12th February 2017, 21:26
Awesome!!!!

That was the HDR10 Version of the movie.

I will just have a great rest of the day. Or my brain will explode with so much bs on here.

To simple you compare the HEVC Encoder PQ performance + the HDR Grading now both should be separately viewed but hard todo that in reality without the master, unless you know specific visual issues coming from wrong Grading post manipulation itself ;)

Though it's interesting that the HDR Master of the Sony Camp Sample should show already these noise levels without it even seeing the (i call it uknown japanese Encoder) :D

LG/Samsung did overall way better with Ateme over the whole time ;)

And i mean old come on

UTC 2016-02-03 07:59:49
UTC 2016-02-03 08:01:32

But yeah LG Colors of Journey HDR if i see some of those SDR conversions i wonder if such a colorist ever saw a tree in real life or if he colors his reality like he does in his Grades ;) ?

https://www.youtube.com/watch?v=vI91heLv_v4

i mean thats like another reality or maybe some people experience somekind of saturation syndrome ?, somekind of job effect "i have to saturate it i have to full colors no matter what" :D

visionplushdr
12th February 2017, 21:27
And how are you going to show this to me without having reference HDR monitor which I can sat at and watch it?

By the same MadVR HDR processing you before said it had OTHER Colors than the HDR10 version plus you said HDR10 was SDR then my old images you didn't even mentioned that.

You are just a lot confused. And i understand due to you not knowing what i do. Sometimes people argues on other people's work but they don't have a clue on the matter.

Is this an SDR too?

https://extraimage.net/images/2017/02/12/46b7387dafe1557767f47bdfba786fc6.md.jpg (https://extraimage.net/image/2zO2)

The other problem here is you have a "taste" on how things should look like. But HDR ( commercial ) looks like garbage, any. Even the demos used to sell TV's. So ?

kolak
12th February 2017, 21:35
Joking? Forget about madVR!

The only way to prove me that colorist done poor job is to have reference HDR TV, so we can all see grade the way how colorist did. Not some MadVR processed version, not even version on home HDR TV, but proper preview, so our eyes see it the way as colourist did. You have to see real thing at 1:1 in order to reliably judge it, not some SDR interpretation of it (specially when technology is still in early stages).
Got it?

kolak
12th February 2017, 21:37
To simple you compare the HEVC Encoder PQ performance + the HDR Grading now both should be separately viewed but hard todo that in reality without the master ;)

Though it's interesting that the HDR Master of the Sony Camp Sample should show already these noise levels without it even seeing the Encoder :D

LG/Samsung did overall way better with Ateme over the whole time ;)

And i mean old come on

UTC 2016-02-03 07:59:49
UTC 2016-02-03 08:01:32

I've seen this master probably more than 2 years ago. Original recording is very old. I think around when F65 was released.

visionplushdr
12th February 2017, 21:37
Joking? Forget about madVR!

The only way to prove me that colorist done poor job is to have reference HDR TV, so we can all see grade the way how colorist did. Not some MadVR processed version, not even version on home HDR TV, but proper preview, so our eye see it the way as colourist did. You have to see real thing at 1:1 in order to reliably judge it, not some SDR interpretation of it (specially when technology is still in early stages).
Got it?

MadVR doesn't changes colors. MadVR is actually the best player in the entire planet you can use to watch a movie. Are you freaking kidding me even more and more?

Is this SDR?

https://extraimage.net/images/2017/02/12/f2cab30b8f0c29dcf8b406b5702347c3.md.jpg (https://extraimage.net/image/2zOS)
https://extraimage.net/images/2017/02/12/54d207a06e59148357f7459efb8c33f3.md.jpg (https://extraimage.net/image/2zOy)

The colors are crap, right?

Go find help please.

You did the HDR sony camper crap, and you even have a face to complain about my work?

kolak
12th February 2017, 21:45
The only issue that with high brightness colors are preserved differently, they will look way more saturated (maybe that's why your madVR preview is so desaturated).
You can't simulate this! You need real HDR TV, so your eyes are hit with this higher than SDR nits. You assume that eye sees colors the same regardless of brightness. Well, not long time ago you said flower looks different in the sun :) Make your mind.

Get real, stop judging HDR videos on SDR madVR preview. Is your TV calibrated? You have 100 of variables, so what you see may be veeery different than in reality (what colorist saw on good reference TV).
How can I judge these grabs if my Mac screen crashes blacks, is over saturated etc.
Your really don't get idea of reference preview!

Lost any further interest in this (anyway) meaningless discussion.

visionplushdr
12th February 2017, 21:49
The only issue that with high brightness colors are preserved differently, they will look way more saturated (maybe that's why your madVR preview is so desaturated).
You can't simulate this! You need real HDR TV.

Get real, stop judging HDR videos on SDR madVR preview. Is your TV calibrated? You have 100 of variables, so what you see may be veeery different than in reality (what cohorts saw on good reference TV).
How can I judge these grabs if my Mac screen crashes blacks, is over saturated etc.
Your really don't get idea of reference preview!

Lost any further interest in this (anyway) meaningless discussion.

And when i have said you must watch the output as in the images?

You don't need any HDR TV you need a TV in blu-ray input mode to watch a movie with the correct output.

Just like what TV does when pluggin USB movie directly to it in HDR, it turns on the HDR processing there.


Without an "HDR TV" you use MadVR and setup your TV in some personal preset in 4:2:0 Blu-Ray output. And manage to calibrate to your wishes or setup the output you freaking want.

The images are ALWAYS simulated. You are watching a freaking 8-bit window screenshot, how come that could not be simulated?

You have to playback the movie in your TV and then you get the HDR content to it's fully output. Where you setup the gamma you need.

You are weird man.

kolak
12th February 2017, 21:57
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

visionplushdr
12th February 2017, 21:58
Is your TV calibrated? You have 100 of variables, so what you see may be veeery different than in reality (what colorist saw on good reference TV).
How can I judge these grabs if my Mac screen crashes blacks, is over saturated etc.
Your really don't get idea of reference preview!

Lost any further interest in this (anyway) meaningless discussion.

You say you know what's HDR or SDR by judging from an SDR MAC monitor?

You claim your mac screen also crashes blacks?

WTF is this.

You really have no idea in HDR, you are using a Mac as well and claim to want to see HDR content proper "output" WTF.

This got bigger gamma for example.

https://extraimage.net/images/2017/02/12/cac365c05b05f8c3bd20f08762c30939.md.jpg (https://extraimage.net/image/2zOk)

You can't watch HDR on your mac man. Are you sure you are 60 years old?

kolak
12th February 2017, 21:59
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

visionplushdr
12th February 2017, 22:00
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

No it's not any different. It's better. I don't change any reference made colors or matrix. Just rise PQ. Higher re grade.

And stop talking about i don't regrade anything, if you compare the HDR10 grade the HDR contrast and movie details are CRAP compared to my version. You just have no clue on HDR you claim you can watch it on a screenshot or your mac. WTF man.

kolak
12th February 2017, 22:04
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

visionplushdr
12th February 2017, 22:07
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

I don't setup anything to my likes. I don't change the calibrated reference from the actual room.

I only rise PQ. Higher re grade.

I don't change anything but rising PQ. Which you don't know how to do. But that doesn't makes it wrong. Since there's nothing wrong. It's the same as studios work on any field with higher PQ. And balanced PQ.

Whatever claim you said on here about "my master" is a criticism to the actual colorists and studio. Since it was ALWAYS the same video.

Groucho2004
12th February 2017, 22:07
Wow, this is getting really silly...

visionplushdr
12th February 2017, 22:09
Wow, this is getting really silly...

For sure. The guy kolak uses a Mac SDR monitor and wanted to see HDR output from a native ST 2084 PQ HEVC BT. 2020 Video.
Then he said Lucy colorists are complete noobs. But he made the crappy camper HDR video that looks worse than my old DVD's.

I think that covers everything from now on.

Groucho2004
12th February 2017, 22:11
The guy kolak uses a Mac SDR monitorHow do you know that?

visionplushdr
12th February 2017, 22:12
How do you know that?

He mentioned it a million times.

Groucho2004
12th February 2017, 22:13
He mentioned it a million times.Give me just one link so I don't have to go through 150 posts.

kolak
12th February 2017, 22:14
... But he made the crappy camper HDR video that looks worse than my old DVD's.


Where I've said this?

I seen it long time ago, so I know it's very old clip. That's all I said. Stop assuming, because this is like not using proper monitoring for your work- very misleading.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

visionplushdr
12th February 2017, 22:15
Where I've said this?

I seen it long time ago, so I know it's very old clip. That's all I said. Stop assuming, because this is like not using proper monitoring for your work- very misleading.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

In the old thread it's posted you have "worked" in the HDR Sony Demo.

Now you are lying to yourself?

kolak
12th February 2017, 22:16
He mentioned it a million times.

Once in order to hint you that without reference monitoring you can't tell reliably anything about someone else grading job.
Also- any "grading" job which you do is worthless as your TV will be far from giving proper preview.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

visionplushdr
12th February 2017, 22:16
Give me just one link so I don't have to go through 150 posts.

For example

The only issue that with high brightness colors are preserved differently, they will look way more saturated (maybe that's why your madVR preview is so desaturated).
You can't simulate this! You need real HDR TV, so your eyes are hit with this higher than SDR nits. You assume that eye sees colors the same regardless of brightness. Well, not long time ago you said flower looks different in the sun :) Make your mind.

Get real, stop judging HDR videos on SDR madVR preview. Is your TV calibrated? You have 100 of variables, so what you see may be veeery different than in reality (what colorist saw on good reference TV).
How can I judge these grabs if my Mac screen crashes blacks, is over saturated etc.
Your really don't get idea of reference preview!

Lost any further interest in this (anyway) meaningless discussion.

visionplushdr
12th February 2017, 22:17
Once in order to hint you that without reference monitoring you can't tell reliably anything about someone else grading job.

You can't watch HDR on a Mac monitor. You can't watch native HDR video man. Are you sure you are mentally right.

kolak
12th February 2017, 22:19
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

visionplushdr
12th February 2017, 22:21
You still don't get it.

You don't setup anything as you need/like- there is only one good setting, which is achieved by calibrating reference TV in the actual room.
Anything else is wrong (more or less).
I can 100% guarantee you (not even knowing your settings) that your HDR preview is very different than reference HDR TV- Sony or Dolby (specially knowing your taste). This means your claims about crap clors etc. have simply no ground and that's why this whole debate is abut pointless.

Are you in some bot repeating mode now? You posted this like 4 times lol.

I don't need to setup anything i just play the same movie as studios did in higher nits. What's wrong with you?

visionplushdr
12th February 2017, 22:26
Give me just one link so I don't have to go through 150 posts.

The guy also denied now ( because it looks like crap ) he did not work in the HDR sony demo.

Where anybody in doom9 can remember his post about the "collaboration" in that video. He liked to use to lie to himself then? What a world.

CruNcher
12th February 2017, 23:04
Colaboration doesn't necessarily means he graded it at all and is responsible for those results ;)
Also he said that sample is way older then we know about it even before it was officially used for Device presentations and it's final leak to the public.
So really one of the very early overall 4K HDR workflow tests most probably of Sony in the release of the F65,depending on the calibration that time the noise could have gone maybe unnoticed due to the PQ setup or any other circumstance, maybe they where right in the process testing their FX1 Reality Denoiser and didn't noticed it at all on playback ;)
Dont forget Sony had their own Internal Idea of HDR and PQ as well which partly survived in their own Chip and processing ;)

kolak
12th February 2017, 23:13
I'm not colorists, so 100% I did not grade it. Well, had nothing to do with it, except seeing it way before it got public. I don't think it was ever shot with HDR in mind, but more as F65 test (more Slog-3 test). Very early test which shows a lot of noise. I seen other samples from F65 which has similar problem.

Blue_MiSfit
12th February 2017, 23:13
This thread, unfortunately, has devolved into madness just like the last one. Some folks have been struck and I'm closing the thread again. It's COMPLETELY off-topic.

Just a reminder to everyone, read the rules of the site. There are so many rule 4 violations in this thread it's not even funny. Please be nice to each other. Don't troll. This is supposed to be a place for civil discussion and development of video tools and processes.

Feel free to start a new thread to actually discuss the encoding of 4K HEVC content.