External Display 10-bit/HDR Questions

Hey there, friends!

At long last, I received my FW16 (7940hs/7700s) today. This isn’t meant to be a first impressions post, but I’ll just quickly say that overall, I’m impressed! I have had a couple of minor issues, but it seems like it’s going to be a great rig so far.

My question is about external displays. I have an LG C2 42" OLED that I’ve used/loved for a couple of years. I’ve used it with a bunch of different machines and I always run 4k/120hz with HDR enabled and 10 (or 12, if supported) bit color. It looks magnificent for pretty much all content :smiley:

So far, I haven’t found a way to enable HDR or 10+bit color with the Framework. I’ve hooked up directly to the GPU via both an HDMI to usb c adapter and the HDMI expansion card. 4k/120 works slick, but the color is currently set to 8-bit and though it says the display is HDR ready in the windows display settings, when I click the toggle, all of the screens just flash for a second and the toggle stays off. I’ve wrangled the color depth setting in the AMD software, but the only option is 8-bit.

Would love to get some more info from someone smarter than me about what I’m doing wrong and what options I might have to remedy it. I’ve tried some gaming and other tasks so far and it performs well, but just looks a little scuffed compared to other computers (of a similar performance level). Text also looks pretty rough.

Thanks for the consideration!

Did you get 4k@120Hz via the expansion card, too? I wonder, because it only supports HDMI 2.0b, which is capable of 4k@60Hz. I expect my FW16 to arrive mid to late May, but I’ve got an AMD Card in my desktop, too. My latest addition to it was an ultra-wide monitor with 3840x1600@144Hz and 8 Bit. For 10 Bit I have to lower the frequency to 120Hz, even via Displayport.

Sadly I don’t know, which version the DP alt mode supports, but if it’s only 1.4(b), the bandwidth is sufficient for 4k120Hz@8bits only. For HDR 10bits, you might try 60Hz refresh rate.

Also see Wikipedia on Displayport

The HDMI Expansion Card, like the vast majority of USB-C to HDMI adapters, only supports up to HDMI 2.0b.

HDMI 2.0b has insufficient bandwidth for what you want. In fact even at 4k 120 Hz 8 bit it is using aggressive compression to condense the data into the limited bandwidth (it is likely using 4:2:0 subsampling, which essentially means the computer is sending a full resolution black and white image and a half resolution color image that the display recombines to produce something between in quality, the reduction in quality is most noticable around text). Bandwidth is simply insufficient to achieve 10 bit even with the supported compression methods.

To achieve 4k 120 Hz 10 bit at all will require an adapter with support for HDMI 2.1 or above, and achieving it without any compression will require the adapter to communicate with the computer over DisplayPort 2.0 or above (the laptop doesn’t support HDMI, so the adapter communicates with the computer over DisplayPort and then translates it to HDMI).

Many adapters (and laptops, IDK about the FW16 specifically) are also limited to DisplayPort 1.4b from the computer, which also restricts bandwidth. Although it supports DSC (a better compression method that many people say looks just as good as no compression) so that’s not a huge issue.

I appreciate the replies. So far, 4k/120hz is working fine. Incidentally, it worked fine with past fw13 laptops as well via the expansion card.

Kyle, that would explain why everything looks so scuffed. I have ordered a more robust hdmi to usb c adapter that apparently supports 2.1, so we’ll see if that does the trick for hdr/10-bit + 4k/120.As one might expect, if I turn the resolution down to 1440p, it unlocks both hdr and up to 12-bit color. I’d imagine that using an actual DisplayPort expansion card/adapter may alleviate the constraints, but of course, the C2 only has HDMI since it’s technically a TV.

I’ll let you folks know how it goes with the new adapter. The Razer Blade 14 that the FW is replacing had HDMI 2.1 and everything looked just beautiful, so hopefully I can get it dial in to look the same.

1 Like

Ah, have you seen the artifacts caused by chroma subsampling (the compression method currently being used, the most noticeable degradation in quality is around text)?

Which adapter specifically?

Ah, that makes sense.

HDMI over USB-C was crippled when it was released (limited to HDMI 1.4 despite HDMI 2.0 already existing with nearly double the bandwidth and DisplayPort 1.4 already existing with over triple the bandwidth) and never got updated. So HDMI over USB-C never got used and any USB-C to HDMI adapter actually uses the DisplayPort over USB-C protocol and has a chip to translate the DisplayPort signals into HDMI signals.

In general that means that when using USB-C to HDMI adapters you will be limited to the constraints of both DisplayPort and HDMI.

Theoretically 4k 120 Hz 10-bit requires at least 29.86 Gbps of bandwidth, however there is also blanking (bandwidth that is intentionally wasted to improve compatibility with certain displays). Standard blanking would make it require 44.95 Gbps total, although Reduced Blanking V2 (a feature that reduces the amount of blanking required, some displays use it by default but even the ones that don’t will almost always work if you enable it in the GPU drivers) reduces the required bandwidth to 32.27 Gbps.

However DSC can allow for getting away with not having that much bandwidth and still getting decent quality. DSC is a visually-lossless compression algorithm (ie. It technically reduces quality but at least 75% of the time people fail to tell the difference even when comparing side-by-side) compression that is far better than chroma subsampling (what HDMI 2.0 uses). It can compress to a third as much bandwidth with almost full quality.

HDMI 2.0 supports 14.4 Gbps of active bandwidth (excluding bandwidth reserved for correcting any data corruption in the connection). So HDMI 2.0 has only 45% as much bandwidth as you need and doesn’t support DSC.

DisplayPort 1.4 supports 25.92 Gbps of active bandwidth. So it has 80% as much bandwidth as you need but does support DSC so it gets near full quality.

DisplayPort 2.0/2.1 supports up to 77.5757 Gbps of active bandwidth, although AFAIK AMD’s implementation of it is limited to 52.3636 Gbps. It can achieve full quality.

HDMI 2.1 supports up to 42.666 Gbps of active bandwidth. It can achieve full quality.

Of course if you use an adapter that communicates with the display over HDMI 2.1 (42.666 Gbps) but the computer over DisplayPort 1.4 (25.92 Gbps + DSC) you end up being limited to 25.92 Gbps + DSC.

4 Likes

First of all, THANK YOU for such a comprehensive reply. This has always been a bit of a hazy mystery to me, so thanks for bringing me up to speed. It sounds like the FW16 isn’t a great choice to drive a 4k/120/hdr panel?

Here’s the adapter I snagged.
https://www.amazon.com/gp/product/B08MSWMXT4/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1

It definitely isn’t a great choice to drive a 4k/120hz/10-bit hdr panel over HDMI with the currently available selection of USB-C to HDMI adapters.

Whether it is good for doing that with DisplayPort or with better adapters will depend on what version of DisplayPort the laptop supports.

According to AMD the 7840hs/7940hs (which control the ports on the side of the laptop) support the 38.79 Gbps variety of DisplayPort 2.1, which is sufficient to run such a display without compression. However, TechPowerUp claims that the integrated USB4 controllers are limited to only DisplayPort 1.4 (25.92 Gbps), so the ports on the sides of the laptop may not be able to achieve full quality.

The port on the rear of the laptop (only applicable to people with the 7700S) interfaces directly with the 7700S (bypassing the 7840hs/7940hs). The 7700S supports the 52.3636 Gbps variety of DisplayPort 2.1, although if that works will depend on the implementation (I’d expect that the real implementation is likely either 25.92 Gbps or 38.79 Gbps variety, the 52.3636 Gbps is touch to achieve due to parts availability).

However even if the system is limited to DisplayPort 1.4 that is still enough to drive 4k/120 Hz/hdr using a visually lossless compression algorithm (a visually lossless compression algorithm is a compression algorithm that technically reduces quality but in a way that less than 25% of people can tell in a side by side comparison).

TLDR: The laptop can definitely drive a 4k/120hz/10-bit hdr panel at near full quality (probably imperceptible for the vast majority of people). If it can actually reach full quality is unknown, but DSC near full quality is usually satisfactory,

That seems fine. I was curious if you had found one that supports greater than DisplayPort 1.4 for the upstream connection to the computer but that appears to be limited to 1.4 like most that I’ve noticed. So it has 80% of the bandwidth required to achieve 4k/120hz/10-bit hdr, but with DSC that should be fine (far better that whatever your currently getting with chroma subsampling compression).

I can drive the 5120x2160 72hz 10-bit LG DP 1.4 monitor fine just fine over DP from the older iGPU on Ryzen 7000 desktop (using USB-C port). Framework iGPU should run DP 1.4 just fine over USB-C also. I should be able to test it next month…

1 Like

Update: the adapter showed up, but surprisingly works worse than the existing options. Though it does allow me to set the color depth to 12-bit (which looks great), it won’t exceed 60hz at 4k (even at 8-bit).

The FW16 soc supports HDMI 2.1, per AMD’s documentation.

I would have to imagine that such an expansion card (which as pointed out above, would be an improvement from the currently available 2.0b card) is at the front of the line from frameworks roadmap.

However that is not actually relevant.

The HDMI capabilities of the CPU and GPU are completely unused on all Framework laptops.

The HDMI Expansion Card actually taps in to the DisplayPort capabilities of the laptop and has a chip built in to the card that processes those DisplayPort signals to translate to HDMI. Afaik the HDMI 2.1 versions of those chips are currently rare and expensive compared to the HDMI 2.0b versions, which is likely why there’s not yet an HDMI 2.1 version of the expansion card yet.

The CPU does support DisplayPort 2.0/2.1 40 Gbps according to AMD, which is not enough to run full HDMI 2.1 48 Gbps but would still be higher bandwidth than the 18 Gbps that the current HDMI 2.0b card supports (and HDMI 2.1 can take better advantage of the bandwidth available to it, so even with limited bandwidth HDMI 2.1 can be worthwhile).

Unfortunately Techpowerup indicates that the USB4 controllers in the AMD CPUs that Framework is using are limited to only DisplayPort 1.4 32.4 Gbps. Which if true would further limit the bandwidth of a potential future card on the current laptop. Although still a lot more bandwidth than the current HDMI 2.0b 18 Gbps card can take advantage of.

1 Like

thanks - to confirm, nothing explicitly keeps FW from releasing an HDMI 2.1 expansion card that leverages the underlying capabilities of the AMD 7X40 HS boards in the FW16, right? An issue of time/parts availability/money/etc. at present?

interesting re: techpowerup coverage- would you please share the link?

I’ve decided to go with the shotgun approach and ordered 3 more adapter/cables.

They could definitely produce an HDMI 2.1 expansion card that is capped at 25.92 Gbps of active bandwidth (excluding overhead due to error correction), which is 80% higher bandwidth than the 14.4 Gbps bandwidth limit of the current HDMI 2.0b card. Such a card could also take much better advantage of the available bandwidth due to having better compression algorithms such as DSC.

It might be possible to produce an HDMI 2.1 expansion card that is capped at 38.8 Gbps of active bandwidth, however afaik the chips that would be needed for that are not yet readily available and as I noted TechPowerUp indicates that the 7040 series AMD CPUs can’t handle that display bandwidth through the USB4 controllers. So that is unlikely, and achieving the full 42.7 Gbps of active bandwidth that HDMI 2.1 is capable of is definitely not possible currently.

TechPowerUp releases these detailed images indicating the IO capabilities of processors including details such as the version of DisplayPort supported by the USB4 controllers.

Source

In that image it displays below the USB4 controllers that they support DP 1.4a, which is an older version of DisplayPort that is limited to an active bandwidth of only 25.92 Gbps, which will be the bottlenecking factor for the connection between the laptop and any HDMI 2.1 expansion card.

By comparison on some other processors such as Intel’s Meteor Lake processors they list DP 2.1 in that same spot. It’s possible they made an error in listing DP 1.4a, however it’s also possible that the CPU is indeed limited to DP 1.4a 25.92 Gbps through the USB4 ports.

If someone reading this has a 7040 series laptop and a USB-C to DisplayPort dongle that supports DP 2.1 (the Framework DisplayPort Expansion Card should work) and a display that supports DP 2.1 they could potentially test to confirm (share what it lists under “Current Link Status” under the “Gaming → Display” tab in AMD’s software, I think it should be “10 Gbps x 4” if DP 2.1 is being used).

1 Like

For anyone visiting this thread in the future, I’m a little surprised, but THIS adapter is providing 4k/120/10-bit (edit, 12-bit, in fact!), directly into the DGPU port so far.

https://www.amazon.com/gp/product/B0BY3VS8LF/ref=ppx_yo_dt_b_asin_title_o00_s01?ie=UTF8&psc=1

Looks just fine and assuming it’s stable, will be good enough for me!

Could you please share what it lists under “Current Link Status” under the “Gaming → Display” tab in AMD’s software? That may provide insight into what underlying DisplayPort version is being used.

Interesting. That indicates that it is using DisplayPort 1.4 (which has four 8.1 Gbps data connections bonded together for 32.4 Gbps total or 25.92 Gbps excluding the bandwidth reserved for fixing corruption in the connection). Unclear if that is a limit of the adapter or the laptop.

However achieving 4k/120Hz/12-bit requires at least ~39 Gbps including blanking. So it is being compressed to 2/3rds the bandwidth. However given that it is likely using DSC compression (which is rated to be able to compress to 1/3rd the bandwidth and still have 75% of people unable to tell the difference) the compression isn’t likely to be noticable.

I’m sure I’m not among the 25% that can distinguish the difference. I feel like it might not look quite as crispy as HDMI 2.1 capable computers I’ve plugged, but that could just be psychosomatic. Either way, I’m straight up delighted to get even close to the potential of this lovely display with the Framework and will be content.

And that 25% is who can tell the difference in a side by side comparison of uncompressed against DSC compressing to only a third of the original bandwidth.

If your case DSC is compressing to two thirds of the original bandwidth (ie. It is compressing much less so the reduction in quality is less) and you’re not comparing side by side. So trying to chase getting it to work without compression is unnecessary (although I would probably try because I’m a perfectionist about this kind of thing).

Assuming that DSC is being used I agree that is likely the case (I told you that it is using compression so the knowledge that quality is reduced causes you to feel that quality is reduced even if it’s not noticeable).

It is also possible that even though DSC could be used the software may have defaulted to something stupid like using chroma subsampling (where it sends brightness data for all pixels but only sends color data for half or a quarter of the the pixels) which would definitely reduce quality.

If that’s happening it should be indicated under the “Pixel Format” drop-down in the same “Gaming → Display” section of AMD’s software. If that says something like “RGB 4:4:4 Pixel Format PC Standard (Full RGB)” then everything is at full quality. However if it says anything about 4:2:2 or 4:2:0 then chroma subsampling is in use (and if it says something like “Limited RGB” at the end it means the vibrancy of colors is reduced).

Edit: Here’s an image from AMD showing the Pixel Format drop-down. The top option is best. The middle and bottom use chroma-subsampling. The 2nd from the bottom has limited vibrancy. The 2nd from the top is unclear (definitely has no chroma subsampling, but I’m not sure if it has the full vibrancy).

DH3-007-1