There is a thread on AMD Gitlab about this. At a high level the issue is outside of the GPU driver and other parts of the stack need to improve. You can find more details in that thread.
In this case, how does Ubuntu get 8h YouTube streaming?
Read what I linked. It has all the technical details.
So the TL;DR version is that most Linux distros use the hardware acceleration GPU like as if the user is playing a video game? I guess that kind of make sense, thank you for explaining it
The problem is in this case the hw acceleration seems to be kinda borked and use way too much power compared to what it uses on windows (apparently they can do 1080p30 for around 1W over idle there) to the point that software decoding uses less power than hw below about 1080p30.
I am a bit worried about that bit since in the issue it seems they are fully blaming it on some non implemented offloads and dismissing the decoder using tons of power bit but I hope that gets fixed at some point anyway.
Not a Framework user, but I can confirm this behaviour on my Thinkpad Z13 G1 running F40, software decoding (Chrome) actually takes less power than hardware decoding (Flatpak Firefox) running YouTube and Twitch, and youtube playback still takes around 6-7W on battery
It’s the same for firefox with hw decoding disabled, though keep in mind that’s only for lower resolutions, somewhere around 1080p is the crossover point after that sw decoding starts loosing really quickly.