Does anyone have any guess as to how much more power the AMD Radeon RX 7700S expansion module uses at idle compared to the expansion bay shell?
I don’t have much of a need for a dedicated GPU, but it would be nice to have it when needed, I’m just wondering if the idle power usage difference is enough to justify powering down the system and switching out the modules, or just keep the GPU module in.
I’m not sure about the idle consumption of the 7700S, however theoretically it shouldn’t matter because the system is supposed to automatically deactivate the dGPU when at idle and just use the iGPU.
However in practice many users have reported (with other laptops) that poorly optimized software sometimes causes the os to keep the dGPU active. For example I’ve seen some people state that the Microsoft Phone app causes issues.
So it pulls around 100Watts (some reporting up to 115) from the early reporting and white papers! Which means it will be more efficient than the 3060 desktop, and a little under the 7600MTX mobile (20 watts lower).
But we don’t know what Framework will clock it at… estimates are that it will be stable at 75watts just fine, but maybe it can go even lower? Some older GPUs were stable all the way down into the 30s so…pbbbtbtbt shrug hard to say. Expect a 100watt draw, MAYBE up to 115watt, but framework could potentially underclock it maybe even down to 75watts which would be 5 watts under the 7600mXT give or take 5 for safety.
@Kody_Boen the poster is asking about the power consumption when under little to no load. You seem to be talking about the power consumption under load, which Framework has already indicated will be around 100w.
Ahh, then most GPUs are capable of 85% today, so 85% of 100. 85 watts… but also reports of it working at 75 watts, so it might be getting 75%. Eventually a GPU cannot be stable if underclocked too much, messes with memory timings etc. So as I said before, likely 75 watts.
Unless this GPU is more like older GPUs or more like APUs then it could potentially go all the way down to like a whopping 33% like the old days if it’s been designed specifically to run in extremely low wattage but still maintain stability. you can technically get even 4090’s or 7900 XTXs down to 30-33% of their maximum but it takes a lot of work and they tend to not be very stable. But again, it could be tuned specifically for it, so who knows. APUs can handle being extremely undervolted.
We’ll have to wait to see what Framework did with it. My bet, 75 watts. I’d put money on 55-60 watts, but I feel 75.
You might be a bit confused about underclocking vs. undervolting vs. power (watts) consumed at idle. See TechPowerUp – the 4090 can use up to 500 watts but still idle at 21 watts.
A quick test shows that my desktop 3070 draws around 20w during light load (such as web browsing) despite consuming well over 200w when actually under load.
Although even 20w would be an atrocious idle power consumption for a GPU being used in a laptop. That alone would drain the battery at 24% per hour.
My current laptop (a Razer Blade 15 with RTX 3070) will let the 3070 draw about 15W sitting at the desktop without anything running. I have spent a lot of time and energy trying to get it to turn off and stop drawing power unnecessarily, but alas any fix I find or implement is only temporary.
Now I simply run Arch Linux and the is disconnected via software and cannot be used unless I remove the block and reboot.
With my 5700 xt the idle power draw issues with multi-monitor configs only occured if the monitors used non-standard timings. I was able to solve it by setting both of my monitors to use the standard Coordinated Video Timings.
There’s a linux tool called nvtop that gives some power data and it seems to be mostly right, but I got that figure by running the laptop idle at the desktop and getting the battery’s power draw from powertop, then disabling the gpu (rebooting) and then observing it again. There was a 15W delta, which was about consistent with what nvtop reported.
There are tools on windows that can do the same thing, like HWInfo 64.
I will note that the expansion bay modules don’t appear to be intended for regularly swapping out as it involves removing screws inside the chassis. There are also some community concerns about the number of times the port can be plugged/unplugged, but Framework doesn’t like us to speculate on those as they haven’t released an official port rating yet.
For my FW 16 I ordered it without the GPU because I’m sick of the work arounds it takes to disable it. I’ve got a desktop with GPU if I need it, and there’s always eGPU options.
Ars Technica answered this question in their review.
Battery life has always been a bit of a weak point for the Framework concept, and over the years, the company’s laptops have ranged from “passable” to “not great, actually.”
The Laptop 16 manages to be both; without the external graphics module, the 85 WHr battery manages a little over eight hours in our PCMark 10 battery test. This isn’t incredible, especially given the size of this battery, but it’s perfectly workable, roughly the length of a standard workday (if you’re traveling with the Laptop 16 in the first place).
The battery life with the graphics module installed, even with hybrid graphics enabled, was just over five hours. And this is in a test that doesn’t really use a dedicated GPU even when one is present. Other laptops with dedicated GPUs we’ve tested over the years have not been this affected by them in a general-productivity battery test; even a Lenovo ThinkPad X1 Extreme Gen 5 from a couple of years back does better, and it’s equipped with a power-sucking 4K display.
So if you’re traveling, maybe leave the GPU module at home or pack it separately. Between the size, the weight, and the hit to battery life, the best thing about its modularity right now is that you don’t have to live with all of its downsides all of the time.
On my current laptop which has a RX5600M, if the dGPU is not used and powered down, it drops to almost zero watts of usage.
Biggest issue is that sometimes some app grabs the dGPU when it really doesn’t need it, then it will eat about 8w by itself doing nothing.
If it didn’t happen on the review units, I suspect that its due to a firmware/sofware-config issue, where possibly PCIe ASPM wasn’t set up correctly.
I use nvtop to measure it when I’m using it, and battery-controller querying tools like powertop to measure when I’m not. To get GPU measurement I look at the delta between GPU enabled and GPU disabled (via BIOS/kernel module loading) under the same load conditions.
I’ve filed a feature request with nvtop to change the behavior. Mission center wraps nvtop and has the same problem, so if nvtop solves it mission center will too.
Another data point. on the LTT review, the battery life was hardly affected by having the dgpu installed whilst doing common desktop work like watching a video. So it seems that if there’s a large difference in battery life it means power management has not been configured correctly. So, it can go down to nearly zero watts when not used.