Btw, an interesting fact is that sometimes configurations with more cores are more efficient than those with less cores, for the same performance, because having lower clocks can save more power than delivering energy to less cores does.
This mainly applies to high performance tasks and is usually the other way around in lower utilisation, however, in tandem with a kind of MUX switch, as in completely deactivating the dGPU in lower power scenarios (or with this laptop just taking the GPU out), having higher core count silicon, eg. a 9070 downpowered to 9060 XT wattage/both to 120W, might be more power efficient under load.
Well, I guess we do have new information on the Framework 16 :). A couple of thoughts and questions from my side.
Firstly, one has to say it, huge congratulations to the company for actually making upgradeable graphics happen - wow, what a milestone in the laptop industry.
Then, second point, I guess it just is what it is regarding gaming performance. Framework is not Razer, Razer customers will always have a 5090 option in the same form factor, while framework doesnât seem to have a high-end objective - which is fair. But they actually are in touch with Nvidia, which means there is âhopeâ ;). I maintain that 8gb of VRAM is not helpful for âsustainabilityâ, but thatâs another debate.
Love the other innovations, 240w charger - wow.
A couple of questions:
If I purchase the new mainboard but keep gaming with an external GPU via OCuLink - how much would the new CPUs (the higher end one) increase my fps? Any ideas? (using RTX 3090 via OCuLink 4i).
How is the 240w charger useful? Can someone explain to me? If the GPU is 100W and the CPU 45W, who needs it? I am certainly missing something, but I am just not sure why Framework delivers a 240w charger, but not a 175w graphics module ;). What is it for? Just to charge the laptop a bit faster? Who would carry around a brick just to charge a bit faster?
I had the possibly-mistaken impression that that was a Nvidia requirement. If Nvidia allowed more VRAM, they might not sell as many more-expensive desktop cards.
Right now, if you play a demanding game on Balanced or Performance mode, the battery drains. A 180W charger doesnât quite deliver 180W, itâs a bit lower, and then there are efficiency losses, and where Nirav was saying âsustainedâ, thatâs not to say that the parts donât pull a bit more. And until now, the computer market at large hasnât supplied a 240W charger that actually plays nice with the Framework 16. Some power-related issue somehow causes stuttering in games as far as Iâve read (and Iâve seen it first-hand with most of the 140W adapters Iâve tried personally).
A 180W adapter should delivery 180W to the laptop - a good one will allow for a few % over before cutoff, like 185W for example. But, voltage may âdroopâ a bit at high current draw, over the cable, or just in the power conversion circuits of the adapter. Hereâs some low-level testing of the Framework 180W adapter show it hitting the full 180W on the output: https://www.youtube.com/watch?v=w10htntCKow (it also introduces a bunch of other ideas of how to think about the performance and behavior of power adapters)
But consider that the power adapter needs to potentially supply the peak possible GPU power, the peak possible CPU power, and charge the battery, at the same time. The GPU and CPU power may spike very quickly, too fast for the system to adapt the budgets for the different components. Also, the firmware needs to account for power adapters that donât quite meet their specs (which are very common and users expect them to work), by slowly ramping-up power usage and backing-off if voltage-droop is detected.
And finally, the laptop firmware may just be a bit buggy and not very smart. @James3 has published his own fork of the EC firmware which fixes stuttering on 140W adapters and similar issues. Itâs not particularly easy to install, but see A call on 240w adapter - #396 by James3
Thanks for the detailed correction. I thought someone had measured and from the âlaptopâ end it was only pushing 150w. But, lots of threads - I am most likely wrong
Iâve never seen 180W in any of the USB-C hardware power monitor things Iâve plugged into the laptop⌠chalking that up to conservative behavior laptop-side is probably right I guess, though. (I wouldnât know.)
If you were wrong, I donât think you were very wrong.
You arenât wrong. I can only get the 180 psu to max at around 163w when trying to push it. And if I launch a demanding game (Helldivers 2 for example) which pushes the gpu to 100% and cpu to around 75%, then launch stress to max out the cpu as well, it will just throttle everything down to about 60w which is pretty ridiculous.
And when playing heavy games for hours, even with the psu âmaxingâ out at 164w, the battery slowly goes down and that psu will become too hot to touch.
Some here may tell you it is âdesignedâ for this or that, but unless they have the benchmarks and testing to prove it, take it with a grain of salt.
Nvidia would need support of 24Gib (in other words 3GB) modules or clam-shell/double stacking modules with the 5070 and then itâs mostly about drivers.
The problem can also be âsolvedâ on the hardware side (but would still need driver support), but that would mean a complete redesign or using a better chip beforehand which hurts their margins: The 5070M has a âphysicalâ memory interface of 128 bits, and each memory module uses 32 bits, so max are 4 chips. If someone wanted to use more chips, they could only stack another 4 chips on the other side of the PCB to use the same bus, which is both expensive and bad for compact devices. Else Nvidia would have to give the card a wider memory interface, for example 192 (6 chips) or even 256 (8 chips), but as stated before thatâd mean a costly (millions of dollars in R&D) redesign.
Anyone else having USB issues? If you for example 3d print, and push back and forth the same SD card. Or do embedded development, and it goes on off on ofâŚ
With AMD stuff it doesnât work good. It goes slower and slower in each detachment, no matter sw or hw.
Unfortunately, yes. I posted this somewhere else in these forums for a couple other folks as well. I was a bit disappointed to learn that aspect as I use both the 2280 and 2230 slots on my current 7940 plus the expansion bay with an additional 2280. I plan to swap to the AI 370 when it drops (pre-ordered) and already swapped over to one larger 2280 and combined the drives since the 2230 will be reduced. Good thing I have the expansion bay so I can come to terms with the fact I still have additional storage. But I feel for the folks who will be using the dgpu and limited (internal anyway) because of such.
I mean pcie4x2 is still almost 4GB/s would not call that the end of the world especially for a 2230 which tend to be on the slower side since they have to use less and denser flash. Still having the full with would have been nicer.