I got the dGPU over the weekend and am now able to see how much my battery drains while under an intensive gaming. Seeing how hot the power supply got, I couldn’t help but wonder if it was thermal throttling on power delivery. Does anyone know if the PSU will thermal throttle the supplied wattage under heavy load at normal ambient temperatures? I’m more than happy to put a $10 heatsink on there if it means higher wattage output.
Update: I’ve decided to order a usb-c power meter and intend to measure power delivery over time and use a thermal camera to monitor the temperature of the PSU. I ordered this meter, and while I’m not necessarily convinced of it’s reliability as a sensor, I couldn’t find something better.
I’m going to try cross-referencing it with a clamp on amp-meter.
I don’t think thermal throttling is the right thing for a power supply as this would cause confusion and makes troubleshooting harder. Instead, it should shutdown completely when overheated.
Gan power supplies can get incredibly hot and still work, while the main upsides of gan semiconductors is their electric properties, they can also take a lot more heat than regular silicon ones to the point where pretty much anything else will fail first. Also the caps still don’t like it warm XD.
With great power density comes great heat even if you are incredibly efficient.
Modern power supplies do have thermal protection but that is usually not throttling but turning off when overheating.
I have this meter. Has been working well when I’ve used it. The reported power usage is reasonably accurate & reliable though I would not call it a precision instrument (then again it’s not sold as such, so …). Readings seem to be accurate within ±0.05V and ±0.01A. Definitely good enough for monitoring USB PD usage.
Thanks for sharing! I found on the manufacturer’s site they state that their testing has shown a ±3% error at 20v, which (assuming it’s the same margin at higher voltage, which I don’t necessarily believe) would give a maximum equivalence of ±7.2W in error. This would be within what I consider to be an acceptable error margin when measuring somthing as high as 240W. At the 180W this PSU is rated for, that drops to ~5.4W.
Definitely not good enough if I was doing some professional power sensitive work, but for something casual like this, it may be good enough. I anticipate my little thermal camera has similar limitations.
How hot a brick gets depend on a lot of factors. How hard it is loaded (in percentage of max load capacity), but also output voltage, switching frequency, and input voltage.
I found out that switching power supplies tend to be not as hot when operating on a 120V compared to a 240V, despite being more efficient on a 240V grid.
I also have that same meter. I don’t use it a lot but I do use it to check outputs from time to time. It seemsrather fragileand mine had some plastic rattling in it when I first got it. I did see another when I was looking for one and forgot what it was called. It was a cable that had the meter built in and it was cheaper.
They tend to have lower on resistances and allow for pretty fast switching speeds wit lower losses thx to low gate capacitance and stuff.
They also tolerate high tempeatures a whole lot better than their silicon counterparts.
That doesn’t really make logical sense, either it’s more efficient or it’s hotter, can’t be both at the same time without some other external influence.
The efficiency difference is like 7%. Very interesting discovery, as well.
Although that is observed on apple 10W bricks with a not-published efficiency-load curve.
I can still test it. Our lab have 230V transformer (since you won’t get a 208V hookup for your laptop)
My USB power meter arrived, so I ran some games and checked it out. As far as I was able to tell, the power supply was not thermal throttling, it outputted about as much power the whole time despite the increasing temperature. The highest temperature I saw on the power supply, according to my thermal camera, was in the 60c range. It may have continued going up a bit if I ran it for longer, but I don’t think it would have gone much past that based on how hot it felt to my hands, knowing how hot it feels after hours of heavy use.
One thing I did find interesting though, is that it seemed to max out around only 155-160W of output, at which point my battery was draining at a rate up to 20W. If the error on the sensor is ±3%, then actual draw could be as high as about 168W, so let’s say 170W. This (somewhat arbitrary) error adjusted supply delivery is clearly lower than 180W, which itself doesn’t seem terribly concerning, were it not for the fact that an extra 10W of deliver would make a drastic difference in gametime/battery draw. This would be the equivalent of doubling the length of a battery-dependent gaming load, and eliminating some the loads that only pulled 10W from battery entirely. If I assume the sensor was accurate, and that the PSU is delivering on average 157W, then the extra 23W missing would account for all the power needed to handle the heavy-loads during my testing.
Maximum power draw from the PSU was no higher than the 155-160W measured when the PSU was cold (room temperature) from being idle for a while.
Here’s a sample of images I collected during a “heavy load” of Space Marine 2. The USB-C power meter in this sample is showing 156W draw, and at the same time the battery is reporting 19.3W load, which translates to just under 5hrs of game time.
I’ll also note that I’m not at the extreme end of this power meter’s readable range, as it’s rated for up to 240W. I’m sitting at about 65% of it’s maximum throughput. I unfortunately don’t have a good way to test the accuracy or precision of the power meter.
this might explain why windows will say “this device might have limited functionality” every time I connect my 100W charger, which is 20V 5A.
If it caps at 4.41A then indeed it have “limited functionality”.
This then enhances the arguemnt of a “universal barrel jack charger”, or a “unviersal charger port”, with vastly enlarged contacts for higher currents (10A?) and minimum voltage drop.
It will also be much easier to implement, since it’s “just power”, therefore lowering cost.
I’m not sure. I imagine either a safe shutdown, an unexpected shutdown, or a performance throttling of the processors.
If there is a cap at 4.41A, that would seem highly specific and odd to me. The device is supposed to be capable of 240W charging, granted that’s in no small part thanks to the use of a higher voltage. If 240W requires 48V, and the Framework is limited to 4.41A (we don’t know for sure that it is), then it would be capped at pulling in just under 212W instead of 240W.
I wonder if that’s due to the voltage drop you’re seeing? that would make a lot more sense then that rather random current limitation. Also resistance in the USB C port wouldn’t cause a lower voltage to show at thateter like another comment is suggesting