Batteries are not charged by the power of the supply but through the PD protocol ask for a specific voltage at a given time.
As long as the power unit can provide the max the computer asks for then it will charge at a maximum rate, probably a max of 80W. A lower spec power unit will prolong the charging time and if the laptop is being used lower it so much it may not even charge the battery.
So battery takes a max of 80W the rest is available for the laptop’s use.
A 240W power unit is fine as it won’t get stretched.
Downsides are:
Not specific to the Framework so if there is a support request around power, then it’s not so easy to deal with
It’s a bit heavier to lug around, so maybe another for day travelling
It will use more power in proportion to that used. i.e. if efficiency is best at 80% then you will require to load the supply with the best part of 200W. lower power uses are usually less efficient. No worry unless you are out in the woods, in a cabin, with solar, in the winter And I’m not concerned with the notion of ‘waste’ and the ecology, else I wouldn’t buy a computer.
I think amoun is correct. It can allow up to 240W (PD 3.1), but I doubt the battery will be charged at that rate. Power usage is for the entire unit (battery charging, and the other laptop components such as CPU, GPU, memory, display, etc). Maybe 80W is safe like you said. If that’s the case, the rest of the power can be used elsewhere or just not used.
My current laptop has a 65W PSU. When completely dead, it can actually pull 50W to charge the battery (with laptop off). With it on, it uses the whole 65W (observed through a Kill-A-Watt meter). When fully charged, even if I max out CPU (it’s only iGPU), I only see 55W of usage. So there’s definitely power budgeting going on there.
You can provide 25W, the computer asks via a PD protocol for a voltage, not watts.
Not sure what the rest means, can you explain a bit more?
The laptop runs from the battery.
The power unit charges the battery via PD, which the computer negotiates.
If the laptop uses 25W and the power unit can deliver 24W at some voltage then the battery state will slowly decrease as the 24W of power let’s say at 12V x 2A as there is inefficiencies and the battery is only 90% efficient at best.
For my Gen 11 Batch 1 FW13, a 30 watt charger will charge while in use, (not Heavy use, though). Anything smaller didn’t really want to charge, while the computer is in use. Turned off, it would charge slowly.
I have not seen any reports of a minimum cut-off where the laptop will not charge if it’s unable to negotiate high enough. In fact an “emergency charge” exists where it will take 5v.
That being said, a 45w (and especially a 25w) will charge slower because the voltages available are less than what Framework’s supplied adapter can provide.
On both Framework Laptop 13 and Framework Laptop 16, the batteries are designed to charge at 1C, and the charging circuitry will only ever charge up to that rate, regardless of the power adapter wattage. That means up to 55W/61W on 13 and 85W on 16. Charging above 1C would require either putting extra wear on the battery that limits cycle life, or using a battery chemistry that has less energy density.
There is some loss in the charging circuity, so you’ll need something like a >60W/65W power adapter for 13 and a >90W power adapter for 16 to charge the battery at full rate if the system is in standby, or more if the system is in active use.
1C is the max, so 84W ÷ 16V is some 5A, but this is only done for a short time as 5A would kill the battery, even explode it if the battery is really flat or nearly fully changed so it can take two hours or more to charge.
The rate is lowered not just to save an explosion but to reduce wear which is largely down to heat.
Charging an empty or nearly full battery requires a lot of force, which generates heat, so at both extremes the voltage may be 5V and 0.9A the basis of USB A 2
Now a charger may be able to provide 5A at 20V so 100W
85W divided by voltage give the max current for charging
I.e there is benefits in having a 100w charger? 1c into the battery. If you have a 65w charger you have nothing left over. So if I draw 40w on the platform, only 25w can go into the battery?
If I have 100w it means 60w can go into the battery while 40w can be used by the system
Here is another of the same data, it does require some questioning.
Can’t remember why it dropped to 3W but that is the base draw when not in use but plugged in. I have no drain in hibernating without power etc.
I’ll do another test next time I run the battery down to zero
The main issues are
With a completely discharged battery the initial charge rate for a hour is low
The is starts vamping up. So it seems the battery has to be around 15% before it’s deemed to be safe to take a 1C charge.
After another hour it drops of . . . . so that’s some 65W input over two hours, at some efficiency of maybe 85% giving 55W
Note there are losses in a ) The power unit ~ I am measuring the power unit not the battery b) The battery has inefficiencies too.