Yes it is the correct one I just don’t like it when companies use third party businesses to route and monitor the requests. kustomer.help has been going since 2015
Next time I will directly email support. I’m usually more careful ??
UPDATE: I have sent a letter noting my displeasure with the third party use whilst asking for exact details on how the BMS works, I imagine it’s a part of the battery pack and further, how that info is made available for the OS
We use Kustomer as our ticketing system and have since our inception. Any frame.work domain linking to support is simply a mask as we’re utilizing Kustomer’s cloud platform. This is common practice. Knowledge Base/Support Tickets… Kustomer. Guides… Dozuki. Bugs… Jira. Welcome to 3rd Party tooling and platforms for business.
Here is a good video I found awhile back explaining the various ways that lithium ion batteries undergo physical damage and degradation. Very thorough, but I do not know for sure how many of these processes are applicable to the lithium polymer batteries sold by Framework.
I thought I’d had enough for the last year of monitoring the battery wear but what is happening is clearly of interest.
After some 3 months of a lot of use unplugged and once I had discovered that I could ‘read’ the wear and the BIOS update that permitted the ‘reading’ of cycles I took a devoted interest.
a) I started to use the laptop plugged in >90% of the time and b) set the Battery Charge Limit (BCL) to 78%. This seemed not only to stop the recorded wear rise that had reached some 7% after four months but it started to decline.
As it started to decline I thought either the computer was reading wrong or what. I have since discovered that the battery pushes the info to the OS and the OS does very little. So I thought maybe the battery itself could do with reset.
So approx each month I reset the battery by running it down until the laptop automatically turned off and then charged it to 100%. Coincidently with that each time the recorded wear dropped. I know it’s a bir uncanny.
Now 13 months later I thought time for a change so I decided to lower the BCL to 69% This was due to the fact that at 78% the battery was till charging to nearly 4.165V which I can understand may wear the battery. So now I have the battery at 4.032 with the BCL at 69%
I was expecting a change but not the one I have. Today I did what was to the be the last of this routine checking on 31st March and what happened . . .
The wear has just recorded 2.8% which is lower than it was upon receipt.
The reading is wrong now and there is a lot more wear or
The reading is correct and the previous reading are way out.
The take is this.
If measurements are not taken over a long period with different settings and use patterns, and tabulated and graphed there is no easy was to see what is/maybe happening. Here is the graph from 4th Jan 2023 to 30th June 2023.
UPDATE 19th April: Since I inadvertently let the charge drop a bit I used the opportunity to reset the calibration by a full discharge > a full recharge and then > discharge to my ‘new standard’ of 66%
The cycles went up 2 which I wasn’t expecting and the wear sort of doubled in that in went from 2.6 to 4.8 ???
I just did a deep discharge and recharge cycle, followed by a couple of more 90-100% cycles to see if it would get any more in the top end. /sys/class/power_supply/BAT1/charge* reports 3084000/3572000 = 86.3% SOH. 1.4 years in service, usually charged to 75%, not sure how many full cycle equivalents but I’m going to guess maybe 300ish.
Glad replacement batteries are available-ish in the marketplace. Will be looking forward to upgrading to the 61Wh and moving this battery to the router/wireless AP/NAS unit instead.
I think you are reading too much into this. This data is mostly within measuring tolerances and by undercharging so much you really should not have any measurable wear jet.
Since I don’t exactly know how your full charge process works I’ll just have to guess you don’t discharge to complete empty with a constant load so the zero point of the fuel-gauge in the battery just keeps drifting downwards with every time which explains your graph.
And even if you do discharge till the battery turns off if you have different wear you’ll get different 0 points because the voltage sag is different.
Would be neat if the framework ec got the battery re calibrate mode thinkpads have where you can leave the laptop plugged in and it discharges and then fully charges the battery itself giving very consistent results.
For example as I said every few week I do a full discharge and then charge to 100%. Sometimes i do this twice and keep the battery at 100% for a few days.
The wear in the first few months was 7.5% and 45 cycles, this was when I was using most of time not plugged in
So sure the readings are questionable but then how else would the battery state be assessed.
As I’ve said you can see details of the changing charge rates etc.
The main issue is that I mostly use it plugged in and if lower the Charge limit effects the recorded wear in such a way as I have seen, then clearly I will just keep testing to see how long it can keep this up
AS I said all the details of Charging and reported wear for 13 months can be viewed.
Sorry in advance, this may turn into a bit of a semantic debate but in this case it’s important to define terms accurately so we are not talking about different things.
Yeah but not HOW you do a full discharge, this part is important because if you don’t go all the way to empty those watthours will be missing in the total. If you just discharge till the is shuts down that would pretty nicely explain your graph creeping downwards each time.
There is 2 was to look at it, for one there is actual wear (the one I am talking about) where the battery cells loose capacity and get higher internal resistance and self discharge and then there is the wear reported by whatever tool you are using which is the difference between the design capacity and the capacity the battery controller thinks it has (which I think is what you are talking about).
The fuel gauge in the battery basically works like this, it has a coulomb counter so it basically counts the energy that goes in and out and updates the “capacity now” value based on that. when it is charging it increments, when discharging it decrements. Of course that process isn’t perfect and so that drifts over time and might over and under report. when you fully discharge (like fully fully) it counts down till it is at zero and then keeps going until the cell voltages reach the cutoff point, which is why when you re-calibrate a really confused thinkpad battery or a battery where you swapped the cells the battery percentage stays at 0% for a long time. Once it actually emptied it starts counting from actual zero and you should get a pretty accurate capacity. Note that the actual capacity of the battery did not change just the measurement of it. And even that may not be exactly the same run to run if you don’t control the discharge rate because the battery voltage sags based on load which makes it reach the min faster on higher loads. Also some battery controllers get confused if the actual capacity is higher than the programmed design capacity XD, I once put 120wh worth of cells onto a 24wh thinkpad battery and boy was that controller a dick about it, it just capped the measured capacity at 24wh and then just stuck at 0% for hours.
Never going to 100% or fully empty does lead to accumulation of errors in the measured capacity but it does cause less actual wear.
I bet your test cycles put more wear on your battery at this point than your normal use XD
The discharge rate is always the same, browsing and the odd video which takes around 5 to 6 hours.
Discharge is done with the same use, until auto power on. Then I wait an hour or tow and power on and may get another half an hour use. It power off I leave it for another hour and power on and may get 15 mins. Usually it will not power on a four time.
So if the charge is coulomb counting, that is fine in measuring the actual energy.
As resistance increases as the battery gets charged, then charging becomes increasingly inefficient.
I can also measure the amount of watts used to charge externally
So the issue is how representative of the battery charge is the coulomb count ??
Regarding wear : I see it in relation to the cycles.
Framework state 1000 cycles for 20% wear, or 50cycles for 1%.
Framework don’t say how the cycles are achieved or over what time, hence my checking to see if what I do correlates.
I’m aware that as a battery wears each cycle get smaller and yes charging to 100% will be a bit stresy for the battery but then I can’t see Frameworks statement 1000/20% would not included and 100% charges.
Yeah that probably explains the variance, you using windows or something else?
I’d bet it doesn’t discharge all the way down but just a little further each time which would explain your initial downward trend.
Framework really should implement the online re-calibrate routine in the ec to eliminate that (hell it’s open source I might take a crack at that if they don’t).
That’ll just give you even messier data, the dc-dc efficiency changes a lot more than the charging efficiency does. You could use a slow charger to reduce the charging efficiency differences even more if you want to do that.
Good enough I suppose since that is how it has been done for ages now but it is certainly not perfect not that perfect really is required.
Since the wear curves are usually not linear idk if you can extrapolate like that, they also afaik didn’t define what a cycle means and your battery may actually have more than 55Wh (even factory new cells have varying capacities) from factory which would hide the actual wear rate for a bit.
With the 18650s the roule of thumb was about double the cycles for every 0.05v less max voltage (of course you loose capacity initially but you’ll keep it longer), so if a 4.2v max cell is rated for 80% capacity after 300 4.2->2.5v cycles it would get 80% capacity after 600 4.15-2.5 cycles and 1200 4.1-2.5 cycles and so on. Given that there usually wasn’t all that much capacity at the higher voltages this tends to be a good tradeoff.
I have not seen too much data on the spicy 4.40 or 4.45 chemistry the framework battery uses but I’d bet it has similar wear characteristics so your only charging to 4v per cell should increase your longevity dramatically, assuming you aren’t grilling the battery cause temperature causes wear too independently.
Anyway, point is you should be careful drawing conclusions from data that is more noise than signal, humans are made to see patterns so we often tend to see them where they aren’t or misinterpret stuff.
Would be neat to get a datasheet for the cells in the framework batteries but I doubt that’s going to happen.
The exercise is more of providing some background of what and how the readings are representative of charge limits and charging, as the topic is a question.
That the results may not be truly representative of actual battery wear they can be representative of use.
If the reading of coulomb count is fairly accurate then I can compare what the capacity is relative to when it was received, that gives me an idea of wear.
UPDATE: May 1st: I have another way of showing the wear by comparing the recorded cycles and the values wear.
I hadn’t managed to get a record of the cycles until a specific BIOS performed that, I think it was 3.09, so it is from then data was available
The wear was 3.4% and dropped to 2.9 however it wasn’t for a few months until I took daily reading when one day in May I saw the ‘wear’ as nearly 7% and the cycles around 50
So I have compared the cycles to the wear based upon Frameworks claim that the battery will loose 20% capacity in 1000 cycles, or 1% every 50 cycles on average.
So the [Blue] line on the graph is the cycles divided by 50 hence 1 and up, given I started at 50 cycles.
The [Red] line is the wear minus 3.2 (as a figure between the 3.4 and 2.9 that were shown via powercfg in the early days)
Ideally these lines would coincide or at least run parallel. It is also worth noting that wear or capacity reduces quite quickly in the first 100 cycles although this is greatly influences by temperature and time. Ref to be included, or see a link in one of my previous posts
I usually use plugged in and have the limit set to 66% but inadvertently the mains wasn’t secure and it dropped to 60%, so I engaged the plug and then thought I 'd check the ‘wear’ again, having only done it 40 minutes before.
It could have gone either way but it dropped from 4.1 to 2.8 UPDATE: 16th May The wear is recorded at 2.4 ???
So much for not reading too much into a couple percent, especially without a full discharge and charge cycle.
Running plugged in (especially with a charge limit) leads to drift in the coulomb counter over time, that plus maybe a bit of battery self discharge would lead the battery to estimate it’s capacity a bit higher on a partial charge. That wear level is an educated guess of the battery and with it never reaching one of the extremes it tends to drift more and more from educated towards guess over time.
Look at it this way, it’s like you are measuring the distance between work and home by counting steps, that can be quite accurate as long as you reach home or work from time to time and start counting from there. If you just move a couple steps forwards and backwards over and over again, it turns less accurate.
Running an 1135G7 that I got in March 2022.
Using HWinfo64, my battery shows 10.2% wear after only 138 cycles.
I’m not really happy about this to be honest, as the claims by Framework that the battery is designed to retain 80% capacity after 1000 cycles seems to be a bit misleading. I’m not anywhere close to halfway there!
I’m not really using this laptop for heavy tasks. Light gaming while pugged in, but on battery it’s for document writing, web browsing, and watching videos. I don’t really use it off power that much.
So yeah, hoping to get some input. I’m not ready to be thinking about upgrading the battery, it’s still practically brand new!
Hi Unless you have a reading from the beginning it’s hard to say how much of the wear is down to initial capacity. The battery may say 55W nominal which is used as a base line but could have been only 52.5 and appeared as nearly 5% from the start.
Mine was apparently 2.9% and then rose to 7.4% in maybe 30 cycles. (29.6% in 120 cycles ) So I thought I’d see if it was a calibration issue. I will never really know for a few years
Then I tried calibrating by discharging and a full recharge. Every time I get a high reading I recalibrate and over a year I have only 64 cycles and an average of 4.5%
or maybe 9% for 128 cycles. In that sense what you have is comparable.
So I have a graph that compares recorded wear ( I check a few times most days and take the lowest) and compare that to the wear that would/should have been seen by cycles
On the graph I’ve divided the cycles by 50 to get them on the same level.
The red is the expected wear given the number of cycles ( 1000 for 20% [50:1] ) the other the wear as recorded.
As you can see the ratio is getting better. Data line was aligned to the cycles[red] that would be in tune with expectations. You can see how it is getting better and below the red means it is better than expected.
But note this data is from checking every day, so if you don’t do that the odd recorded measurement can be way of by a multiple at least 4 in my case 7.4% at 30 cycles.