Thank You for your continued support and contributions!
The information you link to does not support your conclusions (in fact, they agree with me)
If batteries are really good for so many charge cycles as that chart suggests then I don't think this setting makes sense.
Ah but the first chart does not say anything about how charging was performed. The assumption is the batteries were placed in an analyser and charged/discharged repeatedly. The point of the chart was to rebut torne's assertion that charging per se is detrimental.
The basis on which he made that statement still remains outstanding.Also why do the BU docs take the trouble to explain in the Li-ion sectoin HOW the charging should be done to preserve battery life. These later points are what i've highlighted, and are all linked in my first post.You want to comment on them ?
My goal here is to establish the concept that not charging to 100% and partial charges are better than charging to 100% and letting the unit run flat.
Ah but the first chart does not say anything about how charging was performed. The assumption is the batteries were placed in an analyser and charged/discharged repeatedly. The point of the chart was to rebut torne's assertion that charging per se is detrimental.The basis on which he made that statement still remains outstanding.Also why do the BU docs take the trouble to explain in the Li-ion sectoin HOW the charging should be done to preserve battery life. These later points are what i've highlighted, and are all linked in my first post.You want to comment on them ?
The *chart* doesn't, but the article that contains it does indeed say that it's referring to full discharge/full recharge cycles. Also, the chart shows capacity dropping off with number of charges.. so it doesn't rebut my claim at all.
I never disagreed with any of the things BU claim; I just noted that they don't make much measurable difference with batteries in regular consumer electronics, and that therefore it's probably a waste of time to implement a more complex charging scheme that gives people less runtime.
I think the point you're missing is that if your suggested regimen is to charge to 70% and then only let it discharge down to 45%, you are throwing away a huge amount of your runtime *anyway*.
In a couple of years time, your battery will probably be in better condition than someone who hasn't done that, but you haven't *gained anything* by doing so: their battery, even used randomly for the same length of time, probably has more 100-0% runtime than yours does 70-45% runtime
My ipodvideo's battery has not been replaced since the device was made in 2006, was used by someone else before me, and neither I nor the previous owner took any care over its battery whatsoever. When I got it halfway through its life, it lasted for 22-23 hours; it now lasts for 21-22 hours. If I only used it 70%-45% then even with a brand new battery I'd get way less runtime.
The charging per se is not responsible for lower capacity. I've found nothing in the BU docs to substantiate that normal charging is detrimental to a li-ion batterys capacity.
Throwing away is an exaggeration, i'm choosing not to use it, but the battery can still deliver it as opposed to being unable to. For me ~4 hrs/session is good enough. For those that need more time they need to get a player with a battery with more capacity than 15h to begin with. Its a sizing issue, ideally a player that can deliver 3-4 times what you need is better.
An extra point is that of parasitic loads, charging a device while its still on tends to trick the guage into taking more charge. Now switching off the device isn't an option while charging it, so accepting a lower charge is safer than 100%
A person that charged to 100% and used it with full DoD has got more run time out of the player, whereas i've deferred my use of it. But if they need more run time their battery isn't going to be up to it, mine will.
How do you know the battery actually has this much life on it. Do you run battery benchmarks on it regularly ?
There is another point here, your's is an anecdotal experience versus testing data by a battery company. I'm more inclinced to give the latter the benefit of the doubt.
An ipod has got better battery life to start with than a clip+ which maxes out at 15 hours and i've got very attached to it & RB.
The BU docs say its good to do a calibration every 40 partial charges, if you use it everyday thats once a month. I prefer to do it once very 3 months. Not too bothered if the guage goes out of sync by even 10% in that time so long as i know the battery is still going strong.
Actually, the ipod video's stock battery life as quoted by Apple is 20 hours, so mine after six years lasts *longer*, thanks to Rockbox being considerably more power efficient :p
Then maybe you need to go read more about battery chemistry rather than just relying on a single source? It is a fundamental fact of every battery technology that we have that they degrade with use no matter what you do with them. There is no free lunch. Usage pattern can change the rate of degradation, but nothing is going to stop it.
You and another used your ipod video for 6 years and you still get 20 hours runtime out of it today (!)
You and another took no care, that means you charged to 100%, ran it till it was dead and recharged again to 100%, you've been doing this regularly for the last 6 years too.
How is that possible ? i'd have thought you'd be getting half the stated life out of it by now if not less. ok, it looks like you got a 10% drop, over the last three years but thats nothing.
That's in about four-ish years time. Whereas your ipod video hasn't gone below 80% in 6 years that too with no paritcular care. I will assume you use it regularly, say a few times a week, rather than per month.
The point where i came in is you assert that regular partial charges are worse than letting the device run down. I can't find any evidence that charging is bad, only that DoD is.It would seem you are using your experience with the ipod video to make the statement that fewer charges, is better than more ?
As I said, the power consumption is lower, so the battery capacity probably has dropped quite a bit. I expect it has way less than 80% of its original capacity now, but I don't really know; I can't easily battery bench it in the Apple firmware, and I'm not motivated enough to do it the hard way, and I don't exactly know what Apple's stated life is based on, so without measurements from when the battery was newer it's hard to work anything out.
This just doesn't make any logical sense. If 4 hours a session is good enough for you and the full capacity of the battery is 15 hours of runtime, then you don't have a problem! You can use the battery however is convenient, and it'll be many many years before it's lost enough capacity that it can't deliver 4 hours any more. You are receiving no benefit from some complex scheme. Someone who wants to use the player for 12+ hours at a time might benefit from the battery having a longer life, but *can't achieve that* by having the charging behaviour change, because if they don't charge it as fully, or discharge it as fully, they can't get their desired runtime even while it's new.
The idea that you should have a battery that's 3-4 times the capacity you actually need is insane. Some devices happen to work out this way because their physical size/weight vs their power consumption allows for it, but most consumer electronics this is basically impossible: nobody makes a high end smartphone that lasts for most of a week on a single charge with regular usage, because you *can't do it* without making the device massive and heavy beyond what people will consider buying.
And, again, if you *have* got a battery that's 3-4 times the capacity you actually need, then caring very much about the charging behaviour is a waste of your time because it will have way more capacity than you need for a *very* long time whatever you do: it takes an extremely large number of cycles to lose 75% of capacity no matter how deep those cycles.
Always maintaining a Li-ion battery in a fully charged condition will shorten its lifetime. The chemical changes that shorten the battery lifetime begin when it is manufactured, and these changes are accelerated by high float voltage and high temperature. Permanent capacity loss is unavoidable, but it can be held to a minimum by observing good battery practices when charging, discharging or simply storing the battery. Using partial-discharge cycles can greatly increase cycle life, and charging to less than 100% capacity can increase battery life even further
A 100-mV to 300-mV drop in float voltage can increase cycle life from two to five times or more. Li-ion cobalt chemistries are more sensitive to a higher float voltage than other chemistries.
So, here's the next problem: our devices don't have "gauges". The batteries in almost all cheap consumer electronic devices do not have smart charging controllers built in, or even state-of-charge monitoring. The charge controller is part of the device, not the battery; it rarely stores any data whatsoever, and usually it's so dumb that you actually have to drive it in realtime in software to make it charge at all, which is why a lot of these devices can't charge without switching on. The percentage of battery displayed in Rockbox is not the reading from some gauge, it's just a guess based on the current cell voltage read by an ADC, compared to a hypothetical discharge curve that's hardcoded into Rockbox for that player. It completely ignores internal resistance because we haven't got a good way to model it. That same cell voltage is also used to decide how to charge the battery (which phase of charging to be in, etc).Smart batteries with real state of charge gauges are generally only found in laptops. So, yeah, all your comments about gauges and calibration and so on are completely irrelevant, because those things *do not exist* on any of the hardware we're talking about.
Yes, it's an anecdote. The point was it's an example of real-world capacity loss. "Testing data by a battery company" here actually means "some graphs put up by some company that makes expensive advanced battery chargers", which if you think about it is someone with a vested interest in making you think that battery charging is complicated.. :p
Chargers for cellular phones, laptops, tablets and digital cameras bring the Li-ion battery to 4.20V/cell. This allows maximum capacity, because the consumer wants nothing less than optimal runtime. Industry, on the other hand, is more concerned about longevity and may choose lower voltage thresholds. Satellites and electric vehicles are examples where longevity is more important than capacity.
As I mentioned above, since there are no battery gauges in these devices, this is a waste of time and is just using your battery more for no reason. The Rockbox percentage is a completely fixed calculation and the only way to get it back "in sync" is to battery-bench your device with its battery in its current state, work out approximately what the discharge curve is as a result, and modify the source code with your readings.
Edit: Also, yaknow, I'm not stopping you (or anyone else) from implementing this and trying it, though obviously testing it objectively is going to be rather time-consuming and require comparing two devices. All I've tried to do is warn people that it may not be worth the effort. If you do implement it, test it, and can show that I'm wrong and the effect is significant, then that's great and I'll help you get your patch landed as soon as possible.
QuoteAs I said, the power consumption is lower, so the battery capacity probably has dropped quite a bit. I expect it has way less than 80% of its original capacity now, but I don't really know; I can't easily battery bench it in the Apple firmware, and I'm not motivated enough to do it the hard way, and I don't exactly know what Apple's stated life is based on, so without measurements from when the battery was newer it's hard to work anything out.Hmm i don't see any ipod video logs posted for the battery runtime plugin.But RB can run on the ipod video
It seems the runtime plugin is not available in RB for the ipod video.
The only way i know to get anywhere close to a 0.2C load is to play FLACS in a loop on the clip+.
A smartphone is two levels more complicated than a clip+. First there is the transmitter/receiver that needs to increase power as and when the signal weakens and ontop of that you might be doing some cpu intensive computing on it. A lot of unpredictable and heavy pulsed loads in concert. Not charging to 100% might help.
QuoteAs I said, the power consumption is lower, so the battery capacity probably has dropped quite a bit. I expect it has way less than 80% of its original capacity now, but I don't really know; I can't easily battery bench it in the Apple firmware, and I'm not motivated enough to do it the hard way, and I don't exactly know what Apple's stated life is based on, so without measurements from when the battery was newer it's hard to work anything out.But RB can run on the ipod video It seems the runtime plugin is not available in RB for the ipod video.Damn, so it makes it harder now to tell how your battery deterioated over time.
I found out the battery specs i quoted in my previous post were for a 0.2C discharge load that means the battery runs down in 5 hours. That's what the graphs are referring to above. 0.2C appears to be a standard battery test.So to interpret those graphs one must consider how long the unit takes to run down first. If its 10 hours thats a 0.1C load, if longer then even less, 15 hours is a 0.067C load.The only way i know to get anywhere close to a 0.2C load is to play FLACS in a loop on the clip+. Otherwise with mp3s it will be longer so the load is more like 0.1C or even lower.Am assuming a 0.1C or lower load will take longer (twice as long?)to kill the battery than a 0.2C load as its more gentle.
This is a good point. I was fearing that the clip+ would be dead in about two years with normal use so was looking to see what could be done.There is a fixed time before a battery gives up the ghost. You can use that time either in small chunks or use more in a go. I guess this is your point, that either way you end up with nearly the same amount of work. So i guess the deep DoD isn't going to make a tangible difference. Shorter DoDs are better though its not so mandatory.
The idea comes from sizing lead acid batteries for a home UPS. Sizing is cost effective in this case as those batteries do not like deep discharges and cycle time can be considerably reduced.
ok, so i still have the point about not charging to 100%. Here is another article in favour.
Interesting, did not realise there was so much control of charging in user land, thought the devices firmware would be responsible and the only option available to RB would be charge to 100% or nothing.
This means you have to individually code a charging algorithm for each device RB is ported too, that is if you allow RB to do charging on that device in the first place.
Incidentally what criteria do you use to decide the battery is full and to stop the charging ? That the 4.2V is reached along with timing ?
Thing is the battery benches tend to be close to what the user expects assuming the battery is in good working order, not read any comments that the RB 'gauge' is wildly off so far.
Ah yes what a conspiracy
Comparing with another device is not possible, though ideal.
Can i prove to everybody here that this is a better strategy for longer battery life ? no, all i can do is point to the sources, reputable ones that say not charging to 100% is better. So i will do that for now.
Page created in 0.056 seconds with 17 queries.