Jump to content

Batteries, energy, and regenerative braking - an odd discrepancy


jd

Recommended Posts

I had an interesting experience yesterday and I'm hoping someone can help me make sense of it. My IPS 121+ went out with me for my first serious uphill ride and I collected some numbers I can't quite explain.

My wheel has a nominal 350 Wh (Watt hour) energy capacity. I'm a pretty big guy (~100kg) and I went up a 75 meter (vertical) hill at a pretty slow pace, maybe around 6-8 km/h, total road distance about 1.5 km; parts of the hill are pretty steep. Initial battery state as reported by the IPS app was around 70%. By the time I got to the top of the hill, the app was reporting 35%.

Assuming that the percentage reported by the app refers to the energy left in the battery and not something else, my wheel used about 35% of 350 Wh = 122 Wh of energy bringing me up the hill. The actual physical (gravitational potential) energy of moving 115 kg (me + wheel) up 75m is ~85 kilojoules, or about 24 Wh. Going up a pretty steep incline, with really strong power draws from the batteries, plus all the balancing and everything else using energy on the wheel, it seemed reasonable to see a 20% efficiency. To recap:

  • 24 Wh: minimum energy required to move me up the hill.
  • 35%: battery capacity change
  • 122 Wh: 35% of 350 Wh, the nominal battery capacity, the estimated actual energy used to move me up the hill.
  • 20% energy efficiency: seems low, but what do I know?

Then what happened next surprised me.

I rode my wheel back down the hill, expecting to get back some amount of the lost energy through regenerative braking. From somewhere I got a 50% efficiency number on regenerative braking, so I expected to get back a little bit of energy. And indeed at the bottom of the hill, my battery was back up to 52% charge. So I got back 17% of my battery, which makes sense -- I used 35% going up, and got back 17% -- about half -- going down.

Except that would imply that my battery accumulated 60 Wh or energy going down a hill that only has 24 Wh of potential energy. Either IPS has managed a 250% efficiency in regenerative braking (someone call the Nobel committee!), or my math or physics is wrong. To recap:

  • 24 Wh: maximum possible energy gained by rolling downhill
  • 17%: battery capacity change (a gain this time)
  • 60 Wh: 17% of 350 Wh, the estimated actual energy recovered by going downhill.
  • 250% energy efficiency: huh?

Anyone have any insight? Are the battery ratings optimistic? Is the charge meter in the app wrong or non-linear or something else? Perhaps the actual nature of battery chemistry means my battery underreported charge after a big power draw?

If I assume that my wheel does recover 50% of downhill energy, and thus 24Wh / 2 = 12 Wh represents 17% of my battery, then its true (usable?) energy capacity would be only 70 Wh, a far cry from 350 Wh. (If this is common among EUs, someone must have noticed by now, right?)

Thoughts? Explanations? Ideas? Errors?

 

 

 

Link to comment
Share on other sites

4 hours ago, jd said:

I had an interesting experience yesterday and I'm hoping someone can help me make sense of it. My IPS 121+ went out with me for my first serious uphill ride and I collected some numbers I can't quite explain.

My wheel has a nominal 350 Wh (Watt hour) energy capacity. I'm a pretty big guy (~100kg) and I went up a 75 meter (vertical) hill at a pretty slow pace, maybe around 6-8 km/h, total road distance about 1.5 km; parts of the hill are pretty steep. Initial battery state as reported by the IPS app was around 70%. By the time I got to the top of the hill, the app was reporting 35%.

Assuming that the percentage reported by the app refers to the energy left in the battery and not something else, my wheel used about 35% of 350 Wh = 122 Wh of energy bringing me up the hill. The actual physical (gravitational potential) energy of moving 115 kg (me + wheel) up 75m is ~85 kilojoules, or about 24 Wh. Going up a pretty steep incline, with really strong power draws from the batteries, plus all the balancing and everything else using energy on the wheel, it seemed reasonable to see a 20% efficiency. To recap:

  • 24 Wh: minimum energy required to move me up the hill.
  • 35%: battery capacity change
  • 122 Wh: 35% of 350 Wh, the nominal battery capacity, the estimated actual energy used to move me up the hill.
  • 20% energy efficiency: seems low, but what do I know?

Then what happened next surprised me.

I rode my wheel back down the hill, expecting to get back some amount of the lost energy through regenerative braking. From somewhere I got a 50% efficiency number on regenerative braking, so I expected to get back a little bit of energy. And indeed at the bottom of the hill, my battery was back up to 52% charge. So I got back 17% of my battery, which makes sense -- I used 35% going up, and got back 17% -- about half -- going down.

Except that would imply that my battery accumulated 60 Wh or energy going down a hill that only has 24 Wh of potential energy. Either IPS has managed a 250% efficiency in regenerative braking (someone call the Nobel committee!), or my math or physics is wrong. To recap:

  • 24 Wh: maximum possible energy gained by rolling downhill
  • 17%: battery capacity change (a gain this time)
  • 60 Wh: 17% of 350 Wh, the estimated actual energy recovered by going downhill.
  • 250% energy efficiency: huh?

Anyone have any insight? Are the battery ratings optimistic? Is the charge meter in the app wrong or non-linear or something else? Perhaps the actual nature of battery chemistry means my battery underreported charge after a big power draw?

If I assume that my wheel does recover 50% of downhill energy, and thus 24Wh / 2 = 12 Wh represents 17% of my battery, then its true (usable?) energy capacity would be only 70 Wh, a far cry from 350 Wh. (If this is common among EUs, someone must have noticed by now, right?)

Thoughts? Explanations? Ideas? Errors?

The battery state display is actually only a voltage meter. The batteries have a tendency of dropping their voltage during discharge (the higher the amperage, the larger the voltage drop). Similarly during regenerative braking, the voltage goes up higher than what the voltage would be with no charging (ie. "at rest"), and will totally settle only after a longer while (hours? I've had discharged battery go up a couple of volts over night). So I wouldn't draw too much conclusions from the battery charge state based on the displayed percentage alone.

Link to comment
Share on other sites

5 hours ago, esaj said:

The battery state display is actually only a voltage meter. The batteries have a tendency of dropping their voltage during discharge (the higher the amperage, the larger the voltage drop). Similarly during regenerative braking, the voltage goes up higher than what the voltage would be with no charging (ie. "at rest"), and will totally settle only after a longer while (hours? I've had discharged battery go up a couple of volts over night). So I wouldn't draw too much conclusions from the battery charge state based on the displayed percentage alone.

It comes to my mind that may be the maker can make battery meter more precise by just compensate  the internal resistance/voltage drop. It can be down by an op-amp circuit(hardware) or by subtract the 'current times the internal resistance' from the voltage(software/firmware).

Link to comment
Share on other sites

8 hours ago, zlymex said:

It comes to my mind that may be the maker can make battery meter more precise by just compensate  the internal resistance/voltage drop. It can be down by an op-amp circuit(hardware) or by subtract the 'current times the internal resistance' from the voltage(software/firmware).

Wouldn't it need to also depend on temperature? The "elbow" at the end of this curve is a lot sharper for warm temperatures, and the meter would be too optimistic there. Or does that get reflected in the internal resistance?

Li_Ion_DiscTGph.JPG

Link to comment
Share on other sites

7 minutes ago, dmethvin said:

Wouldn't it need to also depend on temperature? The "elbow" at the end of this curve is a lot sharper for warm temperatures, and the meter would be too optimistic there. Or does that get reflected in the internal resistance?

Li_Ion_DiscTGph.JPG

I believe that the temperature-related drop is indeed caused by the internal resistance changing with temperature. Maybe the solution would be using something like Charge Doctor uses to actually measure the used watthours used rather than trying to compensate for the voltage drop. It would probably have to have pretty high sampling rate though, as the power usage varies wildly during riding. Even then, you get less mAh out of the cells with higher discharge, but it would (probably) still be much more accurate than the voltage-based measuring.

battery-discharge.png

Graph of total capacity got out from a cell before the voltage drops to cut-out (3V in this case), less mAh in total are got out with higher discharge rates (about 10% less with 2.0C vs. 0.2C).

Link to comment
Share on other sites

6 minutes ago, dmethvin said:

Wouldn't it need to also depend on temperature? The "elbow" at the end of this curve is a lot sharper for warm temperatures, and the meter would be too optimistic there.

Li_Ion_DiscTGph.JPG

That's true, the capacity of a battery pack depend on temperature very much. In order to count in the temperature effect, I just put some thought:
----relationship between capacity and temperature is not linear, vary from cell makers, and difficult to model. 
----the battery meter usually represent in %, not in Wh
----when temperature drops, capacity represented by Wh drops quickly
----however, if capacity represented by %, it may be meaningless. For instance, take a fully charged 260Wh EUC(at 25C) to outside with -20C for 2 hours, what should the meter read? Could it be still 100% even the capacity may dropped to 50Wh?  
----This curve represent only the situation when cell discharging. What about open circuit(no discharge current)? People often read the battery meter when stop ride. I have measured the open circuit voltage of a fully charged cell at -10 degrees C, it won't change much.
----modify the meter to something below? The length of the bar represent Wh.

BattMeter.gif

Link to comment
Share on other sites

Using the data of some of the best cells that are available as an example, the Samsung 30Q's with an IR of just 18 mΩ, the endothermic heat generated from even this remarkably small IR still causes the cell temperature to rise to 100°C at a sustained discharge of 20A.

On low IR cells like these, there is not much loss of capacity from 0°c & the optimal operating temperatures. Problem when the temperatures get too high is that the electrolyte becomes volatile at +>60°c, which cause unwanted reactions that degrades the cell's performance & increase the IR over time.  

8569fa87c69ee1_BatData1.png.9cf7fae754709

 

569fa74c7c177_BatData.png.2b60bdbd9c92be

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...