Home › Forums › Mayfly Data Logger › Monitoring power consumption › Reply To: Monitoring power consumption
Hi Matt, power is a challenge. I always think of those battery watches with a tiny battery in them. How do they do it.
My take is this. a) what is the minimum energy its going to collect across the really low solar months eg winter months b) What is the longest I want the system to run for with no solar power – that is usually a storm of some sort.
Then the stored power can be budgeted into two sections – one for the time period that it needs to be working/sampling to run with no solar energy input, 2) when there is “excess” energy that can be used to transmit the readings to the cloud. An assumption in this is of course that the readings will be stored (on uSD), and when there is available energy can be reliably delivered to the cloud . (https://github.com/EnviroDIY/ModularSensors/issues/194)
The current Mayfly charging circuit assumes a relatively low impedance on the solar side, and for all practical purposes its easy to over specify the solar panel to 5-10W (for Mayfly) so it is likely to delivery the 0.5A at 4.5V when ever there is solar power there. For a 4Ahr battery, it would be charged in 8hrs ~ a summers day but in winter?
On the consumption side – I’ve put a spread sheet together in mA-secs or mA-minutes to estimate usage. So using mA/minutes a 4Ahr battery has 4,000mA*60minutes – 240,000mAmin.
For 24hrs if sampling is every 15minutes and takes 0.5min of running current – say 38mA and the other 14.5min are at a sleep current of 2mA. Then every 15min it consumes 38*0.5+2*14.5 or 17+29 or 46mAminutes – and in a day – 96 sampling periods ~ would take 4,416mAminutes. So for 4Ahr this would likely last over 54days without any solar power. However with the battery power “budgeted”, if the 4Ahr can be divided into 2Ahr for communications and 2Ahrs for guaranteed core taking readings then using 4,414mA it would have 27days.
I’ve created a Battery Management System to be able to partly implement this in a “simple” way. Yet to be entirely proven. The basic premise is that at critical points in the program a call to the BMS asks is there enough power for the operation to be performed? One of these points, after wake up is to check if there is enough power to transmit readings. If the answer is NO, it just does logging, if Yes it does LoggingAnd Transmisson. (Uses the current Class entry points)
So far I haven’t considered the power cost of transmitting the readings to the internet, as its expensive in power, but this would only happen when there is enough power. So a separate budget power exercise needs to be performed for the cost of transmitting, but it can be gated so it only does it when there is sufficient power (for a winter that might happen after the sun has risen enough to be charging the battery).
In practice the LiIon voltage is partly dependent on ambient temperature (cold in winter), and Mayfly mega1284 Vref tied to Vcc at 3.3V, and combined with charging/Solar input. So Vbat does not reflect LiIon battery at all times. So for NO solar the Vbat ADC results are accurate from 4.2V to about 3.6V, when it becomes non-linear due to power cct LDO dropouts. For ~~good~~ solar the Vbat ADC reflects how well the battery is being charged and is likely above 4.2V ~ mostly not a problem. So hopefully the Vbat absolute voltage could have a number 3.8V? (…. 3.9V) that is crudely interpreted as a 50% (or 75% ) “fuel gauge” for a reasonably large LiIon battery. The problem may be when there is weak solar, and when the Vbat absolute readings are taken may indicate a solar charging voltage, that doesn’t reflect the power stored in the battery. The Modem takes power at different time than when the Vbat ADC reads the LiIon voltage ~ so there could be nonlinear Vbat readings just at the most needed time when the LiIon is run down. That is with a weak solar charging current the Vbat is the solar charge voltage, not the LiIon battery voltage. I’ve been investigating a separate battery monitor on the external ADC, and potentially also the Xbee LTE’s has a voltage converter that can measure the Vcc (which should be ~ 3.3V).
https://batteryuniversity.com/learn/article/discharge_characteristics_li
For being able to do shorter time period testing I’ve been using a 500mA hr battery for testing the BMS transition thresholds, and I’ve built in an option for different types of batteries 2Ahr ..4Ahr with different thresholds.
Well hope its not too much detail, but I’d be interested if this going in the direction you think might answer your questions. 🙂