Maker Pro
Maker Pro

NiMH solar battery charger

Dany

Apr 4, 2014
3
Joined
Apr 4, 2014
Messages
3
Hello everyone, I am currently working on a solar charger project (to charge a NiMH battery which has a nominal voltage of 1.2 V and 1300 mA capacity) and I have come across some problems that I don't fully understand. My solar panel provides 2.14 V and 1.4 A. However, when I insert the battery, the solar panel voltage drops to 1.5 V (it is actually better for the battery under charge as the maximum charge voltage is 1.5 V as stated in the datasheet). Can someone explain me why the voltage drop happens? Do I also need a diode to protect the solar panel against reverse current from the battery or the MOSFET transistors give this protection? Below is the circuit. Thanks.

attachment.php
 

Attachments

  • Solar Battery Charger.jpg
    Solar Battery Charger.jpg
    68 KB · Views: 2,001
Last edited:

davenn

Moderator
Sep 5, 2009
14,254
Joined
Sep 5, 2009
Messages
14,254
Hi Dany
welcome to the forums :)

The first most likely reason is that the usual stated voltage of solar panels is the OC ( Open Circuit) voltage
The load voltage will be somewhat less

just for an example a panel you want 12V from, its stated voltage is likely to be up ~ 17 - 20V (OC voltage)

Also its stated current is usually a short circuit current, and again its normal sort of load current is going to be somewhat less

cheers
Dave
 

Dany

Apr 4, 2014
3
Joined
Apr 4, 2014
Messages
3
Thank you Dave. In this case how can I determine the solar panel voltage operating under the load? In my design I have six solar cells each providing 1.1 V (open circuit) and 440 mA (short circuit). I divided the cells into two sets, each having three cells connected in parallel. Then I connected these two sets in series. This configuration gives me the panel output of 2.14 V and 1.32 A. For load I am using 1.3 A rechargable battery with 1.2 V nominal voltage. I don't think that my control circuitry can be considered as a load too? Am I right?
 

Arouse1973

Adam
Dec 18, 2013
5,178
Joined
Dec 18, 2013
Messages
5,178
You are lucky you didn't try this with a power supply first. You don't have any current limit for the battery, you could have damaged the battery. I suppose the only saving grace would have been the rds on of the FETs but just having looked at the data sheet they are 30A rated and 40mOhm rds on. They really are over rated for what you are doing, they are capable of pushing way too much current into the battery in the right conditions.
The best way to charge a NIMH is with a constant current of C/10 for 14-16 hours, this is the kindest thing you can do for your battery. Some batteries can be charged at greater than 1C but it's does not do them any good and reduces their life. You also very rarely get full capacity when fast charging unless you have a special charger which is constantly motoring the battery and terminates charging at the correct time.

The reason I would suggest for you to use constant current is because if you left the battery on charge in the bright sunshine and forgot about it then no harm done. If you don't have constant current charging and leave this in the sun for several hours you could come back to a dead battery or even worse a battery that has caught fire.

Read this http://www.ehow.co.uk/list_6008872_dangers-overcharging-nimh-batteries_.html

Adam
 

Dany

Apr 4, 2014
3
Joined
Apr 4, 2014
Messages
3
Thank you Adam. I did't put a current limiting resistor because the maximum current of the solar panel is 1.3 A which is the same as the capacity of the battery, therefore, charging it with 1C should not be unsafe. I have also designed two termination methods when the battery is fully charged therefore leaving the battery should not cause any threat. One termination method is the temperature cut off and the other one is the voltage peak detection. Once the end of charge has been detected charging is terminated by switching off one of the mosfets. Because of that there is no path from the battery to the solar panel so that the current would not flow back to it (am I right at this point?).
 

Arouse1973

Adam
Dec 18, 2013
5,178
Joined
Dec 18, 2013
Messages
5,178
Ok but make sure your battery can handle 1C. Also delta peak detection can be a bit tricky to get right, sometimes it will trigger too early and sometimes not at all. I would still have current limit just in case. A good old run of the mill resistor will do the job and because it is passive will just sit there doing its thing, as long as you have the rating correct obviously.
Adam
 

KrisBlueNZ

Sadly passed away in 2015
Nov 28, 2011
8,393
Joined
Nov 28, 2011
Messages
8,393
I'm afraid there are some problems with this approach because of the electrical characteristics of the solar panels.

An array of solar panels is a voltage source; the amount of voltage it generates depends mainly on two factors: the amount of sunlight, and the amount of current that's being drawn from it. I'm going to assume there will be a constant amount of sunlight falling on the solar array. In this case, the voltage decreases as you draw more current from it, as shown by the following graph.

attachment.php

(The graph is from http://www.mtmscientific.com/solarpanel.html.)

The voltage will be at maximum when no current is being drawn from the solar array; this is the unloaded state. As you start to draw more and more current from it, the voltage will drop fairly smoothly until the "knee" of the curve is reached, and the voltage starts to drop off quite quickly as the load current is increased. At the end of the curve, there's no voltage left; this is the short circuit current that davenn mentioned in post #2.

There is a point on the knee called the "maximum power point" which I will describe later. At the moment, just see how the voltage behaves depending on the amount of current drawn from it.

When charging is active, your circuit simply connects the solar panel directly to the battery via the two MOSFETs. The MOSFETs do have a certain ON-resistance, Rds(on), but it's pretty small and can be pretty much ignored. Say the battery is charged to 1.2V, and the solar array's unloaded voltage is around 2V. Because of this voltage difference, current will flow. The solar panel will try to pull the battery's voltage up to 2V, but the battery will not allow this. Current will flow from the solar array to the battery. They will play a tug-of-war game, with the current being represented by the tension in the rope. The voltage will stabilise somewhere, probably around 1.4V or so, because the battery is more "stubborn" than the solar array - the internal resistance of a charged battery is lower than that of a solar panel, especially a small one.

The amount of current that the solar array can supply depends on the voltage, according to that graph. It also depends very much on the amount of sunlight falling on the solar array, which is a separate factor that is not shown on the graph and which I've assumed to be constant so far. The result is that the amount of current that flows from the solar array to the battery will be almost completely dependent on the amount of sunlight falling on it. This will also affect the voltage to a lesser extent.

There is no explicit current limiting in this arrangement. The small ON-resistance of the MOSFETs is not a significant factor here. You would be relying on the solar array's own electrical characteristics to limit the current, and the current that the solar array can produce depends on the amount of sunlight falling on it.

To start with, this means that the rate of charge for the battery will vary constantly as the sunlight varies. The only way that I know of to work around this problem is to have a huge solar array that can deliver the required amount of voltage and current even at low sunlight, then waste most of its capacity most of the time. I'm going to assume that's not a workable option for you.

The next problem is detecting the charge termination condition. With the charging current varying over a fairly wide range, it becomes impossible to reliably detect charge termination. The "negative delta V" method, aka peak voltage, won't work, because the voltage will be rising and falling with the sunlight level; as soon as a cloud blocks some of the sunlight, the voltage will drop, and the falling voltage detector will terminate the charge. In any case, the negative delta V condition is only a reliable indication of full charge when an NiMH battery is being charged at a current of 0.5C or higher.

End-of-charge detection via rate of temperature rise will also be unreliable if the charging current is varying widely, because the battery will cool down during low current periods and could end up being overcharged because the target temperature gradient never occurs within the expected time period.

A few other comments on your design.

I suggest you note that U4 is attached to the battery. This can be inferred but I think it should be stated clearly.

You only need one MOSFET in series with the battery; you should combine the two Stop conditions into a single signal to control that MOSFET. Otherwise you waste a bit of energy in the extra MOSFET, and a bit of money too. But this is irrelevant because that approach isn't going to work anyway.

I don't know whether you need to prevent the battery from back-feeding in to the solar array. See whether the solar panel data sheet says anything about it. If you do, you'll need another MOSFET in series, but with the opposite orientation, so that its body diode will not be forward-biased in that situation.

Edit: Oops I forgot to get back to the maximum power point. This is the point on the curve in the graph where the power available from the panel is the highest. Power is voltage multiplied by current. Solar-powered equipment, especially large equipment where maximum power is very important and circuit complexity and cost is less important, can detect and track the maximum power point in real time, to ensure that maximum energy is always extracted from the available sunlight.
 
Last edited:
Top