Trevor Wilson said:
**Yep. I've heard the same bullshit.
How much power
**About the same amount in one year as a two bar heater will use in a few
hours. Or as much in one week as a few halogen downlights will use in one
hour.
Are there any plans by major manufacturers to
**Yep. There are no plans to get consumers to switch to Solar hot water,
however, which would REALLY reduce power consumption. Of to get people to
wear an extra sweater during the Winter, instead of running that central
heating. Or to choose more efficient appliances (better air conditioners,
'fridges, etc) instead of buggering around with inconsequential things.
Education and common sense will reduce electricity consumption for more
efficiently that turning off a DVD player.
Coming from an industrial background in New Zealand I see this problem all
the time.
I often wonder how people measure this power used by devices in standby.
Most of the portable meters that are available are not sensitive enough to
measure down to under 1W nor are they capable of measuring out of phase or
true power.
Let me introduce you people to a little problem known as bad power factor.
In Industry We actually can turn on water heating and capacitors to reduce
the total load on the system.
An even bigger factor to be considered is the bad power factor of our power
systems caused by switchmode power supplies used in modern electronics and
heavy industrial motors. A bad power factor is where an appliance draws AC
current that is not in sync with the voltage. This causes high apparent
currents which if we relate back to your TV, the cheaper meters are not
true RMS this means that if the meter was to read the power drawn by the TV.
It would appear as 10w on the meter but it's true power could be only about
0.2 - 0.5w. I agree that solar power hot water would reduce total load on
our system but we need some large resistance loads to reduce our terrible
power factor.
As far as the supply authorities are concerned this power factor is a way
for them to get more money out of industry. it has only recently been
enforced that in a new industrial installation that power factor correction
units be put in to reduce the load. but the real problem is that as we get
more sophisticated electronics using swiutchmode supplies and as air
conditioning units become popular in our homes the combined effect of these
loads cause our power supply to have a bad power factor.
The effects of a bad power factor is first of all the top and bottom of the
ac supply sine wave is flattened. this in effect is a dc point on the
supply. this causes motors and transformers to become less efficient as
much as 60% of their maximum capacity. the main effects are excessive
heating in their windings.
If I take for example an industrial welder on it's own it draws a maximum
current of 50A but if we put a capacitor across it's supply for power factor
correction we can reduce it's current to 30A. so as you can see by putting
this correction in place we can save 40% energy in this case.
So why don't companies put in this feature it's because it is not a well
known effect and the average person doesn't understand, Another thing mains
rated AC capacitors are expensive. So if the consumer doesn't know it has a
negative effect on the supply why bother.
in my opinion as our supply load reduices the line capacity we need to
install power factor correction on our supply.
In a nutshell motors and appliances peak in their current draw before the
voltage is at a maximum.
power factor correction capacitors draw current after the voltage maximum.
so if we can measure the amount the voltage and current are different we can
correct it by placing capacitors in the circuit.
i could keep giong on about the effects but i will leave it at this.