Am I right with these calculations:
200 watts per second = 200 j (1 second)
If you cram 200J into the 1 second duration:
200j = 340VDC x 0.58A x 1 second
Worst case scenario, if you cram the 200J into the max power flash duration of 1/300 second:
200j = 340VDC x 177A x 1/300 second
177A sounds like a lot of power, but consider that the duration is 33ms. Also consider that's the highest amperage for the longest duration this flash should ever see.
More likely situations are in the area of 1/30th of that current: roughly 6amps?
In either case, it's a surge of power through a circuit with little resistance from a bank of charged capacitors.
Your MATH is correct, your equations are NOT representing what you want. First Watt-Second (ws) is a measure of power over time and is equivalent to Joules, similar to the kWh used on your power/hydro bill NOT Power per Unit of Time. @ Any Rate 1 Watt = 1J/1Sec. 1ws = 1j/s * 1s = 1J. So, 200ws = 200J.
As noted above Ej = C*V^2
Where:
Ej is Energy in Joules
C = Capacitance in Farads
V = Potential Difference in Volts
So IN THEORY:
C = 200 / 340^2 ==> C = 0.00173F or 1730uF
BUT this assumes the capacitor Starts @ 340V and Finishes @ 0V. IF the the tube stops conducting @ say 100V (I have NO IDEA if this occurs, it is just a supposition that there is some minimum Voltage required.) then we have to do some more figuring to get our total CONSUMED energy to 200J.
((100V)^2) * 2000uF = 20J
So we would actually need 220J @ the START of the discharge cycle.
220J / .002 = 110,000 = V^2 ==> V = 331V
Now, if the Flash Duration is 3.3mS (1/300) then we we have 200J/3.3mS (or more simply 200J * 300) --> 60,000W (60kW) with an average current during discharge of 60kW/((331V + 100V)/2) ==> 272A.
Now, ASSUMING you used the IGBT method of STOPPING the flash after a fixed interval of 1mS, but you still wanted FULL Power:
200J * 1000 = 200kW (BUT remember watts are a measure of Joules/Second, and here we only supply power for 1mS)
@ 1/9500s we are @ roughly 500uS (1/10000). ASS_U_MEing you still want the FULL POWER delivered (and I KNOW this is NOT the case) ==>
200J * 10,000 = 2Mw! (Again, the AVERAGE POWER is still only 200W, 2Mw is the PEAK POWER) And if the Voltage is the same as above, 2Mw/((331 + 100)/2) = 9070A!
As per charging, 200J refreshed once per second = 200W. NO getting around that. Using a line driven Step-Up Transformer would require a 200VA transformer @ the MINIMUM, and should likely be > 250VA. This is a fairly MASSIVE transformer @ mains frequencies of 50-60hz! As mentioned previously, using a voltage doubler to take 120Vac to 240Vac would give you peak voltages of 240Vac * 2^1/2 = 339V. BUT BE WARNED: Capacitive Coupling from MAINS to ANY CIRCUIT is DANGEROUS! MAKE SURE YOU ARE VERY CAREFUL! GALVANIC ISOLATION IS ALWAYS SAFER!
As a SIDE NOTE: Not all capacitors of the same voltage/capacitance value are EQUAL. You will need extremely LOW ESR with VERY HIGH current ratings. I certainly don't have any experience in the selection of capacitors like these; obviously there is a solution, I just don't have a clue about the "how".
As a quick example: the power dissipated in a purely resistive wire or PCB trace==>
Pd = I^2*R
Where:
I is the current in Amps
R = Resistance in Ohms
20ga copper wire has a resistance of ~10.15mOhms/ft. So a 3in 20ga wire leading from your capacitor bank to your Flash Tube would have a resistance of:
0.01015Ohms/ft * 3in * 1ft/12in = 2.5375mOhms.
And the Power Lost in the wire would be:
@ 272A ==> 272^2 * 0.0025375 = 187.7W
@ 9070A@ ==> 9070^2 * 0.0025375 = 208.7kW!
Inductance in the same piece of STRAIGHT wire would be: 79nH and could certainly influence circuit behavior.
As I said @ the end of my last post, I cannot imagine this project being fiscally viable; HOWEVER, I Certainly understand wanting to play with it! JUST BE CAREFUL! You are playing with very dangerous voltages & currents. A small mistake could result in a life threatening outcome.
I Guess now I AM on the same page as everyone else. I had NO IDEA a flash consumed so much power. I am still having a hard time wrapping my head around it. Powering a flash from a battery pack seems highly improbable if this Flash is representative of MOST flashes. A 1AH battery pack @ 12V = 12Whr ==> 12Whr * 3600s/hr = 43,200J Assuming 20% efficiency in conversion, this would imply 8,640 delivered Joules, or roughly 43 "Flashes" per charge. "AA" Batteries have a 440mAH to 900mAH rating, so 4 * 1.5V = 6V, & 6V * 600mAH = 3.6Whr * 3600 = 12,960J. Assuming a "cheap" camera uses roughly 1/4 the power of the flash in question, and assuming the same 20% efficiency in power conversion: (12,960 * .2)/50 ~ 52 Flashes per battery set, which sounds optimistically high, but at least the right order of magnitude
Thanks! I have enjoyed playing with this
Fish