The specs say to use a 15-48v power supply, and specifies 19.8A... Power goes in on the top left of this image. I'm assuming that the board has a 15v voltage regulator or buck converter of some kind, but I'm not 100% sure how it works.
Assuming that it's a regulator, how does power work in that kind of a circuit? Does Ohm's law work normally in a situation like this - i.e. do watts in equal watts out, or is the current of the power supply the maximum that I can achieve through a voltage regulator? In other words, would a 24v 20A supply perform the same as a 48v 10A supply (both 480w), or would I get 20A out of the first one and only 10A out of the second one?
Last Edit: Jul 17, 2018 7:39:03 GMT -5 by jimustanguitar
According to the spec sheet, the max current is 18.9 amps, it does not specify if that is at min or max voltage. So it is safe to assume that it could draw at least 18.9 amp - regardless of the voltage. However using ohms law, 800w at 48v = 16.6amps so that is probably at max voltage. If the supply has over current protection built in, it will most likely shut down any time the current drawn exceeds 10 amps. Ideally you would want a 48v/20a supply. The 20v/20a supply should work but be less efficient, it will generate more heat, and you will loos amplification. for ohms law to produce 800w at 20v = 40 amps. It probably wouldn't draw that, but most likely cut your amplification in half.
Some things are meant to be closed. Your mind isn't one of them.
Post by jimustanguitar on Jul 17, 2018 10:00:33 GMT -5
Yeah, I thought that it was strange that they didn't specify which voltage the current rating was for. Most amp manufacturers will specify a 'typical' power draw with more defined set of circumstances so that you can calculate the BTU load in an equipment rack...
It does look like the caps by the power inlet are only 25v, so I *think* that whatever I put in gets regulated down to something less than that... Basically I should still go with 48vdc and as many amps as I can afford instead of finding 15-24v with more current? That's kind of what I was scared of
Still a little curious about how voltage regulators effect circuits... Does power in still equal power out in watts, or does cutting the voltage change the rules - is the output current capped at whatever the input amperage was? I haven't worked with many power supply circuits or anything that manipulated voltage in the middle, so I'm not sure.
Last Edit: Jul 17, 2018 10:08:14 GMT -5 by jimustanguitar
Still a little curious about how voltage regulators effect circuits... Does power in still equal power out in watts, or does cutting the voltage change the rules - is the output current capped at whatever the input amperage was? I haven't worked with many power supply circuits or anything that manipulated voltage in the middle, so I'm not sure.
I don't like to look at in terms of Watts as that only indicates how much "power" it has. What is 400 watts? It could be 1 volt at 400 amps, or it could be 400 volts at 1 amp. The later being way more efficient, less heat dissipation and you can use small wires, but just saying "400 watts" doesn't tell you that. I prefer to look at the volts and amps. Instead of just stating "400 watts" it should read "400 watts @ 400 volts" or "1amp @ 400volts".
A regulator does just that, it "regulates". It makes sure the output voltage stays constant regardless of input fluctuation and load fluctuations. So for example a 12 volt, 1 amp regulator, will operate at 12v min and up to 48volt max input. So even if the input varies from 25 to 40 volts up and down, the output will remain a steady 12 volts. It will also provide up to 1 amp, so even if the load fluctuates between 0 amps and 1 amp, the voltage will remain 12 volts. It also does not matter what the input amps are. I can provide 50 amps to the input but the output will still be 12v at 1 amp.
Power in does not always equal power out. It depends on many factors and what the regulator output is rated for in addition to loosing power in the form of heat, step up vs step down, etc. As for how it effects a circuit the only thing you need to pay attention to is the output of the regulator everything on the input side of the regulator has no effect on the circuit. A regulator, or power supply will act as a buffer or filter between the real world and your circuit. The regulator doesn't really effect your circuit but your circuit could effect the regulator. It's possible to over draw the regulator. If your circuit demands more current than the regulator can provide it will over heat and possibly become damaged.
Don't know if that helped or made it more confusing lol.
Some things are meant to be closed. Your mind isn't one of them.
Post by jimustanguitar on Jul 17, 2018 11:45:52 GMT -5
I was thinking of a voltage regulator kind of acting like a solid state transformer where the supply and the load match each other, and you could get a higher current out of the 'secondary' than the input if you're dropping the voltage... I probably shouldn't equate those two.
Now can you explain it like a garden hose for Phil?
Last Edit: Jul 17, 2018 11:46:18 GMT -5 by jimustanguitar
Here is a video on the differences of a regulator vs a Buck converter. They both do the same thing, but the Buck converter is much more efficient and doesn't require a huge heat sink. (your amp most likely uses a buck converter)
Last Edit: Jul 17, 2018 14:46:22 GMT -5 by BeeAmaker
Some things are meant to be closed. Your mind isn't one of them.