Forums
New posts
Search forums
What's new
New posts
New media
New media comments
Latest activity
Classifieds
Media
New media
New comments
Search media
Log in
Register
What's New?
Search
Search
Search titles only
By:
New posts
Search forums
Menu
Log in
Register
Navigation
Install the app
Install
More Options
Advertise with us
Contact Us
Close Menu
JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser
.
Forums
The Water Cooler
General Discussion
Electrical Advice, please
Search titles only
By:
Reply to Thread
This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Message
<blockquote data-quote="Perplexed" data-source="post: 1542158" data-attributes="member: 7157"><p>My apologies, I'm not expressing myself very well, and I should've re-read my old, dusty physics textbook before saying anything <img src="/images/smilies/tongue.png" class="smilie" loading="lazy" alt=":P" title="Stick Out Tongue :P" data-shortname=":P" /> I do understand (well, kind of!) the nature of the relationship between voltage and amperage (I = V/R, aka Ohm's Law).</p><p></p><p>My plasma cutter is set up so that it can operate on either 120V or 240V, much like SS's table saw. However, the amperage draw can be adjusted via a "Output Current Control" knob to adjust for cutting through different thicknesses of metal. This knob is calibrated to run from 20 to 30 amps. The thicker the metal, the higher the knob is set (i.e., more amperage and less voltage). So if I'm reading Ohm's Law correctly, the cutter would actually have a higher maximum amperage draw at 120V than at 240V - right? If that's the case, then why do heavy-duty cutters (the kind that cut 1/2" or thicker metal), and welders, require a 240V supply?</p><p></p><p>I realize this is getting off on a tangent, but I'd like to get my facts straight.</p></blockquote><p></p>
[QUOTE="Perplexed, post: 1542158, member: 7157"] My apologies, I'm not expressing myself very well, and I should've re-read my old, dusty physics textbook before saying anything :P I do understand (well, kind of!) the nature of the relationship between voltage and amperage (I = V/R, aka Ohm's Law). My plasma cutter is set up so that it can operate on either 120V or 240V, much like SS's table saw. However, the amperage draw can be adjusted via a "Output Current Control" knob to adjust for cutting through different thicknesses of metal. This knob is calibrated to run from 20 to 30 amps. The thicker the metal, the higher the knob is set (i.e., more amperage and less voltage). So if I'm reading Ohm's Law correctly, the cutter would actually have a higher maximum amperage draw at 120V than at 240V - right? If that's the case, then why do heavy-duty cutters (the kind that cut 1/2" or thicker metal), and welders, require a 240V supply? I realize this is getting off on a tangent, but I'd like to get my facts straight. [/QUOTE]
Insert Quotes…
Verification
Post Reply
Forums
The Water Cooler
General Discussion
Electrical Advice, please
Search titles only
By:
Top
Bottom