Welcome to the Tweaking4All community forums!
When participating, please keep the Forum Rules in mind!
Topics for particular software or systems: Start your topic link with the name of the application or system.
For example “MacOS X – Your question“, or “MS Word – Your Tip or Trick“.
Please note that switching to another language when reading a post will not bring you to the same post, in Dutch, as there is no translation for that post!
(@circuitdriver)
New Member
Joined: 8 years ago
Posts: 1
Topic starter
November 5, 2017 9:02 AM
I've gotten a WS2812 strip working well, thanks to some of the postings here. Bought a 5V 10A power supply to run it.
Now I want to join 3 one meter strips (144 pixels each), using one data line, but separate power supplies to each strip. Looking at power requirements, I need to have about 8.6 amps available per strip, to handle when all or most of the pixels light up.
When I do the math, that adds up to a total of 25+ amps. I doubt that my home's circuits would handle that, as it has only 15A and 20A circuits.
Could run a couple of extension cords to tap other circuits in the house, so I can see if it works. But that's not a good long-term solution, especially if I want to throw it all in the truck and go show off somewhere.
Any ideas out there?
Thanks.
John
(@hans)
Famed Member Admin
Joined: 12 years ago
Posts: 2859
November 19, 2017 11:30 AM
Hi John,
this might be helpful for your calculation;
For the LEDs we use (for example) a 5V 10A power supply. Since Watt = Voltage x Amps, this would theoretically use 50 Watts.
Now translated back to your (I assume) 110V power outlets, the load would 50 Watts. 50 Watts divided by 110V would be about 0.5 A on the power outlet.
Your home circuits will there for be just fine.