With 220v, you’re splitting the load between two wires (110v, 180* out of phase, meaning the peak of each current’s sine wave occurs at the other’s bottom, which allows the current to continue flowing through the alternate hot wire as it would a neutral in a 110v configuration). You’re still drawing the same amount of power. This is why the wire used to feel hot and now it doesn’t.. you’re running half the load through that same wire(@110v).. and the other half through an additional hot wire (@110v). The small percentage I speak is primarily based on power factor. Look up the specs for a Meanwell HLG-240 driver (most common led driver I know of). Here you will see a 3% difference in power factor. These drivers actually run more efficiently at 110-115v than they do at 220-230v.
POWER FACTOR (Typ.)
PF≧0.98/115VAC, PF≧0.95/230VAC @ full load
Your calcs gonna stills wrong if you continue avoiding add the resistance to your maths.
Last edited: