I have two led strips used for 2x27" monitor bias lighting. Each have their own USB cables for power. These two USB cables are plugged into a 2-port wall charger for a phone. I would like to use a 2x female to 1x male adapter to join the two USB cables into one, then plug it into a much smaller 1-port USB wall charger. The reason is due to my space constraints.

Will doing this impact its energy efficiency, ie using an adapter like this?

https://www.aliexpress.com/item/1005004340044483.html

  • WhiteHotaru@feddit.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    There are two questions to answer:

    • which current and voltage do the LED strips need (5V, 2A would be something you could support with USB 2).
    • what output does the charger provide as a maximum? Is it enough to power both strips (5V and two times the A).

    If these match it should work. Another topic could be the cable itself. Theoretically it could start to burn, if you try to channel to much current through it, but with simple USB I doubt it. If it is getting hot after some time, scrap your setup. This would be a fire hazard.

    • SoySaucePrinterInk@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      The LED strips use SMD 3528 LEDs which need 5V and the wattage is listed at 11.52W/min. The amperage isn’t listed but for those LEDs, I’m seeing 5Ah online. The charger provides 40W

      • ChaoticNeutralCzech@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        You’re confused. W/m means watts per meter, and the “5Ah” is probably actually 5 A, or the current you can push through the strip (limiting the length to below 2 m).

      • UnlimitedRumination [he/him]@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        Watts in a resistive example like yours is Volts x Amps. I would have been able to much better answer this question a year ago so forgive me if I’m misremembering the specs but I’ll answer since nobody else has. Two things that suggest to me this might be a bad idea:

        • Charger is 40W, that’s probably usb PD (I don’t know anything about QC so maybe I’m wrong). PD supplies more than 15W (5V x 3A) by stepping up the voltage, not the amperage. While stepping up either would likely be bad or very bad for some part of your circuit, don’t worry about that though; without the powered device telling it to, PD won’t activate. It should max out at 15W… I think. It depends on the resistance on the CC lines and using a splitter could screw up the resistance that tells the power supply which USB version to support so it can go up to 3A (15W). Sorry, it’s been a while since I’ve worked with USB power. 2 strips of 11W will need more power than that. Basically my concern is you won’t get adequate power out of the charger for one reason or another.
        • Where are you getting the 11.52W/min number? Watts don’t have a time unit and that much precision sketches me out. Almost as if someone who isn’t adequately educated measured the power straight off a multimeter once and just wrote that on a product page. Is the LED strip from a reputable manufacturer?
        • ChaoticNeutralCzech@feddit.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          1 year ago

          He meant watts per meter, not minute – the strips can be cut and rejoined. As for the 5 Ah, no clue.

          The circumference of a monitor is more than 1 m, so a charger of 3 A at least will be necessary for each. This is why I prefer higher-voltage strips where less current is required and higher resistance is tolerable. Anyway, the power is quite high and this could cause overheating problems.

    • ChaoticNeutralCzech@feddit.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      Fire hazard? Not really. The charger will not provide more than 5 V, 3.1 A or whatever its rating is. Even if the strip fails short circuit (unlikely, most LEDs blow open and there is at least a resistor in series anyway), nothing will receive more than 15 W of power. A 1m cable can safely dissipate that.

      Worst case scenario, an LED fails short and its series resistor will receive over 6x its intended power (with 5 V across it instead of 2 V, and its current will increase correspondingly). It is soldered on power-dissipating flexible PCB and will probably not blow open. It will get the area somewhat hot and potentially melt the plastic in the back of the monitor while the rest of the strip keeps glowing but more dimly. Hard to tell if it could get over flammable temperatures with moderate heatsinking and only a few watts of power.

      To keep safe, I would deliberately add resistance before the LED strip, using something like a 1Ω 5W resistor (or a shitty long cable). That way, the voltage drops significantly in case of a short. Also, the LEDs will run at lower-than-intended current, which prolongs their life and decreases risk of failure.

      Edit: Some microwaves have a 0.8Ω 25W resistor as part of inrush, at least in 230V regions. Feel free to use that, it will happily handle a semi-short circuit. Or you know, an automotive fuse.

      The safest option is replacing the whole setup with an LED strip that has no resistors (bare LEDs) and a constant current driver.

      • WhiteHotaru@feddit.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 year ago

        Well, it is nice OP got an answer from someone, who is clearly more knowledgeable than I am. Thanks!