Jump to content
Search In
  • More options...
Find results that contain...
Find results in...

Dimming LED strips with Resistor vs PWM

So I'm working on a project and i need to add a light strip to it. However I don't want the maximum brightness of said strip. I know I've read that PWM is the proper way to reduce brightness. However I'm just looking to dim them down to a set brightness. No need for adjustments. Is it going to be bad to simply put a resistor in line on the power side? 

 

I'm working with a white Bitfenix 6" strip that has 6 LEDs on it. It's going in a car to illuminate the center console area. The car is a 58 corvette and lacks interior/under dash space so I'm aiming to keep whatever I do as Streamlined as I can manage. 

 

So what is the worst case here? Will the constant lowered voltage hurt the LED chips? Or should I be worried about the resistor getting hot?  I test wired it with a random 1k ohm resistor I had laying around and the brightness seems like what I want. However I don't have a test power supply on hand to leave it on for an extended time with some circuit protection. 

IMG_2982.JPG

IMG_2921.JPG

Ryzen 7 2700X , Asus Prime X570-Pro, Bykski CPU Block, AMD Vega 56, Barrow GPU block, g.Skill Ripjaws V 32GB PC2800, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Lian-Li PC011 Dynamic XL case.

Link to post
Share on other sites

Lowering the voltage don't hurt and is better then pwm. PWM is done because it is cheaper then voltage regulation and is fine for high frequency.

for the resitor do the math

P=U*I

U=R*I 

 

=>P=U*I^2 

p is in the power which the resistor will eat 

Link to post
Share on other sites

I'm only in my first year of electronics in college so take what I say with a few grains of salt.

As far as I'm aware running less voltage trough your LEDs won't damage them, only a higher voltage would be a problem.

And if the resistor does get hot, you can replace it with 2 2K resistors in parallel to spread the load. I've been told this is being done in quite a few circuits.

Link to post
Share on other sites
8 minutes ago, MandicReally said:

So I'm working on a project and i need to add a light strip to it. However I don't want the maximum brightness of said strip. I know I've read that PWM is the proper way to reduce brightness. However I'm just looking to dim them down to a set brightness. No need for adjustments. Is it going to be bad to simply put a resistor in line on the power side? 

 

I'm working with a white Bitfenix 6" strip that has 6 LEDs on it. It's going in a car to illuminate the center console area. The car is a 58 corvette and lacks interior/under dash space so I'm aiming to keep whatever I do as Streamlined as I can manage. 

 

So what is the worst case here? Will the constant lowered voltage hurt the LED chips? Or should I be worried about the resistor getting hot?  I test wired it with a random 1k ohm resistor I had laying around and the brightness seems like what I want. However I don't have a test power supply on hand to leave it on for an extended time with some circuit protection.

Using a PWM is "ideal" in the sense that power is actively being shut off. So if the PWM duty cycle = brightness %, 50% brightness is using 50% of the power the LED would've been using. Using a resistor however expends more energy because it's still a load on the circuit. However, it may not really matter. LEDs consume little power as-is.

 

As far as your questions go:

  • Lowered voltages don't hurt LEDs. It's voltages higher than what it's rated for.
  • Resistors have what is called a wattage rating. As long as the wattage (voltage times current) going through the resistor doesn't exceed its wattage rating, it's fine, even if it gets hot. Though I'm sure there's a maximum temperature but for what you're doing the power dissipation is likely too small to matter
Link to post
Share on other sites

Thank you folks those were the answers I needed. Being an automotive application the slightly increased load from the resistor is a non-issue here. And the brightness seems perfect. I'll try and test bench to check for hot resister before I finalize but I'm thinking I'll be fine. 

 

This was one one of those simple things I couldn't quite wrap my head around the right answer on. 

Ryzen 7 2700X , Asus Prime X570-Pro, Bykski CPU Block, AMD Vega 56, Barrow GPU block, g.Skill Ripjaws V 32GB PC2800, Dual EKWB SE360 Radiators, Corsair RM750x PSU. All in a Lian-Li PC011 Dynamic XL case.

Link to post
Share on other sites

Here's some led basics.

 

A LED doesn't work at a specific voltage, it's a current driven device. However, there is the notion of Forward Voltage , which very simplified means the minimum voltage from which the led will be guaranteed to be completely turned on and produce light.

 

So basically this means that the LED has three regions of operation :

 

1. Circuit voltage is below the forward voltage of a LED (I'll say Vf from now on), where it's like the LED isn't there in the circuit, it blocks the current flow completely

2. Circuit voltage is within a very narrow region near the forward voltage of the LED, let's say Vf - 0.1..0.2v  of the LED. In this region, the LED starts to produce some light but acts like a resistor, limiting the current flow.

3. Circuit voltage is above the forward voltage of the LED, which means the LED is fully open and produces light.... BUT, if you don't limit the amount of current flowing through the LED, eventually the LED heats up and burns out.

 

You can't rely on that very narrow region to limit the amount of current going through the LED because it varies from LED to LED due to manufacturing process and also varies due to other factors, like how warm the LED is. That region drifts with the temperature significantly. For example, a white or blue LED could have a forward voltage of 3.0v and could start to light up from around 2.8v when it's around 20-40 degrees Celsius, but could start to light up from around 2.6v if it's warm at around 60-70 degrees Celsius.

 

So what you normally want is to determine a voltage that's guaranteed to be slightly above the forward voltage, so that if you take 10 random LEDs out of a bag, you know all those LEDs are guaranteed to be fully open and letting current flow go through.  In my example above, I would go with 3.2v and then I have to figure out a way to limit the current going through the LED or through a strip of LEDs.

 

The easiest method is the resistor and you have the simple formula Ohm's Law which says  Voltage equals current x resistance.

 

So let's say we have this circuit 

 

[ power supply  +12 v ] ------- [ resistor ] ------- [ led 1 ] ---- [ led 2 ] ----[ led 3 ] ----- [ power supply GND ]

 

I know I want the LEDs to receive at least 3.2v in order to make sure they all are fully open and letting current flow, that they're not in that narrow region I said at [2.]

So the 3 LEDs would have in total 3 x 3.2v = 9.6v  which means I have to make disappear the difference between 12v and 9.6v on the resistor.

Let's put the numbers in the formula

 

12v - 9.6v  =  Current x Resistance  , or  2.4v = Current x Resistance

 

If you have a resistor value, then you can figure out the maximum current that would go through the LEDs, by simply rewriting Current = 2.4 / Resistance. With your 1000 ohm resistor, that would be Current = 2.4 / 1000  = 0.0024 A or 2.4 mA per LED.

If you want to configure the maximum current to flow through the LEDs, then you rewrite the formula to Resistance = 2.4 / Current. So for example, for 10mA through LEDs, you do Current = 2.4 / 0.01 A = 240 ohm

 

The only other thing you need to be aware of is how much heat is produced in the resistor, the formula is Power  = Current2 x R , so for example for 10mA and 240 ohm resistor, we have P = 0.0001 x 240 = 0.024w which means we can use a plain simple 0.1w surface mount resistor, or a 0.125w axial resistor and we're sure it won't overheat.

 

Now, the resistor configures the maximum current flow, so that's the maximum amount of brightness the LEDs would be at. If you want to adjust the brightness, you'd have to change the resistor or use PWM.

The resistor is simply a safety limit, sets the maximum, it's hard to change.

 

PWM turns on and off the LEDs a few thousands time a second, and by adjusting the amount of time the LEDs stay on versus amount of time the LEDs stay off, you perceive a different amount of brightness. Note that it's not linear, meaning at 10% PWM you won't see the light as only 10% as bright compared to when you're not using PWM. The human eyes don't work like that.

 

 

 

Link to post
Share on other sites

If your into this kind of thing get an arduino nano and build a small MCU circuit that way you can program it any way you want and make changes when you want its a cheap way of controlling led's

My daily driver: The Wrath of Red: OS Windows 10 home edition / CPU Ryzen TR4 1950x 3.85GHz / Cooler Master MasterAir MA621P Twin-Tower RGB CPU Air Cooler / PSU Thermaltake Toughpower 750watt / ASRock x399 Taichi / Gskill Flare X 32GB DDR4 3200Mhz / HP 10GB Single Port Mellanox Connectx-2 PCI-E 10GBe NIC / Samsung 512GB 970 pro M.2 / ASUS GeForce GTX 1080 STRIX 8GB / Acer - H236HLbid 23.0" 1920x1080 60Hz Monitor x3

 

My technology Rig: The wizard: OS Windows 10 home edition / CPU Ryzen R7 1800x 3.95MHz / Corsair H110i / PSU Thermaltake Toughpower 750watt / ASUS CH 6 / Gskill Flare X 32GB DDR4 3200Mhz / HP 10GB Single Port Mellanox Connectx-2 PCI-E 10GBe NIC / 512GB 960 pro M.2 / ASUS GeForce GTX 1080 STRIX 8GB / Acer - H236HLbid 23.0" 1920x1080 60Hz Monitor HP Monitor

 

My I don't use RigOS Windows 10 home edition / CPU Ryzen 1600x 3.85GHz / Cooler Master MasterAir MA620P Twin-Tower RGB CPU Air Cooler / PSU Thermaltake Toughpower 750watt / MSI x370 Gaming Pro Carbon / Gskill Flare X 32GB DDR4 3200Mhz / Samsung PM961 256GB M.2 PCIe Internal SSDEVGA GeForce GTX 1050 Ti SSC GAMING / Acer - H236HLbid 23.0" 1920x1080 60Hz Monitor

 

My NAS: The storage miser: OS unRAID v. 6.9.0-beta25 / CPU Intel i7 6700 / Cooler Master MasterWatt Lite 500 Watt 80 Plus / ASUS Maximus viii Hero / 32GB Gskill RipJaw DDR4 3200Mhz / HP Mellanox ConnectX-2 10 GbE PCI-e G2 Dual SFP+ Ported Ethernet HCA NIC / 9 Drives total 29TB - 1 4TB seagate parity - 7 4TB WD Red data - 1 1TB laptop drive data - and 2 240GB Sandisk SSD's cache / Headless

 

Why did I buy this server: OS unRAID v. 6.9.0-beta25 / Dell R710 enterprise server with dual xeon E5530 / 48GB ecc ddr3 / Dell H310 6Gbps SAS HBA w/ LSI 9211-8i P20 IT / 4 450GB sas drives / headless

 

Just another server: OS Proxmox VE / Dell poweredge R410

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×