THE GREATEST TURN SIGNAL MOD....EVER I have video. I made this.
#12
#13
![Default](/forum/images/icons/icon1.gif)
ok this is how I did it. I took some "hot bodies" flush mount signals, cracked them open, replaced the small surface mount LED's with two 1 Watt LED's and then hid a 20 Watt power resistor behind the feiring. I am looking into getting 4 of those bad boys in there but I will have to make my own heat sink for that.
#15
#18
![Default](/forum/images/icons/icon1.gif)
Sanders - To been seen better on the street?
Wirewalker - I guess I don't understand the principle. I thought that the LED's would be rated to 2.4 volts, but will run on whatever you give them. If you give them more than they're rated for, they burn out. Adding a resistor reduces the wattage, because the resistor dissipates most of it as heat. I think that a 1 amp, 2.4 volt LED would use about .850 watts, while the resistor would use about 3.5 . Also, I've never heard of resistance being measured in anything but ohms, but you say you have a 20 Watt resistor? Obviously, it works very well. I'm just trying to understand how it works a little better.
Wirewalker - I guess I don't understand the principle. I thought that the LED's would be rated to 2.4 volts, but will run on whatever you give them. If you give them more than they're rated for, they burn out. Adding a resistor reduces the wattage, because the resistor dissipates most of it as heat. I think that a 1 amp, 2.4 volt LED would use about .850 watts, while the resistor would use about 3.5 . Also, I've never heard of resistance being measured in anything but ohms, but you say you have a 20 Watt resistor? Obviously, it works very well. I'm just trying to understand how it works a little better.
#19
![Default](/forum/images/icons/icon1.gif)
you are right most of the way. your calculations look right but resistors are rated for thier restistance (ohms) and thier power,heat, dissapation (watts). if you feed an LED too much voltage the gap current becomes too great and it will burn out. no one ever worries or mensions the resistor wattage because usually LED's only consume a little current,therfore, only need a 1/2 or 1/4 watt resistor. also if you start puttnig these guys in "series" you add up thier combined voltage and you may not even need a resistor. it all depends on design. I started with a 10watt resistor, it may have been OK but it got real hot. I didn't want to melt anything.
#20
![Default](/forum/images/icons/icon1.gif)
I think you need some basic EE101 here.
what laws of physics are you following???????
If you have a 1 Amp 2.4 V LED, it uses exactly 2.4 W of power by Power = voltage x current. So how did you come up with .850W???
A resistor doesn't use power. Where are you getting this 3.5 number from????
You are right that a resistance is in units of ohms. But a resistor also has a maximum voltage and a maximum current in which it will deliver its rated resistance (ohms). So if a resistor has a max voltage of 15 V and max current of 2 A, then it has a power rating of 30 W. So if a 100 ohm reistor is required but you only use a, say, 1/4 W 100 ohm resistor, you'll likely blow the resistor (and maybe some other circuitry).
ORIGINAL: HurricaneForce
I think that a 1 amp, 2.4 volt LED would use about .850 watts,
I think that a 1 amp, 2.4 volt LED would use about .850 watts,
If you have a 1 Amp 2.4 V LED, it uses exactly 2.4 W of power by Power = voltage x current. So how did you come up with .850W???
...while the resistor would use about 3.5 .
Also, I've never heard of resistance being measured in anything but ohms, but you say you have a 20 Watt resistor?