I've also noticed that most people use 6 ohms/50W resistors, but complain that they get very hot...I thought that a higher ohm value would solve the heating problem, but I haven't seen anyone use 25/50/100 ohms @ 25W...everyone seems to be using the 6 ohms resistors...
I could be wrong, but here's how I see things:
The LED bulbs + resistors are supposed to draw less amps than the stock 21W bulbs...based on the Power law equation (P=V*I), a quick calculation reveals that when signaling, the stock bulbs draw a max 21W/12V = 1.75A...while at night (when on all the time) they'd draw a constant ~600-700 mA (since they operate @ aprox. 30-40% nominal power).
So, if the Resistor goes in parallel, based on Kirchhoff's current law (I=I1+I2), I would have to look at ensuring that the total Amp draw: I1 (LED) + I2 (Resistor) < 1.75 Amps? So now, if I go with the 6 ohms/50W, Amp2 > 2A, so no good...if I use a 100 ohm/25W resistor, then I2 = 125mA, which is OK...so, basically, I1 (LED) would have to draw a max of 1.625A, which I think they don't... so wouldn't the LEDs actually put less stress on the LCM if we use a Resistor with a higher value (not to mention solve the heat problems)?
I'm still debating on this one...any electricians around?