To work out a series resistor for a given led, the equation is as follows
So if for example the supply voltage is 12v and the voltage drop of the led is 3v and the led current is 20 ma maximum, this equation will allow a resistor of 450 ohms, to make it shine at its brightest, this is however the minimum value for the resistor, any value lower could damage the led, it’s good practice to move the value up to say a 1k in this example to keep within the limits unless it is essential that maximum brightness is needed. The information for a given led is found on data sheets for the device. So a resistor in series with the led wired to the supply is the usual way to do this with the anode wired to the + rail , the resistor can connect either to the cathode or anode of the led, (cathode is denoted with a flat part on the led), and the anode usually has a longer leg. Power ratings is also a consideration for larger current types of leds, as this would move the power rating of the resistor up. If a 12v led was used and the supply was 12v , generally no series resistor would be required, another way to use without resistors, could be this way, if 4 x 3v leds were used with a 12v supply, the PN junctions of the diodes would drop 3 volts each if wired in series, so 3v x the 4 leds equals 12v. This is for low power types and not heavy duty types as a condition called thermal runaway could occur with heavy current types, so the series resistor regulates and maintains the working conditions of the leds.