LEDs are never driven with a pure voltage source. Unlike resistors, the curve of voltage vs current is nonlinear and extremely steep, so just slightly too much voltage will cause the LED to draw a huge current and burn out, while just slightly too little voltage will cause the LED to fail to light. Not only this, but the amount of voltage required varies with temperature, in a way that makes things unstable. So if you had a precisely regulated adjustable voltage supply, and set the voltage to just the right value to light up the LED, then the LED would warm up a bit, and this would cause it to draw much more current, which would warm it up further, causing it to draw still more current, in a "death spiral" of ever increasing current and heat, until the LED burned out.
The usual solution is to use a resistor in series with the LED to stabilize the current at the desired value. Sometimes a constant-current regulator is used -- it's more precise, but more expensive.
As a rule of thumb, most LEDs can be driven with about 20mA of current. But use the figure from the LED's data sheet if you can.