Electricity is governed by two simple rules:
W = V x A
A = V / R
Where W = Watts (power), A is Ampères (current), V = Volts and R = Ohms (resistance).
So a device that wants to charge at 2.5 W needs to draw 500 mA so it needs to have a resistance of 2 Ohms and a device that wants to charge at 5 W = 1 A must have a resistance of 1 Ohm.
However, if you hook up that 1 Ohm device to a 2.5 W USB port or charger, then that port or charger will have a problem, which can be solved in one of two ways:
-
The computer decides that the USB port has been shorted and turns off the power
-
The charger or computer supplies as much power as it can, but the voltage will drop until an equilibrium is reached, for instance 2.5 V x 1 A = 2.5 W.
In practice, things are more complex because we’re not talking about simple loads such as a DC motor or a light bulb, which simply have a fixed resistance and turn/burn harder as the voltage increases until they burn out. Instead, your ereader or phone converts the 5 V USB power to the ~ 4 V the battery needs using a DC/DC converter and/or switched power supply, which will simply lower its effective resistance as the voltage drops, attempting to keep the Watts the same, until it decides the voltage is too low and it stops charging.
To reiterate: the rated voltage for USB power is ALWAYS 5 V. However, this will drop if the device draws more than the USB port or charger can deliver. The limit on the power that will be supplied is the minimum between what the computer or charger can deliver and what the device may draw.
Note also that devices hooked up to a computer tell the computer how much power they need. For instance, my iPhone 6 tells my MacBook Pro that it wants the normal 500 mA and then another extra 1600 mA, so it could draw a total of 10.5 W (it seems to draw about 6 W, though). However, chargers don’t run the USB communication protocols, there’s a separate specification on how they’re supposed to behave.