There are a number of parts to the answer.
Number one - it depends. There is no simple answer.
Number two - the distinction between classes is probably misplaced.
To be clear. Class B means that the output devices are biased so that they turn completely off when not driven, and there is no quiescent current through the output stage. Class AB means the output devices are biased to be slightly on, even when not driven, and there is some amount of quiescent current through the output stage. (Class A means the output devices are biased so that that even at full power neither device ever turns off, and the the quiescent current is thus very high, and the output stage dissipates similar (or more) power when idle as a class B amp does when run at full power.).
There seems to be an implicit idea in the OP that class B is the province of signal level stages and AB the province of power amplifier stages. This isn’t really the case. As a rough approximation it may turn out to be true, but there is no intrinsic reason for it to be so, and in more modern devices and amplifiers things may become distinctly different. Indeed there is something of a resurgence in class B for high quality power amplifiers (led by Douglas Self.)
A Class B output stage on a power amplifier is as efficient as it gets for the conventional topologies. AB is less so mostly as it is dissipating additional power due to the bias current. Other than that they are about the same. However even class B still dissipates power as heat, and is not 100% efficient. The maximum theoretical efficiency is 70.7% (or 78.5% if we only consider sine waves.) At higher volumes it means you are losing more and more power to this inefficiency. Hence nice big heat sinks on high power amplifiers.
Modern amplifiers are swinging more and more to class D. Since a class D amplifier works by switching the output stage on and off, with no intermediate levels, there is, in principle, no power loss. (Obviously the output devices are not perfect, and there is some loss during the switch, and some loss as the output devices still have some resistance when turned on. However modern switching devices are remarkably good.) The efficiency of class D can be very good. Significantly better than class B, and so good that often only tiny heatsinks are needed.
The design of the output of any MP3 player is heavily constrained by the very low voltages available. Often only 3.3v. For cheap players this is further complicated by the need for the output to serve duty as a headphone driver. iPods have a separate line level output available in the dock connector and there is a significant step up in sound quality if this is used instead of the headphone output to drive an external amplifier.
We can assume that when driving an external amplifier the impedance seen is very high, and thus there is no power of any significance delivered by the mp3 player. This is probably the most important part of the answer. Thus the power draw of the mp3 player will be reasonably independent of the volume level it is set at. The internal stages of the output driver of the mp3 player may arguably dissipate a tiny bit more when driven harder, but this will be insignificant.
MP3 players usually use a digital volume control. This can mean that at low volumes the sound is quantised at less than 16 bits, and this can lead to remarkable loss of quality. In general, for best sound quality it is best to turn their volume up, and turn the input volume on the amp down. This also results in better noise performance. There is devil in the details, but as a first approximation this is a good rule. Gain staging can get complex, and some modern DACs with internal digital volume controls are resistant to loss of resolution over a large part of their range. But it isn’t hard to get caught out, and mp3 players are usually not at the vanguard of quality.