iTunes song file (AAC format) question

Owing to inattentiveness on my part, I have two copies of “Here’s Looking at You, Kid” by The Gaslight Anthem in my iTunes library. One is purchased through the iTunes Store, and the other is (I believe) ripped from a CD. There is a 6-second difference in the track length, with the purchased song clocking in at 3:42 and the ripped song at 3:36.

However, the file sizes of the two songs are extremely different - the ripped version is 3.3 MB, which is about average for my library, whereas the purchased file is a massive 7.2 MB!

Like I said before, the two differences between the two are A) the track length, a difference of 6 seconds, and B) the fact that one is purchased from iTunes and one is ripped from a CD.

So what’s up with this? Six seconds of music (well, silence really) isn’t enough to more than double a file size, is it? Does iTunes ‘bloat up’ the file for some reason?

It’s likely a difference in bit rate between the settings you used to rip your CD and the settings used to create the AAC. Especially if you bought the “ITunes plus” version (or whatever it’s called), it has a much higher bit rate (therefore larger file size). Also, did you rip your CD to MP3? That might be more compressed than AAC, I don’t know.

What are the respective bitrates of the purchased track and the ripped track?

This is true almost by definition - bitrate is essentially a measure of data space used per second of content, so the question essentially translates into ‘why are the bitrates of these files so different?’

3.3 MB for 3:36 works out to around 128 kbps bitrate, which is decent for MP3 but hardly great quality. 7.2 MB for 3:42 is over 256 kbps, which is excessive for CD rips, but not bad for a digital purchase, where you don’t have a higher-bandwidth version available on the original CD that you can return to if you wish.

So - the itunes version is better quality. It uses twice the space to actually preserve more of the little details of the original recording. You might or might not be able to tell the difference if you listen closely enough.

Hope that explanation helps.

ETA: These bitrate calculations are averages for the file - if either are variable bit rate they might report different numbers.

AAC is not a proprietary coded owned by iTunes, they just popularized. Nero and other make AAC codecs (mp4 and m4a amd m4b are the same thing)

Mp3 tagger will give you bitrates for mp4 as well as mp3 and others.

I just did a couple of tests.

Olivia Newton-John’s “Love And Let Live” EAC rip 100% quality

Wav File (1141 kbps) 3:27
Using Nero AAC codec (vbr 192) 3:27

Wav File (1141 kbps) 3:29
Using iTunes AAC codec (vbr 192) 3:29
Using iTunes AAC codec (cbr 64) 3:29

Wav File (1141 kbps) 3:28
Using DBpowerRamp and Nero’s AAC codec (vbr 128) 3:28
Using DBpowerRamp and Nero’s AAC codec (cbr 382) 3:28

Wav File (1141) 3:27
Using Lame MP3 encoder (vbr 320) 3:27
Using Lame MP3 encoder (vbr 128) 3:27

Using DBpowerRamp and Lame’s MP3 Codec (192 vbr) 3:27
Using DBpowerRamp and Monkey Audio’s Codec (192 vbr) 3:27

Wav File (1141) 3:27
WMA Lossless (893 kbps) 3:27
WMA (192 cbr) 3:27

It seems to me the bit rate has nothing to do with it. The original wav file measure from 3:27 to 3:29

Now here’s the intersting part. I looked at my Greatest Hits CDs by Olivia Newton-John, Heart and Madonna.

The Greatest Hits are not always the same length from the CDs. And here’s even more interesting info. I have Madonna’s “True Blue” CD and the Digitally Remasterd version of “True Blue.” and the track times are different. On some tracks on the original CD “True Blue” the tracks are longer some are shorter than the Digitally Remasterd versions. (I am referring to the wav files in unconverted formats)

So it doesn’t seem to me it’s the bit rate but rather what source the wav file is recorded in.

And as you can see iTunes measure the time of my wav as 2 seconds longer than Windows did. And DBpowerRamp varied the time of the same wav.

So it’s probably the song was taken from one album and converted and that source was different from your CD.

While all of that is very interesting, I think that the main question was not about the small difference in the duration, but the large difference in file sizes.

Just a note that you can change the bit rate for songs you add to your Itunes library (i.e. 128 bps vs. 256), but anything you buy from the Itunes store will come in at 256 (you can make your own AAC file copy at 128).

I remain unconvinced that 256 sounds better than 128. And a lot is dependent on the sound quality of the recording to begin with. I could sync some MC5 live recorded songs that I have at 1114 bps and they’d still sound like the last moments at Jonestown. It’s like taking photos of a blurry picture - you can use the best equipment possible, but a sharp photo of blur is still blur.

And yes, 256 uses about twice the file space as 128.

MrSquishy, you’re exactly right. The track with the larger file size has a bit rate of 256 kbps, and the smaller file is 128 kbps.

If you listen to the two files in rapid succession, can you hear a difference?

Or, if you listen to both files simultaneously (for a 384 kbps bit rate) does it sound better than either file by itself?

In this case the blur might not be in the original recording - it might be in the performance. I’ve heard me some MC5.

My own ears distinguish the difference up to about a 192 bit rate. A 128 bit rate sounds a bit warbly or “sandpapery” or something in certain frequencies. I encode everything to 192 personally as a balance between sound quality and filesize. Anything higher, and I really can’t tell the difference.