One bit is equal to* an energy of kTln2 Joule (k = Boltzmann’s constant, T = temperature), or about 2.910[sup]-21[/sup] J at room temperature (25°C). An ASCII character is coded in 7 bits, meaning a 100 character email (quite terse) contains 700 bits of information (in just the text; this excludes things like the header and whatever else is present in the file), equal to about 210[sup]-18[/sup] J, which by mass-energy equivalence works out to roughly 2.22510[sup]-35[/sup] kg. Thus, to get a ton, you’d need about 4.510[sup]37[/sup] emails.
*‘Equal to’ in the sense that this is the minimum energy needed to store a bit; an ordinary computer uses far more than that, of course.
But during the transmission, when it’s traveling through the fiber, the email itself, meaning the information, is being transferred in the form of photons.
While you wisely said this is minimum energy, in fact the true energy usage depends on how new your computer is. Each transistor in a digital circuit leaks a little bit, where the leakage is current, or electrons. In the good old days, 1990 or so, while ASIC chips might leak a couple of microamps when the clock is turned off. However, newer process technologies which increase speed also increase leakage, so today that number could be in the amp range. So, reading email on a new computer takes more electrons than doing it on an old computer, even more than you’d get from looking at it in an information theoretic way.