Banks rounding down to the nearest cent

This is a plot device I’ve seen in a couple stories (The movie “Superman III” and the novel “A Stainless Steel Rat is Born”):

When computing interest, transfers, etc. banks round the amounts down to the nearest cent and keep all the fractions of a cent for themselves. The theory set forth in the above plots is that all the cent fractions together would total in the millions or billions of dollars. In the stories, supervillains and such were somehow able to tap into this source of income that nobody (even the banks) was even aware of.

How much of this is based on fact? Do banks, indeed, round down to the nearest cent? If so, how much money does this represent worldwide?

No idea about the answer to the question, but I did once see a film where a guy gets rich because of this premise but it was utter crap.

Well, let’s say that a bank can make an average of one-half cent per month on a $10,000 account. In order to make a billion dollars (even over a year), the bank would have to have millions of accounts (this would be a heckofa BIG bank). Many times interest is credited on a quarterly or yearly basis, anyway. And one half cent per month on a $10,000 account would be a terrible return on investment.

Not to say that this isn’t done, but I wouldn’t expect this to be a bank’s mainstay of generating capital.

AHA! That’s why they’re all consolidating.
The conspiracy is becoming perfectly clear.
Fnord.

Lets say you deposit $10,000 in a bank, and they give you 3% interest. The bank makes money by loaning your money out at higher than 3%.

If they give somebody a car loan at 8%, they make 5%.

The day after you deposit the $10,000 your account is theoretically worth $10,000.82191781. The $00.82191781 is one day’s worht of interest.

Except they don’t do it this way. They credit your interest at the end of the month. So, (not allowing for compounding,) If you deposited your money on the 31st of the previous month, it worth $10,025.479452. Right?

Wrong.

If it’s a savings account the bank takes $25.47, or $25.48 (depending on the bank,) from their working capitol and just credits it to your account. The fractional cents are just ignored. Since the money comes out of the banks working capitol, and it’s rounded off before it comes out, there are never any fractional cents.

If it’s a money market they take the total amount of interest earned by the money market fund, divide it by the number of shares, and credit it to the accounts. If the number doesn’t work out exactly, they don’t create fractional cents. They just take the remainder and let it ride for another month and then divide that up as closely as they can without creating fractional cents.

The half pennies are a myth.

I think it’s based on an old programming UL.

In COBOL there are two ways to perform division (the capitalized words are the COBOL commands, the lower case words are just names given to fields):

COMPUTE quotient = dividend / divisor [ROUNDED]

or

DIVIDE divisor INTO dividend GIVING quotient REMAINDER extra-stuff
If you use the first method, there is no way to select or store any “remainder”. The machine simply records the single answer that it arrives at. Depending on the particular hardware/operating system/compiler you use, there may or may not be automatic rounding or truncation. (The extra ROUNDED keyword forces that issue.) Generally when rounding, if you are using a quotient field defined with two decimal positions, and you divide 1.00 by 3, you will get the answer .33 while if you divide by 6 you will get $.17. This simply follows a basic rule that if the digit after the last significant digit is less than 5, round down, otherwise, round up.

The bank can’t touch the .0033333333333… using the first method because it does not exist. The machine did away with it to satisfy the two-digit decimal answer.

However, looking at the second answer, we find a method to store the remainder. It does not go away. The story goes that the DIVIDE verb preceded the COMPUTE verb when COBOL was developed and some larcenous programmer wrote all his routines using the REMAINDER option, then wrote a routine to accumulate the values in the remainder field and transfer them to his account each evening.

While grabbing one hundred thousandth of a cent on various transactions would not garner the bank even a rounding error on their books, it could, conceivably, bring in a tidy sum for an individual.

I have never had much faith in the story, myself. It seems more like an example of some new coder asking out loud whether anyone else had thought of it, which led to stories that it had happened.

My problem with the story is that auditors are programmers (usually young ones) and I would expect anyone who tried this to get nailed rather quickly. And of course, few of us can arrange that no other programmer will ever pull maintenance on one of our programs. How long could the code stay there before the guy was caught?

It could have happened, but I have never seen names or dates.


Tom~

I work with a utility company, and the rates are per KWH for electricity or therm for gas are determined to 5 significant digits.

So, for example, electricity might be something like .07012 (or 7.012¢) per KWH, and gas might by .71866 (or 71.866¢ per therm).

When you get your bill, though, it’s all rounded to the closest cent (with half-cents rounded up).

All the other financial software (at least the stuff written in-house) performs similarly.

We don’t have any missing half-cents that get transferred to the “bit bucket.”

I believe you will find, if you ask, that most banks compute interest by rounding to the nearest cent. My spreadsheet calculated this way when I used to track such things and it and the banks never disagreed on the interest.

Why not call up B of A and ask? :slight_smile:

Rounding is the most fair way to do it, of course, and that’s the way most of the stuff I’ve seen is done.

[ hypothesizing ] Let’s not think of a fixed savings account. Let’s think of an active portfolio, that has twenty or thirty transactions a month, from dividends coming in and reinvested, etc. Let’s say there are 30 transactions a month on a $100,000 account (that’s pretty high an estimate.) Then, if the bank always drops the fractional cents, the average that would be earned in a month is around 15 cents… about $1.80 a year. If the bank has 100,000 such accounts, that’s a chunk of change.

Now, we’ve stacked the deck, because of those 30 transactions a month, some wouldn’t involve any rounding at all (for instance, a dividend paid that was already rounded), and some would be money paid out rather than money credited to the account, so that would be subtraction.

But still, it’s not the billions of dollars that these movie plots would have you believe. [ /hypothesizing ]