calculating required network capacity

I swear this isn’t a homework assignment!

I am studying for a comprehensive exam in an MIS program. There seem to be a number of questions that go along these lines

Is this just simply
3000 sites x
10 computers x
10 surveys/ hr x
50 Kb per survey x
8 bits/byte x
1.1 (for ten percent overhead) = 132,000,000,000 bits (=132 Gbits) per hour
(or from each site’s perspective 44Mb/hr)

The next part of the question is

Does this simply factor in:
132 Gbph divided by
3600 sec /hr = 36.6 Mbps
(from each site’s perspective, 44Mbph = 12.2Kbps)

I could also throw in a fudge factor for the non-uniform flow of data to factor in, say, an extra 25 or 50% over required throughput to give an acceptable bandwidth.

BTW, I realize the “bandwidth” is being misused here to mean “data throughput capacity” which is not strictly speaking correct. Bandwidth is measured in Hz, not bps. Unless they want me to do something like assume a bit rate and compute the actual required bandwidth of the medium, which I sort of doubt.

Thanks!