What causes the hamsters to work hardest?

For example, if I post something that’s 100 characters, then I realize I forgot something, so I make another 100-character post, does that strain the hamsters more than a single 200-character post? I would think it does, because each post has to be taken into account for my post count and indexed for searching, but I really don’t know. Also, when I preview a post, it sends the information to the hamsters so that they can process any codes I used, then shows me what it would look like. Then when I submit, they process the codes all over again. How much extra bandwidth does that use up? And what about searching? I assume that is the worst of all.

O, dopers, please educate me in the ways of the industrious, yet presumably cute, hamsters, that I may ease their toil.

I don’t know about your other questions, but yes, searching does put the greatest strain on the hamsters. So we ask that people please not search unless it’s necessary. Vanity searches are almost never necessary.

You are correct. A single large post takes less CPU time and disk access than multiple small posts of the same resulting information.

That’s hard to answer from a quantitative standpoint. Suffice to say multiple “Previews” do use bandwidth and CPU time to parse the code and return the PHP-generated page, but since the post is not submitted to the database, it’s not hitting the server as hard as a “Submit” will. There is some interest in offline post composers, and I almost wrote one once as freeware, but decided that no one really cared that much for the effort it would take.

Yes.

test