Programmers: please learn when to not use while loops

Fairly lame, and fairly geeky, but I’ve just been reviewing too much bad code lately, and this style’s a particular pet peeve of mine.

While loops have their place. They’re great when you want to run until a condition is either true or false. Say you need a particular lock but there’s no “block until it’s yours” mechanism, so you go


while (get_lock() == false)
{
      sleep(10);
}

Or say that you want to keep running until you run out of things to work on in an incoming queue:


while ((job = queue.get()) != NULL)
{
      //do stuff
}

But if you need to initialize your control variable, check the variable’s value at the end of the loop, and then assign a new value to that variable, use a bloody for loop. Going:


value = initail_value;
while (value != 0)
{
     //do stuff
    value = get_next_value();
/* or 
    value = next_value; or
    value = thingy->next_value;
*/
}

makes about as much sense as going:


i = 0;
while (i < 10)
{
    //do stuff
    i++;
}


All programmers seem to know instinctively that the second case reads better as


for (i = 0; i < 10; i++) {
    //do stuff
}


But, its apparently too big of a leap of logic to realize that the first case is cleaner and more readable as


for (value = initial_value; value != 0; value = get_next_value()) {
    //do stuff
}


-lv

Damn I miss programming.

I try so hard to instill good programming habits into my students, and then I catch them doing stupid shit like this:

When x can only ever be 1 or 0

if(x == 1)
{
//do stuff
}

else if(x == 0)
{
//do other stuff
}

else
{
//do something else entirely
}

Administration thinks that deducting ‘stupid fuckhead’ points is unfair.

in C++ x can sometimes equal something unexpected. The last else would catch this, and maybe display a message “er, x is not 1 or 0. Something’s gone tits up”

The “for” loop version and the “while” loop version do have different semantics so it’s possible that the programmer genuinely needed the while loop.

But given the nature of most programmers, they probably just missed the obvious and wrote unnecessarily obscured code.


switch(x)
{
    case 0:
      // do stuff
      break;
    case 1:
      // do other stuff
      break;
    default:
      // commence screaming.
      break;
}


You can say “if this equals something, then do this, otherwise do that”. You can say “if this equals something, then do this, otherwise if this other thing equals something else, do that, otherwaise, do a third thing”. But if you’re saying “if this equals something, do this, if it equals something else, do that, otherwise do the other thing”, use a switch statement.

-lv

Or event:



assert(x == 1 || x == 0);
if(x == 1) {  

} else {

}


I’d find that the clearest way to express “x should always be 0 or 1 here”…

I’m talking about an instance where x is a variable that cannot ever be anything other than a 1 or a 0.

Of course I teach them how to deal with instances in which they have a variable, like y, which could be 1 for 'do something, 0 for ‘do other’ or some other value entirely for ‘an error happened.’

But in cases where, for example I’m having them do a modulo division of an integer by 2, the answer can only be 1 or 0. There is no other case, and yet they still write unnecessary code.

What? In the code fragment I gave, both loops initialize, test, do stuff, then advance the condition variable, in that order, then test, do stuff, and advance until the test fails. They (should) compile to the exact same set of opcodes.

Granted, a “continue/break” statement would work differently in the while (wouldn’t increment the condition variable) case, and I’ll assume that that’s what you were referring to, but that’s not an issue in a vast majority of programs.

-lv

I am too rusty to really know what the hell I am talking about but in my experience it was often convenient to put strange code in temporarily to debug the program.

>>I’m talking about an instance where x is a variable that cannot ever be anything other than a 1 or a 0.

huh. i take it you haven’t maintained much code in the real world, then?

the RISKS digests are full of stories about code that assumed variables would always be in a certain range, only to have modifications which violated those boundary conditions and cause rockets to go spiraling out of control.

from a performance aspect, the extra else costs nothing, and as pointed out is a useful catch when (not if) something changes later on.

while i won’t argue that your students write great code, you certainly aren’t teaching them how to write maintainable software.

catsix, in that case I’d suggest making your students write a test plan for their code that ensures 100% code coverage. Not only will I then have a chance of hiring a junior programmer that knows how to write a bloody test plan (since that aspect of coding seems to not be covered in college), but they’ll have an incentive to not add unneeded code.

-lv



i = 0;
while (i < 10)
{
    //do stuff
    i++;
}


I just can’t get worked up over this. Sure, it’s shorter if you do it as a for loop, but they’re equivalent, and both clear, so what’s the big deal?

“break” behaves the same, I was thinking of “continue”.

No, it won’t crop up very often, but that fact that it could crop up means you have to pay that much more attention to the “while” loop when reading the code. Which makes using “while” when “for” will do that much lamer…

(Personally, I’d rather see neither–



(loop for i from 1 below 10 do
  (something))
; or
(loop for item in some-list do
  (something))


look far nicer to my eyes ;))

Actually, I have thanks.

I’ve dealt with massive piles of shit code left behind by other programmers who tested for everything under the sun when a goddamn int number modulo 2 cannot possibly result in any other answer than an int 1 or 0, especially in a program that verifies the data before the modulus is performed.

And this is a rather large problem with programmers today. Extra code doesn’t cost anything, so they throw it in wherever in the most haphazard manner, not knowing whether their asses are actually covered or the code does absolutely nothing, thereby constantly increasing the size of programs.

I teach them to write concise, efficient code that is not such a mess that it becomes an unmanagable chore to figure out just why the hell an entire function that’s never called anywhere by another function or in the main function exists in a program. You ever tried to maintain code where people left in absolutely everything they typed?

Yeah, they tend to hate the fact that they fail projects for such things. They of course also think there is no actual need to ever begin any kind of plan before writing a program, which is a belief I cure as quickly as possible. Shit, lots of them will actually try to begin coding before they even understand what the hell problem they’re being asked to solve.

For sufficiently large sizes of “//do stuff” they’re not equally clear.

The first line of the “for” loop tells you instantly that you’re iterating over some sequence. The “while” loop may require you to read through a significant chunk of code to figure that out. And as I mentioned previously, they have different semantics under some circumstances, so the “while” loop might require more careful analysis.

The while case is less clear. It leads to infinite loops like:



i = 0;
while (i < 10)
{
    //do stuff
    if ((something_else == SOME_SPECIAL_CASE) && (i==8)
    {
         //We don't want to do "other stuff" for this special case
         continue;
    }
    //do two screenfulls of other stuff
    i++;
}


That said, I believe I did mention that this was a lame rant, right?

-lv

I agree with the OP, but disagree with CATSIX good programing requires both good standard style, and protection from the unexpected having apparently impossible else clauses that lead to error handling is of great importance in good programming. These ensure that when your code is missused or incorectly updates it will give meaningful messages that will allow the users of your code to fix the problems they create. Also such code is a godsend if there is a memory leak from another application into your own.

Hi, did I talk about a program that interacts with any other application?

I talked about modding a freaking integer by two. If you check to make sure the number you’re going to mod by 2 is an integer already, what other results are you going to get from that particular division?

I’m with you catsix.

I see useless, redundant idioms all the time. I spend a lot of time wearing an “Oracle DBA” hat, and one of my pet peeves is when I see people doing stuff like:



create table foo(a number);
commit;
select * from foo;
commit;
drop table foo;
commit;


No matter how many times I tell people that there’s no point in commiting after a DDL operation, they still do it. And regardless of the fact that doing so completely defeats the purpose of transactions, they commit after every single fricking statement. It’s almost like people insist on not “getting” transactions (kind of like how many people insist on not “getting” relational databases in general…)

Sometimes, it’s the little things that grate for no rational reason. :slight_smile: