A C programmer's rant -- or -- How to get the current year in 8 easy steps

(This is a mild, geeky rant and probably won’t amuse you much if you’re not a software developer.)

So then. I and some other poor saps at work are tasked to migrate this large, 18-year old software product, written primarily in C, to a modern platform. For the most part it must remain as it is, in C. The people who wrote this old code are long gone, and I hope for their sakes — after my experience with their handiwork — that they’ve left the profession entirely.

I came across this specimen a few days ago. As you can see, it is a handy function for returning the current year as an integer. I’ve added line-number notes for reference.



int get_current_year(void)
{
       time_t  current_time;
       char    time_string[26], current_year[4];
       int     totlen, pos, i, int_year;

/*1*/  time(&current_time);
/*2*/  current_time = time(NULL);

/*3*/  strcpy(time_string, ctime(&current_time));
/*4*/  totlen = strlen(time_string);
/*5*/  pos = (totlen - 5);

/*6*/  for (i = 0; i < 4; i++)
       {
         current_year* = time_string[pos+i];
       }

/*7*/  int_year = atoi(current_year);
/*8*/  return (int_year);
}


Works like a charm too. So far, anyway. In case your C is a little rusty, allow me to reveal the magician’s secrets:
[ol]
[li]Get the current time as an epoch value: the number of seconds passed since January 1, 1970.[/li][li]Get the current time as an ep — hey, didn’t we just do this?[/li][li]Convert the epoch time to a date-time string, and make a local copy of that string.[/li][li]Get the string’s length, even though C tells us it’s going to be exactly 26 characters, pretty much always. Still, you can’t be too sure of anything in this crazy, mixed-up world of ours, so better go find out. [/li][li]Get the position in the string where the four-digit year starts. Use needless parentheses for flair.[/li][li]Copy the year’s digits to make another string. Is this new string properly terminated, for what is to follow? Our author, ever the optimist, apparently had no doubts.[/li][li]Scan the four-digit string to get its integer value. Store the integer in case we need it later.[/li][li]Hey, where’s that integer? We need it! Return it to the caller, who’s probably wondering where we’ve been all day.[/li][/ol]

It would be hard to improve on this code — assuming its purpose is to completely crush the spirit of all who read it. If on the other hand its purpose is to return the current year as an integer, which would be the conventional interpretation I suppose, then I’d like to have a crack at the problem myself. You know, just a few tweaks maybe. Just a little polishing.



int get_current_year(void)
{
  time_t  current_time = time(NULL);

  return localtime(&current_time)->tm_year + 1900;
}


Hope you don’t mind, old programmer, wherever you are. I did keep the name at least.

A thing of beauty. They’d appreciate it at TheDailyWTF.com, I’m sure.

This is pretty much going to be a programmer’s thread so I’ll include some of my horror stories as they become unrepressed.

Back when Windows 95 was new I was contracted to modify a legacy app that was even older (Clipper, Summer 87). shudder

Far enough back that I have blissfully forgotten the details, but I was replacing thousand line procedures with 10-20 lines of code. I speficially remember tossing code that included filter commands to locate one record in a 100k record database with simple find commands to jump to the record that there was already an index for! 10 minutes of execution accomplished in fractions of a second.

Documented code? Right. The original programmer was a charter member of the 'If it’s hard to write, it should be hard to read" club. Much of the time I’d just look at the input and output and figure out what he was trying to accomplish.

He was family to the clients and had expired of a sudden heart attack so I had to use a great deal of diplomacy explaining my improvements.

When I was done their code base was less than a third of what I started with and was orders of magnitude faster.

Not to defend the code in the OP, but the C functions for getting the time are needlessly convoluted anyway. Instead of being able to just get the current year directly, I have to first manually allocate an object. Then, I have to call a function which outputs the same data in two redundant ways (both returning it directly onto the stack and storing it in the location referenced by the pointer it is passed, unless I indicate to skip this second step by passing that special non-referencing pointer called NULL), one of which I will almost certainly elect to ignore/disable, while the other is directed to the object I just allocated. Then, I call a further function which needs to be passed a pointer to this object (again), and which returns a pointer to a new object. But this new object is statically allocated, so if I ever call this function (or other related ones) again, watch out; that data I just got a reference to gets overwritten without warning. Finally, even when dereferencing the appropriate field from this structure, I still do not directly get the year; rather, I get the year in terms of an offset from the year 1900, which I must manually shift back into the representation I care about.

Anyone who doesn’t respond to that with “What a mess!” has been blinded by familiarity.

Time related APIs are to programmers like printers are to IT dudes: Always unnecessarily large and convoluted in working with. Why? I have no idea.


I had a project working on some code that broke up Japanese text into its syntactic elements. It was written in Visual Basic as a web page and I had to convert it to C.

It didn’t rely on any sort of clever data structures or paradigms that modeled real language and the guy who wrote it wasn’t really all that hot a programmer to begin with. He pretty much just broke the language up using common sense and brute force.

While I can respect that this worked perfectly well–possibly better than something which used clever data structures and language modeling paradigms since those presume that language speakers will follow something like rules–it meant that the code was quite large, and functioned like an old goto/spaghetti QBasic program. Every function was a switch with 5-10 possibilities and for each possibility it called another function that was itself principally just a switch statement. And there were a good 20 functions in each file, for a total of something like 40 files.

Laboriously I port this thing, inventing my own self-resizing string type and all the necessary string functions thereof in C since he was overrunning all the buffer sizes he said I’d need for everything, going over every single line of code several times to make sure it lines up exactly with his code (which he was modifying daily as I was continuing to do the port), and finally get it working.

So now all the guys in management, they start coming up to me all rubbing their hands together sort of hushed and nervous like, “So…you understand how it works now? The code?”

FUCK no. I could port it another 80 times and spend a year tracing each line in a debugger and I’d still have no idea what was going on in that code. I might be a better programmer than the guy who wrote this mess by orders of magnitude, I might be able to make it go faster by a few hundred times, but you’re stuck with him and him alone for the rest of eternity or until you create or purchase a different language parser, cause aaaain’t nobody that’s going to take that spaghetti apart.

At least he had the decency to write in actual C. I have seen this (in a program I fortunately did not have to support)



#define BEGIN {
#define END }
#define AND &&
. . .

I suppose someone was happier writing pseudo BASIC.

This looks like the pseudo-Algol the original Bourne shell was written in. Since that original implementation (the Bourne shell syntax has been re-implemented multiple times, and often expanded in the process) did mildly unholy things to allocate memory, it often crashed and had to be debugged by people who weren’t fans of the Algol-esque macrology, making it infamous in Unix-oriented C circles.

  • 1900?
    What if your system is running pre 1900?
    Huh?
    Huh?

6 years ago I was primarily writing C++. Now I can barely read the code in the OP.

Oh, I’ve got a million of these, most from the past year.


typedef int int32;
typedef long long32;

One can only presume that long32 is a type for the indecisive(or perhaps those who subscribe to the Forrest Gump philosophy of writing code).


raii_mutex_grab raii(the_mutex);
//do things with the mutex
delete &raii;
//do things while not holding the mutex

I’m still in awe that there’s a programmer out there who understands raii but doesn’t understand the difference between the stack and the heap.


if((int)ptr == NULL) { //...

Why, God? Why?


struct bar {
   //.. members go here
}; 

static int foo[FOO_SIZE];

//...

for(i = 0; i < FOO_SIZE; i++)
    foo* = (int)malloc(sizeof(struct bar));

//...

manipulate_bar((struct bar*)foo[index]);

//...
more_stuff_with_bars((struct bar*)foo[index]);

//...
get_the_point_yet((struct bar*)foo[index]);

I fixed in the obvious way, of course: s/int/uintptr_t :smack:

Fortunately, one of the more senior people noticed and actually fixed it.
The worst I have ever seen, though, happened after I was several weeks into a project to port one of our applications to x86-64. There were the usual errors in our code, but we also had some vendor code that we linked against that was the most god awful code I’ve ever seen. After a couple weeks of fixing 64-bit cleanliness issues you sort of go into a fugue state and start fixing things by rote. When I saw about 6500 of the following errors, I knew that madness had finally taken me:

I’m currently helping with C labs for first year CS students who have only used things like Java and Haskell. The exercise, unsurprisingly, focuses on pointers and the manipulation thereof. It’s a basic dictionary exercise. The (unchangeable) header file they must all use contains:


typedef struct dictionary *dict;
typedef word char[MAXWORDSIZE+1];

struct dictionary {
    dict left, right;
    word somedata;
}


Now, I’m no great fan of excessive Hungarian notation, but what they’ve done there right at the start is typedef away any syntactic indication that these “dict” things that we’ll use all over the place are in fact pointers. It’s like they’re deliberately trying to ingrain shit habits into innocent first years. Every student I’ve sat down with I’ve had to say, “you see this here? Do not do this.”

Not nearly as spectacular as some of the examples in this thread, but I’m just a bit horrified to see it foisted on students at a top three UK university.

I recently had to right a C function that returned the number of the current day. It’s sort of a pain in the ass. In RealBasic (where I wrote a companion program) you just say day.DayOfYear.

There’s something to be said for rapid prototyping environments.

The ‘struct tm’ structure that’s filled in by functions like ‘localtime’ and ‘gmtime’ includes a day-of-year field. So I think you might have re-invented the wheel here — unless you were forbidden to use the standard library?

Agreed. On the other hand, the DailyWTF site that friedo mentions shows that every language attracts people who are determined to use it ineptly.

I’m not sure Python or Ruby, say, had they been around in the early 90s, would have helped the guy whose work I’m having to deal with now.

Fortunately the C library designers anticipated time travel, and were foresighted enough to make the ‘tm_year’ field a signed integer. So you’re good to go as long as you don’t stray too far. Plus or minus 32,767 years is certainly safe.

Unfortunately, I was writing this for an embedded controller, and the complier (Dynamic C) uses an odd-ball subset of the standard C libraries. So, that feature wasn’t available.

People often seem to assume that things won’t exist in the language libraries and just grab the first closest function they see and run with it.

I agree in General - that’s a pretty poor piece of code, obviously written by someone who doesn’t understand the time functions. However, there are a couple of things you pick up on which I don’t think are all that bad.

The alternative being to code in some hard constants which will leave us with a function which depends on the length of the string always being 26 characters. Admittedly there’s very little risk that that’s going to be an incorrect assumption, but I think it’s generally good practice to use strlen when working with strings. Assumptions are bad - especially when the alternative is so simple and easy.

For me, parentheses = clarity. I don’t think programmers should ever be discouraged from parenthesising their statements, particularly in C. I’ve maintained lots of code with a c precedence chart sitting in front of me, and my job would have been much easier if the smartarse who wrote it had used parentheses appropriately.

If I’m going to be running my code through a debugger then I like to do this sort of thing too - it makes it much easier to check the values that functions are returning than having to jump out of the function and see what its returning.

Coding style is always going to cause no end of holy wars among programmers, but my personal preference is (within reason) to go for clear, well-structured code, and let the compiler do the job of optimising it for me.

The example here is trivial, and it’s obvious what the function is trying to do, but if you dial up the complexity a few notches then a clear coding style makes all the difference. (and I’m not suggesting that your approach is unclear - I’m just saying that some of the examples you pick out aren’t necessarily examples of poor coding).

Don’t laugh. I recently logged onto one of my OpenVMS servers a couple months back and greeted with a screen that included the statement:

Last Login 15-DEC-1856

I thought I had done it more recently. But evidently when you get to be my age time flies by.

My C is a bit rusty, but isn’t there a problem in the OP code?

current_year is defined as 4 characters. 4 characters of year are copied to it, and then atoi() converts these to an integer. But isn’t a terminating NULL character needed? Will atoi() work reliably without that?

Steam Punk programmers would revolt!

I’m curious about this, too.

It’s been years since I’ve written C, but it looks like this would only work as long as the adjacent memory contains [junk] non-numeric data.