Doper developers : Please tell me what I am going through is unique

Unfortunately, I doubt it will be. Programmers are some of the quirker people around, and when you slap a PhD on top of it, it becomes…crazy.

  1. The forementioned PHD likes to come up behind me and just pokes his face at my screen. “A-ha!” He would exclaim if I happen (heaven forbids) have a complier error. “Check the .h file!”, “Did you declare the function signature?” and so on and so forth.

Is it just me, or is it werid to loathe people peering over my shoulder uninvitied as I try to code? Or debug a problem? No big deal with pair programming (the other coder is invited, at any rate). But I just hate people who sneak up behind you and then ask, “Why aren’t your inheriting/using virtual functions/compositing/using pointers/allocate off the stack/using vector?”

Ugh

  1. Supersitious DLL belief

“DLLs areeee evvilll!”

I got into an extended arugment over static builds and DLLs in Visual Studio. Needless to say, I understand what the PhD is saying about DLL hell. DLLs not found, wrong versions and so on. But what drives me into a frenzy is the unwillingness to tackle the problem. There are wills to solve the problems - manifest files and researching deployment strategies. But to him it is like some sort of arms war - I pull up some stuff to show it will work but he would retort “It didn’t work the other time with OpenCV!”

The one incident which almost had me vomitting blood is when I introduce to him Dependecies Walker, which can find out what dlls are missing, allowing me to identify them. He dropped this question, “Does Dependecies Walker needs a DLL?”

“Yes”

“Then what if the DLL it needs couldn’t be loaded?”

“It’s in the same folder as the .exe!”

“Yah, but what if it cannot be loaded! Or it depends on something else not found in the system”

Me: “Then the system is fucked and no way it would run our software”

  1. Meaningless jargon arguments

“I try not to use inheritance when possible” I said that. Bad big mistake.

“Ah-ha! Then you are using object-based, not object-oriented. Blah blah blah”

I need cotton wools.

Your coworker sounds rather insecure. Peering over your shoulder and making comments would drive me batshit too. Doesn’t he have any work of his own to be getting on with, or what?

Nope, it just that sometimes he likes to hover behind me, or he just happens walk past and ‘see’.

And I just have number 3)

Please,please, please. Do you see the break points on my screen? Do I look distracted or otherwise? This is because I am debugging. This is a bad moment to drop me and lecture me on the correct use of data structures or question my design decision.

Why don’t you just ask me ‘So, how are you handling this problem situation’ instead of , 'So how are you handling this problem situation? If you are using an array, then it won’t work. You have to use list or a map. I promise you that once you have deal with this your experience with data structures will improve. I ran into this problem myself when I work on a similar project. You got to be careful…"

SHUT UP!!!

Please tell me this is not normal.

Though the truth may sucks.

And I wish he is my co-worker. He is my supervisor. (Me bachelor, he PHD. Any attempt to argue with him usually goes into ‘When I was doing my dissertation’ and there’s nothing I could gain-say him. Even in areas which he has no clue in.)

Get one of those polarized screen covers and angle your monitor so he cant see it from anywhere but your chair?

Go to HR and tell them he is making you uncomfortable to the point of being harassed, you need to be able to think without being interrupted? You have no issue discussing programming with him, BUT it needs to be at an appointment, not at random because he is breaking your concentration.

I don’t think (1) is all that common – at least, not due to PhD status. My guess is he’s just a supervisor who simply isn’t very good at supervising (which is pretty common, I think). At least he’s not ignorant and intruding on your space – that’s cause for a ball-point to the jugular.

I’d say that (2) isn’t common specifically, but is a common general characteristic that has different incarnations for everyone. (That is, for this guy, it’s dlls. For someone else, it’ll be IDE choice, or text editor, or preferred programming language, or whatever. Needless to say, his argument is ridiculous.)

In my experience, (3) is fairly common among PhDs, and often for good reason. For those working in a particular field, there’s a reason that jargon exists – the use of well-defined and nuanced terms allows faster, more efficient, and very precise communication of the broader abstract ideas, methods, etc. What appears as mere pedantry and obstinance often really does have a point, even if one doesn’t initially see it (or ultimately end up accepting it).

Does he actually write code, or is he more of an academic or someone who used to write code but hasn’t had code in production for ten years or so? In the latter, I can understand it, and have seen it before. In the former, that’s pretty puzzling. People who write code and put it into production tend to be pragmatic instead of pedantic. In my experience the people who talk about how to make things happen are the ones you want to listen to. The people who talk about how many things can derail a process or which might theoretically go wrong are the ones who have very little confidence in their ability and are floating balloons for when they fail to achieve their task.

This doesn’t mean everyone who points out possible pitfalls is an insecure and possibly ineffectual person, they may just be being prudent, but trends will tell.

Enjoy,
Steven

When I was in a cube-farm, I mounted a little mirror on my monitor so I could see when someone was coming up behind me. Several of my colleagues did this, as well, so it wasn’t just me.

I second the comment about getting a privacy filter for the monitor. Say you need it because the glare from the overhead lights were giving you headaches.

The behavior shown is not limited to just those with PhDs. Unfortunately, we are in a profession where many of us tend to believe in our God-like powers. Think about it … we create stuff out of nothing using only our minds and our fingers (and a good IDE, and a decent programming language, and frameworks/toolkits built by others). Thus, you are going to run into a huge number of people who will argue about every little thing in the programming realm.

Many of us suffer from Not-Invented-Here syndrome. That syndrome manifests itself in statements such as, “You didn’t do it the way I would have done it; therefore your way is absolutely wrong!”

The key is to not become one of these jerks yourself. As you progress in your chosen field, try to remember that things are changing rapidly and the stuff that you know today may very well be replaced in the future.

One thing to watch out for is the belief that a particular tool is always the correct tool in every situation. The programming universe is not that simple for there to be one and only one correct way to do things in all situations. For example, that is why we have different sorting algorithms (Bubble, Heap, Insert, Merge, Quick, etc.) … some algorithms work better in some situations than others.

Now, regarding your boss … go with the flow. Put up your barriers, both physically and mentally. Thank him for his help when he offers it. Say something like, “You know, I hadn’t really explored that particular aspect in detail, but you’ve given me a lot to consider.”

Avoid arguing with him, because he certainly has more experience at being an opinionated, overbearing, pinheaded jerk than you do. Plus, he has seniority.

No, your situation is not unique. It will occur to a more or less degree everywhere you go.

Just wait until you write the code, debug it, get it working exactly as specified (and desired even if not specified), then have someone go over your code to check your naming of variables and alignment of indentations, then expect you to rewrite it to conform to the shop spec and start the whole process over again.

Get used to it. It’s real life and real world. You put up with it and try not to kill them.

It also sounds like he doesn’t really know what he’s talking about. Object-oriented programming does not imply inheritance, nor even classes for that matter. If your program solves a problem by considering elements of the problem domain in terms of objects and their relationships, you have an object-oriented program. It might be written in C++ or Java using classes to taxonomically organize objects, or it might be written in JavaScript using prototypes, in Lua using tables and metatables, or even in C using structs and functions.

Inheritance is a technique that is sometimes useful in object-oriented programming, but even in languages that support classes, composition is just as “object-oriented” as inheritance, and for many situations is a better choice.

The idea that OOP means “always define the problem in terms of classes and subclasses” is the kind of sad misconception all too often foisted on students in introductory Java programming classes, and I would have expected someone with a PhD in computer science (that is what he has his PhD in, right?) to know better.

As far as I am concern, he’s always the researcher. I never work in a research field before, so I am not aware of his job scope, but I doubt he has experience working for the front end. He doesn’t know UML for one thing.

For all the things he does I am trying not to attribute to malice; Maybe he’s overly concerned. I am not sure how to raise changes to the working patterns: I have worked in a software house before, and instead of discussing design decisions by suddenly popping up behind my back, we would go to a room and draw flow charts and etc. And the setting is an academic setting (the research group wishes to create a consumer product from its research. I kept asking for more GUI developers but the PhD kept rebuffing me. “What you do is easy! Anyone can do it!”).

As for the mirror, I’ll get one. Sometimes I use the black shinny surface of my CPU to see who is behind me, but it is like argh he’s everywhere!.

He’s a PhD in computer science, specifically in computer vision and imaging. He got a superiority complex from it. “Nothing beats 3D and computer vision” he likes to boast. I understand that you need raw power when you are doing image processing, but I am coding the GUI and he was asking me, “Why are you making things complicated? Using a vector? It slows things down!”

As for not going ax-crazy and killing my co-workers, I usually could, as long as the agitation level is kept down. Thanks for the advice thus far!

<dons dusty, pedantic, jargon-filled grad-school cap>
Personally, I don’t think the quoted statement is accurate – if you’re using the term OOP as I believe it is commonly understood. However, I’d agree with it wholeheartedly if you substitute “necessitate” for “imply” (which is how I read most of the rest of your post). So for instance, using the distinction put forth by the PhD guy, your example of composition would assuredly be “object-based” but not “object oriented” programming.

Why on earth this would be a point of contention is beyond me – without more information, it does indeed seem like academic pedantry. But researchers have built careers on exactly that type of thing (cf. Chinese room debates, agents, clould computing, etc.), so my gut response it that he likely had some point to make, valid or not.
</ddpjgc>

That is in fact what I meant by “imply.” I was using it in the formal logical sense. :wink:

If you take “imply” to mean “suggest, especially in connection with common usage,” then yeah, object-oriented programming does generally imply classes, since they are absolutely part of the approach to OOP that is most commonly used and taught. But this isn’t a valid foundation for making any kind of “academic” distinction between what’s OOP and what’s not.

I don’t see it as academic pedantry. If anything, the academics should know what is really meant by “object-oriented,” and should have a clear theoretical understanding of the fact that classes are just one of many possible ways of constructing object-oriented programs. Classes and inheritance don’t even really intersect with the real concern of OOP; they’re just a way of taxonomically organizing objects, an organizational principle rather than an approach to modeling problems as interactions between objects. There are object-oriented languages that do not support classes at all.

I’d say that being a good object-oriented programmer definitely requires an understanding of concepts beyond classes. For too many people, classes and inheritance are the only OO tools they have, and their code suffers for it.

Programmer with a Ph.D. here (although my Ph.D. is in physics, not in programming) . . . and I have to say that I’m pretty sure if I ever tried to do that at my job they’d take me out in the parking lot and kick my ass.

Come to think of it, have you tried that?

He sounds like an insecure jerk who just wants to show off. That kind of behaviour will also result in a severe arse kicking where I work. I work in a very small company where almost everyone does something highly technical, though.

For my part, I am also trying to look at it from a different point. Am I being too pedantic or just plain touchy? It’s normal for a supervisor to come along and ask, “So how are you going to implement this feature?” Maybe next time I should just do whatever I am doing and grab a pen and notebook when he drops by next time so that I won’t be multi-tasking and less likely to be annoyed.

I understand that many people have different definitions of OOP, but the functionality of the software does not rely on it being OOP or object-oriented. It’s just pointless wrangling in context of developing a software for end-users, not some academic research or for some paper. Is the code easier to maintain if I’ve used composition or inheritance? That’s to me is what is more important.

I’ll also keep a watch on myself when I interact with other programmers.

Another programmer with a PhD (nearly) chiming in. I’ll side with the majority here: you met a quirky person that happens to have a PhD. PhDs might stress you out a lot, and I have a friend who went into a depression for some time, but PhDs don’t make you behave like that by default.

On the other side, though, I noticed many students pick up the attitudes and quirks of the people in their research group, and I did meet quite a few lecturers and researchers behaving a bit like that (well, more like behaving a lot like that), so it could be the case that your colleague, while being an otherwise perfectly normal person, just got influenced by socially bad company.