Yeah, he learned them, which means he acquired them through use and building upon his current knowledge. What you are advocating is memorizing a bunch of stuff, pretty much in the same exact way people cram before a test. The former is nearly effortless. The latter takes a lot of work.
And, based on what you say, count me as someone who doesn’t understand why he needs to know more about these methodologies, for one simple reason. It sounds like learning it would be automatic at any job where it was in force, since they’re going to have to inform you of the particulars anyway. This does not sound like something that he needs to memorize. Just read a little bit, get familiar with the concept enough that it makes sense in your head, and then say you know about it when asked.
The only benefit I can see of memorizing is what was stated above: to know a few buzzwords to parrot back.
And, yeah, I know I don’t have experience in this area. But some of you do, and you haven’t made it clear enough why he needs to memorize this stuff.
Well, that interview was uncomfortably short. Perhaps I so strongly impressed him that he didn’t need to talk to me any further! Somehow I don’t think so. Anyway, design methodologies didn’t come up.
Except it’s pretty easy to tell during an interview if someone has really done agile development or not, especially in a technical interview. Buzzwords will only get you so far. But if you can demonstrate that you understand the concepts and can offer opinions on which of varieties of agile would be appropriate for different projects you can convince a good interviewer that you can be trained.
It’s not all buzzwords, and it’s not just like people used to develop code. The real problem is that most software development doesn’t use any formal methodology. If you move to an environment that does you’ll be dealing with a culture clash at minimum, and real work issues if things are bad. True agile development isn’t what most people are familiar with - the iterations are much shorter, the tasks need to be broken up into bite-sized deliverables, and it relies on developers giving realistic estimations of work efforts. You’re also working closely with QE and Doc (and potentially other groups) on a day-to-day basis.
Not sure what you mean by this. Usually learning a lot of technical terms involves a minimal exposure so that when one comes up you aren’t hearing it for the first time. So that’s not exactly memorizing, not like learning the state capitals for a geography test. Some jobs require memorizing lots of technical terms, such as billing and coding for medical systems, but even then you have resources to look up information, you don’t rely on memory alone.
I’m not suggesting the OP do a lot of memorizing, just getting some exposure and realizing they’re new names for old concepts in many cases.
But it’s also extremely rigid and difficult to make design changes once you get started. It’s not appropriate where you have to code for a solution where you are starting with incomplete knowledge or requirements.
Agile and Scrum are more interative, making them more flexible, but they also require discipline to use effectively.
Developers like to think that this project management stuff is pointy haired bullshit. But the fact of the matter is that development projects cost money and companies need to have an idea how much they will cost and if it’s worth it. They also need to know if the projects are on track or if they need to hire more resources.
I’ve seen what happens when companies go with the “lets have a couple really smart guys just code this approach”. You end up with unstable, shitty, unmaintainable spaghetti code.
Really I have to question what sort of development the OP was doing if he has never heard of ANY of these methodologies? How did you decide what to build? How did you architect it? How did you document the business logic or any special algorithms? What sort of QA process did you go through? How did you handle updates and bug fix releases?
Do you think that these methodologies have existed forever? What do you think programmers did before these things existed?
Do you think that they didn’t know what to build? How can that be? We had computers, and businesses ran software, decades before these methodologies existed.
The IT world didn’t come into being all at once including Waterfall, Agile, and Scrum. Somehow, regardless of your disbelief, banks managed to maintain accounts, and stores managed to do business, using software that was developed without these methodologies.
I’m not saying that they don’t serve a purpose. I saying that it’s silly for you to question how it’s possible to write good workable code with knowing about them.
Of course it’s possible. They may make the process more efficient, but they don’t move it from the realm of the impossible to the possible.
It looks to me like you are going from a less formal work culture to a corporate professional culture, and you are realizing some gaps that may make that transition more difficult.
And that’s okay. If you haven’t been in the thick of things, what else could you expect? It’s never to late to engage with your industry. Start reading some blogs, talk shop with your buddies, watch a few TED talks and get a subscription to Wired (or whatever you guys read.) Get yourself up to speed and engaged with what is going on in the wider world of your field. Grab the bull by the horns.
But you aren’t going to get much from being able to sit down and write good code. Not when Juan and Rajiv and Xiaofeng can write code just as well as you can. Well run companies no longer hire routine employees. They want to hire an indispensable employees, employees who are crucial and unique and make substantial contributions. Now and then you may find a mom and pop who just needs someone who can fix a computer. But if you want to be in a corporate or startup environment, you have to bring more to the table than writing good code. And one thing they’ll be looking for is someone who understands and is an active part of the entirety of software development, not just capable of writing the what their manager has put in front of them.
And if you’ve been on the sidelines for a while, that may mean you need a play some catch up. That’s not a reflection on you- what you’ve been doing has been working fine so far. But it’s a rough world out there and nobody gets to relax for long.
Organizations are more complex than ever. Our markets are increasingly global and connected. Customers expect increased performance and more responsive products. The need to streamline workflows and use your resources is more important than ever. Not that long ago, for example, supply chains were managed by looking at what you ordered last year and making an educated guess. Now we have predictions that are so accurate that Walmart successfully predicted an increased demand for strawberry Poptarts during hurricane Katrina. Shipping has becomes so much more streamlined that “fast fashion” shops like H&M are possible. I’m not a techie, but there is no doubt that tech is facing the same pressures. Even people who manage little Etsy shops are using complex methods for managing their customers and providing service.
And that’s why management has become more systemized and more complex. Better management has brought us a lot of the really cool things we now take for granted.
So either you can fight it and stick to your line that you just want to write good code, or you can rise to the challenge.
I’m going to have to go with this assessment, although I’ll try not to be quite as harsh.
You’re understandably frustrated because you’re out of work. Your problem is that you haven’t kept up with about 20 years worth of innovation in how this work is done. Buzzwords have meanings, and while they can be (and often are) abused, if you can’t discuss them intelligently, you’re not going to have much luck finding work.
I wish that programming could consist of sitting down and just writing code all day, but it doesn’t. Software development is a team effort integrating dozens of disparate skills and successful efforts require herculean-levels of competent project management. Scrum, Agile, and so forth are categories of tools that can help to do that. Like anything else, tools can be used to great effect or they can be misused to disastrous consequences.
I estimate that I have interviewed about 80 job candidates over the past six months, applying for jobs from summer internships to senior-level systems programmers. Only about 30% of my questions are directly programming related. And that’s just to make sure they understand your basic computer science concepts. (Languages are irrelevant; anybody can learn a new programming language in a couple weeks.) The rest of the questions are meant to assess how the candidate approaches a novel problem, how they break it down into steps, how they determine whether they have accomplished each step, and how they determine if they have solved the problem as a whole.
You know what that process is? Project management. For programming problems, it’s just a smaller version of what a full-time PM does. And if you don’t understand what those skills entail, you’re not going to be very useful as a member of a software development team.
You mentioned that you had spent much of your time running web stores. My guess would be that you installed some piece of turnkey e-commerce software and tweaked it as your clients requested. That’s not software development. It’s not really even programming. At best, it is software maintenance. The kinds of problems encountered in maintenance rarely offer you the opportunity to grow as a programmer by learning new technologies, methodologies, or problem-solving skills. It’s the same shit day-in and day-out.
You do that too long, and before you know it you’ve become useless while the rest of the world has passed you by.
I understand completely the frustration feeling inadequate or stupid. You’re obviously not stupid, but you need to take a few breaths and relax and get up to speed with what you’ve missed out on.
I’ve moved around from job to job a lot. What I constantly encountered were people who were content to become minimally competent in one skill and then barely perform their job adequately. I was the guy who was always reading new programming books on the subway, learning new techniques, going to conferences and Meetups, talking to people who were doing really innovative and amazing stuff and picking their brains, and eventually developing a reputation for high-quality work in the open source world until I could get hired based on my public reputation alone. I could do that because I fucking love this stuff. Reading books and talking to smart people does not require effort on my part because learning about it is so damn enjoyable for me. The worst jobs I’ve had were the ones where I was the smartest guy in the room. The best ones were the ones where I felt like a moron. That means there was an opportunity to learn.
I was laid off in August and during my job search I think I was asked about scrum in every single interview I had. So, IMO, yes, you’re going to want to bone up on these things.
How did it get done? Not very well. Please don’t tell me that you’ve never read The Mythical Man Month. When I was in college, in the early 1970’s, we studied software disasters - and software was a lot smaller back then.
Case in point - you said that you wrote the test plan. Did you do testing also, since developers are the very worst people to do QA. Developers tend to test for those things they have thought about. Much of microprocessor design is like software these days, and we’ve found the best way of doing design verification is to thrown tons of random mini-programs at simulation models of the design. It finds stuff the designers would never think of testing.
PM checking in. I wont echo the same points already made here about learning the current lingo and about how each methodology works - those points are made well upthread. I just wanted to add that from a management perspective, Agile poses some issues with regards to predictability. A lot of organizations and projects have budget and time constraints, and Waterfall lends itself to a certain amount of predictability, which is why it persists today. I think Agile delivers a better end product, but it is more difficult to tell someone how much it will cost and when it will be done. Some organizations are OK with that uncertainty, others are not. So, that may be a question the OP would ask back when an interviewer poses the question.
For the OP, I think you will be okay thru all this, as you are already more informed than you were at the start of the thread, which means when this question comes up again you will be better prepared, and the interviewer can move on to other questions without raising an eyebrow at you.
No. I’ve been in jobs where those things weren’t used. Gotta earn a paycheck, and I can’t force an employer to move out of the stone age.
You’d be guessing wrong. :rolleyes:
Why the hell would I think something stupid like that? Where the hell did you get the idea that I would think something stupid like that? Not knowing the names of system design methodologies is not synonymous with not writing readable code.
You’re probably one of those gals who thinks it’s hilarious to pretend that she can read minds and know what people think and do and how they code. Not only that but you’re one of those gals who does that outside of the pit which is the only place where this belongs.
Jesus people. I sincerely asked about this stuff because I wanted to know how common it is for employers to require it, and what’s the best way to respond when asked about it. Some of you are trying to be helpful but some of you are jumping all over me for asking for help. This isn’t the pit.
We didn’t write test plans for our own code, we wrote test plans for each other’s code, and of course I didn’t test code. There were other people, not developers, who did that.
The last large development project I was on was in the 90s and the employer didn’t use these methodologies. My employment since then has been doing Business Intelligence (this was at a major health insurer and our department did not use these methodologies), web design firms that did not use them, and, most recently, running a corporate IT department, which involved almost no coding.
Maybe not “memorize”. I’d say he has to be familiar with them. If for no other reason, that’s how a lot of managers like to run their projects.
In all fairness to the OP, a lot of managers/ project managers don’t really have ANY sort of methodology. Some are totally non-technical and just fill out weekly status reports and meeting minutes. Or everything is just ad hoc requests that get filtered down to the next level until there is no one else to filter down to. Or they are dependent on some super-human coder to just come up with miracle shit at 3am until he gets burned out and takes a job with Google or some other startup.
Maybe my problem is that I’ve never liked large corporate environments. At least not ones where I was just another cog in the machine (in my most recent job I largely was the corporate IT department, at least onsite).
Maybe I should just continue avoiding that kind of environment until I can retire in 10 years. I do have a number of small web development firms showing interest so maybe I should just do that, and of course read up on some of this stuff in case it comes up.