Computer security: Should you be accountable for your own system?

For the moment, let’s ignore being held legally accountible and focus more on ethically/morally accountable. I mean we can talk about legal stuff too, but I imagine the answer is that it hasn’t been adequately tested in court, it differs greatly in jurisdiction and in enforcement, etc. etc.

Here’s what I’m thinking:

It’s very difficult to track down or catch spammers, spyware/virus authors, DDoS’ers and other general internet bad guys since they commonly operate through “bot-nets” of “zombie computers”. Basically, people who lose control of their machines through trojan horses, security flaws, and other means. A person’s computer will become “hijacked” and used to mount various attacks or simply as a relay point for massive amounts of spam. Often these people are completely unaware that this is happening.

Surely people would say Microsoft is to blame for purveying vulnerable software. But is that it, end of story?

Another example that comes to mind might be people running unsecured wireless access points. If someone parks in front of your house and does some Bad Shit™ using your internet connection, should you be responsible/accountable? The trace comes back to your IP address. And while it may not have been you that did the bad deeds, you didn’t take adequate measure to protect/secure yourself.
If computers are tools, somewhat potenitally damaging tools, shouldn’t you be responsible for taking the needed steps for the good of society (in an abstract sense)? Is it enough to just say “Oh, I don’t know enough about computers and technology”, or “I shouldn’t have to deal with this”.

Are there non-hyperbolic parallels to be made regarding other societal things for which you are responsible even if you aren’t the one directly acting in the problem? (you don’t take proper care to maintain your vehicle and a metal piece flies off on the highway and hurts someone; you don’t adequately fence off your property, and some local kid drowns in your pool).

Where is the blame, responsiblity, and accountablility?

No interest, eh? Oh well…

If you leave a circular saw in your backyard unattended and someone proceeds to take the saw and butcher people with it, are you held accountable for leaving a dangerous weapon somewhat accessible in your yard, or is the offending person charged with muder and theft of the circular saw? Is the manufacturer of the saw held accountable because he didn’t build in a safety device that ties itself to the person who purchased it?

I think for the most part Microsoft is selling such a flawed product that they have the moral culpability. I’m sure some people with infected systems are running anti-virus products, but these are always slow in detecting new attacks. Remember, PCs are not marketed to experts, but to the average person. If there was something in the OS to inform you of an intrusion, I could see some responsibilty for the user to act, but that doesn’t happen.

As for the vehicle analogy, what if a design flaw made it possible for the brakes to go out all of a sudden? This could be countered by going under the hood and adjusting something every few weeks - but often there will be nothing to adjust. For the most part though, wouldn’t the company responsible for the flaw be more culpable than the driver for problems?

Wireless access points are a different story, since the documentation clearly describes setting up security. Setting up a password once is not too much to ask, especially when doing a setup that takes some degree of expertise. I hope the install software is a bit more insistent on the need for passwords today than it was when I set mine up.

Here’s a case study. In the good old days UNIX machines would come with root unpassworded, or with some standard install password. People discovered that they could get root access on lots of machines by using default passwords, since sysadmins, who should know better, never changed them. Today all systems have immediate password aging, so this isn’t a problem. Yeah, we can blame the idiot sysadmins in this case, but the vendors are the ones who could and did really fix the problem. I haven’t heard of this as an issue for well over a decade.

Nopers. No one forces you to use Windows. Microsoft, nor any other manufacturer has any moral culpability for people who mis-use their products.

Maybe if you changed the analogy to include needing someone to maliciously tweak the brakes in order for the defect to manifest itself, it would be a fitting analogy.

What if someone left their car unlocked on the street, and it got stolen and used in a crime? We might not have much sympathy for the person who left their car unlocked, but I don’t think anyone would hold them even partly responsible for the crime committed.

Now, if someone left a loaded gun sitting unattended, it might be a different story. But I think a computer is more like the car. It merely facilitates all kinds of activities, some of which could be harmful; but there’s nothing inherently harmful about it.

Car analogies for computers are like Hitler - you can’t get a speeding ticket when somebody’s breaking in to your house.

I’m of the opinion that if you sell to consumers, none of the traditional threat vectors in your product should be enabled by default.

That means no default admin user in XP, no preconfigured SSID in the wireless router, and no macros in Office.

If the user wants to enable any of those features, they should be forced, to the greatest extent possible, to research the risks that they are subjecting themselves to before turning the functionality on.

There’s no god-given right to not have to RTFM. Consumers shoulder most of the blame for being lazy in the first place, but vendors share some of it for playing on the laziness.

Sure, we’re lazy, but that doesn’t make us morally or legally responsible for the actions of another person.

We’re not talking about people misusing their products. Spammers and other lowlifes have culpability - no question about that. We’re talking about Joe or Jane average who buys a PC at their local Best Buys, turn it on, and, with no effort on their part, have it taken over. I’m hard pressed to call not purchasing or downloading a set of security products to patch up the inherent flaws in Windows misuse. And, many people in businesses are forced to use Windows products, because of the availability of software, for instance.

Hacking is now a part of the computing environment. It can no more be ignored than red lights. If no one ever had accidents we wouldn’t need airbags, but since they are inevitable a car company who installed air bags that didn’t work could not defend themselves by saying that the driver should have driven more carefully.

All our problems would go away if we did away with domain addressing. If you don’t know a path to the machine you’re sending mail to, like we did in the good old days, there would be no spammers. grumble grumble.

I agree, and my example of root logins was along those lines. Having an architecture where office macros can do anything to system settings is one of the basic issues I was talking about. Having them turned on by default is just stupid given the basically flawed architecture.

How about the old kind of car with the type of locks that could be opened with a coat hanger? The driver in this case was vulnerable through the inaction of not replacing the locks with more secure ones.

Though it doesn’t usually apply to computing, there are plenty of occasions where one can be held legally responsible for the actions of another. In my county, parents of exceptionally truant children regularly do jail time. Most jurisdictions have some kind of attractive nuisance statute, under which a property owner could be held liable if some neighborhood kid drowns in their unfenced pool. Thanks to SOX, executives at public companies can now do prison time for security breaches for which they or their employees did not exercise due diligence.

The notion that somebody can be held responsible when a clueless user’s box gets turned into a spam relay isn’t completely outlandish - at least morally, in an IMHO sort of way.

If you put a completely unpatched Windows box on the net, with a public IP address, running as Admin, and start visiting shady Eastern European porn sites using IE with ActiveX enabled, you’ll probably get hacked. If your machine starts spewing out a large amount spam, your ISP may very well “hold you responsible” by shutting down your account, and they would be perfectly within their rights to do so.

The question posed by the OP is whether, in addition to the hackers/script kiddies, the consumer or the software publisher should be the one who exercises due care to make security breaches less likely.

I say it’s maybe a 60/40 split, with the consumers bearing most of the onus for being lazy in the first place, with the companies picking up the balance for making marketing decisions that reinforce bad consumer behavior.

I think the analogies are tough because we’re not so much talking about a tool that’s been negligently left out (gun in the street), but rather a conduit through which bad stuff can happen (in the case of not securing a wireless network).

The other difficulty is that, especially in the case of viruses, the ramifications are…viral. It’s not just that your computer is taken over and used in a crime, but rather that you’re now adding to the problem by infecting other machines.

How about this for an analogy – Why does the government require people to take certain precautions (i.e. get immunized against certain diseases) before travelling to a given country? My presumption is that it’s so that you don’t contract some nasty disease and die, but more to the point that you don’t contract some nasty disease and then spread it further.

Don’t you have responsibility to society to “secure” yourself?