Engines of Creation

Scylla:

Um… nothing happened there. JP is a work of fiction, as I’m sure you’re aware.

<shrugs> There will certainly be problems, accidents, and deaths as with any technology. However, if we limit the racial lifespan, such accidents would be self-limiting. And we can reduce the probability of overcoming that one limitation to an arbitrarily small probability.

There are other limitations we can implenent, such as dependence on a specific artificial chemical, weakness to another, which we could use to limit the scope of an accident.

A much more serious and plausible threat, and I have no refutation. If nanotechnology is possible, we will make it happen. We can only hope that our ethics evolve to the point where such technology does not cause our extinction.


I sucked up to Wally and all I got was this lousy sig line!

Singledad:

Jurassic park isn’t real?

You don’t have a cite for that do you? :slight_smile:

I’m glad you agree hacking might be a problem.

The real problem is that in reality when we work with nanotechnology we are creating life (for all practical purposes.)

We will be creating very powerful and fast breeding life.

I would submit that it is practically impossible to engineer and release nanotechnology that woudn’t evolve towards overcoming the constraints and safeguards you place into it.

You say “limit the # of generations that they can reproduce.” I say “The first nanobot with a broken generational counter overcomes this.” Another way would be to hijack the materials of other nanobots. That way it could reproduce without setting off the counter (or program, or whatever you are using.)

If you build in a dependance on a chemical, they may find an alternate that serves just as well.

The problems with getting rid of unwanted microscopic organisms are well documented throughout history. That these organisms can indeed evolve to thwart our most Machiavellian assasination methods can be demonstrated by the fact that we are losing the war with bacteria. Antibiotics do not have the same effect they did 40 years ago. It is becoming more and more difficult to engineer effective new antibiotics against today’s resistant bacteria.

The parrallel should be obvious.

I’m now writing a short story dealing with nanotechnology. The protagonist (a nanotech engineer) tries to avoid the mutation factor by building into his nanobots a checksum routine to verify that the successive generations are made without being different from the originals.

However… The routine that checks the new generation has a bug. It fails to include in its check of the new generation the code in the new generation that performs the checksum on the next generation. (does that make sense? I explain in more detail in the story). So, after a few generations a copy error is introduced into the code that checks for mutations. Then, later generations can’t check for errors anywhere, and mucho mutations are possible.

The point I’m trying to make is that the nanites will be a product of software design, which has traditionally been a great source of error. I should know, I’ve spent the better part of the last decade making my living by finding software errors.

And I’ve spent the better part of mine making them! :smiley: Your point is well taken. We will have to be very careful.

Your scenario is reasonably plausible for SF, but it would never fly in science. The key is multiple redundant backups. First, most mutations in the checksum checking routine would tend to fail safe, in that it would kill an ordinarly good bot. Second, you would normally have three checksum checking routines; if they disagreed, the bot would again self-destruct, killing the mutation. You would have to have three (or more) simultaneous mutations in very specific places to unleash a disasterous bot.

And we can make reliable software. The JPL/NASA facility that makes the Space Shuttle Avionics software has been extensively studied, and their reliability record really is impressive.

Whether its incopetence through cutting corners or malice, the problems that we’re most likely to face are ethical and not scientific or technical limitations.

I recommend that you steal a plot point from JP: Have part of the problem due to malice or administrative negligence.


Time flies like an arrow. Fruit flies like a banana.