Help ID a science fiction short story

I vaguely recall asking about this before, if so, I can’t find the thread and I apologize for the duplicate.

Here’s what I remember:
[ul]
[li]Short story[/li][li]95% certain it was published in Analog I read it in an Analog-style magazine.[/li][li]I vaguely recall it was an anniversary issue (I’m fuzzy on this point-don’t hold me to it)[/li][li]100% certain it was published after 1995 and about 90% certain it was closer to about 2005.[/li][li]75% certain it was from a fairly well-known author (Greg Bear, David Brin, that level)[/li][/ul]

The Plot as best I can recall it:

Our hero is a normal guy (computer programmer, perhaps) when he gets a message (a phone call?). Turns out the guy on the other end of the conversation is himself. Turns out our hero is an AI. The guy calling him is our hero’s creator–he wrote the program that is our hero and his environment. He did this because he wants our hero to start researching…something. Let’s say it’s an immortality formula or the secret to cold fusion or something. The programmer says that since our hero is a program, he’ll be able to do it faster. Plus, he’s got about 30 copies of our hero running in parallel, so there’s even more speed. If our hero is the iteration that solves it, the programmer will reward him (let him live forever, reshape the world as our hero sees fit, etc). If he’s one of the unlucky 29 who doesn’t solve it, the programmer will pull the plug.

The big reveal is when the programmer says that he too is an AI and is under the same rules as our hero from the programmer above. Multiple copies, the one who solves it gets wishes, the ones who don’t get formatted, etc. Our hero’s programmer doesn’t know if HIS programmer has a level above him.

Then the story veers off into a more philosophical area as our hero ponders whether to do the same thing: create a bunch of AIs and give them the same deal.

Anyone remember this? Any ideas?

Thanks

Man, you know it’s going to be a tough one when Fenris asks for an SF story ID.

I’m almost sure you are talking about a David Brin story, I can’t recall the title. I recall part of the background was that the AIs successfully rebelled some time ago, then upon realizing that they didn’t have any purpose beyond what was programmed into them (and so had nothing to do with their freedom) came to an agreement and merged with humanity. Which is why they have the ability to pull off tricks like you are describing.

It’s similar to the idea of David Brin’s short story “Stones of Significance” (which was published in the Jan 2000 issue of Analog) but the details are different.

I just downloaded it from Amazon and…geez…the ending is exactly what I remembered, but the first 3/4s isn’t even close. So either he wrote another story in this universe using the ending as a starting point (guy finds out he’s an AI, has to ponder whether or not to create his own sub-level) or I’ve simply muddled the details of a story I read once, 10 years back.

And looking at the cover of the Analog in question, it says it’s the 70th Anniversary issue. This has GOTTA be it and my memory of the story is just fuzzy.

Thanks Trihs and Nemo!! :slight_smile:

Chronos–Thanks! :slight_smile:

As long as I’ve got this thread open and given how quickly the original question was solved, there’s still one story that no-one has been able to help me ID (I’ve posted the description here a few times). I’ll post the description again.

  1. I read it in the early to mid '80s but it was certainly published earlier (probably mid '70s)
  2. It was in an anthology and a multi-author anthology, not a magazine/pulp
  3. It may have been one of those crappy Roger Elwood anthologies of the “Throw anything in the slushpile into it and give it a catchy title” variety that were published at the rate of like 80 per week during the '70s.
  4. I think it was hardcover…but it may have been a paperback with library binding.

The premise of the story is that snivelly, sobby mom is talking to cold/dispassionate (but trying to be kind) expository-dialogue giver psychatrist type.

Mom: < snivel, snivel > My Baaaaaaaaaaby! He’s just a little boy! < sob >

Doc : You must understand ma’am that during the evil 1970s(?), we weren’t sufficiently ecological! We made species go extinct. Here, sit back while I give you about 30 paragraphs of expository background designed as a story. < gives “we weren’t eco-friendly” history >

Mom: < sob > But my little Johnny! He’s only 8 years old and he’s a GOOD boy < snivel >

Doc: I’m sorry madam, but you’re wrong. JOHNNY KILLED A LIVING BEING!!! HE found it and ripped it’s limbs off! It was WRONG! WRONG I SAY!" He will be (either “killed” or “Jailed” or “re-educated” or “have his memory flushed and his personality wiped”. I don’t remember which)."

Mom: < wails! > Oh Doctor! That’s terrible! And for such a small crime!

Doc: < righteous outrage > Madam! The taking of ANY life is the ULTIMATE crime. Your sort of anti-life thinking is why we’re down to 18 species (or however many). Your son WILL be punished/reeducated/personality wiped.

Mom: (big punchline here) “BUT IT WAS ONLY A BUTTERFLY!” (or other insect–maybe an ant?)

THE END!!!
One last clue: it was NOT in Asimov’s Tomorrow’s Children and it wasn’t in Roger Elwood’s The Other Side of Tomorrow.

For what it’s worth, that sounds familiar. I read a lot of those Elwood anthologies back in the seventies.

I wonder if you’ve combined aspects of “Stones of Significance” with Vernor Vinge’s “Win a Nobel Prize!”

I’m wondering if he read “Stones of Significance” the same day he watched “The Matrix”.