Chip-to-chip optical interconnects

People are forecasting the use of optics to connect chips on the same PCB. My question: how would that work?
What I mean is that today I can use a free program to draw a bunch of lines (special lines but still lines) and for 35 bucks get someone to print those on a piece of plastic (and it’s way less to do in volume, of course). I can then use a toaster oven (or an industrial reflow machine… same thing) to melt a couple chips to the PCB. And all that, this cheap contraption can contain hundreds of fine copper wires reliably carrying 5GHz state-of-the-art serial electrical links.

Compared to that, how practical would optical interconnects be?

I think the key points are size, speed, and heat. It might cost more, but technology needs to keep miniaturising, and eventually the cost will go low enough to be mass marketable.

I forgot to mention that when soldering a chip to a PCB, the liquid metal even actively pulls the chip into the right alignment.

PCB technology is amazing in its simplicity, cost, and resilience.

I’m just wondering… how would that be matched by optics? Eg., how would you integrate waveguides into a pcb? What would you need at the interface between your optochip and the pcb? Is it concievable that this could all be done with cheap, low-tolerance tools? E.g., a piece of fibre stuck into a drilled trench that’s very roughly pointed at a hole in the chip?

Because if not, then there would just be no use for optics except in a few niche applications. If it takes a lot of money and effort to make basic prototypes, designers won’t bother. And if optics have to be used and it’s still very difficult, then that’d hurt the whole industry.

Y’know, many years ago I said the same thing about surface-mount technology. When will they ever learn??? :smiley:

Wire wrap forever!

Well it turned out SMT wasn’t really that hard. If it was, we would be screwed.

Btw, what’s wire wrap?

Wire wrap is a short-run circuit board manufacturing technique that uses wires wrapped around posts. Still in use, but just barely.

Yeah, when I started playing with computers you could still buy wire wrap tools and supplies at Radio Shack (hey, I was naive). For that matter, they still had tube testers in some stores. Computer magazines had articles arguing the benefits of wire wrap, and detailing good wire-wrap technique.

I’ve used a lot of wire wrap, but not for wire wrapping. It used to be good for haywiring modifications onto printed circuit boards, back when you could do that kind of thing without a microscope.

We used to call this white wire changes, since the fixes were done with white wires (duh). They were bad things. I never wire wrapped, because the board I had to make for my bachelor’s thesis was too damn fast - 150 MHz back in 1973.

As for the OP, I found an article in EE Times about this, giving Intel’s view, which is that while there are issues, in a few generations inter-chip signaling is going to be too fast for standard interconnect (when it gets above 15 Gbits/s). It is a problem already, since we are forced to use serial interconnect (SerDes) because we can’t do high speed parallel interconnect any more. This is a real pain for some of us.

That’s very interesting. Intel is talking about the same concerns as I:

I found some work that’s been done. It seems that the best candidate for cutting waveguides is to use lasers (laser ablation). At the ends of the waveguides you’d cut holes and stick in pre-fabricated mirrors. These modules can hold a single mirror to redirect vertical light from an optochip or two mirrors to act as a via. The mirror modules are little plastic doohickeys that should be mass-produceable. All these things are a couple thousandths of an inch across.

It all seems reasonable. Nowhere as easy as making a PCB with a printer. But reasonable. Sort of.

If the inter-chip signals are all modulated and multiplexed in different ways, they could all travel through some kind of rather simplified, unified light-pipe - because the modulation and multiplexing would sort them out at their destinations, rather than having to route them precisely there and nowhere else.

But isn’t high-speed parallel still the king of chip-to-chip? The highest-speed thing you’ll find on a pcb – the 512-bit 2 GHz (DDR) link between a GPU and its DRAM – is parallel with a dedicated address bus (plain-old as it gets). The second-best thing, HyperTransport, is also parallel at several GHz (but using two wires per signal and packets like serial). Serial links (which top out at 5 GHz) are really only being used for board-to-board.

Are they still using HyperTransport? I worked on some docs for it a few years ago. Then I got RIF’d and never heard if it amounted to anything.

Something strangely disturbing about the phrase “double-pumped.”

You mean HTX, the hypertransport expansion slot? No, that’s dead. But chip-to-chip HyperTransport, now in its third version, is what every AMD processor uses. It’s a big advantage for the platform and Intel is planning on doing something very similar in a year or two.
Btw, it seems that copper has been doing amazing things lately. The terabit memory bus on the latest video cards is the most incredible, but board-to-board PCIe 2.0 at 80 Gigabit, or computer-to-computer Infiniband at 40 Gigabit (the last two both using the new 5GHz PHYs) are all more powerful than almost anyone needs.

The extent to which we’ve optimized copper is thorough. But that in fact means it’ll hurt more when we hit the wall at 15 GHz. You’d think we would be better ready for optical, yet we barely have prototypes. It’s like copper has been so good to us that we’ve been too lazy to plan ahead.

No slot. I was working on AMD chipset docs. Seems to me an HT slot would be kind of unreliable, but what do I know? I’m pretty amazed by the things folks have done to increase the signal-to-noise ratio over copper–differential signalling, serialized data, even intentionally jittering clocks.

There are a lot of bright people out there, always trying to milk Moore’s law just a little bit longer. So far, whenever they seem doomed to failure, someone thinks up a way to go one step further. Maybe the leap to optical signaling will bring it all to a halt. I wouldn’t want to bet either way.