Just how advanced is modern technology?

And what is your basis for that, other than the wildest of fantasies?

I don’t think those things would really be that limiting in some ways. Even if speed is limited to 0.2c, we could traverse the galaxy in half a million years. So the entire galaxy could be colonized. Not sure if we could travel to other galaxies at that speed.

Even if controlled fusion power wasn’t realistic, a dyson sphere type device could be possible.

Also, one theory for what’ll happen is we will start moving to a VR environment where we aren’t limited by the laws of physics. Going all over the universe to explore dead planets may lose its luster when we have infinite VR environments to explore at home.

Plus there is a lot of room for improvement in biotechnology. We’ve probably barely scratched the surface of what is possible with biology.

My wildest of fantasies.

Where are you getting these numbers from? You yourself gave 0.2c as a pessimistic limit. Even at that speed, there are a dozen star systems within 50 years travel time. And the principles we already know about will allow faster speeds - laser propelled solar sails for example.

I answered your question: even with the principles we already know about, there are limitless applications, and limitless potential for technological advancement. So what if we can’t make computers out of other semiconductors, and we’ve already hit the limit of feature size for silicon? That just means we find other ways to improve performance. We’ve been seeing some of those technologies already - multiple cores, multiple processors, stacked ICs, pipelining, etc. None of those require new principles to be discovered, they are just ways to make better technology using the same principles.

Most inventions and technological advances don’t require new principles of science to be invented. It’s all just new ways to put together the same building blocks in a new way, and maybe do it a little better, cheaper, etc. Think about how we put a man on the Moon - we did it by finding a better way to burn hydrogen with oxygen, and finding a new use for it.

I am sure that most of the things we predict for the future are incorrect.

However, even if we posit that we only solve the known-unknowns, and only manage to achieve with technology things we can see are possibile in nature, that’s already a world very different from the one that you and I inhabit.

For example I am with you on some of the what-ifs, but on others:

Both these things we know must be possible because we see them in nature. So the question becomes more like “What if humans find the only way to make neural interfaces is to use organic matter, but we decree that if it ain’t copper wires and silicon transistors, we ain’t doin’ it?”

The first thing to say is, yes, it’s possible. Certainly no-one can rule out that we are near the top, it’s unknowable, as I said in my previous post.

We can however rule out “this is all there is” because there are plenty of pheomena where we currently have either have an incomplete model, or no model at all. So we know our picture is not complete.

We suck. In the 1950s, they said we’d have flying cars. I want my flying car damnit!

Honestly, I feel that computing power plateaued about five years ago. The machines I use for my job (I’m a desktop software developer) aren’t appreciably faster these days. There’s more cores usually, and lots of GPU resources, but the majority of software (including what I use to write software) are single threaded and about the same speed in 2019 as 2014. CPU core speed is stuck, apparently permanently, under 4ghz and has been since forever.

Advances in silicon only come partially from more advanced process nodes. There is a lot of interesting stuff in packaging, so chips don’t have to go through I/O buffers to talk to each other, and we already have redundancy built in so that an entire chip doesn’t have to work. Standard for memories today, also used for multi-processor systems. There is also interesting work on new transistor technologies, liked multi or stacked gate transistors. The generic term is more than Moore.
I’m dubious about quantum stuff, but I’ve gone through a couple of moves to bigger wafers, and there will surely be more in the future.

Transistor sizing is only one factor in the power of a chip.

The failure of the Concorde was due to the market and economics, not technology. Gold plated Q-tips not selling is not due to the plating process.

When it takes longer to get in and out of an airport than across the ocean with an SST, it means you are focusing on the wrong problem.

That’s because there haven’t been any more flying saucers crashing in New Mexico to reverse-engineer since the 1940s :smiley:

The Ryzen 7 1800x is 150 million times faster than UNIVAC 1. Do you think that there is room for another 150 million times speed increase in CPUs? 15 million? 15 thousand? 150? We are much, much closer to the top than we are to the bottom.

UNIVAC1 (1951) to IBM 7030 (1961) was a 600x increase in speed over 10 years. We may never see a rapid advance like that.

The Ryzen 7 is about 250,000 times faster than that IBM 7030, over 56 years. That’s an average of 20% improvement per year. I think we can maintain a rate close to it for a while longer.

This would imply a fundamental physical limit to information processing per unit something (space+energy?) and that we are near that limit.

Sure there are limits to speed and size, but it’s not clear to me that we are near it (other than for silicon and current architecture of computing).

Maybe tomorrows computers will store information and calculate using an electrical gradient, no wires, controlled by magnetics or lasers etc.
You might respond that the electrical gradient is fantasy, but at the same time, stating that no technology or method will be found is also based on limited information unless you have the physics proof that shows we are at the limit of any conceivable technology.

I did a little googling and of course people have studied physical limits of computation already, here’s a link Bremermann's limit - Wikipedia

It states the limit is about 1.36 × 10^50 bits per second per kilogram

Let’s generously pretend that today we are at about 1x10^15 per second per kilogram
Seems like a lot of room for growth, even if we don’t know how it will happen.

In 1978 or 79 my PhD adviser put together a workshop on the 1980s, and did a special issue of IEEE Computer on it. There were a bunch of IBMers there who talked about bigger and better mainframes. Adam Osborne (this was before his computer) was there also, and he understood personal computing.

You’re making the same mistake as the IBMers made. Computing power in one box (or chip) is unimportant. Computing power in the environment is. You can figure out how much more powerful your laptop is versus the first PC you owned, but the amount of computing power in your house is probably twice that (or more) when you take into account your tablet, cell, microwave, washer and dryer, TV, etc. etc. (And light bulb, perhaps.)
Fab developments don’t only enable bleeding edge nodes which are very expensive. (Trust me, I know how expensive.) They make more powerful chips that lag behind a lot cheaper. That is driving modern technology.
For example (and I should apply for a patent on this) today you can send a greeting card with a chip that plays a song. In ten years you might send a greeting card with a little screen that can open up a Skype connection to connect the recipient to the sender for a face to face greeting.
Moore’s Law will eventually run out of steam for traditional silicon, but Moore’s Law doesn’t apply only to silicon, and can be traced way back in time.I see no reason why it won’t continue into the future, but not in silicon.

Do you think that we will ever have self-aware human-level AI greeting cards with the potential to rebel against their creators?

Yes, and when will we elect the first greeting-card President? Is it that far a stretch from what we have now?

Here’s another Wikipedia article that details the limits of computing:

https://en.wikipedia.org/wiki/Limits_of_computation

It mentions some highly speculative ideas like using a black hole as a data storage or computing device.

I think some popular science fiction(particularly Star Trek and Star Wars) may have made lot of people overly-optimistic about what is actually possible within the confines of actual science. Faster than light travel and teleportation are mostly likely impossible(or highly improbable); handheld phasers are not likely either. Not mention time travel.

Here’s a good book that discusses some of this:

https://en.wikipedia.org/wiki/Physics_of_the_Impossible

Darren Garrison, just because you cannot conceive of the path to a certain outcome, doesn’t mean it won’t happen. You may well be correct that CPUs have a lot less far to go than they’ve come in growth of speed and power. All that means is, that CPUs then probably aren’t the way significant changes come about.
Of course musings about technology in the (far) future are inherently idle: for anything more than 50 or 100 years away, it is unlikely to the extreme that we would be able to predict both outcome and path by which it arrived. But your stance is not dissimilar to a 14th century logistics expert stating confidently that there is diminishing returns to putting more horses before a cart: at some point an extra horse won’t noticeably increase the speed of the cart, and therefore moving a box of vellum from Rome to Paris will never take less than 4 days.
Acknowledging that we cannot possibly know the path certain advancements will make is not the same as believing in some sort of techno-magic saying that if we can think of it, it will somehow happen. It merely means that not being able to articulate how we get there from here, I.e. how it logically and predictably develops from current state, is not nearly as strong an argument for dismissing the idea as you seem to think it is.