When will we stop being able to learn new things?

I almost hijacked another thread, but that’s not nice.

If we’re genetically identical to our ancient ancestors, then we should have the same capacity for abstract thought. Yet the knowledge of mankind has increased (seemingly) exponentially. There’s knowledge and there’s intelligence. We should eventually reach a limit on the knowledge we can keep in our heads, but what about intelligence? Are we going to hit a peak and stop (or dramatically slow down) being able to conceptualize new ideas?

From another perspective, the humans of, say, 10,000 years ago worshipped sticks and ate mud. They could be convinced that mermaids existed and they couldn’t draw with perspective. Was the only difference the quality of their (complete lack of an) educational system?

I think we already have. We use computers to handle simulations and masses of data our brains can’t; we use mathematics to describe concepts we don’t really grasp. I mean, one thing I hear fairly consistantly about quantum mechanics, for example, is about how nobody truly understands it; we use mathematics to model it, but we don’t understand it, really; it’s too alien.

We can’t grasp such things, but we plus our technology and techniques can, so far. In essence, we’ve expanded our intelligence by creating devices and techniques that act as a prosthetic extra brain lobe of sorts; we’ve offloaded some of our intellectual tasks onto the external world. And we’ve found ways to mathematically model that which we can’t really grasp in essence.

Until a grue eats us.

Seriously though, I think there will always be new things to learn, so the answer is never in that regard. If you mean does the human brain have a limit to thought evolution, then that’s a much more difficult question and one I have no non-WAG answer for sadly.

an interesting analysis. My only argument against, I guess is that those models and such do indeed teach us something that we didn’t know yet. Our capacity for the theoretical results, and more importantly, using those results to create new and better models, doesn’t seem to be slowing down yet.