Is logic an evolving academic discipline? Even in the 21st century?

By “evolving” I mean are new principles added or discovered? As time goes on, most academic fields increase the breadth of knowledge we have on them.

For example, let’s take physics. During the Issac Newton era, the emphasis was mostly on problems of a classical mechanics nature. Space and time were thought to be fixed and “concrete”. However, Albert Einstein proved that our everyday perceptions of space and time were, in fact, faulty. Time and space were shown to be more malleable entities than previously thought.

In this way, physics has undergone development. New concepts have replaced old, and traditional theories receive new light (for example from new experimental evidence).

Now I don’t know much concerning the academic discipline of logic. A simple definition IMHO would be to call it the science of reasoning.

But I wonder if it evolves in a similiar way that other fields do, namely that the informational output of the discipline increases. I know of various logicians that have gone through the ages (the trimuruvate - Aristotle, Russel and Godel) and thus the field must continually progress and change.

However to what extent does that occur today? What are the latest challenges (if you will) in the world of high order logic? How are the principles concerning logic changing or being uprgraded?
Also, who are the most renowned logicians in the world currently?

Apologies if the question is naive, but I would still like it answered.
Thank you.

Not my field, so I can’t really say how it’s evolving, or who the current “stars” are. But yes, logic is an active academic discipline. If nothing else, there seems to be a decent amount of publishing going on in the field.

Logic involving notions of time, causality, and knowledge is pretty new. A layperson could understand the basics of those, I think. Model theory is still active, but that’s pretty hairy stuff even as higher mathematics goes. The liar paradox is still hot, and probably will be for some time. Non-founded set theory (i.e., allowing something like X = {X}) is kinda big.

I doubt this very much. The subject quickly becomes mired in topos theory.

Classical logic is closely related to set theory: the relevant topos is the category of sets. Equivalently, this is the category of functors from the trivial category to Set. Time-dependant logics are related to categories of functors from total orders to Set. The connection basically goes through the fact that in any topos one can functorially give a Heyting algebra structure to the lattice of subobjects of any given object. In Set, the Heyting algebras are actually Boolean, which gives rise to calssical Boolean logic.

I’m no expert, but there has to be at least a little bit the layperson can understand. It might differ from our notion of the basics, but it is there.

But that brings up two new, actively researched areas of logic: topos and category theories. These are inaccessible to the layperson, but they exist.

The study of infinitely large numbers is another area that’s still active.

Logic is pretty tightly tied to set theory, which is a growing discipline (but I won’t even pretend to know what the open questions there are). It’s also tightly tied to computation theory, where a fundamental question lies unanswered: Does knowing the answer ahead of time make problem-solving any faster?

There’s algorithmic information theory, but that’s also pretty heady.

Quantum logic is a new discipline. It’s still somewhat basic, but it will probably get very advanced very quickly.

The problem you run into rather quickly is that there are no clear-cut boundaries between the different branches of mathematics and their applications. So it’s hard to say what’s logic and what ain’t.

Well, I think topos theory is relatively inaccessible. Category theory, however, is very accessible. I think the canonical example of a category that is hardly ever mentioned in most introductions is the category of matrices over a given ring, say R. The notion of multiplication that is only defined for certain pairs of elements is right there for the seeing.

The problem is that category theory was developed through homology theory, and still carries the stigma of “metamathematics”. Ther alternate term coined by Steenrod and popularized by Lang is, of course, “abstract nonsense”. The category Mat(R) is a great example for why this couldn’t be further from the truth. My work on representation theory of categories extending the theory of modules over rings goes a long way in this direction.

If there’s a way to explain even the rudiments of category theory to someone with no background in algebra, it’s beyond me.

Try this:

A category is some collection of things with two labels on each one, along with a rule for combining two things into a third, as long as the second label of the first thing matches the first label of the second thing. The first label on the product is the “leftover” first label from the first thing and the second label on the product is the “leftover” second label from the second thing.
{honestly, the “thing” references go down a lot better when using a board}
Now, if a and b can be composed, write their product as ab. If b and c can be composed, then ab and c can be as well, to get (ab)c (remember the labels). We could also pair off b and c first to get a(bc). We’ll require that these two operations give the same result, just like multiplying regular numbers.

As an example, consider the set of all matrices with real entries. If a given matrix is m-by-n, say the first label is m and the second is n. Now if we’re given an n-by-p matrix, we can multiply the two (since the number of columns of the first equals the number of rows in the second) to get an m-by-p matrix. Since this product is associative, this set forms a category.

I’s interesting that something as abstract as catergory theory is actually being used by mathematical physicists in loop quanutm gravity.

Mathocist, can you expand your post a little as it looks interesting. What is the formal defintion of a catergory?

Category theory is a big deal in theoretical computer science. It’s a bit of a hijack from the topic of this thread to get into exactly what a category is, but if you start another thread, you’ll get more discussion than you probably wanted.

Or if you wanted to submit it for a staff report… :smiley:

Heh. That’d probably be about the least-popular staff report ever.

As ultrafilter suggests I’ll leave this for someone to open another thread in GQ.

[massive hijack]
Physics and mathematics are interacting in fascinating ways these days. My half-arsed understanding of recent history is that some time ago physicists (specifically theoretical physicists like Ed Witten and his contemporaries) needed mathematical language that they didn’t have (or just weren’t completely familiar with) to describe things like string theory. So they started making up their own formalisms. Then mathematicians, particularly topologists who had been happily doing their own thing for a while, wandered by and said “Wait a minute…what the heck are you talking about?” Many long conferences ensued, punctuated by statements from the mathematicians like “I get it, you’re talking about a connection!” and “You know, if you use category theory that really cleans up the language in this part here.”

The end result has been lots of fascinating mathematics like gauge theory derived from recent particle physics, and lots of particle physics using high-falutin’ mathematical concepts as categories, orbifolds, and groupoids now that the two camps having started paying attention to one another again.

[/hijack]

So, what about the rising stars of logic? I haven’t yet heard of any renowned logicians (in recent times).

There’s no one on the level of Godel or Russell. Closest I can think of is Greg Chaitin or Solomon Feferman (although both men are well past the point of being rising stars).