Google is finding old pages on my wiki. Is there a way to stop that?

When I do this Google search the first two responses Google provides links to old revisions of that Wiki page. I want it to hit the main article and not the old revisions of that article.

Is there anything I can do about this?

(I am running Mediawiki for that site)

Figure out the differences in the URL between old and new pages (it’s usually pretty different), and google robots.txt to learn how to pages based on their URL.

You might be able to compose a robots.txt to inform Google to not index any URL with the “oldid” URL parameter.

It looks like robots.txt allows wildcards, something like:

DISALLOW oldid

might do what you want?

Do you submit sitemaps to Google?