How Does Google Search for Website Hits?

I am trying to increase visibility of my website (for free). One tip is to make my webpage titles non-generic. From what I can tell, this involves changing common webpage labels (like “Home”, “Product Info”, “Contact Us”, etc.) into unique labels that describe my product. The problem with this thinking now becomes that Google is trying to make my labels serve two masters. However…

This does not work for me. I have kept my webpage labels generic (and conventional) to facilitate website navigation. Yet, if I do this, Google won’t find my webpages and/or my website won’t stand out in a list of hits? I guess this is what I get for going with FREE promotions, huh?

My question boils down to:
a) What would the bright people at Google think it is a good choice to use webpage titles as a good term upon which to search?
b) Is there a way around this? Surely other keywords found elsewhere on my site would be ideal for Google to pick up? How can I make that happen?

Surely, there are other things Google searches on. Or, does Google merely search for the lowest hanging fruit, so to speak?

Your thoughts?

Open a Google Webmaster Tools account and submit your site directly to Google using a sitemap.

If this is true, are you saying all websites run through Google somehow, whether directly or indirectly? This does not seem correct. Google must do more than simply search webpage names. Surely, Google must have some way of searching all parts of a website to return a decent list of related hits…as opposed to a simple word search that would return everything and anything that just so happens to match a search word or two.

There must be more to the story (without needing to understand the inner workings of their algorithms, of course).

Pigeons.

All submitting a sitemap does is let Google know that your site exists, and what its structure is. They’ll send a bot your way to check it out, and index it, and rank it.

At the most basic level, the way that Google ranks websites is through links: the bots will follow links to find new content and assign a certain value to it. If there are many links to a page or site, then that boosts the credibility of the site, but that is very far from the only criterion involved. There’s an entire industry that exists purely to optimize web content to figure prominently in Google search results.

The details of the various algorithms are a closely held secret, of course, but the key is to have relevant, unique and fresh content. Depending on your site this might actually be geographically unique content eg plumbing services in Remoteville, Middleofnowhere County. If your site is something that loads of other people are doing, then it is difficult.

As an example, the boards here have Google bots crawling constantly - it’s new stuff, at a decent lexical level, linked many, many times, on a well known domain. Try an experiment with an unusual grouping of words on here, like beige sauerkraut speedos.

That’ll probably be found on a search by the time I wake up tomorrow.