SEO question about domains

Hi, I used to do a little SEO about a decade ago but have lost track. I really don’t want to offend the Google Gods with my new business but also want to do what I can legitimately … here’s the issue:

People search for the service I will be providing mainly using 3 keyword phrases.:

‘X test’
‘Y test’
‘Z test’

The one site I have now is optimised for ‘X test’. However, if they search for Y test or Z test, even though it’s the same service they won’t see my site. What I was minded to do was buy three domains:

www.X-test.London
www.Y-test.London
www.Z-test.London

Each domain would have exactly the same content except for the three keyword phrases (as reflected in the domain names). So, on the second site, I would replace the keyword phrase ‘X test’ with Y test. Etc.

Would anyone know if that is that a legit way to go these days?

Without doing anything else SEO wise, this would limit you to those search terms which I don’t think is a good idea. In general, when you hire someone to do SEM (Search Engine Marketing) for you, the first thing they do is an analysis of what people are searching for as it pertains to your product and service. Then built the SEO into the website around that for organic searches. You want organic searches to find your website.

Quoted for truth.

Far too many people, including those who should know better, view SEO as a magical additive to web sites - just graft it on and watch your traffic zoom.

SEO is as separate from web design and content development as safety and fuel economy are from a new car design.

It is a huge mistake to think SEO can be “added in” or to use most kinds of subterfuge to fool Ma Google. The rules they want you to play by are pretty plain, and the penalties for trying to trick them or use black-hat techniques can be pretty dire.

That’s why I’m asking. I don’t want subterfuge, I would like people who use other terms (Y test and Z test) to find the site, which they won’t currently do as the site is built around content using the keyword phrase ‘X test’.

You need to build content around those keywords. It should be real articles and real content. Don’t do what some have done, which is pump a bunch of keywords into their pages as lists, because Google will punish you for that.

Yes. Google Search works on content, and it’s long since left behind simple keyword counting and the like. I can’t quite visualize what you’re trying to accomplish with three parallel sites differentiated by one term or whatever, but I am pretty sure no combination of tricks and foolery will accomplish much. If you really want people to find one set of information referenced by three different “facets” or “portals,” I don’t think your approach is going to work out.

Like I explain to my clients, SEO involves building two websites in one - one for the user and one for the search engine.

No offense intended, but this is a complete waste of time and the fact that you’re considering it kinda speaks to how far behind the times you are in SEO.

The reason this won’t work is that Google is all about content and identical content has no value. Your duplicate sites will never rank worth anything.

Don’t forget that Google is one of the world leaders in Artificial Intelligence and their search engine algorithms are way past the idea of keywords. Nowadays Google understands content way beyond the idea of keywords/ Google doesn’t give a rat’s ass about keywords. Google cares most about user experience. Google wants their users to find what they’re looking for - that’s what makes people keep using the service… because it actually works.

So if they decide that your site might be relevant for certain searches (based on it’s content), they’ll send you some traffic and if your site engages that traffic and they actually stick around instead of bouncing right back to Google, then Google will start to believe that you’re giving people what they want, and then they will send you more traffic.

IMHO, forget “keyword optimization” and work on user experience optimization.

If the search engine you’re optimizing for requires a completely different website from the one your users are actually using, then that search engine won’t give users what they want for most sites. Which in turn means that most users aren’t going to bother using that search engine, and your SEO will be useless.

Okay, that’s great guys - things have certainly moved on! Thanks.

Hmm. That’s one way to look at it, I guess. IMHO/IME, a careful architect can build one site that serves both. I guess for some needs the human site might need to be contra-SEO in organization and visible content, so the shadow site notion is useful.

Exactly. If I make a beautiful graphic that says Cad.Com the search engine will never see the text so if I want 'Cad.Com to be found I need to include it as text. Normally this is not an issue but with people doing Wordpress and thinking they are actually writing a website it could be a huge issue

That’s not creating a page for the users and a page for the search engines; that’s just creating a page for the users. Not all users have connections fast enough to see images immediately when a page loads. Not all users have web browsers that display images. Not all users can even see.

In reply to the OP,

Don’t create duplicate domains with duplicate content; Google won’t like that. To evaluate keywords, use the Keyword Planner tool (free AdWords account required). That’s Google telling you which keywords are more popular than others, and it’ll recommend additional keywords for you.

Back on your own site, if you want to measure on-page behaviors and conversions (e.g. purchases, signups, etc.) rather than just new visits, you can use the Content Experiments tool inside Analytics. It lets you compare different sets of content against each other and benchmark their performance over time – without incurring a Google penalty.


As for whether you can just design a page for humans and ignore Googlebot’s limitations… an multimillion dollar industry has grown up trying to answer that question because Google is deliberately vague and secretive with their algorithms. Much of this industry consists of spammers and scammers, but there are legitimate practices you can use to help improve your SEO without hurting real users – but even if these things seem “common sense” to veteran web designers, they are not always obvious to small businesses and amateur coders.

For example, coding for users often means nice, clickable images that Google can’t read. It might mean organizing things above or below the fold, and making changes based on visual hierarchy and click maps. Google doesn’t really reveal how it weighs things like that vs PageRank, bounce rate, keyword similarity, etc. There’s a lot out there that does affect Google but is not easy to optimize for because the specifics are Google’s trade secrets. And Googlebot isn’t perfect; at work I’ve accidentally Googlebombed other companies merely by mentioning their brand on the same page as an image with no alt tag. In a few days, Google started showing that picture (a completely unrelated “click to call us now” phone icon) as that other company’s logo in Image Search. It didn’t understand that the two things were unrelated despite being on the same page. (We eventually fixed it by better describing our images).

In other words, Googlebot may be one of the world’s most advanced natural language parsers, but it still prefers clean data that it can easily understand, and that means providing a level of detail and consistency that real humans do not need.

To that end, even if you don’t want to engage in “black hat” SEO practices – and you shouldn’t, because their efficacy is dubious at best and the penalties are very severe – you can still engage in white-hat SEO practices like the ones recommended by Google. This usually involves adding additional descriptive information to content that you otherwise would not bother adding for normal visitors. Alt tags are the obvious one – even if very few of your visitors might be blind – but there are also titles, meta descriptions, H1s, canonicals, URLs, using dashes instead of wordsalltypedtogether, sitemaps, product and organizational structured metadata, YouTube captions, etc. It can also involve things like using the Keyword Planner tool above to better understand Google’s concepts of related keywords and how they are affected by stemming, pluralization, etc. (Google is good at this, but not perfect, and different keywords can still have an impact in competitive industries).

Arguably none of that is as important as “quality content”, but at the end of the day Google is not going to tell you exactly what that constitutes, so you do what you can there, then spend a little time optimizing for Google without hurting your real visitors, and hope for the best. As stated above, these are not mutually incompatible goals, but neither are they identical goals. It’s best to keep both in mind and explicitly optimize for both.

It basically boils down to “design for humans, describe for Google”.

Thanks for taking the time to explain so much. I guess I was looking for a kind of philosophy about this as much as a coding answer. The way you summerise seems to nail a mindset and that’s really helpful. Thanks again :slight_smile: