In reply to the OP,
Don’t create duplicate domains with duplicate content; Google won’t like that. To evaluate keywords, use the Keyword Planner tool (free AdWords account required). That’s Google telling you which keywords are more popular than others, and it’ll recommend additional keywords for you.
Back on your own site, if you want to measure on-page behaviors and conversions (e.g. purchases, signups, etc.) rather than just new visits, you can use the Content Experiments tool inside Analytics. It lets you compare different sets of content against each other and benchmark their performance over time – without incurring a Google penalty.
As for whether you can just design a page for humans and ignore Googlebot’s limitations… an multimillion dollar industry has grown up trying to answer that question because Google is deliberately vague and secretive with their algorithms. Much of this industry consists of spammers and scammers, but there are legitimate practices you can use to help improve your SEO without hurting real users – but even if these things seem “common sense” to veteran web designers, they are not always obvious to small businesses and amateur coders.
For example, coding for users often means nice, clickable images that Google can’t read. It might mean organizing things above or below the fold, and making changes based on visual hierarchy and click maps. Google doesn’t really reveal how it weighs things like that vs PageRank, bounce rate, keyword similarity, etc. There’s a lot out there that does affect Google but is not easy to optimize for because the specifics are Google’s trade secrets. And Googlebot isn’t perfect; at work I’ve accidentally Googlebombed other companies merely by mentioning their brand on the same page as an image with no alt tag. In a few days, Google started showing that picture (a completely unrelated “click to call us now” phone icon) as that other company’s logo in Image Search. It didn’t understand that the two things were unrelated despite being on the same page. (We eventually fixed it by better describing our images).
In other words, Googlebot may be one of the world’s most advanced natural language parsers, but it still prefers clean data that it can easily understand, and that means providing a level of detail and consistency that real humans do not need.
To that end, even if you don’t want to engage in “black hat” SEO practices – and you shouldn’t, because their efficacy is dubious at best and the penalties are very severe – you can still engage in white-hat SEO practices like the ones recommended by Google. This usually involves adding additional descriptive information to content that you otherwise would not bother adding for normal visitors. Alt tags are the obvious one – even if very few of your visitors might be blind – but there are also titles, meta descriptions, H1s, canonicals, URLs, using dashes instead of wordsalltypedtogether, sitemaps, product and organizational structured metadata, YouTube captions, etc. It can also involve things like using the Keyword Planner tool above to better understand Google’s concepts of related keywords and how they are affected by stemming, pluralization, etc. (Google is good at this, but not perfect, and different keywords can still have an impact in competitive industries).
Arguably none of that is as important as “quality content”, but at the end of the day Google is not going to tell you exactly what that constitutes, so you do what you can there, then spend a little time optimizing for Google without hurting your real visitors, and hope for the best. As stated above, these are not mutually incompatible goals, but neither are they identical goals. It’s best to keep both in mind and explicitly optimize for both.
It basically boils down to “design for humans, describe for Google”.