After a few weeks of our responsive site not getting indexed we decided to
try something different.
We have the details in this article:
http://www.fieldexit.com/forum/display?threadid=179
We posted this on LI too but I know not everyone reads everything there (I
know I don't... I don't care for the UI of it at all).
In a nutshell, we changed our links to actual hyperlinks using the anchor
tags, but then used jQuery to stop the browser from following them.
This means that crawlers will crawl the site, find the links in the anchor
tags, and because they don't execute JavaScript they will follow them and
index them.
But the site itself will behave like an SPA for real people using modern
browsers.
One hour after making the updates and resubmitting the site (and a sitemap)
to Google it was already being indexed. This morning it appears to be
completely indexed on Google as well as Bing.
What did I learn from this? Web page design is ahead of Web Crawling
Bots. If your site requires indexing for Google searches then it's best to
plan for that. Lowest Common Denominator
If your web page is truly an "application" where indexing isn't a need, go
crazy with your new design methods. :)
Brad
www.bvstools.com
As an Amazon Associate we earn from qualifying purchases.