Hey guys, I have a question about SEO. I have someone who is currently the top rankin page with a decade of SEO. The site is currently 37 pages and they want to switch to a single landing page. I’m afraid this will destroy their SEO because it’ll reindex everything.
Can I just lob off the index.html and replace it with a new one and just link to the other pages in the footer to maintain the indexing?
Anyone have any thoughts on this. Sorry I know it’s not exactly software development, I wasn’t sure where else to post it.
Hey guys, I have a question about SEO. I have someone who is currently the top rankin page with a decade of SEO. The site is currently 37 pages and they want to switch to a single landing page. I’m afraid this will destroy their SEO because it’ll reindex everything.
Yes! it will. Wouldn’t matter what “tricks” one wants to create. Search engines will ding the site hard and drop it from their searches.
If the client has been around for 10+ years then they should not be acting like a start up. Single page landing sites are indicative of startups and buzzy fly-by website creators. Their SEM SMM strategy should be geared towards useful content no flashy landing pages.
However, if they’re marketing team has to have their landing pages then one could just create a special route in the app (ie example.com/campange_id) then use that within their marketing material and correspondents without them killing their current traffic and ranking.
software development, I wasn’t sure where else to post it.
Yes it is, its apart of the software life cycle and deployment of a web application. Search Engine Marketing is a key part of any web application life cycle and release to production.
I’d also like to add to the readers of this thread that when talking SEO/SEM/SMM then be sure to know the client’s current strategy and marketing plan. this usually takes a few meetings with both the business and marketing side of things.
Another thing to point out is when one is building a Single Page Application(SPA) or Progressive Web Application(PWA) those have extra issues one needs to take into account.
Google’s SEO recommendations for Progressive Web Apps
John Mueller of Google provided a detailed update on how Google handles PWAs and JavaScript sites in general back in March 2016.
In John’s post to google plus, he emphasizes the following:
Don’t cloak to Googlebot. It’s important that the developer uses the feature detection and progressive enhancement technique so all users have access to the content. It’s also not a good idea to redirect to an unsupported browser page.
Use rel=canonical when serving content from multiple URLs so that you aren’t guilty of duplicate content violations.
Avoid the AJAX-Crawling scheme on new sites.
Googlebot will not index URLs with “#” n them. Many Progressive Web Apps use the hash symbol in their URL structure, which means the search engines will drop everything beyond the #. The only way around this is to implement a URL structure using traditional SEO rules. This may be tricky for some sites and companies, but it’s a necessary step as we move forward.
Test to see how Googlebot sees the page. You can make use of Google Search Console’s Fetch and Render tool to see your site exactly the way Google sees it.
Make sure required resources aren’t blocked by robots.txt.
Reduce the number of embedded resources in the page (especially the number of JavaScript files required to render the page), since these might not be fully loaded.
Use an accurate sitemap file to signal any changes to your website when using Accelerated Mobile Pages (AMP).
Remember that some search engines and web service providers accessing content may not support JavaScript or might support a different subset.
For more information about the SEO implications of Progressive Web Apps, check out this great piece by Pete Wailes: “Introducing Progressive Web Apps: What They Might Mean for Your Website and SEO.”
I’d take a look at their Google Analytics (or other Analytics - they are using Analytics, right?) and see what Organic Search traffic is currently being driven to internal pages. It is possible the the bulk of their traffic is only being driven by their Home Page: this is common if the bulk of their traffic is from Brand terms.
Direct Traffic (URLs typed in from advertising, print material, et al) is also typically to the Home Page: if they advertise using internal URLs, make sure these aren’t lost in the redesign.
If the Home Page is the main driver, and:
They are maintaining the bulk of the content on the Home Page
They will still be presenting it in a spider-friendly format (no graphical text, lazy-loaded JS-driven content, etc.)
They keep other signals intact (same IP, same Domain Name)
The quality inbound links to the site are to the Home Page rather than Deep Links to internal pages (look a Analytics Referral and Landing Pages details),
then the overall impact may be minimal.
If a significant portion of their Organic Search traffic is driven to internal pages which are being removed/significantly downgraded, then they may suffer a ranking/traffic hit.
Again, Analytics can help gauge the impact of this lost traffic.
Saving Old Linking Info
If the site’s underlying architecture is changing, such that the URL of the Home Page (and perhaps internal pages) are changing, set up “301 - permanent” redirects from the old URL to the new. Google, other search engine spiders, and live users will all be able to follow an old link to the new content.
Don’t use “302 - temporary” redirects, even if they are easier to set up. Even though live users won’t notice the difference, 302’s don’t pass “link juice” and the rankings will suffer.
We Relaunched and Screwed Up - Now What?
You might consider having Google PPC as a backup plan - if the site suffers a dramatic, business-impacting drop in Traffic, a PPC campaign can replace that traffic more quickly than a site rebuild.