You’re about to get the strategy behind one of the most challenging SEO campaigns my SEO agency has ever run. Why was it so challenging? 3 reasons: First, the niche is massively competitive: A make-money-online infoproduct in the financial niche. Nuff said. Second, we only had 5-months to pull this off. Third, just like any other client, they were extremely hungry for results and demanded quality work. In the case study below, you’re going to learn the technical playbook, the onsite content strategy, and the link building techniques we carried out to get this 45.99% revenue growth win for this infoproduct business. The Case Study Our client takes advantage of the wide reach of the interwebs to teach his students how to earn money trading online. We’re talking currencies, forex, stock markets, crypto, etc. The business’ revenue is generated solely through the sale of digital download products – in this case, trading guides in an ebook format and video trading courses. When the owner of this profitable business (which already built some authority in the niche) approached The Search Initiative (TSI) about helping to grow their organic reach and find new students, we were excited to take on the challenge in one of the most competitive spaces there is. To accomplish this, the game plan was to focus hard on a quick-win strategy, while setting the stage for long term gains post-campaign. Our strategists were certain that the value we could provide would have a considerable impact on his business’ bottom line. How? Because… Over the course of the campaign, our technically-focused SEO strategies were able to grow organic traffic by 23.46%. But what did the best job for the client’s business was the 45.99% increase in the number of conversions comparing 1st vs last month of the campaign. Sales went up from just over 2,100 a month to 3,095 – this really bumped their monetization. And we did it in time. These gains were achieved within only 5 months of the client signing with TSI and our team starting the campaign. Here’s how we did it… The SEO Playbook for Infoproduct Websites Phase 1: A Comprehensive Technical Audit I’ve said this in every TSI case study we’ve published so far… and I simply cannot emphasize enough: So before you begin any campaign, always start with a full technical audit. Starting with… Page Speed First, our technical SEO strategists started at the bottom of the client’s tech stack… and you should too. This starts with you digging into the web server’s configuration, and running a series of tests to measure the site’s speed. This enables you to ensure that the performance of the web server itself wasn’t causing a penalty or disadvantage on either desktop or mobile connections. So, what tests we run? PageSpeed Insights (PSI) – this should be everyone’s go-to tool and shouldn’t need an explanation. GTmetrix – it’s good to cross-check PSI’s results, therefore we use at least one other tool. In reality, we use GTmetrix together with Dareboost, Uptrends, and Webpagetest. HTTP/2 Test – this one is becoming a standard that can greatly improve your page speed, hence, it’s definitely worth looking into. If you’re not HTTP/2 enabled, you might want to think about changing your server or using an enabled CDN.You want to see this: Performance Test – I know it might sound like overkill, but we included this in our test suite earlier this year and use it for the sites that can expect higher concurrent traffic.We’re not even talking Amazon-level traffic, but say you might get a thousand users on your site at once. What will happen? Will the server handle it or go apeshit? If this test shows you a steady response time of under 80ms – you’re good. But remember – the lower the response rate, the better! In cases where transfer speeds or latency are too high, we advise you (and our clients) to consider migrating to faster servers, upgrading to better hosting or better yet, re-platforming to a CDN. Luckily, most of the time, you can achieve most of the gains through WPRocket optimization, as was the case with this case study. Your Golden WPRocket Settings Cache → Enable caching for mobile devices This option should always be on. It ensures that your mobile users are also having your site served cached. Cache → Cache Lifespan Set it depending on how often you update your site, but we find a sweet spot at around 2-7 days. File Optimization → Basic Settings Be careful with the first one – it may break things! File Optimization → CSS Files Again, this section is quite tricky and it may break things. My guys switch them on one-by-one and test if the site works fine after enabling each option. Under Fallback critical CSS you should paste your Critical Path CSS which you can generate using CriticalCSS site. File Optimization → Javascript This section is the most likely to break things, so take extreme care enabling these options!! Depending on your theme, you might be able to defer Javascript with the below: Note that we had to use a Safe Mode for jQuery as, without this, our theme stopped working. After playing with Javascript options, make sure you test your site thoroughly, including all contact forms, sliders, checkout, and user-related functionalities. Media → LazyLoad Preload → Preload Preload → Prefetch DNS Requests The URLs here hugely depend on your theme. Here, you should paste the domains of the external resources that your site is using. Also, when you’re using Cloudflare – make sure to enable the Cloudflare Add-on in WPRocket. Speaking of Cloudflare – the final push for our site’s performance we managed to get by using Cloudflare as the CDN provider (the client sells products worldwide). GTMetrix If you don’t want to use additional plugins (which I highly recommend), below is a .htaccess code I got from our resident genius and Director of SEO, Rad Paluszak – it’ll do the basic stuff like: GZip compression Deflate compression Expires headers Some cache control So without any WordPress optimization plugins, this code added at the top of your .htaccess file, will slightly improve your PageSpeed Insights results: Internal Redirects You know how it goes – Google says that redirects don’t lose any link juice, but PageRank formula and tests state something different (there’s a scientific test run on 41 million .it websites that shows PageRank’s damping factor may vary). Whichever it is, let’s take all necessary precautions in case there is a damping factor and redirects drop a % of their link juice. As we investigated the configuration of the server, we discovered some misapplied internal redirects, which were very easily fixed but would have a considerable effect on SEO performance – a quick win. You can test them with a simple tool httpstatus.io and see results for individual URLs: But this would be a long way, right? So your best bet is to run a Sitebulb crawl and head over to the Redirects section of the crawl and look at Internal Redirected URLs: There you will find a list of all internally redirected URLs that you should update and make to point at the last address in the redirect chain. You might need to re-run the crawl multiple times to find all of them. Be relentless! Google Index Management Everyone knows that Google crawls and indexes websites. This is the bare foundation of how the search engine works. It visits the sites, crawling from one link to the other. Does it repetitively to keep the index up-to-date, as well as incrementally, discovering new sites, content, and information. Over time, crawling your site, Google sees its changes, learns structure and gets to deeper and deeper parts of it. Google stores in their index everything it finds applicable to keep; everything considered useful enough for the users and Google itself. However, sometimes it gets to the pages that you’d not want it to keep indexed. For example, pages that accidentally create issues like duplicate or thin content, stuff kept only for logged-in visitors, etc. Google does its best to distinguish what it should and shouldn’t index, but it may sometimes get it wrong. Now, this is where SEOs should come into play. We want to serve Google all the content on a silver platter, so it doesn’t need to algorithmically decide what to index. We clean up what’s already indexed, but was not supposed to be. We also prevent pages from being indexed, as well as making sure that important pages are within reach of the crawlers. I don’t see many sites that get this one right. Why? Most probably because it’s an ongoing job and site owners and SEOs just forget to perform it every month or so. On the other hand, it’s also not so easy to identify index bloat. With this campaign, to ensure that Google’s indexation of the site was optimal, we looked at these: Site: Search Google Search Console In our Read More Read More
The post 45.99% Earnings Increase in 5 Months for a Digital Infoproduct [SEO Case Study] first appeared on Diggity Marketing.