7 Tasks In 7 Days: Improving Your SEO
There’s no silver bullet for improving SEO. Fixing critical errors can correct visibility issues, but you need to make SEO part of your overall business process for long-term growth. The basics need to be part of your long-term marketing strategy, something you work to improve over several weeks, months and even years.
However, there are a lot of small things you can do that will have a significant long-term impact on your visibility. Here are seven simple steps you can take that can lead to a noticeable improvement in your SEO performance over the next seven days.
Day 1: Claim Your Local Listings
Google Business Profile (GBP) is essential for SEO. It’s a great way to increase visibility while adding legitimacy to your online presence. Adding your information (including a local number), social icons, hours and photos only takes a few minutes. If you haven’t set up GBP yet, consider making it your top priority, as Google sends the confirmation letter to your address to verify that you’re listing an accurate location.
Once they’ve confirmed your location, Google will be more likely to show your website and address to customers doing local searches.
GBP also helps you channel your customers to create content through reviews. New customers can learn about your brand by reading what others said. While you’re setting up GBP, make sure you also claim your Yelp, Yellow Pages and Urbanspoon profiles, if needed.
Claiming local listings is important even for companies that exist only online. Think of them as another way to present your products and customer service to potential new clients.
Day 2: Scan and Repair Broken Links
Internal and external broken links can damage your SEO by affecting the usability of your website. Users who get frustrated by broken links are likely to bounce.
Search engine spiders view broken links as stop signs limiting their progress when crawling your site. If a search engine discovers too many broken links, the crawler could abandon your site, leaving valuable content out of their index.
If you haven’t scanned your website for broken links in a while, this might turn into a big project. Over time, old content, deleted pages and design changes can affect your links and make them stale. Even if you’re on top of your pages, you might have content that links to another website that no longer exists.
A good place to start hunting down old links is by viewing your 404 errors report in Google’s webmaster tools. While this is not a complete list, it will give you the broken links that the search engine discovered. After fixing these errors, schedule some time regularly to go through links on your site and check them manually or look into site-crawler software that can do this automatically.
The good news is that fixing links gets easier if you check them regularly. After this big fix, you just have to scan for problem links weekly (or monthly) to keep the number of errors low.
Day 3: Scan for Duplicate Content
When search engines read content, they’ll assign a value to it based on the information it provides. The more informative the content is, the more likely it is to perform well in search.
If the content exists on multiple pages, however, search engines will split the value they assign between every copy, forcing those pages to compete with one another, which makes it less likely for any page to appear in search.
Removing duplicate pages will consolidate the value and lead to improved organic visibility. Tools like Screaming Frog or Siteliner automate the discovery of duplicate pages, but be sure to click around on the site on your own to see if you can find any technical problems causing duplicates.
Once you’ve identified the duplicate content, it only takes a few seconds to set up canonical tags or redirects that will tell search engine crawls to skip it, so you avoid penalties.
Day 4: Add an XML Sitemap to Your robots.txt File
Step four involves two separate parts, but you should be able to complete them within the same day without an issue.
First, make sure you have a robots.txt file. The document tells search engines what pages they’re allowed to crawl and what they should avoid. Without robots.txt, search engines will try to find every page on your site. Crawling every page can present a problem for eCommerce sites, which use sophisticated search and filter options to help customers find the products they want.
If you don’t want search engines to access your site search or filter options, you can disallow them in your robots.txt. If you already have a robots.txt file, the next step is to add your XML sitemap to it.
XML Sitemaps tell search engines how to navigate your website. While Google and Bing allow you to upload the sitemap within their webmaster tools, placing a link in your robots.txt will help ensure it’s crawled every time a search engine visits your site.
Day 5: Build Some Natural Links
When deciding what websites to show their customers, search engines have to rank the websites by how likely it is that someone will find the information they’re looking for on that page. They determine this by looking at how many other high-quality websites link to that content.
Link building is a term that has a lot of negative history associated with it. In the past, you could “trick” Google into ranking you first for almost any term if you bought the links to do it. Thankfully, those days are gone, but links are still one of the most important factors Google uses when choosing how to rank your site.
One natural way to ask for links is by reaching out to your partners, vendors and any organizations you work with and asking them to link back to your site. If you sell their products, they should be happy to direct customers looking to buy their products to you.
After getting links from businesses you work with, keep an eye out for ways to get your company mentioned in the paper or on websites that report on your industry. The days of buying hundreds of links to rank might be over, but successful businesses continually look for ways they can build high-quality, relevant links.
Day 6: Check for Poorly Optimized Pages
After running your website for a few years, your knowledge of SEO is likely far better than when you started. However, chances are those pages you wrote when you were new to optimization are still hanging around, and their lower quality can harm your site’s overall authority.
Basic SEO steps you follow now (like optimizing images, linking, and tagging) might have seemed otherworldly back then.
This project won’t get solved overnight, especially if your weak SEO pages have poor content. But you can identify some easy wins – like using Hashtags and adding specific keywords – to boost your SEO value. Refreshing old content also signals search engines that you care about what users will find if they land on that page, which can improve your overall quality score.
Day 7: Look for Ways to Increase Your Site Speed
A slow website is frustrating to everyone. If your pages take more than a few seconds to load, customers will likely leave and find a faster site, meaning fewer sales for you. Also, a high bounce rate indicates your website doesn’t have the information customers want. As a result, search engines will show potential customers to your competitors instead of you.
Google’s Page Speed Insights is a great tool to help you identify things that slow down your user experience since they’ll suggest how to fix any problems they discover.
After completing these seven steps, you’ll have a website that’s a lot closer to aligning with Google’s best practices and one that provides a better experience for your customers.