Top Ten Tips: SEO Edition #6

SEO Tip 6

Good things come to those that wait, a great Heinz slogan for ketchup, but a horrible mantra for SEO. While patience is a virtue, if you’re employing it in relation to SEO rankings, you might want to re-think your course of action. Google, Yahoo, and Bing regularly crawl the entire internet looking for more content, and most website owners seem content with waiting their turn, but there is a faster way to get your website noticed then a periodic site crawl. It’s time to introduce you to webmaster tools.

Webmaster Tools

Every major search engine company provides a set of tools to assist owners in managing the performance of their websites. Of course, Google’s is the most robust because they offer more tools than any other provider, but there are others out there that are worth your time. You can find these tools with a quick online search, or you can click the link in this sentence to navigate to Google, Yahoo, Bing, and Yandex’s respective pages.

To be honest, when you get there it probably won’t be what you’re at all expecting. All of the SEO tips (#10,#9,#8,#7) leading up to this point have been reasonably achievable for novices, but webmaster tools are usually the ”make or break” mark for the do-it-yourself crowd. That’s okay because the page is genuinely designed for developers. Even signing into webmaster tools requires a developer account, so you know you’re officially moving up to the next level when you get to this step.

Google Search Console

Once you’re logged in, there will be lots of menus and options available. If you’ve made it this far, don’t get distracted, find the area of the tools that enable you to request a crawling of your website. In Google’s Webmaster Tools (Google Search Console), which I’m sure is the one most readers are concerned about, that would be the ”URL Inspection” tool, but requesting a search engine crawl a website is only one-third of the process of correctly indexing your site.

So What’s Next?

Before clicking around and instructing Google’s crawler to its job, like it’s a college student amid their first internship, you’ll want to have a robot.txt and sitemap file ready for upload. By now, you’re probably starting to notice the amount of work required to manage your site’s SEO is beginning to add up to a lot of hours. And this is as good of a time as any to think about reaching out to a professional for help because, spoiler alert, things are about to get a bit trickier.

Next Exit

Let’s start with the sitemap, a file with a function precisely as it reads, providing a text version of the site layout, so search engines know what to expect when they visit. Developers often create these files because they require some text encoding to assure the proper format. After crawling a site, the search engine reconciles what the sitemap said should be there, versus what it found.

The robots.txt file is complimentary to the sitemap mentioned above. This file provides a set of instructions to the search engine, telling it what it can, or can’t, crawl. This file is especially important for those using WordPress, Wix, Squarespace, and other site building applications because there are a lot of menu pages that are a part of the interface, but aren’t relevant to search engines. In some cases, it may even be a security risk for those pages to appear in results.

Last Chance to Tap Out

If you’re still reading this, and more importantly, awaiting our next post in the series, it means you’re all in. You’re also halfway home, as there are only five more posts left before you’ve completed all ten tips.  If you’re just waiting for the link to contact us, so we can do the heavy lifting for you, it’s right here.

RTR Digital’s has expertise in advanced SEO and digital solutions, and everyone is entitled to a complimentary SEO evaluation after they’ve filled out the contact form.

If you have any questions or comments related to this post, feel free to leave them in the appropriate section of this page.

Top Ten Tips: SEO Edition #10

Top SEO Tips 10

Welcome to the first article in our ten-part series that discusses the top 10 Search Engine Optimization tips from 2018 that should govern your approach to ranking your site higher in 2019. I’m not going to assume that you’re familiar with Search Engine Optimization, or SEO as it’s commonly known (if you are all in on the tech acronyms), so I’m going to give you a brief overview of what it is and why it’s essential.

“SEO is the practice of making online content more easily readable for search engines.”

SEO comprises many different elements, such as a page’s structure, metadata, and performance, with the goal of the service being to balance these three aspects of page design to maximize its efficiency, without sacrificing too drastically from its design elements. There are a lot of companies offering SEO services in today’s marketplace, and that’s making it harder for potential customers to distinguish the SEO experts from the SEO imposters.

The goal of this series is to provide education around some of the subtler points of SEO, so when you encounter a so-called “SEO Expert” lacking knowledge of these points, you can determine if the organization is possibly exaggerating their qualifications. So let’s get started.

#10: GOOGLE SEARCH PREFERS CUSTOM BUILT SITES

The first tip you need to know about Google Search, and search engines, in general, are that they favor custom built sites and not ones created with web builder applications. I’m not saying that designing your website with Wix, Squarespace, or WordPress is some nail in the coffin for your search rankings; I’m stating there is a clear preference for one over the other and I’m going to explain why.

The truth is, a search engine can’t distinguish between a site built with a web building application or one coded by a developer, but what search providers can consider, is how long a website takes to load. While not all developers write perfectly efficient code, even those sites created by developers on the lower end of the efficiency spectrum tend to load faster than those generated by applications, and the reason for this is site overhead.

Site Overhead

Load Time Research

Site overhead is a development term used to describe the number of tools a site is required to load before the page can be displayed appropriately. The more tools there are, the longer the loading time. And According to data from Akamai, a leading content delivery network (CDN), loading times in the three-second range dramatically affect a site’s abandonment rate.

Data points like the one mentioned above are not lost on search engine algorithm designers seeking to gain any advantages in their own competitive markets, so loading time is something every web designer needs to consider when deciding what tools to use when creating a site. With that in mind, it’s important to understand that websites created with web building applications are required to load significantly more tools than a site designed by a developer. The reasoning behind this is that sites built with applications have to load every resource they contain regardless of whether they are applied to a particular website.

For example, most templates usually include code that enables designers to create drop-down navigation menu items. For large sites, drop-down style menus make a lot of sense, each product or service is given its own link, making navigating directly to pages much more manageable for visitors. For less robust sites that may only have four or five top-level pages, every page link can be easily fit into a single row navigation bar, meaning the extra code required to create drop-down menus is loading unnecessarily. The result is additional loading time for the smaller site.

 

Mitigating the Damage

Whether your website was created using an application, or by a developer, there are some steps organizations can take to reduce the site overhead. Most hosting companies offer features like photo management and enabling site compression through a visual interface, and executing these steps can minimize site loading times by as much as 20%, but to make real headway, you’ll probably need a developer to perform some more advanced operations.

Intermediate tasks like minimizing HTML, JS, and CSS, require coding knowledge that most novices haven’t yet acquired, with expert level tasks like configuring the browser cache and asynchronous loading requiring someone who knows the server side configurations as well. Given what you now know about SEO, don’t be afraid to reach out for help, you’re not the only one who needs a bit of third-party assistance.

RTR Digital provides complimentary SEO Audits for any organizations that request them, and you can start your organization down the path to higher search rankings by seeking a consultation by clicking here. If you’re a bit timid about reaching out, we offer some more tips about full SEO integration here. If you have questions that you think will benefit everyone by having them answered publicly, feel free to add them in the comments section.