Search Engine Optimization

How Search Engines Work

 

People that need information turn to a search engine to find answers. So, how does the search engine display thousands of websites that are related to what you are looking for?

Understanding how search engines work is critical to your web marketing campaigns. Use keywords to your advantage to help draw more qualified visitors to your site.  There are over 1 billion websites on the internet. So how do websites account for all these sites including their pages and content?

Crawling & Indexing
Search engines perform two functions: crawling and indexing. This is how search engines gather information that end up on the SERP (Search engine results page.) By the way, only publicly available pages are crawled.
Search Engine Crawlers:
• Google: Googlebot
• Bing: Bingbot
• Yahoo: Slurp
• AOL: interesting fact, as of January 2016, AOL’s search is powered by Bing. Previously it used Google.
Crawlers find domains and sites then “crawl” pages by going from page to page (link to link) “indexing” all kinds of information. Each search engine crawler indexes at different times and rates paying particular attention to new sites and changes to existing ones.

How Search Engines Crawl Websites
Think of search engines as tourists with only a road map to explore the area. The map is the only reliable source of information about attractions in the country. the internet functions as the map, and websites are like the attractions. Search engines send crawlers through every website online. Crawlers are small programs that identify and report web page components. They gather data about the websites and index the information under separate categories and keywords, creating a list of sites that meet the query’s criteria.

How Websites Are Indexed
Search engines use their own, proprietary algorithms to index sites. While some engines prefer sites with a lot of backlinks, others prefer older domains, and some prioritize social linking and related activities. Search engines regularly change their algorithms to keep up with searcher’s demands and user feedback to provide better results than their competitors.
TIP: It’s important to allow – even- help search engines easily crawl your website. It can mean the difference between higher and lower SERP rankings. Site structure will be key and luckily, search engines like Google and Bing provide site owners a few options to help them crawl your website: sitemaps and a file called “robots.txt.”
A sitemap is a list of available pages crawlers love to use. they enable search engines to easily index pages that point back to a site so searchers can find them. Sitemaps make it easier to crawl websites and pages, so they need to be search engine-friendly. The second method to help search engines crawl your site is using a file Robots.txt. This document promotes stronger search engine optimization opportunities and lets site owners give instructions to search engine crawlers. They can direct how process individual pages or even notify Google and Bing which pages should not be crawled (and indexed.)

Search Engine Algorithms
“You want the answer, not trillions of webpages. Algorithms are computer programs that look for clues to give you back exactly what you want.”
Search engines like Google, Bing, Yahoo, etc., use different algorithms to help determine which sites are the best matches for a search query. Some measure a site’s popularity, the amount of time people spend on a site, or content quality and site authority. Each of these factors gives search engines a full-spectrum view into how valuable your website content is and how relevant it is to a user’s search query. Google has a few, well-known algorithms with quirky names like Google Panda, Hummingbird, Pidgeon, Pirate, Payday, TopHeavy and more. Each algorithm serves a particular purpose. Some algorithms like Panda and Hummingbird focus on content quality while others like PayDay concentrates on cleaning up search results for traditionally “spammy” searches (think, “payday loan,” “credit loan” and other spammed keywords). Several other factors go into these algorithms, and search engine technicians and engineers are always updating and tweaking them to improve the quality of the results.
If you’re interested in Google’s algorithm change history, Moz SEO does an excellent job tracking google’s algorithm updates by year including any new recent ones. Most search engines follow the general search parameters of the most popular search engine, Google, but it’s important to be aware of each engines algorithms.

Improving Search Rankings
Search engines value websites that have the features their algorithms prioritize. Sites with strong backlink quality, social interactions, and content quality rank well on SERPs. “Quality” can be vague so here are a few quality attributes content-based algorithms to look for:
• Original, high-quality, useful and compelling content (copy, visuals, media, documents/files, etc.)
• At least 500 words and contains (intelligently used) keywords or phrases
• Written for human readers first, not for search engines (over-optimized and keyword stuffed)
• Trendy, newsworthy and highly informative
• Human interest stories that inform or teach
• Press releases that encompass the above
TIP1: Over-optimized pages – those that are designed specifically for rankings and not for readers – actually have problems ranking well, so you need to write the majority of your content for people and not for Google, Bing or Yahoo. Additionally, search engines want to return the most current and up-to-date information to users, so their algorithms look for updated pages with fresh content.
TIP 2: Google’s Panda and Penguin updates changed search engine optimization (SEO) by adding social signals to their algorithm. the more social activity, the better. Supporting information with social links (Facebook, Twitter, LinkedIn, etc,) can help a page rank higher.

Wrap Up
The key to high rankings is to give the search engines exactly what they want. Include compelling content that captures readers’ attention so they’ll stay on your site. Make sure that the content you present is peppered with social linking so that people can share it. Focus on fresh, newsworthy and trendy information, and you’ll be well on your way to higher rankings.

Article by June Parent

 

SEO Tips to Fix Mistakes

Small Business Trends by Ronald Dod

Keyword Cannibalization

Keyword cannibalization can have very unfortunate results. Keyword cannibalization happens when two or more pages on your site are competing for the same keyword(s). This happens when people don’t realize that they have duplicate content or duplicate titles, or even when inexperienced SEO specialists optimize multiple pages for the same keyword on purpose, thinking it will make the website as a whole more authoritative, but that’s not the case.

Keyword cannibalization hurts your website. Why?  The SERPs are a list; number one, two, three, four, and so on. Logistically, one has to come before the other. Google searches the web and chooses the web pages that are most accurate to the keyword being searched in order to rank them.

When Google comes across both (or more) of your pages that are for the same keyword, they are forced to pick one of them to rank it. What if they pick the wrong one? Something you must always remember when dealing with SEO: Google wants their users to be happy with their search results. If a user doesn’t like your website, then Google will feel the same way.

Google is just an algorithm — a machine. It’s not a human brain that can make connections or understand what’s on your mind. Sure, it’s extremely advanced, but, at the end of the day, you have to manipulate it to understand your website based on the parts that the algorithm will read. If you have multiple pages that all rank for the exact same keyword, Google might not understand that, and it won’t rank any of your pages. That would be a complete disaster.

What do you do if your website suffers from keyword cannibalization? There are only two options, which depend on your site and its content. The first, you can merge multiple pages if it makes sense to do so. If not, your other option is to un-optimize all other pages except for your main one for each keyword.

Duplicate Content

This is a controversial topic among SEO experts. Some will tell you that Google does not have a duplicate content penalty. Some will tell you that Google absolutely does have a duplicate content penalty. So, which one is right? In this case, kind of both.

Google has come out and said that there is no penalty for having duplicate content on your site, but that doesn’t mean that it is a good practice or that it can’t hurt you. We see this a lot, because it really comes into play with eCommerce sites, which are some of the worst for doing this. Way too many eCommerce sites will take the manufacturer’s descriptions and titles, and then put them on their own site. In my expert opinion, this is an easy topic: Don’t duplicate content.

While Google has said that there is no “penalty” for duplicate content, they have also said that they value uniqueness. So, will having duplicate content hurt you? Maybe, maybe not, but will unique content help you? Absolutely! Our advice to our eCommerce clients is to have unique categories and product descriptions for everything they sell, and if you have separate pages for color and size, then merge them together.

Broken Links

Links are golden in this industry. They give your users more information, can help build trust in your authority, and can even build relationships with other people on the web. However, a broken link does nothing. If it wasn’t somewhat obvious, a broken link is a hyperlink that no longer goes to where it was intended. Naturally, over time, links are going to break, and you are going to acquire broken links. Pages go down, sites change; it’s normal and natural.

However, if broken links are a natural part of websites, then why the fuss over them? One or two broken links might not kill your rankings, but you don’t want to accumulate a lot of them. No user is going to appreciate clicking on your link, and it then going nowhere. We all know it’s frustrating, and it generates a bad user experience, which is something Google is passionate about.

Google and users want sites that work. The good news here is that it doesn’t take much to fix this. Check out Broken Link Checker to find any of your broken links, and take them down; and then keep repeating this every now and then. 

Incorrect Redirects

A redirect is how you forward one URL to a different URL. This sends users to a separate URL from the one they originally asked for. There are numerous reasons that you use redirects, but the main thing to remember is that you don’t want to have the wrong type of redirect.

You want to use 301 redirects. These are permanent redirects that tell both the searcher and search engine that a page has been moved permanently. Almost more importantly, a 301 redirect will pass at least 90 percent of its ranking power to the new page.

However, you also have 302 redirects, which are temporary redirects. Since Google knows these are temporary, they have no reason to pass on any authority. There are very few times when this should be used. Most of the time, you want to use a 301 redirect, and any 302 redirects that you have could be hurting your rankings and should be changed over.

No Internal Linking

Not only do you want functioning outbound links, you also want to make sure you have internal links to other pages on your website. A lot of websites that I come across don’t do this, which is a shame. Either they think it’s a complete waste of time, or they don’t understand the SEO juice that internal linking can bring. Not only can it help a search engine find more of your pages, but it can also help your users find more of your pages and the information that they might find helpful, which makes them happy. Remember, a happy user, a happy Google. Check your internal links.

There are several ways that you can link internally. You can use anchor text to point to another site, or you can use links like “read more” or “click here for more information.” Another great way is to have a “related pages” section at the end so that users can see where else they should go. Generally, you want to have about 3-10 internal links per page. Pages that you want to rank higher should have more internal links than others. Also, make sure you do a little housekeeping here. As you add new pages, you want to make sure you go back to older pages and link them to any new content you have.

Incorrect Title Tags and Descriptions

This has been stressed by almost every expert, even Google, and yet websites do this all the time. All of the pages on your website should have unique, descriptive titles. The title is one of the most important SEO aspects that there is. You cannot overlook this. The title needs to reflect what is on the page, not just the same title as all your other pages.

Meta descriptions are another aspect of SEO at which many websites fail. Again, these need to be unique. Your meta description is the content that appears under your title in the search results. This is your sales pitch. Having a good meta description can greatly increase your CTR (click-through rate.) You have 160 characters to tell the searcher what your page is about and how it relates to them. It needs to be persuasive, unique, and highly descriptive.

Non-Responsive Websites

Today more than half of all searches happen on mobile devices, and, unless something completely bizarre happens, that percentage is just going to increase. With all of this mobile activity, websites need to make sure that they can perform on those platforms. They need to be responsive — automatically resizing to fit whatever device the user is on. Google has even stated that websites can’t be complacent, and they have to update with the times. In today’s technology-driven world, that means being mobile responsive.

Again, this all goes back to the user experience. You want those who found your site to be able to access it seamlessly, regardless of what device they are on. Now, we will go ahead and let you know that having a responsive website is a pretty big undertaking. It takes a lot of work on the development side, but, in the long run, it is completely worth it.

Slow Site Speed

A slow website is a terrible website. End of discussion. In today’s technology-saturated and fast-moving world, people want things in a second, and you have about that long to deliver your website to them. It’s generally accepted that a website should load in two seconds or less; any more than that, and the user is going to become frustrated and abandon it. Click here to check your website speed.

Google wants the web to be functioning at the speed of light, and, if you can’t deliver that, then you could be punished for it. They have previously indicated that site speed is one of the variables in their algorithm.

At the end of the day, having a slow website can not only affect your SEO, but your conversions and your bottom line. So, you’ve got to speed it up. Test out your site speed now; if it is too low, then you’ve got some work ahead of you. You can work on optimizing your code by decreasing any unnecessary characters, removing redirects (like I’ve already told you to do), working on compressing large files, optimizing images on your site, or using a CDN (content delivery network.) Whatever you do, you’ve got to get your site moving faster to give your users a better experience and to rise in the rankings.