Five signs your “Local SEO Expert” isn’t an Expert at all.
They put links back to their website in the footer of your website
Your website isn’t on the 1st page of Google
They push Paid Advertising more than Organic
They have been in the game for too long
They are limited in their knowledge
My name is Lukas Fanders (video seo expert) and I have been Freelancing for SEO Agencies for the past few years. I am the person that they pay when they need specialized knowledge, like Video ranking and On Page Optimization. I can tell you that I am astonished at the lack of knowledge that 90% of “SEO’s” have. I use the term very loosely because they know more about selling their services to you, than they could ever dream of knowing about ranking your website. Most know a lot about outsourcing backlinks to people in India for a few bucks, but little about the ranking algorithm used in Google. As Google moves more and more towards the technical side of website architecture and away from spam tactics, your website will lose rankings and traffic if you are dealing with this type.
What your business needs to rank locally in search engines:
Your website and Google Business page need to be optimized correctly to rank and drive customers to your business. Call us today (844) 600-5660
1. They put links back to their website in the footer of your website
This is, probably the single worst thing an SEO or Web Designer can do when they are working on your website.
Why is this so bad? Well let me explain.
Google doesn’t like what are called site-wide backlinks, What is a site-wide backlink? A site-wide backlink is a link that is present on every page of your website. It was used a lot to cheat the algorithms and sneaky, underhanded Web designers and SEO’s would put their link in the footer of a Website to steal the juice from a page. So they are using your juice to rank their page, never mind if it hurts your page.
(A sitewide backlink will negatively affect your entire website and hurt your rankings and traffic.)
What is juice?
Juice is the amount of power a website has to rank, for a particular keyword. Every backlink on your page that leads to another page is giving away your juice to the page it leads to. So for instance if you start with 100 in juice and you link out from one page on your site to one page on a different website, your link juice drops to say 90. If you have a link in the footer, that means every single page on your website is leaking juice to that page. Which is very bad for your rankings. It is making your entire website weaker in the eyes of Google. Now you may start to understand why you aren’t ranking for your keywords.
What makes it even worse?
What makes it even worse is if they use what is called “anchor text”. Let’s say that they want to rank for “Web Design Dallas”. They may put a link in the footer of your page that says “Web Design Dallas”. The problem with this, is that if your website isn’t about web design, Google considers that a non-relevant link. So Google will penalize your site for linking to non-relevant content. If you have any non-relevant links on your page, you should have them taken off immediately. 2. Your website isn’t on the 1st page of Google
If you are trying to rank for a local term and aren’t on the 1st page of Google, do you really think that person you hired is an expert? You have to understand that most agencies that rank for Video SEO terms, generally buy those rankings. They spend top dollar on links, that they are not going to buy to rank your website. So just because a website ranks well for SEO terms, doesn’t meant they will be able to rank your website. A better determining factor would be to look at their On Page Optimization.
3. They push Paid Advertising more than Organic
A good marketing strategy can include Pay per click and Search Engine rankings. Pay per click is not generally a good long term strategy for any business. It is a good indicator of specific terms that people are searching for but should definitely be used along with a plan to rank in Google for your target keywords. The wrong company will have you spending money forever using pay per click. Why pay for every click when you can set up a strategy to get those clicks for free. Simply by doing a little work.
4. They have been in the game for too long
I see a lot of SEO’s that started many years ago and still use those strategies. Search Engine Optimization is not like most industries. It is constantly evolving and changing. Most people would rather stick to the old strategies that once worked than learn the new ones that work now. What worked 10 years ago will get you thrown out of Google today. So it’s best to be very cautious of anyone that says they have been in the game too long. Usually they still have bad habits that they imagine work.
5. They are limited in their knowledge
Anything with an algorithm can be ranked in. Whether that be Google SEO, Youtube SEO, Facebook or Amazon to name a few. If the agency you are using only know how to rank in Google or even worse Google Places, be very cautious.When they are limited in their knowledge of different platforms they generally don’t understand how algorithms work. An algorithm is nothing more than a set of factors used to determine the best page to rank for the search being performed. Any website with a search bar has an algorithm.
There you have 5 good factors to determine if you are dealing with an expert, or being duped.
You are going to need an SEO audit
When you are looking to achieve a healthy website in terms of SEO, traffic and ads, you are going to need an SEO audit that works for you. An audit is performed by someone who is a professional in the field of SEO development, who can create a functional SEO plan that works for your website only. You can certainly find many automated SEO auditors online, but many of them simply tell you only part of a report and then try to scam you into buying more of their services. It is best to leave your SEO auditing up to a professional.
While you may have a web designer…they may not know the first thing about getting your site ranked as number one on Google
Many people think that most of their SEO depends on what content they put on their site, and they may not know the many tricks of the trade that can get them a proper rank in the search engines. A Dallas SEO Expert will let you know, that it is not just about content, but it is about the coding and what you can see beyond closed doors of your website. While you may have a web designer, they may not know the first thing about getting your site ranked as number one on Google or Yahoo searches. This is why we are here to help!
A SEO audit is crucial to your search engine ranking
A SEO audit is crucial to your search engine ranking as well as finding targeted customers that can find your business online no matter what they are searching. Get a professional and experienced SEO audit from our company and you will be forever impressed by the tricks and tips of the trade that your website might just be missing. Your SEO website audit will tell you exactly what you need to do as a site and business owner to fix your website to come out number on in search engines as well as the option of working with us for the best SEO experience for your website.
First of all let me start off by saying most Search engine optimization “specialists” fall into two categories. The Outsourcer and The Do-Nothing. Look at the descriptions below and see if you are dealing with this type of SEO right now.
The Outsourcer – Outsources the work to other people. They usually don’t know how to rank, they just throw links at a website until it either RANKS OR TANKS. A lot of times they use Fiverr which is an absolutely terrible idea when it comes to ranking a website. Fiverr is a website where people do different tasks, for $5. So the SEO that you are paying $150/hr to, is outsourcing that to someone (many times in a foreign country) for a $5 fee and putting that $145 in their pocket for doing no work. Any large company is outsourcing most of the work. This is a bad idea because if you send the wrong links to a website you will be removed from Google. Once you are hit with a backlink penalty it is virtually impossible to recover. Your best bet is to buy a new domain and start over from scratch.
The Do-Nothing – They take your money but never really improve your rankings. Usually they want to push you into Google Adwords, because they don’t know how to rank or they are too lazy to put the work in. I know a lot of these types and a lot of agencies also do this. If you aren’t seeing results it’s because you hired a Do-Nothing SEO. They usually push contracts on businesses because they know they won’t deliver results. They just want to bilk you for all they can get until you realize what’s going on. If you are paying for SEO you should be on the 1st page, PERIOD! Especially local terms.
On Page SEO
I have studied On Page SEO factors extensively and I use this site as a testing platform. I have purposely kept the backlinks to a minimum in order to showcase the power of On Page SEO. I regularly outrank sites with 3-4 times the Authority that this website has. Just by following a simple 10 step On Page SEO checklist that I use on every site. I have detailed the list below.
1. Domain Name
The Domain Name is by far the most important On Page SEO factor for ranking. Anyone that tells you different doesn’t understand SEO and you shouldn’t trust them with your website. Exact Match Domains or E.M.D.’s which for example would be (OrthopedicdoctorinDallasTX.com, or FortWorthRoofing.com) were penalized by Google not too long ago. If you own an E.M.D. you need to be careful about over optimization on your website. Most companies go with the actual name of the company, not the keyword. With the right backlinks you can rank any domain for any keyword.
The URL is the entire web address that shows up for a page of the site (www.nipeyegraphix.com/how-to-improve-search-engine-rankings-on-google) is the URL of this page. The URL is the second most important on page SEO factor. It is really just an extension of the first example I gave, the domain. Some SEO’s will tell you to stuff keywords into the URL but I advise against that. Anytime you add extra words into the URL, you weaken the ranking power for that keyword. So this page would have a much better chance ranking for “how to improve search engine rankings in google” than “improve search engine rankings in google”. Anything that I tell you on this website has been investigated, tested and proven.
I wanted to post this article so the average business owner can get an idea of what goes into ranking a website on Google. There are over 200 factors to consider. There are factors that carry a lot more weight than others, but they all carry some type of weight with Google. The majority of SEO companies do nothing but spam a bunch of links to your website, cross their fingers and hope that it works.
I am an individual that outranks entire SEO agencies in Dallas.
How do I do that?
1. Because I know what I am doing
2. I don’t outsource the work to India or Pakistan
3. I am an expert when it comes to On Page SEO. If you watch the video posted below you will see just how bad the “top” SEO companies in DFW are at on page optimization. If their own site isn’t on page optimized, do you think yours will be?
4. I understand how the Google Algorithm works
5. I take pride in the work I do. It’s more than just a paycheck to me. I feel a partnership with the companies that I rank in search engines.
Brian Dean is the founder of the popular SEO training blog, Backlinko.com. His site showcases insanely practical strategies that will help you generate higher rankings, more traffic and increased conversions.
Well today you’re in for a treat because I’ve put together a complete list.
“The difference between a domain that’s six months old verses one year old is really not that big at all.”.
In other words, they do use domain age…but it’s not very important.
2. Keyword Appears in Top Level Domain: Doesn’t give the boost that it used to, but having your keyword in the domain still acts as a relevancy signal. After all, they still bold keywords that appear in a domain name.
3. Keyword As First Word in Domain: Moz’s 2011 Search Engine Ranking Factors panelists agreed that a domain that starts with their target keyword has an edge over sites that either don’t have the keyword in their domain or have the keyword in the middle or end of their domain.
“Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain”.
5. Keyword in Subdomain Name: Moz’s panel also agreed that a keyword appearing in the subdomain boosts rank,
6. Domain History: A site with volatile ownership (via whois) or several drops may tell Google to “reset” the site’s history, negating links pointing to the domain.
7. Exact Match Domain: EMDs may still give you an edge…if it’s a quality site. But if the EMD happens to be a low-quality site, it’s vulnerable to the EMD update:
8. Public vs. Private WhoIs: Private WhoIs information may be a sign of “something to hide”. Matt Cutts is quoted as stating at Pubcon 2006:
“…When I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. …Having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”
9. Penalized WhoIs Owner: If Google identifies a particular person as a spammer it makes sense that they would scrutinize other sites owned by that person.
17. Keyword Density: Although not as important as it once was, keyword density is still something Google uses to determine the topic of a webpage. But going overboard can hurt you.
18. Latent Semantic Indexing Keywords in Content(LSI): LSI keywords help search engines extract meaning from words with more than one meaning (Apple the computer company vs. the fruit). The presence/absence of LSI probably also acts as a content quality signal.
19. LSI Keywords in Title and Description Tags: As with webpage content, LSI keywords in page meta tags probably help Google discern between synonyms. May also act as a relevancy signal.
20. Page Loading Speed via HTML: Both Google and Bing use page loading speed as a ranking factor. Search engine spiders can estimate your site speed fairly accurately based on a page’s code and filesize.
21. Duplicate Content: Identical content on the same site (even slightly modified) cannegatively influence a site’s search engine visibility.
22. Rel=Canonical: When used properly, use of this tag may prevent Google from considering pages duplicate content.
23. Page Loading Speed via Chrome: Google may also use Chrome user data to get a better handle on a page’s loading time as this takes into account server speed, CDN usage and other non HTML-related site speed signals.
24. Image Optimization: Images on-page send search engines important relevancy signals through their file name, alt text, title, description and caption.
25. Recency of Content Updates: Google Caffeine update favors recently updated content, especially for time-sensitive searches. Highlighting this factor’s importance, Google shows the date of a page’s last update for certain pages:
26. Magnitude of Content Updates: The significance of edits and changes is also a freshness factor. Adding or removing entire sections is a more significant update than switching around the order of a few words.
27. Historical Updates Page Updates: How often has the page been updated over time? Daily, weekly, every 5-years? Frequency of page updates also play a role in freshness.
28. Keyword Prominence: Having a keyword appear in the first 100-words of a page’s content appears to be a significant relevancy signal.
29. Keyword in H2, H3 Tags: Having your keyword appear as a subheading in H2 or H3 format may be another weak relevancy signal.
30. Keyword Word Order: An exact match of a searcher’s keyword in a page’s content will generally rank better than the same keyword phrase in a different order. For example: consider a search for: “cat shaving techniques”. A page optimized for the phrase “cat shaving techniques” will rank better than a page optimized for “techniques for shaving a cat”. This is a good illustration of why keyword research is really, really important.
31. Outbound Link Quality: Many SEOs think that linking out to authority sites helps send trust signals to Google.
32. Outbound Link Theme: According to Moz, search engines may use the content of the pages you link to as a relevancy signal. For example, if you have a page about cars that links to movie-related pages, this may tell Google that your page is about the movie Cars, not the automobile.
33. Grammar and Spelling: Proper grammar and spelling is a quality signal, although Cutts gave mixed messages in 2011 on whether or not this was important.
34. Syndicated Content: Is the content on the page original? If it’s scraped or copied from an indexed page it won’t rank as well as the original or end up in their Supplemental Index.
35. Helpful Supplementary Content: According to a now-public Google Rater Guidelines Document, helpful supplementary content is an indicator of a page’s quality (and therefore, Google ranking). Examples include currency converters, loan interest calculators and interactive recipes.
36. Number of Outbound Links: Too many dofollow OBLs may “leak” PageRank, which can hurt search visibility.
37. Multimedia: Images, videos and other multimedia elements may act as a content quality signal.
38. Number of Internal Links Pointing to Page: The number of internal links to a page indicates its importance relative to other pages on the site.
39. Quality of Internal Links Pointing to Page: Internal links from authoritative pages on domain have a stronger effect than pages with no or low PR.
40. Broken Links: Having too many broken links on a page may be a sign of a neglected or abandoned site. The Google Rater Guidelines Document uses broken links as one was to assess a homepage’s quality.
41. Reading Level: There’s no doubt that Google estimates the reading level of webpages:
But what they do with that information is up for debate. Some say that a basic reading level will help your page rank because it will appeal to the masses. However, Linchpin SEOdiscovered that reading level was one factor that separated quality sites from content mills.
42. Affiliate Links: Affiliate links themselves probably won’t hurt your rankings. But if you have too many, Google’s algorithm may pay closer attention to other quality signals to make sure you’re not a “thin affiliate site”.
43. HTML errors/W3C validation: Lots of HTML errors or sloppy coding may be a sign of a poor quality site. While controversial, many in SEO think that WC3 validation is a weak quality signal.
44. Page Host’s Domain Authority: All things being equal a page on an authoritative domain will higher than a page on a domain with less authority.
45. Page’s PageRank: Not perfectly correlated. But in general higher PR pages tend to rank better than low PR pages.
47. URL Path: A page closer to the homepage may get a slight authority boost.
48. Human Editors: Although never confirmed, Google has filed a patent for a system that allows human editors to influence the SERPs.
49. Page Category: The category the page appears on is a relevancy signal. A page that’s part of a closely related category should get a relevancy boost compared to a page that’s filed under an unrelated or less related category.
50. WordPress Tags: Tags are WordPress-specific relevancy signal. According toYoast.com:
“The only way it improves your SEO is by relating one piece of content to another, and more specifically a group of posts to each other”
51. Keyword in URL: Another important relevancy signal.
52. URL String: The categories in the URL string are read by Google and may provide a thematic signal to what a page is about:
53. References and Sources: Citing references and sources, like research papers do, may be a sign of quality. The Google Quality Guidelines states that reviewers should keep an eye out for sources when looking at certain pages: “This is a topic where expertise and/or authoritative sources are important…”.
54. Bullets and Numbered Lists: Bullets and numbered lists help break up your content for readers, making them more user friendly. Google likely agrees and may prefer content with bullets and numbers.
55. Priority of Page in Sitemap: The priority a page is given via the sitemap.xml file may influence ranking.
56. Too Many Outbound Links: Straight from the aforementioned Quality rater document:
“Some pages have way, way too many links, obscuring the page and distracting from the Main Content”
57. Quantity of Other Keywords Page Ranks For: If the page ranks for several other keywords it may give Google an internal sign of quality.
58. Page Age: Although Google prefers fresh content, an older page that’s regularly updated may outperform a newer page.
59. User Friendly Layout: Citing the Google Quality Guidelines Document yet again:
“The page layout on highest quality pages makes the Main Content immediately visible”
60. Parked Domains: A Google update in December of 2011 decreased search visibility of parked domains.
62. Content Provides Value and Unique Insights: Google has stated that they’re on the hunt for sites that don’t bring anything new or useful to the table, especially thin affiliate sites.
63. Contact Us Page: The aforementioned Google Quality Document states that they prefer sites with an “appropriate amount of contact information”. Supposed bonus if your contact information matches your whois info.
64. Domain Trust/TrustRank: Site trust — measured by how many links away your site is from highly-trusted seed sites — is a massively important ranking factor. You can read more about TrustRank here.
65. Site Architecture: A well put-together site architecture (especially a silo structure) helps Google thematically organize your content.
66. Site Updates: How often a site is updated — and especially when new content is added to the site — is a site-wide freshness factor.
67. Number of Pages: The number of pages a site has is a weak sign of authority. At the very least a large site helps distinguish it from thin affiliate sites.
68. Presence of Sitemap: A sitemap helps search engines index your pages easier and more thoroughly, improving visibility.
69. Site Uptime: Lots of downtime from site maintenance or server issues may hurt your ranking (and can even result in deindexing if not corrected).
70. Server Location: Server location may influence where your site ranks in different geographical regions. Especially important for geo-specific searches.
71. SSL Certificate (Ecommerce Sites): Google has confirmed that they index SSL certificates. It stands to reason that they’ll preferentially rank ecommerce sites with SSL certificates.
72. Terms of Service and Privacy Pages: These two pages help tell Google that a site is a trustworthy member of the internet.
73. Duplicate Meta Information On-Site: Duplicate meta information across your site may bring down all of your page’s visibility.
74. Breadcrumb Navigation: This is a style of user-friendly site-architecture that helps users (and search engines) know where they are on a site:
77. Site Usability: A site that’s difficult to use or to navigate can hurt ranking by reducing time on site, pages viewed and bounce rate. This may be an independent algorithmic factor gleaned from massive amounts of user data.
78. Use of Google Analytics and Google Webmaster Tools: Some think that having these two programs installed on your site can improve your page’s indexing. They may also directly influence rank by giving Google more data to work with (ie. more accurate bounce rate, whether or not you get referall traffic from your backlinks etc.).
79. User reviews/Site reputation: A site’s on review sites like Yelp.com and RipOffReport.com likely play an important role in the algorithm. Google even posted a rarely candid outline of their approach to user reviews after an eyeglass site was caught ripping off customers in an effort to get backlinks.
80. Linking Domain Age: Backlinks from aged domains may be more powerful than new domains.
81. # of Linking Root Domains: The number of referring domains is one of the most important ranking factors in Google’s algorithm, as you can see from this chart from Moz(bottom axis is SERP position):
82. # of Links from Separate C-Class IPs: Links from seperate class-c IP addresses suggest a wider breadth of sites linking to you.
83. # of Linking Pages: The total number of linking pages — even if some are on the same domain — is a ranking factor.
84. Alt Tag (for Image Links): Alt text is an image’s version of anchor text.
85. Links from .edu or .gov Domains: Matt Cutts has stated that TLD doesn’t factor into a site’s importance. However, that doesn’t stop SEOs from thinking that there’s a special place in the algo for .gov and .edu TLDs.
86. PR of Linking Page: The PageRank of the referring page is an extremely important ranking factor.
87. Authority of Linking Domain: The referring domain’s authority may play an independent role in a link’s importance (ie. a PR2 page link from a site with a homepage PR3 may be worth less than a PR2 page link from PR8 Yale.edu).
88. Links From Competitors: Links from other pages ranking in the same SERP may be more valuable for a page’s rank for that particular keyword.
89. Social Shares of Referring Page: The amount of page-level social shares may influence the link’s value.
91. Guest Posts: Although guest posting can be part of a white hat SEO campaign, links coming from guest posts — especially in an author bio area — may not be as valuable as a contextual link on the same page.
92. Links to Homepage Domain that Page Sits On: Links to a referring page’s homepage may play special importance in evaluating a site’s — and therefore a link’s — weight.
Which suggests that they do…at least in certain cases. Having a certain % of nofollow links may also indicate a natural vs. unnatural link profile.
94. Diversity of Link Types: Having an unnaturally large percentage of your links come from a single source (ie. forum profiles, blog comments) may be a sign of webspam. On the other hand, links from diverse sources is a sign of a natural link profile.
95. “Sponsored Links” Or Other Words Around Link: Words like “sponsors”, “link partners” and “sponsored links” may decrease a link’s value.
96. Contextual Links: Links embedded inside a page’s content are considered more powerful than links on an empty page or found elsewhere on the page.
97. Excessive 301 Redirects to Page: Links coming from 301 redirects dilute some (or even all) PR, according to a Webmaster Help Video.
98. Backlink Anchor Text: As noted in this description of Google’s original algorithm:
“First, anchors often provide more accurate descriptions of web pages than the pages themselves.”
Obviously, anchor text is less important than before (and likely a webspam signal). But it still sends a strong relevancy signal in small doses.
99. Internal Link Anchor Text: Internal link anchor text is another relevancy signal, although probably weighed differently than backlink anchor text.
100. Link Title Attribution: The link title (the text that appears when you hover over a link) is also used as a weak relevancy signals.
101. Country TLD of Referring Domain: Getting links from country-specific top level domain extensions (.de, .cn, .co.uk) may help you rank better in that country.
102. Link Location In Content: Links in the beginning of a piece of content carry slight more weight than links placed at the end of the content.
103. Link Location on Page: Where a link appears on a page is important. Generally, links embedded in a page’s content are more powerful than links in the footer or sidebar area.
104. Linking Domain Relevancy: A link from site in a similar niche is significantly more powerful than a link from a completely unrelated site. That’s why any effective SEO strategytoday focuses on obtaining relevant links.
105. Page Level Relevancy:The Hilltop Algorithm states that link from a page that’s closely tied to page’s content is more powerful than a link from an unrelated page.
106. Text Around Link Sentiment: Google has probably figured out whether or not a link to your site is a recommendation or part of a negative review. Links with positive sentiments around them likely carry more weight.
107. Keyword in Title: Google gives extra love to links on pages that contain your page’s keyword in the title (“Experts linking to experts”.)
108. Positive Link Velocity: A site with positive link velocity usually gets a SERP boost.
109. Negative Link Velocity: Negative link velocity can significantly reduce rankings as it’s a signal of decreasing popularity.
110. Links from “Hub” Pages:Aaron Wall claims that getting links from pages that are considered top resources (or hubs) on a certain topic are given special treatment.
111. Link from Authority Sites: A link from a site considered an “authority site” likely pass more juice than a link from a small, microniche site.
112. Linked to as Wikipedia Source: Although the links are nofollow, many think that getting a link from Wikipedia gives you a little added trust and authority in the eyes of search engines.
114. Backlink Age: According to a Google patent, older links have more ranking power than newly minted backlinks.
115. Links from Real Sites vs. Splogs: Due to the proliferation of blog networks, Google probably gives more weight to links coming from “real sites” than from fake blogs. They likely use brand and user-interaction signals to distinguish between the two.
116. Natural Link Profile: A site with a “natural” link profile is going to rank highly and be more durable to updates.
117. Reciprocal Links: Google’s Link Schemes page lists “Excessive link exchanging” as a link scheme to avoid.
118. User Generated Content Links: Google is able to identify links generated from UGC vs. the actual site owner. For example, they know that a link from the official WordPress.com blog at en.blog.wordpress.com is very different than a link from besttoasterreviews.wordpress.com.
119. Links from 301: Links from 301 redirects may lose a little bit of juice compared to a direct link. However, Matt Cutts says that a 301 is the similar to a direct link.
120. Schema.org Microformats: Pages that support microformats may rank above pages without it. This may be a direct boost or the fact that pages with microformatting have a higher SERP CTR:
121. DMOZ Listed: Many believe that Google gives DMOZ listed sites a little extra trust.
122. Yahoo! Directory Listed: The algorithm might also have a special place for the Yahoo! Directory, considering how long it’s been cataloging sites.
123. Number of Outbound Links on Page: PageRank is finite. A link on a page with hundreds of OBLs passes less PR than a page with only a few OBLs.
124. Forum Profile Links: Because of industrial-level spamming, Google may significantly devalue links from forum profiles.
125. Word Count of Linking Content: A link from a 1000-word post is more valuable than a link inside of a 25-word snippet.
126. Quality of Linking Content: Links from poorly written or spun content don’t pass as much value as links from well-written, multimedia-enhanced content.
127. Sitewide Links: Matt Cutts has confirmed that sitewide links are “compressed” to count as a single link.
128. Organic Click Through Rate for a Keyword: Pages that get clicked more in CTR may get a SERP boost for that particular keyword.
129. Organic CTR for All Keywords: A page’s (or site’s) organic CTR for all keywords is ranks for may be a human-based, user interaction signal.
130. Bounce Rate: Not everyone in SEO agrees bounce rate matters, but it may be a way of Google to use their users as quality testers (pages where people quickly bounce is probably not very good).
131. Direct Traffic: It’s confirmed that Google uses data from Google Chrome to determine whether or not people visit a site (and how often). Sites with lots of direct traffic are likely higher quality than sites that get very little direct traffic.
132. Repeat Traffic: They may also look at whether or not users go back to a page or site after visiting. Sites with repeat visitors may get a Google ranking boost.
133. BlockedSites: Google has discontinued this feature in Chrome. However, Panda used this feature as a quality signal.
135. Google Toolbar Data: Search Engine Watch’s Danny Goodwin reports that Google uses toolbar data as a ranking signal. However, besides page loading speed and malware, it’s not know what kind of data they glean from the toolbar.
136. Number of Comments: Pages with lots of comments may be a signal of user-interaction and quality.
137. Dwell Time: Google pays very close attention to “dwell time”: how long people spend on your page when coming from a Google search. This is also sometimes referred to as “long clicks vs short clicks”. If people spend a lot of time on your site, that may be used as a quality signal.
139. Query Deserves Diversity: Google may add diversity to a SERP for ambiguous keywords, such as “Ted”, “WWF” or “ruby”.
140. User Browsing History: Sites that you frequently visit while signed into Google get a SERP bump for your searches.
141. User Search History: Search chain influence search results for later searches. For example, if you search for “reviews” then search for “toasters”, Google is more likely to show toaster review sites higher in the SERPs.
142. Geo Targeting: Google gives preference to sites with a local server IP and country-specific domain name extension.
143. Safe Search: Search results with curse words or adult content won’t appear for people with Safe Search turned on.
144. Google+ Circles: Google shows higher results for authors and sites that you’ve added to your Google Plus Circles
146. Domain Diversity: The so-called “Bigfoot Update” supposedly added more domains to each SERP page.
147. Transactional Searches: Google sometimes displays different results for shopping-related keywords, like flight searches.
148. Local Searches: Google often places Google+ Local results above the “normal” organic SERPs.
149. Google News Box: Certain keywords trigger a Google News box:
150. Big Brand Preference: After the Vince Update, Google began giving big brands a boost for certain short-tail searches.
151. Shopping Results: Google sometimes displays Google Shopping results in organic SERPs.
152. Image Results: Google elbows our organic listings for image results for searches commonly used on Google Image Search.
153. Easter Egg Results: Google has a dozen or so Easter Egg results. For example, when you search for “Atari Breakout” in Google image search, the search results turn into a playable game (!). Shout out to Victor Pan for this one.
156. Authority of Twitter Users Accounts: It’s likely that Tweets coming from aged, authority Twitter profiles with a ton of followers (like Justin Bieber) have more of an effect than tweets from new, low-influence accounts.
157. Number of Facebook Likes: Although Google can’t see most Facebook accounts, it’s likely they consider the number of Facebook likes a page receives as a weak ranking signal.
159. Authority of Facebook User Accounts: As with Twitter, Facebook shares and likes coming from popular Facebook pages may pass more weight.
160. Pinterest Pins: Pinterest is an insanely popular social media account with lots of public data. It’s probably that Google considers Pinterest Pins a social signal.
161. Votes on Social Sharing Sites: It’s possible that Google uses shares at sites like Reddit, Stumbleupon and Digg as another type of social signal.
162. Number of Google+1’s: Although Matt Cutts gone on the record as saying Google+ has “no direct effect” on rankings, it’s hard to believe that they’d ignore their own social network.
163. Authority of Google+ User Accounts: It’s logical that Google would weigh +1’s coming from authoritative accounts more than from accounts without many followers.
164. Verified Google+ Authorship: In February 2013, Google CEO Eric Schmidt famously claimed:
“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results.”
Verified authorship may already be a trust signal.
165. Social Signal Relevancy: Google probably uses relevancy information from the account sharing the content and the text surrounding the link.
166. Site Level Social Signals: Site-wide social signals may increase a site’s overall authority, which will increase search visibility for all of its pages.
167. Brand Name Anchor Text: Branded anchor text is a simple — but strong — brand signal.
168. Branded Searches: It’s simple: people search for brands. If people search for your site in Google (ie. “Backlinko twitter”, Backlinko + “ranking factors”), Google likely takes this into consideration when determining a brand.
169. Site Has Facebook Page and Likes: Brands tend to have Facebook pages with lots of likes.
170. Site has Twitter Profile with Followers: Twitter profiles with a lot of followers signals a popular brand.
171. Official Linkedin Company Page: Most real businesses have company Linkedin pages.
172. Employees Listed at Linkedin: Rand Fishkin thinks that having Linkedin profiles that say they work for your company is a brand signal.
173. Legitimacy of Social Media Accounts: A social media account with 10,000 followers and 2 posts is probably interpreted a lot differently than another 10,000-follower strong account with lots of interaction.
174. Brand Mentions on News Sites: Really big brands get mentioned on Google News sites all the time. In fact, some brands even have their own Google News feed on the first page.
175. Co-Citations: Brands get mentioned without getting linked to. Google likely looks at non-hyperlinked brand mentions as a brand signal.
191. Meta Tag Spamming: Keyword stuffing can also happen in meta tags. If Google thinks you’re adding keywords to your meta tags to game the algo, they may hit your site.
Off Page Webspam Factors
192. Unnatural Influx of Links: A sudden (and unnatural) influx of links is a sure-fire sign of phony links.
193. Penguin Penalty: Sites that were hit by Google Penguin are significantly less visible in search.
194. Link Profile with High % of Low Quality Links: Lots of links from sources commonly used by black hat SEOs (like blog comments and forum profiles) may be a sign of gaming the system.
195. Linking Domain Relevancy: The famous analysis by MicroSiteMasters.com found that sites with an unnaturally high amount of links from unrelated sites were more susceptible to Penguin.
196. Unnatural Links Warning: Google sent out thousands of “Google Webmaster Tools notice of detected unnatural links” messages. This usually precedes a ranking drop, althoughnot 100% of the time.
197. Links from the Same Class C IP: Getting an unnatural amount of links from sites on the same server IP may be a sign of blog network link building.
198. “Poison” Anchor Text: Having “poison” anchor text (especially pharmacy keywords) pointed to your site may be a sign of spam or a hacked site. Either way, it can hurt your site’s ranking.
199. Manual Penalty: Google has been known to hand out manual penalties, like in the well-publicized Interflora fiasco.
200. Selling Links: Selling links can definitely impact toolbar PageRank and may hurt your search visibility.
201. Google Sandbox: New sites that get a sudden influx of links are sometimes put in theGoogle Sandbox, which temporarily limits search visibility.
202. Google Dance: The Google Dance can temporarily shake up rankings. According to aGoogle Patent, this may be a way for them to determine whether or not a site is trying to game the algorithm.
203. Disavow Tool: Use of the Disavow Tool may remove a manual or algorithmic penalty for sites that were the victims of negative SEO.
204. Reconsideration Request: A successful reconsideration request can lift a penalty.
About the Author
Brian Dean is the founder of the popular SEO training blog, Backlinko.com. His site showcases insanely practical strategies that will help you generate higher rankings, more traffic and increased conversions.
Think of keywords as billboards. A billboard advertises your business to all the real world traffic that passes by on the street. A keyword ranking advertises your business to all the cyber traffic that passes by Google for the keyword you rank for. In the real world there are many factors that go into who sees a billboard. One of the advantages of cyber traffic is that it is much more targeted. You may put up a billboard selling beer and 50% of the people passing the billboard are under 21. That is terrible targeting.
High conversions and Return On Investment rely on the ability to target and even hyper-target potential customer’s. Now because the cyber world is constantly collecting data, you will know ahead of time the amount of traffic to expect when you are targeting keywords online. Google gets X amount of searches per month for each individual keyword. If you are on the first page for that keyword, you will get a percentage of that traffic. The more keywords you rank for the more traffic your site will receive.
Long Tail vs. Short Tail Keywords
A long tail keyword is a phrase that is made up of 3 or more words. “Buy Cheap T.V.’s Online” or “Boston Car Accident Attorney” are examples of long tail keywords. These keywords generally have less search volume, but also less competition. Long tail also have higher conversions than short tail keywords.
Low Search Volume
A short tail keyword is a phrase that is made up of less than 3 words. “Accident Attorney” or “Buy Gold” are examples of short tail keywords. They generally have high search volume and high competition. They tend to have a lower conversion rate than long tail because they aren’t as targeted.
High Search Volume
These rules are not always the case, but more often than not this is what you can expect.
Buyer Keywords vs Browser Keywords
A buyer keyword is a keyword that has commercial intent. Commercial intent is when somebody has the intention to spend money rather than browse. A browser keyword is a keyword that somebody would use to find information or something free. This keyword has little commercial intent so the person is not likely to buy. They are more likely to browse and get all their information, then buy at a later time. If your goal is to make money online you should focus on buyer keywords with commercial intent.
“Buy iPhone 6”
“Canon 2500 prices”
“Tablets for sale online”
“Free software downloads”
“iPhone 6 information”
“Batman movie free download”
Ranking Different Platforms
When it comes to rankings there are numerous different platforms that you can use to achieve top results in the search engines. Some people favor one over the other, but the fact is you can rank all of them in Google.
Hangouts are videos that are created live and uploaded onto Youtube. When it comes to Hangouts and regular Youtube videos or Youtube Live, keep this in mind. Youtube has a Domain Authority of 100. Google favors products that are owned by Google. Youtube and Hangouts can be used to achieve faster rankings than could ever be accomplished with a website. A video is 60X more likely to hit the 1st page of Google than a website. The biggest disadvantage with these type of rankings is they can be flagged and removed very fast and for no good reason. In competitive niches videos are regularly flagged by competitors. So the best strategy is fly under the radar or make a lot of accounts and spread the videos out among them.
Websites come in all designs, niches, platforms and sizes. With the right website you can rank for virtually any keyword you want. As long as you do the proper on site optimization and link building. Once you can rate the competition, you will know what you can rank for with the site you have. WordPress is the most popular platform. Websites are much harder to get penalized than a video is. There is negative SEO going on, but you have a window of time before you are hit by Google. Where as a video can be removed in 1 day, a website would take closer to a month due to indexing factors of backlinks.
Amazon listings are good to rank if you are selling a product on Amazon. You can rank the listing in Amazon, but you can also rank the listing in Google. If you have a product page ranked in Google it’s a good way to earn affiliate commission or sales depending on the product.
How to Find Keywords
With so many keyword tools out there. How do I know which one if the best and which one will work for me?
Well, the only keyword tool that anyone needs is provided for free by Google, http://google.com/sktool/. This tool performs a number of functions that help when doing keyword research. See below.
1. Pulling Search Volume – First and foremost you need to know how many searches a keyword is getting a month. Without this info you’re pretty much in the dark. This is where most experts will tell you to start when looking for specific keywords that are going to target. Simply put your keyword in the search button of the keyword tool and click search. Google will return the search volume for the keyword that you are researching.
2. LSI Keywords – Who better to tell you the keywords that Google looks at as similar, than Google. This is where broad match comes in if you are looking for LSI keywords for articles, websites, or any other type of rankings. Use the keyword tool provided by Google to find the similar terms. Just put your keyword in the box and click search. You will get a list of keywords that are related to your keyword.
3. Scraping Keywords – Have you ever wanted to find out what keywords your competitor is pulling up in Google for? Now you can. Just enter the website that you want to pull keywords from in the website field of the keyword tool. This will pull the PPC, Search Results, and On page keyword combinations for the website you are analyzing.
The only research that this tool isn’t great for is keyword generation. For that, I would suggest another free tool http://keywordshitter.com/. You just input a few keywords and it generates a list for you.
There are a ton of ways to use the Google keyword tool. Just experiment and see what you come up with.
Latent Semantic Indexing
LSI or Latent Semantic Indexing is a technical term for Google being able to associate like ideas together without the actual connection being there. A good example of this is if you type in “When was George Washington born?” The first result is a Wikipedia page about George Washington. Although the term “When was George Washington born?” doesn’t appear on the page. Google understands that this page has the information you need through a series of associations made about the website.
Google, through LSI also sees some words as the same:
Top – Best
Lawyer – Attorney
Price – Cost
Accident – Wreck
These rules aren’t hard and fast as the results will change, but generally similar terms will be returned in the results.
Broad Match vs. Exact Match
Keyword competition research is an area that a lot of marketers struggle with. Sorting all the different numbers out can be confusing especially if you are just starting out. One of the big reasons is that they don’t understand the difference between broad match and exact match results when searching in Google. Knowing how to pull this number will definitely help you out when trying to determine the actual competition of a keyword that you want to rank for.
If you type the term Car Accident Lawyer Boston without quotes you will get 487,000 results targeting that keyword. Everything that relates to this term is being pulled in the results. Which means that initially it looks like you will be going up against almost 500,000 competitors.
Now if you put the keyword in quotes “Car Accident Lawyer Boston” you will get back 57,500 results. Now that’s not as bad as the first result, but that still seems pretty competitive. Now, click on the last page of search results on the bottom of Google. You will know when you hit this page because you will get a message that states the following.
You get 85 results back. This is your real competition for that keyword. Dig deeper than the initial returned results to get the real competition for a keyword.
When looking at search volume in the keyword tool, remember that Google is set to Broad Match by default. You have to check the box under keyword options that says “Only show ideas closely related to my search terms“. Or click on the tab that says “keyword ideas” next to the “ad group ideas” tab. Your keyword will be at the top separated from the other keywords. This will give you a more accurate search volume number.
How To View Competition
There are many different levels of competition. With the right tools you can get an accurate number that will tell you the amount of competition involved in ranking the keyword. There are really only 2 things you need to consider when checking out the competition of a keyword.
1. Real Number of Competition – The example I gave you above explains how to identify the real number of competitors fighting for a keyword. This number is extremely important. The broad match result number is virtually worthless. If you know how to target a keyword the only number you need are the other websites ” actually” targeting that keyword. Some experts say to target keywords with less than 400 real competitors, for beginners. This number used to be much higher but Google has eliminated mass amounts of duplicate pages, so it is much lower today than it was in the past.
2. Authority of Competition – There are different ways to analyze the authority of the competition. You need to have a realistic grasp on the authority of the sites you are going up against. Even if there were only 10 competitor results for a keyword, if they are all authority sites it will be difficult to take over those rankings. Two of the most popular tools are Majestic SEO and SEO MOZ. Both of these tools use backlinks to grade websites on a scale of 0-100. A decent Majestic score for an average website is CF 20 TF 10. A decent MOZ score would be DA 20 PA 20. Let me explain how the metrics are set up for each tool and then I’ll explain what they are used for.
Citation Flow – This metric focuses on the number of websites that link to that website. Generally Citation Flow is higher than Trust Flow.
Trust Flow – This metric focuses on the trust of the websites that link to that website, using the idea that trustworthy sites will naturally link to other trustworthy sites.
Domain Authority – DA is the number given to the overall domain and it’s strength when it comes to ranking. A site with a high DA will usually rank for terms that they aren’t specifically targeting or short tail / broad terms. Think of Youtube, Wikipedia, and Yelp.
Page Authority – PA is the number given to the individual page on the website. This predicts how likely it is for that page to rank for a particular keyword. PA seems to be secondary to DA for competitive terms. A combination of high DA and PA and you have a website that will rank for competitive terms in the search engines.
MOZ stats for general competition – Average the top 5 results
Easy (DA|PA 1-20)
This would be a term like “Pawn Shops Sarasota”
Medium (DA|PA 21-30)
This would be a term like “Miami SEO service”
Difficult (DA|PA 31-60)
This would be a term like “Buy Gold”
Extremely (DA|PA 61-100)
This would be a term like “Buy Health Insurance”
Strengths and weaknesses
Majestic is very good for finding high quality domains that are healthy in Google. It can be a little slow to index backlinks and off the mark on some search engine rankings.
MOZ has pretty much nailed competition in Google. The problem with MOZ is that a website can have a DA 60 PA 50 and be spammed to death or even deindexed from Google.
If you can, use both of the tools in conjunction with each other. The link below is for the free MOZ toolbar.
Check out this video with the MOZ numbers analyzing competition for different niches.