Styling Dynamically Created Content

Donate to Joseph Ohler, Jr.!

After searching the Internet for a comparable article, I concluded the following explanation is unique and hence worth posting (as opposed to re-explaining what has already been explained very well on other websites or blogs). The topic of this post is how I found out the specific configuration of cascading style sheet (CSS) attributes for my PHP-generated holiday-themed order form. For those new to PHP, it is a recursive acronym: Php Hypertext Preprocessor.

First, I made a version of the dynamically generated content without styling to identify a rough range of style values necessary to reposition the elements to where I would like them to be. For my holiday-themed web-to-lead form, I use a sideways “Merry Christmas” graphic and a sideways “Happy Holidays” graphic. Unstyled, the images are displayed exactly in the order they are produced, e.g. immediately before the form.

<?php

echo “<img id=’christmas’ src=’img/Merry_Christmas.gif’ />

<img id=’holidays’ src=’img/Happy_Holidays.gif’ />

<form name=’order_form’ id=’order_form’ action=’http://buymystats.com/order_received.php&#8217; method=’post’>

[A bunch of form controls]

</form>”;

?>

Embedded image: http://joeohlerjr.com/img/BuyMyStats_Order_Form_without_Styling.jpg

BuyMyStats.com order form without styling

BuyMyStats.com order form without styling

Because I want the “Merry Christmas” graphic to appear left of the form and the “Happy Holidays” graphic to appear right of the form, I then wrote style attributes to change their positions. This had the effect of placing the images to where I wanted relative to the document but bumping down the form by the height of the side-by-side images. The images are displayed above the form almost exactly in the order they are produced, but with a considerable space between due to the inclusion of style attributes shifting the location of those images prior to display.

<?php

echo “<img id=’christmas’ src=’img/Merry_Christmas.gif’ style=’position:absolute; left:72px; top:170px;’ />

<img id=’holidays’ src=’img/Happy_Holidays.gif’ style=’position:relative; right:-480px; top:132px;’ />

<form name=’order_form’ id=’order_form’ action=’http://buymystats.com/order_received.php&#8217; method=’post’>

[A bunch of form controls]

</form>”;

?>

Embedded image: http://joeohlerjr.com/img/BuyMyStats_Order_Form_with_Image_Styling_Only.jpg

BuyMyStats.com order form with image styling only

BuyMyStats.com order form with image styling only

Finally, I wrote style attributes for the form such that its position was bumped up into the space between the images. Everything is now as it should be! The images are displayed to the sides of the form as if the form were produced just after the “Merry Christmas” graphic to its left and immediately before the “Happy Holidays” graphic to its right.

However, a peek at the source code reveals that both images were called before the form was, and the prior screen captures of the unstyled form prove this order of production. The modification of position attributes was possible due to writing the inline style sheet within the element tags; imported style sheets miss the dynamically generated content because the content does not exist when imported style sheets are applied during initial page load.

<?php

echo “<img id=’christmas’ src=’img/Merry_Christmas.gif’ style=’position:absolute; left:72px; top:170px;’ />

<img id=’holidays’ src=’img/Happy_Holidays.gif’ style=’position:relative; right:-480px; top:132px;’ />

<form name=’order_form’ id=’order_form’ action=’http:// buymystats.com/ order_received.php’ method=’post’ style=’position:relative; left:0px; top:-110px;’>

[A bunch of form controls]

</form>”;

?>

Embedded image: http://joeohlerjr.com/img/BuyMyStats_Order_Form_with_Image_and_Form_Styling.jpg

BuyMyStats.com order form with image and form styling

BuyMyStats.com order form with image and form styling

I used an inline style sheet because inline attributes of a dynamically generated tag are applied upon creation of that tag; there is no need for communication to or from a style-switching function. If you attempt to apply an imported style to page elements which have not yet been written to the output file, e.g. dynamically generated within the HTML document, then the style will not be applied because imported styles are applied upon initial page load and not to any elements which are produced after page load via the JavaScript “document.write()” function or by the PHP “echo” function. It is therefore easiest to code your intended style attributes within the output function to guarantee the HTML tags generated by that function are modified immediately by those style attributes.

Update: Although I have taken the holiday light show off my main page as of January 8, you may enjoy it here.

Donate to Joseph Ohler, Jr.!

Hit Holiday Single!

National Day of Code, Learn to Code, Learn Coding, Cascading Style Sheets Primer, CSS Tutorial, Advanced CSS, Positioning and Styling, Practice Code, Practice Coding

52 Reasons Google Penalizes Websites

 
Donate to Absurd Job Vacancies!

 

This Non-
Monetize-able
Space Is NOT
for Rent!

(Reverse Psychology)

Don't even bother with college unless you have these 3 intangibles!

07-10-2014 Update:
The Panda 4.0 update to the Google search algorithm has worked its way through most of the non-zero Page Ranked websites. Due to the refinement of keyword-specific penalties included in the 4th major iteration of the Panda filters, sites that are already on the bottom SEO-wise are expected to remain there unless manually updated.

Alexa Rankings were bouncing all over but have settled into a jiggle, as the constant influx of new sites and the hundreds of minor modifications made by search engines make a steady placement somewhat rare. Nonetheless, the findings in this article appear to remain technically sound and practically effective.

12-10-2012 Original:
Determining website relevancy is a zero-sum game: What is more relevant to one cluster of keywords must necessarily be less relevant to keyword clusters having an opposite or nearly opposite semantic meaning. Thus lies the double-edged sword of search engine algorithms!

Opinions abound on the impact Google has had on the Internet. Some say it is on an ambitious, pseudo-insane mission by aspiring to give users access to unique content, accurate data, precise information, cool merchandise, and the finest writers. Google periodically modifies and meddles with its algorithms in hopes that the best of the web gain the exposure they deserve.

Others decry the loss of human-curated directories such as the original Yahoo! Directory. Who better to evaluate the relevancy of a website to a particular set of human interest topics than an interdisciplinary team of human experts? Unfortunately, the vast expansion of Internet content means automation was bound to make inroads.

Instead of having your website sorted alphabetically within each germane categorical topic, you are now subject to a labyrinthine series of content analyzers that unit-test your web pages against potential topics of semantic association. Modern web browsing is less like walking through a library and more like using a lexical magnet to retrieve ambient content.

History of the Google Penalty

Although live since October 1998, Google did not begin changing its ranking algorithm until December 2000. It also marked the debut of its toolbar extension, thereby setting off the annoying fad of every Internet search portal including a toolbar as part of its promotional efforts.

Google’s most aggressive modification to date of its Page Rank algorithm debuted in 2012. Code named “Penguin,” this edition of the Google algorithm is alleged to have penalized over 10% of page listings in just one day. Naturally, everyone who fancies oneself an “SEO professional” has been hemming and hawing over the best way to exploit Google’s latest nuances. If a future update forces them to switch gears again, then more work (and money) for them to help panicked clients!

Identifying a Google Penalty

Available in both manual and automatic, the so-called “Page Rank penalty” is Google’s means of encouraging conformance with its prejudiced worldview of how the Internet should be sorted. While Google publishes previews of the major changes in its algorithm updates, it rarely comes clean about all of its reasons for changes.

Google has a policy of emailing site owners to whom a manual penalty has been applied. But because algorithmic penalties are an ongoing calculations of many variables, it would cause excessive mail server activity if each site owner were to receive an email every time anything at all impaired their website’s Page Rank.

Clear indicators of algorithmic penalties include the following:

1) Your website is no longer on the first SERP when Googling your brand name. If you’re not #1 for your own website name — especially if your website name is one-of-a-kind — then chances are your site has been punished.

2) You’ve maintained valid hyperlinks and continually fresh content on your website, but its PageRank has fallen. Hitting zero means Google definitely deep-sixed your site!

3) You find a listing for your website — but it is for a page other than your homepage. Inbound links from external websites having good Page Rank may have bolstered individual pages on which you have popular content — in contrast to the rest of your website, which has suffered the brunt of the Google penalty.

4) Your website is nowhere to be found when you search for “site:[yourURLhere]” or “inurl:[yourURLhere]”.

5) Your website no longer appears in cached search results despite your drop in SERP being recent.

Because so many site administrators worship at the Temple of Google — mistaking it for the be-all, end-all of search engines — here I lay out the petty parlor tricks of the wacky wizards behind the “Big G” curtain: Any of the 52 following tactics will peg you down on Google, which has nothing better to with its engineers’ lives than to continually tweak and revise the way it indexes content. (They are literally paid to be busybodies! And I’m certain their gold-digging spouses enjoy the lifetime pension they get just for hitching up with a Google employee.)

Causes of Automatic
Google Penalties

A comprehensive review of strategic goofs, design gaffes, and other faux pas reveals 52 distinct ways to alert Googlebot that your site is “gaming the algorithm.” These various errors may be more or less categorized into 4 classes, each of which I address in turn.

Email this article to your friends — and shield it from your competitors!

Content Clustering

1) Word-for-word duplication of page content. Search Google for verbatim copies of your content before you publish; at least change a few words here and there if you’re set on going live with similar content.

2) Stolen content. You might not steal content, but anyone else can steal yours. Whereas most site admins comply with DMCA takedown notices, sometimes you’ll need to weigh the cost of pursuing infringers through the courts against the time and expense of doing so.

    The plagiarization might not be verbatim, but if Google notices enough to penalize your Page Rank, then you may as well ask it to remove the duplicate content. Time stamps come very much in handy for showing your copy was first!

3) Clustered links in your footer. Don’t cram a bunch of links in the footer — or near the end of a page — because Google interprets this as a last-minute effort to append links onto otherwise unrelated content. So if you’re publishing citation-heavy research online, then consider using a different web page to list each chapter’s footnotes.

    Using endnotes can really deep-six your Page Rank because you would have tons of hyperlinks on one page with only a few lines of text separating each. For the sake of Google’s algorithm, a page of hyperlinks is pretty much the same as cluster-linked footer.

4) Too many meta keywords. The general consensus under Penguin is to include no more than five comma-delimited strings, totaling no more than 140 characters.

5) Identical meta tags across pages. Although unifying the theme of your site in theory, the reality is Googlebot sees this as failing to differentiate descriptors for discrete pages from your homepage descriptors. Be sure to change the meta data before pushing out any new content. Although some repetition is inevitable as your site grows, maintain a table that shows which pages were published when and with which meta keywords.

6) Syndicated content. Unless your website is the only authorized publisher of the online counterpart to content originally published offline — such as the digital edition of a print newspaper — you should avoid content syndication because this produces duplicate online content. And whether the identical content is stolen or syndicated, Google takes away points for the redundancy.

7) Non-HREF portions of anchor tags are overused. The inner HTML and title text of anchor tags should be no more than 60 characters each. The inner HTML between the opening and closing tags should differ from the title attribute value of the same pair while communicating a similar idea within the same language. Avoid chunking the same keywords more than once every 1,000 characters.

8) More than one H1 tag every 4,096 characters. Through extensive testing of H1 frequency on long pages of text, Joseph Ohler, Jr. has inferred the golden ratio as one pair of H1 tags per 4KB as the maximum density that will not cause penalty. If you place H1 tags more frequently than once every four kilobytes, then prepare to see your Page Rank dip.

9) Keywords form awkward sentence fragments. Similar to the H1 rule, Google doesn’t want many semantically similar phrases adjacent to each other, or else they will be counted as keywords close enough in the markup to be “clusters” and therefore ripe for penalty. Ad copywriters are most guilty of this!

    Brainwashed by the “more is less” mentality, most copywriters cram everything into a few sentences such that keyword density cannot help but be high. Copywriting is truly the most doggerel form of “writing!”

    So if you’re writing text for a skate sharpening website, then avoid mentioning “figure skates,” “hockey skates,” and “ice skates” in the same sentence because the triplicate of any word that isn’t an article, pronoun, linking verb, etc. will trigger a penalty. Similarly, a list of cities served should be separated into passing references in narrative form, as if your site visitor were reading an excerpt from a novel.

10) Recycled content. Although not quite duplicates of other content such as stolen or syndicated material, reusing older content on newer pages is akin to hosting your own content farm. Even if you change a few words in each paragraph, modern search algorithms are sophisticated enough to match by word frequency in addition to exact string matching.

11) Gibberish, nonsense characters, and other text that doesn’t match Google’s semantic database on your site’s base language: All these give the appearance of hacked content.

    A surefire way to make your website be marked as a hacker’s hangout is if your page uses Trojan executables, ActiveX controls, Flash objects, and never-ending JavaScript loops to compromise CPU memory and effectively freeze the browser session.

12) Use of multiple homepages, indices, and default landing pages. Google wants website owners to publish only one greeting page because to have several or many would create more opportunities for misleading landing page text. Either you make the most of your homepage content, or you dilute your website’s apparent purpose and usefulness in the opinion of the Google algorithm engineers.

13) Over-optimization. Obeying Google’s SERP sorting rules to extremes results in sub-optimal search results due to adoption of intended behaviors by marketers who minimize value-added content on their pages but are really good at following the rules. Google wants the greatest emphasis is on on the content itself — rather than on how content is arranged — and therefore punishes websites that inflexibly apply its algorithmic principles.

14) Spam-associated keywords. If your website contains phrases often used by marketers such as “[earn | make | save] more money,” “[live | look | feel] better,” and “grow larger,” then you will need to compensate for the inevitable downgrade to your SERP that such words produces. Experiment with less spammy messages such as euphemisms.

15) Glitch-ridden mobile websites. A browser’s inability to detect mobile device features may result in failure to redirect users — and therefore in Googlebot missing your mobile website completely, especially if this glitch is present when your main website is accessed through Chrome!

    If you continue to have issues with mobile detection, then consider having a mobile-only landing page — but cap your number of landing pages at two (desktop and mobile) to minimize your “redundant homepages” penalty.

16) Images outnumber text blocks. Your “img” tags should number less than one-half of every 280-character block of non-img HTML. Why 280 characters?

    Because this equates to five 70-character lines of code, a line length you typically see in non-maximized Notepad. You therefore have a standardized way of measuring textual quantity without having to count pairs of “p” tags, “br”-separated lines of text, etc.

    Also remember that “alt” text does not count towards either side of the image/text ratio; it is ignored in Googlebot’s tally of your img/non-img density. Whether or not the abundant images are advertising, they cause a SERP penalty unless counter-balanced by internally consistent text.

Number and Type of Inbound Links

17) Many links from sponsored websites. Even if you haven’t paid for any sponsored inbound links, be vigilant for links from too many directories. A few links from high-traffic portals are desirable for flow-through, but toe the line on using Craiglist, Ebay, or other easily proliferated pages to spam towards your site.

18) Dead links between pages in your site. Also known as “internal 404 errors,” any hyperlink that requests a resource on the same domain without resolving it to a valid URI results in punishment to your site’s placement in Google searches.

19) Links between sites in different languages. Some say Google doesn’t want you to be multicultural, so “do no evil” and stick with one language for your inbound links. “Take that, globalists!”

    But the truth is that different languages use different character sets in their URLs, and Google doesn’t want to transcode these unless it absolutely needs to because matching and resolving character equivalents on UTF-8 tables takes a lot of processing power on the server end.

20) More than a few inbound “rented” links. Unlike a permanent directory, some ads link to your website for a relatively brief period and then sever the connection. When the “here today, gone tomorrow” links accumulate, Googlebot detects you’re gaming the system and marks you down.

    The easiest solution is to boycott periodically erased inbound links. Another valid, but more time-consuming, solution is to purchase a few rented hyperlinks for very long periods (in the online sense of time) such as two or more years. This ensures longevity in the links, so long as you keep tabs on the click-throughs to ensure the advertising service is delivering for the entire length of the contract.

21) No designated sitemap. Although typically of trivial for use for most users — especially if you embed a Google search box on each page — search bots rely on the XML sitemap to parse your site’s structure and assign importance to layers of nested pages accordingly.

    This tends to be a penalty from the start for many sites because it is easy to overlook this detail when basking in the satisfaction of finally populating all the subsection pages; a sitemap with multiple 404 errors would be negated or worse for SEO purposes.

    Whether or not you expedite recognition of your sitemap through the Google Webmaster sitemap submission form, search engines will catch on within a week of your sitemap being created or updated — rewarding you accordingly. Sites whose XML maps are deleted will decline at least a few spots in the Alexa Rank unless immensely popular.

22) Too many new links at once. Googlebot compares your present number of hyperlinks to your most recent number — perhaps also to even earlier counts as a measure of long-term change to supplement its calculation of your short-term bump. This applies to both inbound and outbound links, although Google penalizes a growth of inbound links more severely because it indicates a pollution of the wider Internet.

    Once again revealing the occluded, Google reverse-engineer extraordinaire Joseph Ohler, Jr. has devised the following rule of thumb for a Google-safe rate of link building: An increase of more than 10 percent of your total link volume — or of more than twelve links, whichever is greater — is the threshold for a Google link penalty; progressively more severe punishment is doled out according to any link creation over that within the same three-day period.

    For a self-promoting example, links were gradually added to this article on an ongoing basis. When something noteworthy came up after the original publication date — such as the SME consulting book Small Business Supercharged — the author made certain to re-populate the article page accordingly.

23) Manual reporting of your website as spam. The most efficient means of cutting someone down via black hat SEO is to post spammy links to your website in various online forms — none of which I’ll list here, for obvious reasons.

    Enough careless insertions of your website URL in places where it violates Terms of Service or otherwise annoys human users will result in it being reported manually to the forum owner and to the spam page in Google Webmaster.

24) Links in forum signatures. Although hyperlinks in your email signature do not cause an SEO penalty due to being inaccessible to Googlebot, such recurrent links when you post email-type “bulletin board” or “thread” messages online will be scanned and marked as duplicate.

    When this happens among pages, irrespective of whether the topic threads have very different keywords, the sorting algorithm will negatively factor these hyperlinks as shoehorned into a discussion and therefore unrelated to the page content.

    The “rel=’nofollow'” trick is not a solution on its own because links will still display as searchable text and therefore counted in keyword frequency. The combination of ‘nofollow’ and URL obfuscation — transcoding your human-readable link into a quasi-random string of characters, usually shorter than a deep link — does ward off an SEO penalty because there are no site-related keywords for Googlebot to read. By injecting “rel=’nofollow'” and obfuscating hyperlink text, Twitter provides a helpful means of distributing access points to your web page.

25) Incorrect language designation. Putting foreign terms in your anchor text, when the surrounding non-linked text is of domestic tongue, indicates deception in the worldview of Google engineers. This penalty applies irrespective of whether you use the ‘hreflang’ attribute within your anchor tag.

26) Inability to establish a site connection. This differs from a 500-type HTTP error: Whereas a numbered HTTP error indicates the website has only some temporarily unavailable URIs, a non-response when your browser pings the site indicates a complete shutdown.

    Common causes include distributed denial of service (DDoS) attacks, server software hangs, and physical destruction of server disks. This is perhaps the most salient reason why site owners and webmasters look for an itemized list of protections against these maladies when examining hosting services that claim “99.9% uptime.”

27) Load times greater than 10 seconds. Other than annoying users and raising your “bounce rate” of visitors who leave without clicking on subsequent links within the site, longer loading periods subtract from your Page Rank, proportionate to how much of the 60-second connection request timeout elapses prior to the document layout being rendered.

28) Domains containing penalized keywords. The fear of accidental anchor text stuffing by linking to a site too often is overblown. The absence of vice-related keywords in your inbound links is far more important because Google designed its search algorithm to extract a vice tax in the form of a roughly five percent overall penalty when links to your page include “cash,” “money,” or other blatantly commercial words. Example: “Shop and buy here, you sheep!”

29) Temporary links from unrelated websites. Whether from pay-to-click (PTC) ads or from “site of the month” directories, temporary links may boost your Alexa Rank a few spots, so long as you either maintain the links indefinitely by purchasing longer “link rental” periods or wait a few months before re-activating those links.

    Otherwise, it will appear to Googlebot that you are receiving intermittent semantic connections from other websites that are abruptly cut off, a linkage pattern that mirrors the creation and removal of spam. And a resource that fails to load within 60 seconds will return a 500 error — so keep your scripting and embedded objects per page to a minimum!

    SEO experts advise having no more than 4MB of data on a desktop-optimized page and less than 200KB of data on a mobile page. The file size restriction on documents designed for rendering in viewports also helps mobile visitors remain within their data plan restrictions.

Number and Type of Outbound Links

30) Invalid outbound links. Even a few hyperlinks to content-less URIs will trigger a penalty. Link maintenance is why manually checking your hyperlinks as matching your intended destination should be part of every SEO plan. An automatic ping bot can check for live-or-dead status but not whether the link still directs to the resource it originally did.

31) Question marks and ampersands in many of your outbound links. Although use of variable delimiters is common and necessary for many web forms, the use of such characters in outbound URIs indicates the transmission of variable data to another website. Google presumes most of these situations involves affiliate websites, so use such characterized external links sparingly.

32) The same outbound links on multiple pages. Traditional menus of internal links to sections of a website are okay — even necessary to prevent a long load time from having everything on one extremely lengthy page — but relegate each external link to occasional frequency.

33) Duplicate hyperlinks and nonsense text in your comment section. Don’t rely on automatic spam detection to screen out such content; check each comment-able page at least weekly for possible encroachments upon the quality of your commenters’ content. Although visitors may own the rights to the comments they post, search engines hold you accountable for letting that content stand.

34) External links to sites that have poor SEO. Although inbound links from sites having a bad reputation — such as file-sharing sites and those rated as adult content — are largely beyond your control, it is critical to have zero outbound links to such domains. “Guilty by association” is what search engine algorithm engineers believe, so keep your nose clean.

35) Text that is heavily loaded with, and has multiple links to, a particular brand name, organization name, or other pronoun. Whether or not the in-depth article on that sufficiently narrow subject — as indicated by the scope of the article being limited to frequent mentions of that pronoun but not to other identities — Googlebot infers such a page is an “advertorial” or opinion piece about a product.

    This classification can happen even if the page content is more like an anti-advertorial such as advocating the anti-university crusade. And NO — guest bloggers are NOT a cause for penalty reduction, so long as their text is posted through your user site’s user handle and not reproduced elsewhere.

36) Use of redirection for more than a few days. Although a 301 redirect is acceptable to Googlebot for up to 72 hours, it will punish perceived laziness on behalf of the site administrator for not updating inbound links to instead reference the new target URL. And any recent penalties to the redirected source site may be applied to the target site! Proceed with caution.

37) 500-class HTTP codes. Although not as penalizing as the dreaded 404 error or transferring one site’s penalty to
another like the 301 redirect does, miscellaneous errors — anything other than code 200 — will factor into Google’s determination of how far down the SERP you go.

    The most common error that occurs when your resources are present but unavailable. Too much traffic at once will occupy all ports on the server hosting your site, so the ‘Net’s busiest websites use one or more mirror servers to deliver the same domain. This is different than a site mirror, which servers identical content at a different domain.

38) Your domain is newly purchased. Besides the penalty from being offline prior to your population of the site, previous owners may have used SEO practices that violated search engine sorting rules at the time. Any red flags that may have seemingly expired could still be present in the form of residual penalties that just won’t go away until years have elapsed. That doesn’t mean the site isn’t worth having, though.

39) Ill-intentioned inbound links. The most famous example — since made more difficult due to algorithmic tweaking — is the so-called “Google bomb” in which mischief makers put unflattering words into the anchor text of one or more hyperlinks pointing to your website. The result was that the targeted website would appear when users Googled the keywords contained in the deviously devised hyperlink.

    If your website shows up in searches for questionable search terms, then chances are one or more bloggers, website administrators, social media users, etc. have produced enough hyperlinks to your website containing those very same keywords in the anchor text. Don’t sweat it, as the relevance of those links will decline as time goes by.

40) Participation in a blog network. Wonder why Millennial Blog Ring never took off? It’s because Google sees the reciprocal links and punishes your site accordingly.

41) Linking from a “content farm” or other website with dozens of interlinked articles will knock your website down a few pegs in the rankings. This means that too many inbound links from blogs will penalize you!

    Spammy copy gives you spammy SERPs. If you’re that hard up for original content, then hire an aspiring writer to be featured prominently on your website so that he or she will provide heart-felt content that is adequately researched and therefore substantive.

42) Infrequent outbound links. Wanting to promote ‘Net citizenship, Google penalizes websites having far fewer anchors to other domains than to one’s own pages. An inspiring entrepreneur has identified the minimum non-penalty outbound/inbound ratio of links to be 1:8. Your Page Rank won’t suffer too greatly if you have a slightly lower ratio such as 1:10, but your Alexa Rank may show slippage.

    It is possible for a webmaster to own multiple websites and selfishly link only to his or her domains without penalty, especially if the registration for those domains is private and therefore un-knowable by Googlebot or its overseers. So long as you don’t duplicate content or make too many hyperlinks reciprocal between pairs of specific pages, your hyperlinking should be penalty-free.

43) More than twice the outbound links per page than inbound links. Whether you operate a product catalog, a news portal, or some other type of website, Google presumes the most relevant websites are linked from enough external pages to equal at least half the number of links to other sites.

    Otherwise, the site treads closely to the “link farm” designation on the sliding scale of “organic” versus “link grid.” Trading links with people on unrelated semantic topics — however Google secretly defines those boundaries — implies the links have been exchanged for the sake of mutual SEO benefit rather than for tying similar content together.

44) One-to-one link reciprocity for over a third of your links. Although seen by some webmasters as a fair way to exchange value, Google thinks otherwise and penalizes your site returning the favor to websites that link into yours.

    Through trial and errror, SEO Master Joseph Ohler, Jr. has estimated at least 30 calendar days must elapse between when a site is externally linked and when the target site “pings back” to the source site and creates a reciprocal link. Otherwise, “pro quid pro” and “logrolling” are how Google sees the exchange of links that occur too closely in time.

    The closer your inbound-outbound hyperlink ratio, the lower your percentage of reciprocal links should be — to compensate for sheer volume of calculated co-promotion.

    For example, a website that cumulatively has 10 outbound links, 30 inbound links, and only 10 of those 30 incoming hyperlinks also being destination URLs in outbound hyperlinks will place higher than a comparable, non-clone website (controlling for the “copied material” penalty) having the same link statistics but with 20 of the 30 inbound links sharing URLs with outbound link href attributes.

    Conversely, the second site could raise its inbound links to 40 — while keeping 20 reciprocal — such that half its links are the same source and destination, as in the first site — but the latter site will not take precedence in Alexa and Page Ranks due to having an absolutely greater quantity of non-spam inbound links despite having the same relative percentage of “inorganic” reciprocal links.

45) From Penalties 44, 45, and 46, we may infer another rule: The penalty-free ratio of outbound to inbound links ranges from 1:2 through 1:8; and the corollary: The penalty-free ratio of inbound to outbound links ranges from 2:1 through 8:1.

Compromised Data Visibility

46) Invisible links. If Googlebot infers that a human visitor cannot see a link on your page, then it will penalize your site accordingly. Characteristics of a “hidden” hyperlink include a z-index below that of an overlapping element; anchor text made transparent via an alpha / opacity value of 0; anchor text set to hold the same or similar color value as the background; and anchor text that is positioned absolutely or relatively off the rendered, scrollable body.

47) Inline frames hiding your content. Embedded pages overlying your content may have much or scarce content, but placing inline frames over text or keyword-stuffed image captions communicates deceptive intent. Use inline frames sparingly, and then only in non-lexical pixel areas.

48) Content about circumventing Googlebot. Formerly an urban legend, it has been since experienced in the field that otherwise SEO-optimized websites have been pushed down the SERP for what should be their highest-ranking keywords — or given a Page Rank of “0” altogether. Further examination reveals these almost-perfect websites made the cardinal sin of advocating for the very overthrow of the web’s Google overlords.

    Thinking itself to be the God of the Internet, Google casts out these rebellious websites from the lofty positions they once held in search results or had aspired to achieve. Meanwhile, these fallen websites create their own pecking order that is both informal and infernal, often twisting the Google spokesman’s last name into a vulgar term for the female anatomy.

49) Making outbound links to unrelated pages. Although few non-spammers would link to content not directly related to his or her website, Google interprets links to sponsors as precisely that. Simply add rel=”nofollow” to such outbound hyperlinks and keep them out of your RSS feed.

50) Disallowing indexing. Although you have every right to block search bots from crawling any or all of your website, no more than a tenth of your URIs should lead to non-indexable files. If a file is not confidential, then keep it out of robots.txt.

51) Script-generated links. If a hyperlink is not present as an anchor tag when your page is loaded with scripts disabled, then it shouldn’t be there when your page is loaded with scripting enabled. Google engineers believe dynamically created links should not be used except on pages intended to be kept confidential; non-confidential pages will therefore be treated as if they are hiding something, which they are.

    This helps the web consumer not only in terms of sorting content relevancy but also in minimizing the number of ads and tracking beacons that are loaded via scripted construction of a dynamic URL. Note this differs from linking to an Amazon product page that has one or more variables in its URL; that is an argument-loaded affiliate or “sponsor” link.

52) Hiring any SEO consultant who claims these rules may somehow be circumvented. Anyone who so much implies that your website “could potentially” jump more than a page or two in relevant SERPs, without refactoring the human-readable content of the website, is leading you on. Lead him or her out the door!

Handling a Penalty
with Grace and Aplomb

Once you’ve determined the cause of your penalty, the fix should be implemented incrementally. Don’t go overboard in making sweeping changes, or else Google might punish you for swinging in the other direction! Here are steps to absolve your Google sins:

1) Keep calm, and remember that everyone suffers a penalty now and then. We’re humans, not Googlebots!

2) Appeal any manual penalties for which you can demonstrate unintentional error. Google knows from your update activity whether you’re a habitual offender or perhaps showed bad judgment only once.

3) Contact the administrators of websites linking to your website, and politely but assertively tell them to unlink from your domain.

4) Inform Google that you don’t want certain domains linking to your website. Be stoic about this; it’s not your fault if you petition for link removal and are denied, as Google has the final say.

5) Wait for Google’s nature to take its course. Re-indexing your newly compliant website may take a few business days, especially if you sent an email about what needs review.

It is often easier to wait for Google to modify its algorithm yet again — in hopes it will reverse some of its penalties — than to redirect a highly penalized website onto a new domain. But whatever you do, don’t become too invested in your SEO strategy because it can all come undone tomorrow if Google says so.

If you feel upset towards Google — or any search engine — for its heavy-handed approach to indexing, then politely explain why and how their engineers and quality assurance testers might not have foreseen the difficulties you and others are experiencing. Even the Google Apologists opine how things could potentially improve, and public comment is part and parcel of total quality management.