A Darker Shade of Gray

http://www.seobook.com/darker-shade-gray

Google’s original breakthrough in search was placing weight on links and utilizing them to approximate the behaviour of web users.

The subjective of

However, there’s still much that can be said objectively about the relative importance of Web pages. This paper describes PageRank, a process for rating Web pages objectively and automatically, effectively quantifying the human interest and attention devoted to them. We compare PageRank into an idealized random Internet crawlers. We demonstrate how to efficiently compute PageRank for big numbers of pages. And, we demonstrate how to apply PageRank to search and also to user navigation.

Back when I started at the search match in the event that you wanted to rank you threw more hyperlinks at whatever you wanted to position & employed the anchor text you wished to rank for. A friend (who will remain nameless here!) Used to rank sites for lookup inquiries in major businesses without so much as looking at them. 😀

Suffice it to say, as more folks read about PageRank & heard the effect of anchor text, Google had to progress their algorithms to be able to counteract attempts to manipulate them.

Over the years since Google has increased more dominant they have managed to make many other signs. Some signs might be simple to comprehend & explain, although signals that approximate abstract theories (like brand) may be a little more complicated to comprehend or attempt to explain.

Google owns the most widely used web browser (Chrome) and also the most common mobile operating system (Android). Owning those supplies Google special insights to where they do not have to put just as much weight on a links-driven approximation of a haphazard web user. They can see what users really do & model their algorithms based on this.

Google considers the user experience an important part of their ranking algorithms. This was a significant portion of this heavy push for making mobile responsive web designs.

Along with your money or your life issues Google believes the encounter so important they have an tradition covering the classes (YMYL) and place greater emphasis on the trustworthiness of the user experience. Some algorithm upgrades which have an outsized effect on these groups get nicknames such as the medic upgrade.

Nobody wants to die from a crap piece of health information or a matching service which invites a predator in their home.

The Wall Street Journal publishes original reporting which is so powerful they behave as the missing regulator in several instances.

Last Friday that the WSJ covered the firm practices of Care.com, a firm which counts Alphabet’s Capital G as its main shareholder.

Behind Care.com’s appeal is an assurance to”help families make informed hiring decisions” about health professionals, as it has stated on its website. Still, Care.com mostly leaves it to families to find out whether the caregivers it records are trustworthy. … In about 9 cases over the previous six years, caregivers from the U.S. who had police records were recorded on Care.com and were accused of committing offenses while caring for clients’ children or elderly relatives… Alleged offenses comprised theft, child abuse, and sexual assault and murder. The Journal also discovered countless instances where day-care centers recorded on Care.com as state-licensed didn’t appear to be. … Care.com states on listings it doesn’t support permits, in little gray type in the bottom… A spokeswoman stated that Care.com, like other businesses, adds listings found in”publicly accessible information,” and that most day-care centers around its own site didn’t cover their listings. She said in the upcoming few decades Care.com will begin a program in which it vets day-care centers.

By Monday Care.com’s stock was sliding, which Resulted in prompt corrective actions:

Formerly the firm warned users in little gray type at the base of a day-care centre listing it didn’t verify licensing or certification information. Care.com said Monday it”has made more prominent” that note.

To this very day, Care.com’s homepage says…

“Care.com does not employ any care provider or care seeker nor can it be responsible for the behavior of any care provider or care seeker. … The data found in member profiles, job posts and programs are provided by service suppliers and care seekers and isn’t information generated or verified by Care.com.”

. . .in an darker shade of grey.

Thus far it seems to have worked for them.

What’s your favourite colour ?

Related: Google is currently testing black advertisement labels.

Update: Care.com recently eliminated the majority of the overt low-quality spam from their website.

Care.com, the biggest site in the U.S. for discovering caregivers, eliminated about 72 percent of day-care centers, roughly 46,594 businesses, recorded on its site, a Journal overview of the website shows. Those businesses were recorded on the site as recently as March 1. … Ms. Bushkin said the company had eliminated 45% of day-care facilities in its database, a few that has not been previously reported. She stated the number is different compared to the Journal’s investigation because the business filters day-care center listings in its own database through calculations to”maximize the experience,” adding that the Journal watched only a subset of its complete listings.

Read More

7 Inspiring Content Marketing Examples (And How to Replicate Them)

https://ahrefs.com/blog/content-marketing-examples/

Content marketing could be daunting, particularly to small business owners. It’s easy to find things such as Coke’s”Discuss a Coke” effort  and assume that you can not do content advertising without a six-figure advertising and advertising budget. Https://www.youtube.com/watch?v=_aa4NDD1fyk This could not be further away from the

Read more ›

The article 7 Inspiring Content Marketing Examples (And How to Replicate Them) appeared first on SEO Blog by Ahrefs.

Read More

Google Florida 2.0 Algorithm Update: Early Observations

http://www.seobook.com/google-florida-20-algorithm-update-early-observations

It’s been some time since Google has had a major algorithm update.

They recently announced one that began on the 12th of March.

What changed?

It seems multiple things did.

When Google rolled from the initial version of Penguin on April 24, 2012 (mostly concentrated on link spam) they rolled out an update to an search-engine spam classifier for misdirection.

And, as time passes, it was quite common for Panda & Penguin upgrades to be merged together.

If you were Google & possess the capacity to check under the hood to determine why things changed, you would probably need to obfuscate any important update by altering multiple things at once to make reverse engineering the change considerably tougher.

Anyone who operates a single site (& lacks the ability to look beneath the hood) may have almost no clue about what altered or how to correct with the algorithms.

At the latest algorithm update some websites that were penalized in previous”caliber” updates have recovered.

Though lots of these recoveries are only partial.

Many search engine optimization blogs will publish posts about how they cracked the code over the newest update by publishing charts like the initial one without publishing that next chart showing the wider context.

The very first penalty any website receives may be the very first of a collection of penalties.

If Google smokes your website & it doesn’t lead to a PR incident & nobody actually cares that you are gone, then there’s a very good chance things will proceed from bad to worse to worser to worsterest, theoretically speaking.

“In this era, in this country, public opinion is everything. With it, nothing can fail; against it, nothing can succeed. – Abraham Lincoln

Absent effort & investment to grow FASTER than the broader web, sites which are struck with a single penalty will frequently further accumulate different penalties. It is similar to compound interest working in reverse – a pile of algorithmic debt that must be dug out of prior to the bleeding ceases.

Further, many recoveries may be only a meticulous invitation to false expectation. To put more resources into a website that’s struggling in an apparent death loop.

The above mentioned site which had its first positive algorithmic answer in a few years achieved in part by profoundly de-monetizing. After the algorithm updates demonetized the site over 90 percent, the harm was there in eliminating 90 percent of that which remained to see how it would react? So now it will get more traffic (at least for some time ) but what exactly is the traffic value to a site without any earnings engine attached to it?

That’s ultimately the tricky part. Obtaining a stable stream of traffic while monetizing at a decent return, without the monetizing efforts leading to the traffic disappearing.

A friend who owns the aforementioned site was functioning on connection cleanup & articles improvement off & on for about a half a year without any results. Every month was a little worse than the prior month. It was just after I told him to remove the aggressive ads a few months ago that he probably had some possibility of seeing any type of traffic recovery. Now he at least has a heartbeat of visitors & can start looking into lighter touch way of monetization.

If a website is consistently penalized then the issue may not be an algorithmic false optimistic, but instead the business model of the website.

The more something appears like eHow the more fickle Google’s algorithmic together with receive it.

Google doesn’t like websites which sit at the end of the value chain & extract earnings without needing to bear far greater risk & expense earlier in the cycle.

Thin rewrites, mainly speaking, don’t add value to the ecosystem. Doorway pages do not either. And something that was propped up with a whole lot of keyword-rich low-quality hyperlinks is (generally ) probably really lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the conclusion of the value chain pulling excess profits from markets.

Here is the purpose of the knowledge chart & included snippets. To enable the results to answer the most fundamental queries without third party publishers becoming anything. The knowledge graph function as floating perpendicular that consume a growing share of their value chain & induce publishers to move up the funnel & publish more distinguished content.

As Google adds attributes to the research results (flight cost trends, a hotel booking service over the day AirBNB announced they obtained HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to merge (Expedia possesses Orbitz, Travelocity, Hotwire & a lot of different sites) or add greater value to remain a distinguished & sought after destination (travel review website TripAdvisor was crushed from the change to mobile and also the inability to market traffic, so they eventually had to change away from becoming exclusively a testimonials site to offer event & hotel booking characteristics to remain applicable ).

It is never simple altering an effective & profitable business model, however it’s even more difficult to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has dropped 50 percent or more.

Some people do exactly the opposite & compensate for a sales shortfall by publishing more lower end material at an ever faster rate and/or raising advertisement load. Either of which normally makes their user engagement metrics worse while making their site less differentiated & more likely to get extra bonus penalties to induce visitors even lower.

In certain ways I feel that the ability for a site in order to survive & stay though a punishment is itself a quality signal for Google.

Some websites which are too reliant on lookup & don’t have any external sources of traffic are ultimately websites which tried to behave too equally to the monopoly that finally displaced them. And over time the tech monopolies are growing stronger since the ecosystem around them burns down:

In the event you had to select a date for when the internet died, it would be from the year 2014. Until then, traffic to sites came from several sources, and the net was a lively ecosystem. But starting in 2014, more than half of all traffic began coming from two sources: Facebook and Google. Now, over 70% of traffic is dominated by those two platforms.

Firms which have renewable profit margins & idle (in terms of handling resources & time to deploy) can better cope with algorithmic changes & shift with the marketplace.

Over the past half a decade or so there have been multiple changes which radically altered the online publishing landscape:

  • The change to cellular, which offers publishers lower ad yields while making the fundamental advertising networks more ad significant in a way that reduces visitors to third party websites
  • the Development of the knowledge graph & showcased snippets which often imply publishers remain uncompensated for their work
  • higher ad loads which additionally lower organic achieve (on both research & social stations )
  • the Development of programmatic advertisements, which farther gutted screen ad CPMs
  • the Development of ad blockers
  • raising numerical doubt & a higher barrier to entry

Each one of the aforementioned could have a double digit percent out of a site’s revenues, especially if a site was reliant on screen advertisements. Add them together and a website which was not algorithmically penalized could still see a 60%+ decrease in earnings. Mix in a penalty and that decline can chop a zero or two off the overall revenues.

Firms with lower margins can Attempt to offset declines with increased advertising spending, but only works if You Aren’t in a market with 2 & 20 VC fueled rivalry :

We don’t necessarily know which channels that they will pick or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has come to be an arms race: fresh tactics go stale in months, and client acquisition prices keep climbing. In a universe where only one firm thinks this manner, or in which one business is implementing at a level above everybody else – such as Facebook in its time – this tactic is extremely powerful. But when everybody is behaving this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get sign up to outrageous prices by startups flush with venture cash, and potential users demand more and more subsidized merchandise to get their initial attention. The dynamics we’ve entered is, in various ways, creating a dangerous, top stakes Ponzi scheme.

And sometimes the system claws back a second or third bite of this apple. Amazon.com prices retailers for fulfillment, warehousing, transaction based charges, etc.. And they’ve pushed hard into launch hundreds of private label brands which pollute the interface & induce brands to buy advertisements even on their own branded key word provisions.

They’ve recently jumped the shark with the Addition of a bonus feature where even when a brand paid Amazon to deliver traffic to their record, Amazon might add a spam popover that provides a cheaper private label branded product:

Amazon.com analyzed a pop up feature on its own program that in some cases pitched its private-label products on competitions’ product webpages, an experiment that reveals the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house titles. The current experiment, conducted in Amazon’s mobile app, went a step farther than the display advertisements that normally appear within search results and merchandise pages. This test pushed pop-up windows that took over much of a merchandise page, forcing customers to click through the lower-cost Amazon goods or dismiss them before continuing to store. … When a client using Amazon’s cellular program searched for”AAA batteries,” for instance, the first link was a sponsored listing from Energizer Holdings Inc.. After clicking on the listing, a pop-up appeared, offering less costly AmazonBasics AAA batteries”

Buying these Amazon advertisements was rather literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying new equity, AWS has been doing investor relations matchmaking for startups. Anything to maintain the current bubble going ahead of this Uber IPO which will probably indicate the top from the stock exchange.

As the market caps of big tech businesses grow they will have to be predatious to develop into the valuations & retain employees with stock options in an abysmal strike price.

They’ve generated bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the technician monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

“It’s an unusual arrangement — employer as landlord — that is starting to catch on elsewhere as college employees say they can’t afford to live comfortably in areas awash in tech bucks. … Holly Gonzalez, 34, also a kindergarten teacher in East San Jose, along with also her husband, Daniel, also a school district I.T. professional, were able to purchase a three-bedroom apartment for $610,000 this summer with help from their parents from Landed. If they sell the home, they will owe Landed 25% of any profit in its own worth.

The above Kind of dynamics possess some claiming summit California:

The bicycle further benefits from the Alchian-Allen effect: agglomerating businesses have higher productivity, which increases the cost of living and prices out other businesses, increasing concentration with time. … Since startups increase the variance inside whatever business they’re started in, the natural constituency for these is someone who doesn’t have funds deployed in the business. If you’re an asset operator, you want low volatility. … Historically, startups have created a constant supply of volatility for tech companies; the next generation is cannibalizing the prior one. So chip firms in the 1970s established the PC businesses of their 80s, but PC companies sourced more economical and cheaper processors, commoditizing the product until Intel was able to battle back. The OS turned PCs into a commodity, then search engines and social media turned into the OS into a product, and presumably this procedure will continue forever. … As long as higher rents raise the expense of beginning a pre-revenue company, fewer people will join them, so more people will join based companies, where they will earn marketplace wages and keep to push rents up. And among the things they’ll do there is optimize advertising loads, which places another taxation on startups. More dangerously, this can be an incremental tax on growth rather than a tax upon headcount, therefore it puts stress on out-year valuations, not just upfront money flow.

If you live hundreds of kilometers away the tech companies may have no effect on your lease or purchase price, however you can not control the calculations or the ecosystem.

All you really can control is your mindset & ensuring you have optionality baked into your business model.

  • If you’re debt-levered you have little to no optionality. Savings supply you with optionality. Savings allow you to run in a loss for a time period whilst also investing in improving your website and perhaps having a few different websites in other niches.
  • Should you run one website that is heavily reliant upon a third party for supply then you’ve got little to no optionality. In case you’ve got several jobs that allows you to alter your focus toward focusing on anything is moving up and to the right whilst letting anything that is failing pass time without becoming too reliant on something that you can’t change. That is the reason it often makes sense for a brand merchant to operate their very own ecommerce website even if 90 percent of the sales come from Amazon. It offers you optionality should the technician monopoly become violent or otherwise hurt you (even when the objective was benign rather than outright misanthropic).

Since the upgrade evolves Google will gather more information with how users interact with the result set & determine how to weight different signs, along with re-scoring sites that recovered according to the brand new engagement information.

Recently a Bing engineer named Frédéric Dubut explained how they evaluate relevancy signals used in updates

As early as 2005, we employed neural networks to power our search engine and it’s still possible to find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking improvements. … The”practice” procedure for a machine learning version is usually iterative (and each of automatic ). At every step, the model is tweaking the burden of every feature in the direction at which it hopes to reduce the error that the most. After each step, the algorithm remeasures the rating of all the SERPs (dependent on the famous URL/query pair evaluations ) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google today and from the coming weeks there will be the next phase of the current update.

So far it looks like a few quality-based re-scoring was completed & some sites which were too reliant on anchor text got clipped. On the rear end of the upgrade there will be an additional quality-based re-scoring, however, the websites which were hit for excessive manipulation of anchor text through link building efforts will probably remain penalized for a good chunk of time.

Update: It seems a significant reverberation of this upgrade occurred on April 7th. From early investigation, Google is blending in showing results for related midtail concepts on a core industry search phrase & they are also in some instances pushing more vigorously on doing inner site-level searches to position a more applicable internal page for a question in the place where they homepage may have ranked previously.

Groups: 

Read More

DMOZ Shut Down

http://www.seobook.com/dmoz-shut-down

Last August I wrote a blog post about how focus retailers were sucking the worth out of online publishing. In it I noticed how the Yahoo! Directory disappeared & how even DMOZ saw a sharp drop in traffic rankings over the last few decades.

The concept of a neutral web is dead. In its position is agenda-driven networking .

  • Politically charged misinformed snippets.
  • Ads cloaked as articles.
  • Largely correct (but politically insensitive) content being”reality checked” where a small detail is disputed to label the entire piece as not credible.

Since the technology oligarchs broadly defund publishing, then the publishers still need to eat. Data grade declines to make the figures work. Businesses which see their advertising revenues slide 20%, 30% or 40% every year can not justify keeping the short-term yet unmonetized side jobs.

There’s Wikipedia, but it is not without prejudice & beyond the value expressed from the hidden prejudice most of the rest of the value from it flows through to the attention merchant / viewers aggregation / articles scraper platforms.

Last month DMOZ declared they were shutting on March 14th without a lot of fanfare. And on March 17th that the directory moved offline.

A number of individuals have pushed to preserve archive the DMOZ data. Some existing DMOZ editors are considering establishing a new directory under a different title but as of their 17th DMOZ editors put up a copy in dmoztools.net. Jim Boykin scraped DMOZ & uploaded a replica here. A few other versions of DMOZ have been published at OpenDirectoryProject.org & Freemoz.org.

DMOZ wasn’t without criticism or controversy,

Although website policies indicate that an individual website ought to be submitted to just 1 class, as of October 2007, Topix.com, a news aggregation site operated by DMOZ founder Rich Skrenta, comprises more than 17,000 listings.

Early in the history of DMOZ, its own staff gave representatives of selected companies, such as Rolling Stone or CNN, editing access in order to list individual pages out of their websites. Links to individual CNN articles were added until 2004, but were completely taken out of the directory at January 2008 because of the content being obsolete and not considered worth the effort to keep.

But by-and-large it added significance to the structure of the net.

As research has progressed (algorithmic development, economic power, influence over publishers, enhanced bundling of distribution & user monitoring ) general web directories have not been able to keep pace. Finally the internet is a web of links & pages instead of a web of websites. Many great sites span several categories. Every huge quality website has some info on it. Every well-known interactive website has some fantastic user contributions & user created spam onto it. Search engines have greater signs about what webpages are significant & which pages have claimed importance as time passes. As search engines have significantly improved connection filtering algorithms and better integrated user monitoring in positions, broad-based manual directories had no opportunity.

The net of webpages vs web of websites concept can be readily seen in how a number of the first successful content platforms have now broken down their content blogs that are articles into a variety of niche sites.

Once links were (approximately ) all that mattered, minding a website’s link authority supposed it was much more profitable for a large thing to maintain publishing more information on the one primary website. That is how eHow became the center of a multi-billion dollar company.

Demand Media showed other publishers the way. And if the additional existing sites were to stay competitive, they had to water content down quality to generate the amounts back out. The issue with this is the glut of content has been lower ad prices. Along with the decrease in advertisement rates was coupled with a change from a links-only perspective of search relevancy into a version based on weighting link profiles from consumer participation metrics.

Sites with lots of links, lots of thin articles & terrible participation metrics were struck.

Kristen Moore, vp of promotion for Demand Media, clarified what drove the most egregious elements of eHow’s editorial plan :”There is some not very bright folks around.”

EHow improved their site design, drastically reduced their advertisement density, removed millions of articles from their site, and waited. However nothing that they did on that domain name was ever going to get the job done. They dug too deep of a pit selling the growth story to pump a Dollar valuation. And they created so much animosity from journalists who felt overwork & underpaid that when they did rank journalists could typically like to link to anything .

The flip side of that story is the newspaper chainsthat rushed to associate with demand Media to develop eHow-inspired sections on their websites.

Brands that like the Google new subsidy will also be very hip to work with Demand Media, which breathes new life into once retired content:”Sometimes Demand will dust off old articles that has been published but is no longer reside and repurpose it for a new.”

Since Facebook & Google grew more prominent in the internet advertisement ecosystem they vigorously moved to suck publisher content & shift advertiser invest onto their center properties. The development of time spent on social networking sites only made it harder for websites to be sought out destination. Google also efficiently cut off direct distribution by consolidating & de-monetizing that the RSS reader space then shut down a job they easily could have left run.

Since the net gained more aggressive, bloggers & market publications that were deeply specialized managed to sneak marketshare in key verticals by minding a differentiated editorial opinion.

Even if they couldn’t necessarily afford to build strong brands through advertisements, they were worthy of a trace on some social media channels & possibly an email subscription. And the best niche editorial remains worthy of an Immediate trip :

Everything about Techmeme and its particular success appears to withstand the contemporary wisdom of constructing a popular site. It publishes zero original reporting and is not a social media. It doesn’t possess a mobile app or a newsletter or even much of a social presence beyond its Twitter accounts, which posts dry merchandise news with zero flair for clickability.

As a work around to the Panda hits, websites like eHow are now becoming collections of niche-focused sites (Cuteness.com, Techwalla.com, Sapling.com, Leaf.tv, etc will combine Livestrong.com & eHow.com). It appears to be working so far…

. . .but they could only be 1 Panda update away from finding out the new model is not sustainable .

About.com has done the exact same item (TheSpruce.com, Verywell.com, Lifewire.com, TheBalance.com). Countless countless dollars are riding on the hope that as the algorithms keep getting more indicative they wont detect moving the material to market brands wasn’t enough.

As content moves around search engines with countless Dollars in earnings can recalibrate rankings for every page & adjust rankings based on user expertise. Can a powerful”how to” guide become irrelevant after a software or hardware upgrade? If this is the case, they could see it didn’t solve the user’s problem and rank a much more recent document that reflects the current hardware or software. Is a problem simple to solve a brief snippet of content? If that’s the case, that could get scraped into the research results.

Web directories that are constructed around sites as opposed to pages don’t have any prospect of competing against the billions of number of monthly search advertisements & the full cycle consumer monitoring search companies such as Google & Bing can perform with their incorporated search engines, ad networks, internet browsers & operating systems.

Arguably in most circumstances the idea of neutral-based publishing no more functions on the contemporary web. Exclusive stories are got by the shill. The political polemic gets automatic retweets from those who recognize. The material that lacks schedule probably lacks the economics to pay for advertisements & buy distribution unless people are able to tell the founder loves what they do as much it affects them enough to repeatedly visit & possibly pay for access.

Read More