Zuckerberg realizes the dangers of the social-media revolution he helped start

Zuckerberg realizes the dangers of the social-media revolution he helped start

In early January, I went to see Mark Zuckerberg at MPK20, a concrete-and-steel building on the campus of Facebook's headquarters in Menlo Park, California. The Frank Gehry-designed building has a pristine 3.6-hectare rooftop garden, yet much of the interior appears unfinished. Many of the internal walls are unpainted plywood. The space looks less like the headquarters of one of the world's wealthiest companies and more like a Chipotle restaurant with standing desks. It's an aesthetic meant to reflect one of Facebook's founding ideologies: that things are never quite finished, that nothing is permanent, that you should always look for a chance to take an axe to your surroundings.

The mood in overwhelmingly liberal Silicon Valley at the time, days before US President Donald Trump's inauguration, was grim. But Zuckerberg is preternaturally unable to look anything other than excited about the future. "Hey, guys!" he beamed, greeting Mike Isaac, a New York Times colleague who covers Facebook, and me.

"Hey, guys!"

  

No one at the company has a private office.

"2016 was an interesting year for us," he said as the three of us, plus a public relations executive sat in a glass-walled conference room. No one, not even Zuckerberg, has a private office. It was an understatement and a nod to the obvious: Facebook had become a global political and cultural force, and the full implications of that transformation had begun to come into view last year.

"If you look at the history of Facebook, when we started off, there really wasn't news as part of it," Zuckerberg went on. But as Facebook grew and became a bigger part of how people learn about the world, the company had been slow to adjust to its new place in people's lives.

The events of 2016, he said, "set off a number of conversations that we're still in the middle of".

   

The News Feed team at Facebook headquarters.

Nearly 2 billion people use Facebook every month, about 1.2 billion of them daily. The company, which Zuckerberg co-founded in his Harvard dormitory room 13 years ago, has become the largest and most influential entity in the news business, commanding an audience greater than that of any American or European television news network, any newspaper in the Western world and any online news outlet. It is also the most powerful mobilizing force in politics, and it is fast replacing television as the most consequential entertainment medium. Just five years after its initial public offering, Facebook is one of the 10 highest market-capitalised public companies in the world.

But over the course of 2016, Facebook's gargantuan influence became its biggest liability. During the US election, propagandists – some working for money, others for potentially state-sponsored lulz [mischief] – used the service to turn fake stories into viral sensations, such as the one about Pope Francis endorsing Trump (he hadn't). With its huge reach, Facebook has begun to act as the great disseminator of misinformation and half-truths swirling about the rest of media. It sucks up lies from cable news and Twitter, then precisely targets each lie to the partisan bubble most receptive to it.

After studying how people shared 1.25 million stories during the campaign, a team of researchers at Massachusetts Institute of Technology and Harvard implicated Facebook and Twitter in the larger failure of media in 2016, finding that social media created a right-wing echo chamber: a "media network anchored around Breitbart developed as a distinct and insulated media system, using social media as a backbone to transmit a hyperpartisan perspective to the world". After the election, former president Barack Obama bemoaned "an age where there's so much active misinformation and it's packaged very well and it looks the same when you see it on a Facebook page or you turn on your television."

Zuckerberg offered a few pat defenses of Facebook's role. "I'm actually quite proud of the impact that we were able to have on civic discourse overall," he said in January. Misinformation on Facebook was not as big a problem as some believed it was, but Facebook nevertheless would do more to battle it, he pledged.

"I'm actually quite proud"

   

It was hard to tell how seriously Zuckerberg took the criticisms

of his service and its increasingly paradoxical role in the world. Across the globe, Facebook now seems to benefit actors who want to undermine the global vision at its foundation. Supporters of Trump and the European right-wing nationalists who aim to turn their nations inward and dissolve alliances, even ISIS with its skillful social-media recruiting and propagandizing – have sought to split the Zuckerbergian world apart. And they are using his own machine to do it.

Since election day Silicon Valley has been consumed with a feeling of complicity. Trump had benefited from a media environment that is now shaped by Facebook – and, more to the point, shaped by a single Facebook feature, the same one to which the company owes its remarkable ascent to social-media hegemony: the computationally determined list of updates you see every time you open the app. The list has a formal name, News Feed. But most users are apt to think of it as Facebook itself.

If it's an exaggeration to say that News Feed has become the most influential source of information in the history of civilization, it is only slightly so. Facebook created News Feed in 2006 to solve a problem: In the social-media age, people suddenly had too many friends to keep up with. To figure out what any of your connections were up to, you had to visit each of their profiles to see if anything had changed. News Feed fixed that. Every time you open Facebook, it hunts through the network, collecting every post from every connection. Then it weighs the merits of each post before presenting you with a feed sorted in order of importance: a hypersonalised front page designed just for you.

Scholars and critics have been warning of the solipsistic irresistibility of algorithmic news at least since 2001, when the constitutional aw professor Cass Sunstein warned, in his book Republic.com, of the urgent risks posed to democracy "by any situation in which thousands or perhaps millions or even tens of millions of people are mainly listening to louder echoes of their own voices". (In 2008, I piled on with my own book, True Enough: Learning to Live in a Post-Fact Society.) In 2011, the digital activist and entrepreneur Eli Pariser gave this phenomenon a memorable name in the title of his own book: The Filter Bubble.

Facebook says its own researchers have been studying the filter bubble since 2010. In 2015, they published an in-house study, which was criticized by independent researchers, concluding that Facebook's effect on the diversity of people's information diet was minimal. When News Feed did show people views contrary to their own, they tended not to click on the stories. For Zuckerberg, the finding let Facebook off the hook.

Then, last year, Facebook's domination of the news became a story itself. In May, Gizmodo reported that some editors who had worked on Facebook's Trending Topics section had been suppressing conservative points of view. To smooth things over, Zuckerberg convened a meeting of conservative media figures and eventually significantly reduced the role of human editors. Then in September, Facebook deleted a post that included the photojournalist Nick Ut's iconic photo of a naked nine-year-old girl, Phan Thi Kim Phuc, running in terror after a napalm attack during the Vietnam War, on the grounds that it ran foul of Facebook's prohibition of child nudity.

Facebook, under criticism, reinstated the picture, but the photo incident highlighted the difficulty of building a policy framework for what Facebook was trying to do. Zuckerberg wanted to become a global news distributor that is run by machines, rather than by humans who would try to look at every last bit of content and exercise considered judgment. "It's something I think we're still figuring out," he told me in January. "There's a lot more to do here than what we've done. And I think we're starting to realize this now as well." It struck me as an unsatisfying answer, and it became apparent that Zuckerberg seemed to feel the same way. A month after the first meeting, Zuckerberg wanted to chat again.

The Zuckerberg who greeted us was less certain in his pronouncements, more questioning. Earlier, Zuckerberg's staff had sent me a draft of a 5700-word manifesto that, I was told, he spent weeks writing. The document, "Building Global Community", argued that until now, Facebook's corporate goal had merely been to connect people. According to the manifesto, Facebook's next focus will be developing the social infrastructure for the community – for supporting us, for keeping us safe, for informing us, for civic engagement, and for an inclusion of all". If it was a nebulous crusade, it was also vast in its ambition.

"There are questions about whether we can make a global community that works for everyone," Zuckerberg writes, "and whether the path ahead is to connect more or reverse course." He also confesses misgivings about Facebook's role in the news. "Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared," he writes. "But the past year has also shown it may fragment our shared sense of reality."

At the time, the manifesto was still only a draft. When I suggested that it might be perceived as an attack on Trump, he looked dismayed. A few weeks earlier, there was media speculation, fuelled by a post-election tour of America by Zuckerberg and his wife, that he was laying the groundwork to run against Trump in 2020, and he took pains to shoot down the rumors. If the company pursues the aims outlined in "Building Global Community", the changes will echo across media and politics, and some are bound to be considered partisan. The risks are especially clear for changes aimed at adding layers of journalistic ethics across News Feed, which could transform the public's perception of Facebook, not to mention shake the foundations of its business.

The solution to the broader misinformation dilemma – the pervasive climate of rumor, propaganda and conspiracy theories that Facebook has inadvertently incubated – may require something that Facebook has never done: ignoring the likes and dislikes of its users. Facebook's entire project, when it comes to news, rests on the assumption that people's individual preferences ultimately coincide with the public good, and that, if it doesn't appear that way at first, you're not delving deeply enough into the data. By contrast, decades of social-science research shows that most of us simply prefer stuff that feels true to our world view even if it isn't true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.

After the election, Margaret Sullivan, a Washington Post columnist and a former public editor of the Times, called on Facebook to hire an executive editor who would monitor News Feed with an eye to fact-checking, balance and editorial integrity. Jonah Peretti, the founder of BuzzFeed, told me that he wanted Facebook to use its data to create a kind of reputational score for online news.

Late last year, Facebook outlined a modest effort to curb misinformation. News Feed would now carry warning labels: If a friend shares a viral story that has been shot down by one of Facebook's fact-checking partners (including Snopes and PolitiFact), you'll be cautioned that the piece has been "disputed". But even that slight change has been met with fury on the right, with Breitbart and The Daily Caller fuming that Facebook had teamed up with liberal hacks motivated by partisanship. If Facebook were to take more significant action, such as hiring human editors or paying journalists, the company would instantly become something it has long resisted: a media company rather than a neutral tech platform.

In many ways, the worry over how Facebook changes the news is really a manifestation of a grander problem with News Feed, which is simply dominance itself. News Feed's aggressive personalisation wouldn't be much of an issue if it weren't crowding out every other source. By my second meeting with Zuckerberg, Facebook had announced plans for the Facebook Journalism Project, in which the company would collaborate with news companies on new products. Facebook also created a project to promote "news literacy" among its users, and it hired the former CNN news anchor Campbell Brown to manage the partnership between it and news companies. Zuckerberg's tone towards critics of Facebook's approach to the news had grown far more conciliatory.

"I think it's really important to get to the core of the actual problem," he said. "I also really think that the core social thing that needs to happen is that a common understanding needs to exist. And misinformation I view as one of the things that can possibly erode common understanding. But sensationalism and polarization and other things, I actually think, are probably even stronger and more prolific effects. And we have to work on all these things. I think we need to listen to all the feedback on this."

Still, Zuckerberg remained preoccupied with the kind of problems that could be solved by the kind of hyperconnectivity he believed in, not the ones caused by it. "There's a social infrastructure that needs to get built for modern problems in order for humanity to get to the next level," he said. "Having more people oriented not just towards short-term things but towards building the long-term social infrastructure that needs to get built across all these things in order to enable people to come together is going to be a really important thing over the next decades." Zuckerberg continued, "We're getting to a point where the biggest opportunities I think in the world … problems like preventing pandemics from spreading or ending terrorism, all these things, they require a level of co-ordination and connection that I don't think can only be solved by the current systems that we have." What's needed is some global superstructure to advance humanity.

Zuckerberg is arguing for a kind of digital-era version of the global institution-building that the Western world engaged in after World War II. But because he is a chief executive and not an elected president, there is something frightening about his project. He is positioning Facebook – and, considering that he commands absolute voting control of the company, himself – as a critical enabler of the next generation of human society. His mission drips with megalomania, albeit of a particularly sincere sort. Building new "social infrastructure" usually involves tearing older infrastructure down. If you manage the demolition poorly, you might undermine what comes next. In the case of the shattering media landscape, Zuckerberg may yet come up with fixes for it. But in the meantime, Facebook rushes headlong into murky new areas, uncovering new dystopian possibilities at every turn.

A few months after we spoke, Facebook held its annual developer conference in San Jose, California. At last year's show, Zuckerberg introduced an expanded version of Facebook's live streaming service which had been promised to revolutionize how we communicate. Live had generated iconic scenes of protest but was also used to broadcast a terrorist attack in Munich and at least one suicide. Hours before Zuckerberg's appearance, a Cleveland man who had killed a stranger and posted a video on Facebook had shot himself after a manhunt.

But as he took the stage in San Jose, Zuckerberg was ebullient. For a brief moment, there was a shift in tone: Statesman Zuck. "In all seriousness, this is an important time to work on building community," he said. He offered Facebook's condolences to the victim in Cleveland; the incident, he said, reminded Facebook that "we have a lot more to do". Zuckerberg then pivoted to Facebook's next marvel, a system for digitally augmenting your pictures and videos. The technical term for this is "augmented reality". The name bursts with dystopian possibilities – fake news on video rather than just text – but Zuckerberg never mentioned them. The statesman had left the stage; before us stood an engineer. –

Chuck Reynolds
Contributor

Alan Zibluk – Markethive Founding Member

The Future of Social Media Marketing

The Future of Social Media Marketing

 

Alan Zibluk – Markethive Founding Member

Types of negative SEO to watch out for

Types of negative SEO to watch out for

Is negative SEO real?
Should you be worried?
Is there anything you can do to stay safe?
This post is an attempt to answer these questions.

The threat of negative SEO is remote

but daunting. How easy is it to for a competitor to ruin your rankings, and how do you protect your site? But before we start, let’s make sure we’re clear on what negative SEO is, and what it definitely isn’t. Negative SEO is a set of activities aimed at lowering a competitor’s rankings in search results. These activities are more often off-page (e.g., building unnatural links to the site or scraping and reposting its content); but in some cases, they may also involve hacking the site and modifying its content. Negative SEO isn’t the most likely explanation for a sudden ranking drop. Before you decide someone may be deliberately hurting your rankings, factor out the more common reasons for ranking drops. You’ll find a comprehensive list here.

Negative off-page SEO

This kind of negative SEO targets the site without internally interfering with it. Here are the most common shapes negative off-page SEO can take.

Link farms

One or two spammy links likely won’t hurt a site’s rankings. That’s why negative SEO attacks usually involve building links from a group of interconnected sites, or link farms. Typically, most of these links use the same anchor text. These exact-match anchors may be completely unrelated to the site under attack; or they might include a niche keyword to make the site’s link profile look like the owner is manipulating it.

A while ago, this happened to WP Bacon, a WordPress podcast site. Over a short period of time, the site acquired thousands of links with the anchor text “porn movie.” Throughout 10 days, WP Bacon fell 50+ spots in Google for the majority of keywords it ranked for. This story has a happy ending though: the webmaster disavowed the spammy domains, and eventually, WP Bacon recovered most of its rankings.

How to stay safe: 

Preventing a negative SEO attack isn’t something in your power, but spotting the attempt early enough to reverse the damage is possible. To do that, you need to regularly monitor link profile growth. SEO SpyGlass, for example, gives you progress graphs for both the number of links in your profile, and the number of referring domains. An unusual spike in either of those graphs is reason enough to look into the links you suddenly acquired.

To actually see the links that made up the spike, go to the Linking Domains (or Backlinks) dashboard in SEO SpyGlass and sort the links by Last Found Date by clicking on the header of the column twice. Look for the links that were found around the same time when the spike on the graph appeared.If you’ve no idea where the links are coming from, it’s useful to look at their Penalty Risk. Switch to the Link penalty risk tab, select those suspicious backlinks you just discovered, and click Update Link Penalty Risk.

In a few minutes, the column should be populated with values on a scale from 0 to 100. It’s a pretty accurate metric to tell if the links are coming from link farms, as, among other things, it looks at the number of linking domains that come from the same IP address or C block.Lastly, once you’ve identified the spammy links, you can create a disavow file right in SEO SpyGlass. To do that, right-click the backlink/linking domain and select Disavow (make sure to select Entire domain under Disavow mode). Do the same for all unnatural links you spotted. Finally, go to Preferences > Disavow/Blacklist backlinks, review your disavow file, and export it once you’re happy with it.

Scraping

   

Scraping your content and copying it across other sites is another way a competitor can ruin your rankings. When Google finds content that is duplicated across multiple sites, it will usually pick only one version to rank. In most cases, Google is clever enough to identify the original piece… unless they find the “stolen” version first. That’s why scrapers often automatically copy new content and repost it straightaway.

How to stay safe:

Copyscape is an essential tool if you’re determined to find instances of content duplication. If you do find scraped copies of your content, it’s a good idea to first contact the webmaster asking them to remove the piece. If that’s not effective, you may want to report the scraper using Google’s copyright infringement report.

Forceful crawling

There are examples of desperate site owners trying to crash a competitor’s site by forcefully crawling it and causing heavy server load. If Googlebot can’t access your site for a few times in a row… you guessed it — you might get de-ranked.

How to stay safe: 

If you notice that your site has become slow, or, worse, unavailable, a wise thing to do is contact your hosting company or webmaster — they should be able to tell you where the load is coming from. If you know a thing or two about server logs, here are some detailed instructions on finding the villain crawlers and blocking them with robots.txt and .htaccess.

Negative on-page SEO

Negative on-page SEO attacks are way more difficult to implement. These involve hacking into your site and changing things around. Here are the main SEO threats a hacker attack can pose.

Modifying your content

You’d think you’d notice if someone changed your content, but this tactic can also be very subtle and difficult to spot. As the attacker adds spammy content (usually links) to a site, they often hide it (e.g.: under “display:none” in the HTML), so you won’t see it unless you look in the code.

Another possible negative SEO scenario is someone modifying your pages to redirect to theirs. This isn’t a threat for most small businesses, but if your site enjoys high authority and link popularity, it could be someone’s sneaky way to increase their own site’s PageRank, or to simply redirect visitors to their site when they try to access yours. For the site under attack, such redirects aren’t just a temporary inconvenience. If Google finds out about the redirect before you do, they can penalize the site for “redirecting to a malicious website.”

How to stay safe:

 Regular site audits with a tool like WebSite Auditor are the best way to spot such subtle attacks. To start your first audit, just launch WebSite Auditor and create a project for your site. Whenever you need to re-run the audit, use the Rebuild Project button. As long as you do this regularly, you should be able to spot changes that could otherwise go unnoticed, such as the number of outgoing links on the site or pages with redirects. To look into those links or redirects in detail, switch to the All Resources dashboard and go through the External Resources section. If you see an unexpected increase in the count of these, look through the list on the right to see where those links point to, and the lower part of the screen for the pages they were found on.

Getting the site de-indexed

A small change in robots.txt is one alteration that could wreak havoc on your entire SEO strategy. A disallow rule is all it takes to tell Google to completely ignore your website. There are multiple examples of this online, including this story. A client fired an SEO agency he wasn’t happy with, and their revenge was adding a Disallow: / rule to the client’s robots.txt.

How to stay safe:

 Regular ranking checks will help you be the first to know should your site get de-indexed. With Rank Tracker, you can schedule automatic checks to occur daily or weekly. If your site suddenly drops from search engines’ results, you’ll see a Dropped note in the Difference column.When this happens across a big number of keywords, it usually implies a penalty or de-indexation. If you suspect the latter, check the crawl stats in your Google Search Console account and take a look at your robots.txt.

Hacking the site (per se)

Even if the hacker has no negative SEO in mind, the attack per se can hurt your SEO. Google wants to protect its users, which is why, if they suspect a site has been hacked, they may de-rank it, or at the very least add a “this site may be hacked” line to your search listings.Would you click on a result like that?  Negative SEO aside, stepping up your site’s security should be high on your list of priorities for obvious reasons.

Chuck Reynolds
Contributor

Alan Zibluk – Markethive Founding Member

Machine learning & neural networks: The real future of SEO

Machine learning & neural networks:
The real future of SEO

Columnist Benjamin Spiegel takes a look at how machine learning and technologies like TensorFlow will change the way we think about SEO.

   I

The good old days of Search Engine Optimization (SEO)

I grew up in the good old days of Search Engine Optimization (SEO) when the keyword tag still meant something and you could make it to the top of SERPs by using the same keyword over and over in the website’s title or keyword tag. That was back when exact match results were the only ones returned from consumers’ searches, and crawlers like Google spent 99 percent of their time crawling and indexing instead of cataloging and evaluating the content’s actual quality and relevance. Those days are (thankfully) over.

Those good old days were followed (for good or ill) by years of SEO practitioners focused on chasing the constantly evolving algorithms, which, in my personal opinion, we have been doing all wrong. Most SEOs start out with something like Search Engine Land’s periodic table of SEO ranking factors or similar guides. We use tools like DeepCrawl and Screaming Frog to help chase down broken links and then ask tools like Moz and others to tell us where to place keywords in the title relative to the overall title length, or how our meta descriptions need to be X pixels less and our body copy has to have X outgoing links…

WE SERIOUSLY NEED TO STOP IT!

Although these are all great practices to follow, they are really corrective actions taken after something has already been done wrong, and none of these tactics will give brands position No. 1 for any high-value keywords. Instead, these should be considered from day one of the website build, implemented during site development, and not be an afterthought once the site is launched. We recently did a test against 150,000 different SERPS, and based on a simple scoring model, the majority of the top three results didn’t follow even half of the best practice rules commonly found in the ranking feature lists.

In this test, we extracted 83 features from each of the SERPS (page speed, content length, link scoring, content density, social signals and so on) and used different models in an attempt to reverse-engineer the algorithm. Even with 83 features, we did not get any meaningful results; we found that websites in the top of the SERPs were just as poorly optimized as those on page 2. This clearly shows us that while all those tactics are important for many reasons, even if you follow them exactly, it will not move your rank from 10 to 1.

So here comes the disappointing part of this article: I also have no idea how to get you a guaranteed position 1 — NOBODY does. But what I can tell you is that there is no simple way to recreate the algorithm, no easy script you can run, no simple linear regressions that can solve for it. We have seen No. 1 rankings that literally did everything wrong, and position 60 rankings that did everything right! Here’s the good news: About two years ago, we got a look under the hood and learned why it has become so much harder to “manipulate” rankings, and why no matter how large the test or sample, it is impossible to re-create the actual algorithm.

It was November 9, 2015, the day Google publicly released TensorFlow. TensorFlow is a (now) open-source software library for machine intelligence. It is, in fact, the library that powers most of Google’s technology like Gmail, Photos, Voice and RankBrain. TensorFlow originally was released as an evolution of Google’s internal neural network training framework “DistBelief” by the Google Brain team. On the simplest level, TensorFlow enables the large-scale and parallel manipulation of “Tensors,” multi-dimensional arrays that carry vectorized data. The latest releases of TensorFlow have improved the scalability with new features that add APIs and deployments onto all types of devices.

TensorFlow & SEO

So what do machine learning and TensorFlow have to do with SEO, algorithms and reaching that coveted No. 1 spot in the SERPs? As Google’s RankBrain gets smarter in understanding users and their intent, it’s also learning to better understand content, information and if that content will provide the right answer, not only to the query but also to the individual user. With the algorithm now truly understanding the query intent on a linguistic level, it can deliver new kinds of results that are correlated and weighted in a way that a human brain can’t even begin to predict. This dramatically changes two aspects of SEO: technical SEO and content SEO.

Technical SEO

As many have said before, the role of technical SEO in the context of fixing links, optimizing title tags and ensuring correct markup is no longer a valid SEO role, meaning there should be no brands out there hiring a practitioner just for that purpose. This nuts-and-bolts work should be done from the beginning of the website build and audited by the web dev team on an ongoing basis.

Instead, the true technical SEOs of the future need to understand more than just HTML and XML; they need to understand how machine learning works, how TensorFlow handles data and weighs inputs, and how to understand and train models. There will always be crawling and discovery, but the main focus is now more analytical — truly data-driven, with the SEO practitioner more a mathematician and software developer than a web designer.

Content SEO

The last few years have seen a convergence of SEO content and content marketing. We know we must now create contextually relevant content that is authoritative, not just keyword-stuffed. Now it’s time to look at more than minimum/maximum character counts and keyword density. We have to start using machine learning models and linguistic analysis to weigh and score our content to ensure it truly answers the consumer question, instead of just telling a brand story.

Personally, I am excited about the new frontiers of SEO and the evolution of the field, and I invite anyone out there who’s in doubt to just look at the growth of voice and conversational search. It’s all powered by machine learning and technologies like TensorFlow. The time is now.

Chuck Reynolds
Contributor


Alan Zibluk – Markethive Founding Member

Why e-commerce SEO is NOT a one-shot wonder

Why e-commerce SEO is NOT
a one-shot wonder

Columnist Trond Lyngbø explains that SEO is an ongoing process that requires continuous investment — not a one-time task to be checked off a list.

  

A “one-shot wonder” is someone who gets drunk after one shot

 — an extreme lightweight. In business parlance, the term refers to an organization that shines brightly for a short while… before fading away to nothing. As an e-commerce web shop deploying strategic SEO for your business growth, the last thing you want is to be known as a one-shot wonder. In fact, because SEO picks up steam gradually, but then becomes increasingly powerful and effective at drawing in new prospects and closing more sales, you should ensure that everything you do is sustainable and long-lived.

Unethical cut-and-run SEO providers try to lure in new clients with the promise of “quick and easy” top rankings on Google. They’ll encourage the use of tactics that work immediately to drive a quick burst of traffic, but have little if any lasting impact. The key difference that sets apart winning e-commerce sites from their competitors is that they select a quality SEO service provider, and then allocate a budget to provide follow-up action after the initial job is done.

Old habits die hard

If a client doesn’t have funds available for a retainer contract for ongoing consulting, or a plan and support system to follow up once a project is done, then they will almost certainly return back to their old habits — and gradually slide back to where they were before. It’s hard to overestimate the importance of continuously working with SEO. In another article, I highlighted the principles of successful web shop SEO:

  • Use search and analytics data to guide strategy.
  • Have a customer-focused website structure and information architecture.
  • Base content creation upon search data.
  • Solve critical problems early in the process.
  • Follow SEO best practices for e-commerce webshops.

What I didn’t mention, though, is that this is a continuous and ongoing process… which can unravel if you stop working on it.

How your data-driven SEO strategy will evolve

Let’s imagine that you’ve implemented all these principles in your e-commerce SEO and are seeing an improvement in website traffic. If you stop working on your SEO, then you might overlook new opportunities to grow or miss a chance to fix what’s broken. When you continue to review your search and analytics data and use this to inform your ongoing SEO strategy, you’ll reap many benefits:

  • You’ll quickly find out whether the search terms you picked to appeal to your audience and attract them to your web shop.
  • You’ll identify the search terms that are more effective at bringing in buyers, and even stumble upon new ones that you can optimize your site for.
  • You’ll know how effectively your website content addresses customer queries, needs, and problems.
  • You can tell if your content and website architecture work to guide visitors to buy from your stores.
  • You can pick up problem areas early and promptly take corrective action before you lose sales or hurt your brand.

Based on an analysis of your business performance on key metrics, you can then make some tweaks and additions to your SEO strategy to further boost your rankings, traffic, and sales.

Monitoring helps improve your strategy

As the effects of these changes kick in, you’ll have more visitors coming to your e-commerce website — which gives you a fresh opportunity to study their behavior and learn more about your audience. For example, if you notice that many people search for a “top-loading washing machine,” it probably means that they don’t have a lot of space. In this situation, the product pages or informational content you show them should communicate how a top-loading washing machine will save space without any compromise on cleaning efficiency. You can answer questions they may have about the product from the unique perspective of their needs because you took the time to analyze user intent based on their choice of search terms.

People may wonder about a lot of stuff before they decide to buy. If you can identify these problem areas, research their behavior and analyze the specific issues that bother them, you will increase your chances of closing a sale.

SEO helps integrate teams

If you’re an e-commerce manager or director in a company that also has local shops, you will have operations handled by both internal and external teams working across many disciplines. All these activities need to be integrated for the best results. But that involves plenty of interaction and discussion between teams. In most organizations, such communication doesn’t happen. That’s why it is so difficult to sustain early results. One of the benefits of having an SEO consultant handle follow-up activities is that you’ll make sure these teams actually work in concert.

Rome wasn’t built in a day! And likewise, a successful SEO strategy to turn around your e-commerce website cannot be implemented in one sitting, or in a short stretch. The best outcomes are seen when your SEO tactics are consistent, strategic and sustained. So be ready to budget for an SEO retainer contract that will safeguard your initial investment — and multiply it many times over.

Chuck Reynolds
Contributor

Alan Zibluk – Markethive Founding Member

Current Cryptocurrency Capital Influx Hints at Positive Bitcoin Price Evolution

Current Cryptocurrency Capital Influx Hints at Positive Bitcoin Price Evolution

No one wants to miss out on what cryptocurrency may be capable of achieving in the future.

  

People often wonder if it is too late to invest in Bitcoin

Every year, the answer is no. The Bitcoin price continues to go up in value, especially for people who hold cryptocurrency for the long-term. The year 2017 will be no different in this regard. Some experts feel a second wave of investments will come soon. If that is the case, the Bitcoin price may soar to US$2,000 or higher.

Bitcoin Price Will Go Higher in the Long-term

Investing in Bitcoin is always a smart decision. Hardly anyone will argue that point at this rate. However, there are some things to take into account. One should always look at the long-term Bitcoin price. Day trading can still be profitable, yet it is losing its appeal. Especially with so much money pouring into cryptocurrency right now. To be specific, the cryptocurrency market cap has grown to about US$45bn. That is quite a growth compared to less than a year ago. Not all of this money is going to Bitcoin directly, however. A lot of altcoins are booming in value right now. An increasing Bitcoin price is bringing more attention to altcoins. This is no longer about “bitcoin vs alts”. It is about cryptocurrency as a whole.

Rest assured a lot of people are looking to invest in cryptocurrency. It is a domino effect, so to speak. People learn about Bitcoin, see the price rocket and feel bad for not investing sooner. The smart people will understand the Bitcoin price has not even begun to peak by any means. There is still a lot of room for gains in all of the cryptocurrency. The BTC price will appreciate, and so will a lot of altcoins’ values.

An Excellent Time To Invest in Cryptocurrency

Do not be mistaken in thinking things will explode overnight either. We have seen massive value gains for over a week now. There is no sign of things slowing down either. All of this is achieved without a convenient way for institutionalized investors pour money into cryptocurrency. An ETF could shake things up quite a bit, but it is not a necessity by any means.

All it takes is a handful of high-net-worth investors, to venture into cryptocurrency. For all we know, this has already happened over the past week or so. No one wants to miss out on what cryptocurrency may be capable of achieving in the future. Regardless of how things play out, investing in cryptocurrency is a smart decision. The Bitcoin price, as well as the value of altcoins with proper use cases, will undoubtedly increase further. However, never put all your eggs in one basket.

Chuck Reynolds
Contributor

Alan Zibluk – Markethive Founding Member

Cryptocurrency Market Cap Soars Above $40 Billion, Boosted By Hefty Trading

Cryptocurrency Market Cap Soars Above $40 Billion, Boosted By Hefty Trading

Bitcoin may be getting most of the headlines, but cryptocurrency as a whole is on a roll. Statistics from Coinmarketcap.com reveal that 82 out of the top 100 cryptocurrencies posted gains in a recent 24-hour period. Whether all cryptocurrencies are riding bitcoin’s coattails or investors are suddenly discovering altcoins is anybody’s guess.

The total cryptocurrency market capitalization (price per coin times amount of coins in circulation) stands at $42.6 trillion. That marks more than a $10 billion gain in 10 days.

Ripple Leads In Growth Rate

Among the currencies with a market capitalization in excess of $1 billion, Ripple has posted the top growth rate of 33.6% in a 24-hour period, yielding a $2.831 billion market cap. Litecoin comes in second with a 22.34% growth rate and $1.132 billion market cap. Ripple’s gain has been credited to a strategic partnership initiative, teaming with Asian and Australian banks in conjunction with its stated goal of acting as a PayPal-like mechanism for large interbank transfers. Litecoin, for its part, has benefited from Coinbase’s decision to support it, allowing users to buy, sell and store Litecoin using its platform and wallet. It became the third cryptocurrency, after bitcoin and Ethereum, to gain Coinbase’s full support.

What Drives Bitcoin?

Bitcoin, far and away the largest market cap in excess of $25 billion, posted a 5.81% 24-hour jump. Bitcoin’s price reached a new all-time high once again, at $1,567. Brian Kelly, a financial analyst at CNBC, has attributed the recent surge in bitcoin’s price to the rise in institutional investors within the bitcoin market. Other factors include the bitcoin community’s consensus not to support Bitcoin Unlimited, and an overall increase in global trading. Some analysts have attributed some of the bitcoin’s growth to that of the altcoins; altcoins are usually bought and sold with bitcoin, requiring traders to buy bitcoin.

Ethereum Has Its Own Story

Ethereum, which has the second highest market cap at just over $8 billion, has jumped 12.12% in the 24-hour period. Its price rise is due to a number of factors. Google searches for Ethereum have spiked to an all-time high, nearly doubling in just one week. Some countries appear to be using ETH a hedge against national currencies. Switzerland, where the Ethereum Foundation is based, showed the strongest interest, followed by Venezuela, which is suffering triple-digit inflation.

South Korea seems to have fallen in love with the currency. Its three largest exchanges handle twice the ETH/fiat volume of Coinbase’s GDAX and Kraken combined. South Korea is also big into fantasy sports, an area where ETH’s smart contracts can be used to make the game more transparent and reduce cheating.

Don’t Forget Dash

Dash, number 5 with a $683.3 million market cap, jumped 6.77% in the 24-hour period. Featuring exceptional transaction speed, Dash continues to become more accessible to investors and consumers.

The cryptocurrency exchange Kraken recently announced the integration of Dash to its trading platform. BitCart, an Ireland-based discount gift card platform, recently allowed users up to a 20% discount for using Dash on Amazon purchases. Crypto-Woo, a payments plug-in, has integrated Dash, allowing users to pay for online purchases with Dash. CryptoBuyer, a Venezuela-based crypto exchange, has begun selling Dash, allowing consumers in the economically ravaged country to have another alternative to its imploding national currency.

Ethereum Classic, number six at $664.4 million, rose 8.97%.
NEM, at %521.7 million, jumped 9.5%.
Monero, number 8 at $371 million, rose 8.76%.

The top 14 cryptocurrencies all posted gains in the 24-hour period.  PIVX, which at $84.1 million has the 15th highest market, cap posted a 3.16% drop.

Chuck Reynolds
Contributor

Alan Zibluk – Markethive Founding Member

Cryptocurrency: Top 5 Things You Should Know About Digital Cash

Cryptocurrency:
Top 5 Things You Should Know About Digital Cash

If you’ve had your ear to the fintech

streets over the last few years, you’ve probably heard the term Bitcoin tossed around as cash’s digital counterpart. What you may not know is how Bitcoin’s emergence in 2009 has spawned a race across the globe to be part of the emerging trend. What exactly is Bitcoin? Will it replace cash? What does it mean for your small business? Here’s a quick rundown to get you up to speed.

What is it?

Bitcoin is a type of cryptocurrency or a digital currency that uses encryption techniques to create units and secure the transaction. What’s unique about this invention is it decentralizes currency away from traditional banks, meaning people can complete financial transactions without any bank involvement or regulation. Bitcoin is the first form of cryptocurrency invented, and is still by far the largest within the market.

How do you use it?

To simplify it further, it’s basically a peer-to-peer sharing network. Members can initiate transactions through the network, however, no actual currency is created or transacted until both parties agree on the amount. Here’s how it works:

1.     Someone requests a transaction.
2.     The request is broadcast to the P2P network composed of computers or “nodes.”
3.     The network initiates a validation process to verify both users and the transaction amount.
4.     Once the transaction is validated, the cryptocurrency is created in the amount that was agreed upon in the validation process. If the amounts or the network credentials don’t add up, the transaction request is denied.

The cryptocurrency has no physical form and only exists within the network. Value is only assigned once the agreed terms are validated. Holders can then withdraw the value from a cryptocurrency ATM in exchange for the currency they’d like to use.

Is it legal?

The legality of cryptocurrency varies by country. Some have explicitly allowed it for trade, and others have totally banned it. For us, the United States Internal Revenue Service (IRS) ruled that bitcoin will be treated as property for tax purposes as opposed to currency. So, it’s legal to own and use for trade internationally, however, it will be subject to capital gains tax.

Are US shoppers using it?

Sure. Knowledge about Bitcoin has increased so significantly since 2014 that there are now 554 Bitcoin ATMs in the U.S. These are stations that Bitcoin owners can use to exchange for U.S. currency.

How will this impact my business?

While Bitcoin is gaining steam in the US and across the globe, it will likely be a few years before this impacts the small business sector. Since the IRS hasn’t identified cryptocurrency as a legal tender, it likely won’t surface as a mainstream payment option for another decade or so.

However, cryptocurrency has the legs to gain popularity within contract-based subsectors. If adopted at full-scale, organizations like banks and insurance companies could be replaced. Access, validation and other major functions can be performed by the technology itself, so bank and insurance underwriting would no longer be a limitation for people who are typically denied credit. Rules, contracts, and processes can be programmed within the peer-to-peer network and therefore transformed into automated processes.

Insurance policies for flight delays will pay out immediately if an airline’s flight data reports a delayed plane. For example, musicians’ royalties can be automatically paid via the blockchain when people listen to their songs, without a record company being involved. People will no longer have to waste time claiming compensation. The amount of self-generated solar power can be calculated without checks by a utility company and credited to the user’s account

Chuck Reynolds
Contributor

Alan Zibluk – Markethive Founding Member