Leadership is the one competency that cannot be learnt in management school. A manager is trained to do things right; a leader does the right things. It is not a matter of training and preparation, but one of instinct and conscience.

The Accidental Apprentice by Vikas Swarup

Blippex – Crowdsourcing Search


Competition in the Search Engine field has been quite stale ever since Google took over the scene. Some prototypes have cropped up here and there, some even attempted by ex-Googlers, but most of them faded into oblivion as quickly as they appeared. Competing with Google is hard. Very hard. No other general search engine can display a few results with the relevance and speed of the big G at an international level. Nor can they so neatly mash up different types of content and data depending on the query. Bing, Blekko and DuckDuckGo are some of the runners-up still in the game, especially for the English-speaking world. (Google still is the undisputed international superstar.)

Blekko does some interesting result clustering into topics, although it comes at the cost of speed. DDG offers results frequently on par with Google, while providing a less cluttered and more private experience. It is dependent on other search engines and sources, but it does a great job of mashing up information from these diferent sources, providing quick answers to some questions. While Google develops everything internally or straight-up buys other companies’ technology, DDG relies on partnerships with other interesting services such as Wolfram-Alpha to become a little smarter. My only gripe with DDG is that their stance is pretty much anti-Google but not anti-Bing, which doesn’t make much sense. Otherwise I love what their data partnerships and quick answer box can do:

CSS color lookup in DDG: #ccc is... a shade of gray!
CSS color lookup in DDG: #ccc is… a shade of gray!

What none of these search engines did was to rewrite the whole concept of web search. They all pretty much play by the book. The only truly innovative engines after Google were the visual ones. As of 2013 there appears to be only one worthwhile visual search engine: Oolone. I feel tempted to give it a go. But besides a different way of presenting results, few people have been successful at coming up with an alternative way to retrieve and rank the results. In other words, PageRank is still king.

Oolone: visual search
Oolone: visual search

Today, a new kind of web search came to my attention. The guys at Blippex had the idea of getting results solely based on which webpages people visit and how long their visits last. They don’t care about crawling the web through links. Naturally, it’s an idea with a big obstacle: getting traction is very, very hard. In the beginning, Blippex didn’t have a single site in their index! Blippex grows along with its user base: the more people use it, the more webpages it will get to know and rank. To this end, their browser extension helps them gather webpages and dwell times. Fortunately, they’re also very privacy-aware, making sure they don’t use cookies or store any other data.

Blippex: the new generation of search?

Presently, Blippex it’s still way behind a usable state, missing several sites, but it’s only been online for a few months. Regardless of its ability to grow, I’m very fond of the idea, and for the first time impressed with a new search engine. I wonder what Blippex could accomplish if they strive long enough and perhaps adopted an equally innovative presentation. For that wonder alone, I will contribute to Blippex by using their browser extension. Realistically, though, there’s few chances that Blippex will be able to gather critical mass, and even less chances that it will understand its users’ intentions as well as Google does. For there’s much more to Google than simply Pagerank.

Google will still have the upper hand:

  • They’re years ahead in research and development.
  • They have a huge and talented workforce.
  • They have advanced technology by experts in every field of computing.
  • They have humungous amounts of data to draw from.
  • Probably they can also estimate the time spent on websites, either with their Toolbar or at least by checking if the same user is returning to Google after their first click on a result.
  • They even reasonably survived attempts of manipulation (webspam).

Still, if you’re intrigued by the idea, check out this article at Quartz for the interesting background of Blippex.

The Opportunity of Keywords Not Provided

music neon letters

music neon letters
Photo Credit: [phil h]

The world of Search Engine Optimization is being rocked again by Google. With most searches now being done securely over the HTTPS protocol, some valuable referral data is being lost by traffic analysts. Now there is no way to know whether a visit to a certain page, coming from Google, is due to a search about X or about Y or Z. We can easily know we have visits to certain pages, and we can estimate how our pages are ranking, but not what kind of searches our visitors were doing exactly.

There was a lot of commotion about this, but there’s no point in resenting Google nor even eagerly gathering keywords another way. The truth is that referral keywords are overrated because they lead to optimizing for search visibility instead of goal completion (user conversion). Why? Because you only get data for stuff that is ranking.

It’s time to look ahead and realize that the fixation on keyword data is actually restrictive and blocking the evolution of Search Engine Optimization. Professional SEOs need to adapt in order to remain in business, but not on a wild goose chase of keyword data, wasting valuable time trying to micro-manage Web Analytics. Instead, the same time could be applied on actually improving the site, its popularity and (consequently) its backlinks.

Then what should be done on the Analytics side?

  • Predictive research is of the utmost importance. Getting keyword research done thoroughly beforehand. Investigating all possible searches that users are doing and that aren’t too competitive to rank for.
  • When unsure about focusing your pages on keywords X or Y, we can always make small changes later and test their impact. Instead of working passively, based on what we think our Analytics software told us to do, we work actively and measure the traffic impact later.
  • Optionally use AdWords with broad queries and specific keywords, in order to test which specific keywords drive more goal completions. Some say we’re falling prey to the conspiracy theory that Google is driving us to spend more in AdWords, but the truth is… it’s a great method to choose from a few specific keyword variations that we identify in predictive research. It can be done much quicker and probably with less chance of skewing than using Analytics for this purpose.

But what really, really matters?

SEOs should already be broadening their perspective and strategies, even before today, and now more so. We should all view Search Engine Optimization, Search Engine Marketing, general Online Marketing and Social Media Management as a holistic field that can be explored for the benefit of clients. The goal of any of these marketing sub-fields is to make your clients richer, more successful. Not simply to make them visible to search engines and people. For this, an online strategy should have multiple defined goals for the business it applies to, and performance indicators to evaluate the effectiveness of that strategy at some point.

Now, with keywords not provided, the experience of good SEOs becomes invaluable because they know what works… this at a time where it’s more difficult to prove what works. So, the future is in the hands of those who know what they’re doing and are confident in their abilities to drive targeted traffic. More than ever, reputation and experience will be key values, and so will be the ability to convince clients about the added value of our SEO.

Basic Tips for Privacy on the Web

In the light of recent news about N$A practices, you may wonder how to take a little more control of your Web presence and experience. Here are some steps to consider if you value privacy…


  • Make sure the connection to your email provider/server is a secure connection. Webmail providers (those where you check your email in your browser) usually are. Others (server-based) should be double-checked to be using SSL/TLS connection.
  • If you’re in Europe, consider using an European mail provider, such as eclipso.eu. If you have a website hosted in European servers, you can set up your own domain for email.
  • For an additional security layer, consider encrypting your messages. Thunderbird users can use the Enigmail add-on. For Webmail, there are some browser extensions for encryption. Click here for more guidance on this.

Encrypted Searches:

  • Use encrypted.google.com as your default search engine, thus preventing eavesdropping from random people when browsing in unsecure connections. You can also try DuckDuckGo — a search engine that doesn’t focus on personalized results. DDG’s results aren’t always perfect, so personally I stick with Google for searches (logged off).
  • Note: this won’t prevent your Internet Service Provider (a.k.a. ISP / your Internet access company) from knowing the sites you visit and the terms of your searches. Google might also keep track unless you turn off their web history.

Proxies/VPN [advanced]:

Enable HTTPS browsing:

  • HTTPS Everywhere
  • Enable HTTPS / secure browsing in Facebook’s privacy settings. Double-check your other settings there, in case Facebook sneaked in another “feature” with dubious purposes. Better yet, avoid facebook altogether.

Hold your cookies:

  • Control the “cookies” stored by webpages (and their ads) on your computer. For example, you can configure your browser to keep cookies only until you close the browser. I suggest doing a complete cleanup of all cookies once before you configure this. Be ready to remember the passwords you have used in the past, because…
  • You will need to login again to any site requiring login on your next browsing session. You can counter this by letting your browser save passwords. Personally I prefer that to having all that cookie data on my computer, as I trust browser developers more than advertisers.
  • The privacy options of web browsers usually provide a Do Not Track setting, which in theory it can help prevent advertisement tracking. Google Chrome also provides prediction and spell checking services which you might not really need.
  • Update: If you have Flash installed, check the settings to prevent any data from being stored by Flash in your computer. Unfortunately, web companies have now adopted Flash as a cookie-like data-storing mechanism.

Blocking ad trackers, social plugins (and any scripts):

  • As a complement to the point above, or especially if you have issues controlling your cookies so tightly, you can also use browser extensions like Privacy Badger. They can identify and block trackers and social plugins that appear in many sites. If you block social plugins you actually stop seeing those annoying “X people like this page on Facebook” boxes.

Personal mentions and profiles:

  • If you’re being mentioned on the web and would like to disappear, SafeShepherd can help with that.

Online Storage:

  • Avoid storing all your personal files in the cloud, at least with companies from countries with snoopy governments.


  • Now this one is a pickle. Microsoft is arguably making Skype less resistant to government snooping; Google has just removed the ability to disable all chat history by default; and Facebook, well, is just not trustworthy regarding how much of your data they keep and access freely.
  • If you want to be fairly confident about the eternal privacy of your chats, you might need to use something like ChatSecure or Pidgin‘s encryption plugin. This is, in practice, very hard because you need every other person to use the same.
  • In the end, you’re probably better off not caring much about it and keeping sensitive talks offline.

Did I forget anything?

Third-party Comment Systems Gone Wild

2013 must be the year of third-party commenting systems. Facebook’s comments were already popular and integrated into several sites. This year I’ve seen Disqus take over the comment sections of some websites, apparently increasing their lead over Livefyre, while Google is already deploying their Google+ comment integration in Blogspot.

These solutions might be interesting for the business owner / novice webmaster who wants to save some time on implementing comments on a plain, non-CMS site. On the other hand, I don’t really understand why many CMS-based sites are dropping their native CMS option in favor of a solution with so many drawbacks. This is what’s happening…

  1. Third-party commenting systems own the comments. They feed them to the webpages through scripts that don’t actually make the comments part of the source code or (in other words) visible to search engines. It gets worse with Google+ where many comments are actually not comments, but Google+ shares of the webpage. Sometimes you can’t even follow a proper line of discussion in the original site.
  2. External systems can fail independently of your site being up and running. I’ve personally experienced a case where externally hosted comments just wouldn’t load. It also happened that I lost my comment for failing to realise that I needed to log in.
  3. 3rd party comment systems track comments of a user across all the sites using the same system. Just the kind of centralization that your favorite government intelligence loves.
  4. These systems also force the user to either create another account (adding complexity to the user’s own account/password management process) or to give them access to some of your social profile data.
  5. They can simply not work on mobile devices. With smartphone and table use on the rise, you’re losing valuable interactions with your site.

I wonder if there’s any study out there that measured user engagement before and after the switch from native to third-party comment systems… So far I only found similar opinions. [1,2,3]