Published on

I will start this post with a simple question: How much of your daily time spent on the Web is spent on Google products or Facebook? A very large proportion I presume and this should not be very surprising since the Internet favourites a winner-takes-the-most scheme, i.e. a market in which only an extremely small number of players can survive economic competition, the winning player taking all the market-share.

Nowadays, two big ‘survivors’ that are Facebook and Google are engaged in a bloody battle in which the weapons are ways to gather more information/data about their users. More than never, the Summer and early Fall have made obvious this information war and the ‘weapons’ used. Think of Google+1, Google Wallet2, Google Offers3, Facebook Timeline4, Facebook Open Graph5

Because the main casualties of this war will be, at the end, us, their users and the Internet in general, I feel the need to share my viewpoint about the direction or vision those two companies want to impose to the Internet and the dangers they represent for our (e)Society.

In what follows, I will focus on Facebook, more precisely on the Facebook post-F8. Why this choice? 1) Articles about F8 are trendy :) 2) F8 parametrizes well the dangers of the info-war.

In case you just came back to Earth and missed yesterday's Facebook conference called F8 on the 22nd of September, here is a summary of the main changes the social platform has already put in place or is going to implement in the next few days/weeks:

  1. A new Profile, the “Facebook Timeline”: https://www.facebook.com/about/timeline
  2. A better and “smarter” Open Graph: https://developers.facebook.com/docs/beta/

To be concise, the latter two changes have the potential to transform deeply how nearly 1 billion individuals6 will interact online, by making the amount of information about people more open/visible, more precise and more important, and by making our interactions more integrated and inclusive to our (digital) lives.

On paper, this sounds exciting and those new features are surely impressive and well-thought both on the front-end side and back-end. Nevertheless, they ask, in my opinion, several important questions. Since I want to keep this post short, I will only highlight two of them:

  1. Do we build eTrust by making interactions unconscious?
  2. How do we ensure eTrust by leaving an increasing amount of data in the hands of a unique private company, without any satisfying external controlling organization?

eTrust

Why do we need trust or why does a society (offline or online) need trust?

In his analysis of trust, Luhmann8 brings certainly the most appropriate and concise answer to such issues by stating that “trust is an effective form of complexity reduction”. Put otherwise, the need of trust rests on the fact that trust is a starting point for the derivation of rules for ‘good’ conduct, or for ways of acting ‘successfully’ by decreasing complexity, risk and uncertainty in a given (social) system. Now, it remains to understand what is (e)Trust.

What is eTrust, or equivalently online trust?

It’s simply the driving force of any digital interaction. Put differently, Trust is often understood as a relation between an agent (e.g. a Facebook user) and another agent or object (e.g. Facebook). The relation is supposed to be grounded on the truster’s beliefs about the trustee’s capabilities (e.g. its reliability) and about the context in which the relation occurs. This is a generalisation of the definition of trust provided by Gambetta9.

Trust implies then a truster conscious of interacting with the trustee.

The Interaction Awareness Issue

The new Open Graph5 presented @F8 has been developed in the aim of increasing the amount of data exchanged and stored by Facebook. However, Open Graph’s new features have in my opinion a very perverse side-effect by making people unconscious of interacting online. Why?

To be concise, the Open Graph protocol allows 3rd party web sites to use the Facebook platform to engage with their users on Facebook and possibly increase their audience / set of interactions.

The new features of this protocol rest on a “user → action → object” model. This means that whenever a Facebook user installs a 3rd party application, every interaction he or she will have with this 3rd party platform will be automatically stored by Facebook and attached to his or her profile/identity. For example, let’s say that I have the Spotify application installed, then everything I will listen to in Spotify will be potentially available to my Facebook friends and then Facebook. Same reasoning applies to what I read / eat… You see the problem?

This 2steps mechanisms blurs and even erases the awareness of interacting in a specific context by playing on the large amount of digital interactions we make everyday. Then, how is it possible to develop a trustworthy Web when people are not even aware of interacting…

The Data Collection Issue

As an application developer for Facebook, I had the opportunity to put my hands on the new “timeline profile” on the 22nd of September. If you are curious, you can do the same by following a step-by-step tutorial written by Greg Kumparak (TechCrunch)7.

Once my “timeline profile” set, my very first impression was this one: “OMG! The UI (User Interface) design is great. WTF! How can they think that I’ll be willing to post a photo of me as a baby, the name of my dog, something on a lost loved one or when I overcame an illness…”

The fact is that by having developed an intuitive UI, by leaving the possibility to their users to even publish more information on their lives and by playing on our narcissism and their immense community, the guys at Facebook have found a great way to gather relatively easily (even) more information about their users.

Add to this the naughty consequences of the new Open Graph mentioned previously and you have an almost perfect data-collection-weapon. The result is simple and not new: an extremely large amount of information about us: from basic ID details to very private ones with precise and detailed information about our preferences, interests and habits.

What is new though, it’s the ability of recomposing very precisely our (digital) DNA, i.e. what we are, or at least the convergence towards a desire of recomposing automatically our DNA.

Now, our approach to the privacy must change in the digital age and already changed thanks to Facebook and Google, among others. I think this change was necessary to pursue the development of the Internet and then my point here is not to say like some that we should “boycott online platforms that gather data about ourselves”, clearly not. What I think important is again to be conscious or aware of the amount of data we make available on the Web and to whom we make it available.

With this respect, the new Facebook represents a potential danger by automatizing the process of interacting online / sharing data and by making individuals no longer accountable for their actions, it’s what I call the Homo-Facebookiens.

It’s a bit paradoxical to observe our current democratic societies reacting strongly (sometimes with reason) whenever a government puts in place a way to gather more data about its citizens when the same citizens share (overall) even more data with companies such as Facebook or Google, id est by definition entities that do not follow a democratic structure and cannot be controlled by their users.

This leads to a simple and straightforward question:

How to ensure a trustworthy and sane development of the Internet in particular and of our Society in general when our ‘DNA’ will be in the hands of a very few powerful companies?

Such a question will need to be addressed in a near future in order to ensure that those great companies/ideas that are Google and Facebook (… and alike) will continue to expand our common knowledge and make the Homo Sapiens a better human being, and not the reverse!

As always, your reactions are welcome and encouraged below.

References:

  1. https://plus.google.com/
  2. http://www.google.com/wallet/
  3. https://www.google.com/offers/
  4. https://www.facebook.com/about/timeline
  5. https://developers.facebook.com/docs/beta/
  6. http://www.facebook.com/press/info.php?statistics
  7. http://techcrunch.com/2011/09/22/how-to-enable-facebook-timeline/
  8. Luhmann, N. (1979), Trust and Power. Chichester: John Wiley.
  9. Gambetta, D. (1998), Can We Trust Trust?. In D. Gambetta (Ed.), Trust: Making and Breaking Cooperative Relations (213-238). Oxford: Basil Blackwell.

Like it? Please share. Also feel free to leave your comments below.

blog comments powered by Disqus