As it were real future-telling, the author, already in 2011, prepares the reader to understand the perils of web personalization and its potential consequences. Now, in 2017, those consequences have materialised.
Let's remember that an interesting part of Information Security is Personal Data Privacy (it that still exists!).
As always, little disclaimer, this collection of learning points do not replace the reading of the book and they constitute a very personal list of items. Let's start:
- The arrival of personalised Internet search by Google in 2009 contributed to make the user of that search a real product rather than a customer.
- The delivery of personalised search results creates, for each of us, a personal bubble in which we will live on. This is great in terms of confirming our interests, however this is not so great in terms of isolating each of us within our own bubble and system of beliefs.
- Different point but also worth highligthing: Asymmetry in email. The cost of sending an email is orders of magnitude lower than the cost of receiving and reading an email (in terms of human time devoted to it). This is the main reason why email spam exists.
- Facebook focuses on relationships among people and Google on relationships in data.
- Facebook focuses on what you share, Google on what you click.
- Both aim the same final objective: User (product) lock-in.
- The author also talks about user behaviour as a commodity and how some companies monetise that e.g. acxiom.
- Interesting fact: Google News was created as an initially easy way to curate news after 9/11.
- A fact: More voices means less trust in a given voice.
- In the US in 2011 people watch TV on average 36 hours per week.
- Definition of TV: Unobjectionable entertainment.
- The key to keep audiences happy: Creating content in response to their likes.
- Personalised filters affect the way we think and learn.
- We tend to convert papers with lots and lots of data into "likely to be true".
- Information itself wants to be reduced to a simple statement.
- The more expert you are in a topic, the more reality-bias you have and the less successfully you will predict.
- Consuming information that conforms to our ideas is super easy. That is why we do it.
- The filter bubble shows us things, but it also hides other things to us and we are not as compelled to learn about new things if we do not know about them.
- It is important to be able to do what you would like to do but also to know what is possible to do.
- For the time being, Internet personalisation does not capture the difference between your work self and your play self.
- There is a difference between that we watch and what we should watch.
- Profiling gives companies the ability to circumvent your rational decision making.
- Personalisation still does not distinguish signal to noise.
- If our best moments are often the most unpredictable ones, what will happen to us if our bubble is fully predictable?
- The bottomline: In the book the author mentions that we do not know the effects of this filter bubble. However, six years after its publication, we can see its real consequences in terms of fake news and isolation.
- The existence of the cloud. Personal data in the cloud, outside your computer, is much easier to search than info on your computer.
- Statement extracted from the book (published in 2011): "Personalised outreach gives better bang for the political buck".
- In the post-materialism era we buy things to express our identity, not because we need the item we buy.
- The personalised bubble make getting people from a community to make better collective decisions more difficult.
- Peter Thiel, American entrepreneur, e.g. Paypal founder, states that "freedom and democracy are no longer compatible".
- Engineers resists the idea that their work has moral or political consequences.
- Small pieces of advice: Delete our browser history every now and then. And if you dare, your cookies ;-) Use the incognito tab in your browser.
- Be aware of the power of default e.g. by default when you open the browser you do not land on the incognito tab.
- The author states that there are also possibilities to improve using this technology if companies are transparent in explaining how their filters work and how they use our data.
- Corporate responsibility is required, and probably also a kind of oversight.
- Personal data should be considered a personal property.
Too much to think about in only one post!
Happy reading!
Hello to a new world |