’Tis the season to be algorithmically presented

Ending of the year presents us with the opportunity to look back and reflect the past year. Fortunately, if you happen to be in Facebook, they provide the review for you in their "yearly review"-app.  Based on your status updates of your year, the app puts them together and presents your year to you. Then you can share it. To be extra kind Facebook also pre filled the text on the status update.

In many ways, one could consider this as a nice gesture and piece of software from Facebook. Now we don’t have to do the hard work at reflecting back our year. No need to think and decide the important points. In a true style of the Silicon Valley solutionism  the problem is solved. - Of course, I know I do not have to share the presentation if I don’t want to, but it is popping in my feed time and time again and every time I get shivers down my spine from this.

Why?

Well, first it reminds me of how much this big corporation knows about me, or think it knows. And the information served to us in the form of yearly review is just a fraction of data Facebook collects about us. (and sells to their advertisers.) This yearly presentation just shows us a little glimpse of the algorithms working in the background. Churning away quietly, patiently collecting all the little bits and pieces we pour into it. 

It is not that algorithms are bad in some way; they are essentially just pieces of code, instructions to do predefined task. But the way and cases we use them trouble me. Why isn’t there more open algorithms aimed to enhance commonwealth and wellbeing? This video by Harlo Holmes for example, is just a tiny peek on what we could achieve with algorithms, when put on to good use.  Why is it that the most sophisticated algortihms are used to gather information about us, categorising us like a herd and then selling this information to advertisers?  You can be single, married, divorced, male, female, white, gay, in your mid twenties, etc. You might also be profiled to like a certain genre of clothing, music, movies. You are most active with these friends, and then your friend's data is compared to yours etc.... All this to collect a profile of the way you act, what makes you happy -how to deliver an advertisement that speaks to you. For your benefit, naturally. Another technological solution to a problem that doesn’t need solving: how to deliver ads so that they are effective and not annoying. (IMHO: all ads are annoying by their nature. )

There are naturally many more uses for algorithms; Christopher Steiner has a nice review of algorithms in his book: Automate this: How Algorithms came to rule our world

He writes: Algorithms normally behave as they’re designed, quietly trading stocks or, in the case of Amazon, pricing books according to supply and demand. But left unsupervised, algorithms can and will do strange things. As we put more and more of our world under the control of algorithms, we can lose track of who— or what— is pulling the strings. This is a fact that had sneaked up on the world until the Flash Crash shook us awake.


Second, they force me a solution I don’t want or need. They present me with a template for and of my life. All the things are from within the bounds of Facebook. - That’s where everything happens and we are our true selves, naturally. They want to engage me more into their ecosystem. -on a side note, have you noticed how difficult it is to share anything you find in Facebook outside of Facebook? Especially in mobile devices. It might be nice to take a look back at your status updates yourselves and see what you have done. This is similar to Think Up that might serve some meaningful glance about your use in social media. But it is worth asking that why share it with everyone? Or why Facebook wants us to share it with everyone? In a way, this reminds me of the ways Facebook slowly and quietly invades into our life more and more. Who remembers the beacon episode?  After that Facebook has slowly and quietly launched more and more ways to gather knowledge about us. But in a very quiet fashion. Of course, sometimes something spills

Why all this is a problem ?  -Maybe it isn’t  -depends on the angle you look at it. For me, personally I found all this to limit my freedom as a ”user”. And this is true to most social media sites and beyond. (Yes, I’m looking at you Google the behemoth) These companies keep their algorithms and source code so secret that you might suspect some sinister magic is beyond it. Probably is too. But the more down to earth reason is that if we want to live in digital world and interact with each other in it, wouldn’t it make sense to do it in either public open way, intheoretical open platform or in your own style? I don’t recall sending letters to my friends that were pre filled. Why to do it in the digital world? Because it is easy? Really?

Third, the yearly review does not have any thought in it. If we don’t count the countless hours some developers poured into it. But the working product is just algorithms, code. The process is thoughtless, emotionless. This can lead to inadvertent algorithmic cruelty, like in Eric Meyer’s case. And the inadvertent cruely may be even more common according to Dale Carrico in his post in World Future Society

Eric Meyer writes: Algorithms are essentially thoughtless. They model certain decision flows, but once you run them; no more thought occurs. To call a person “thoughtless” is usually considered a slight, or an outright insult; and yet, we unleash so many literally thoughtless processes on our users, on our lives, on ourselves.

After all this lecturing and preaching I must say that I am not opposed to seeing people share their yearly reviews (Must admit, I do not look at them.) I am just hoping that if you want to share your yearly review that it  would be genuine and hand-made. And not feel-good ego boosts, done in a double click. What the digital world does not need is that we  automate more emotions and empathy, what it does need is  handcrafted individual works made by humans, something that can be felt. 

First and second generation of the internet

I recently stumbled across this tweet:

And although I can see it's intention I do think that I somewhat disagree on that.I also understand that there is naturally the ambiguous nature of concepts like 1st gen. and 2nd gen of tech. But as I understand it it means the generations of digital technologies and the internet. It could just as well mean the generation of cars of the old and contemporary cars. It would not make that huge a difference.

If we look at the internet in it's 1st generation, I would say that was a real time of personalization and humanization of the internet. Learning to script your way through to get your first web page showing, or starting a blog with LiveJournal or movable type felt, and was personalization in the internet at it's best. It was before the internet was monetized. (and maybe I am just old fool who thinks everything was better back when.)  True it did take effort to get something to the web. But that is part of human experience. If I just hit a sticker to a piece of paper and call it my painting, I rarely have the feeling of accomplishment. (Unless it's an artistic statement on fine art of course.)

Now the second gen of internet offers as a "web 2.0", social media and the like. It is so easy to write and share. And we even have stickers now on Facebook!

By making something easy is not humanizing or personalizing.

Under all this machinery that makes this easy are large corporation collecting and mining our personal data. To probably be sold to marketing firms. Or in the case of Google or Facebook to be used as a value to offer to marketers to get into their platforms. 

One good example is the sudden fame of tilde.club

And if we don't talk about the internet, but tech at large. Then the 1st gen wins there too. With a little knowledge or with a book from the library we could, if we wanted, fix our cars, clocks, radios, phones even. But now it's almost impossible. 

If we think how new tech with smart watches and smartphones brings us personalisation and humanization, we have to realize that at the same time they bring the opposite. Algorithms that suggest you new music, new exercises, new restaurants are just bunch of code. All that may be happening is that we are left in a filter bubble instead of hearing something little (or lot) out of our comfort zone - And maybe end up liking it.

I would agree that the fight we are having right now is to bring humanization back to the tech. But I wouldn't say it's a fight against the first generation, It's the fight against corporations and to the whole attitude of tech industry: What is needed is not new solutions by engineers, but civic engagement and realization. Like Sherry Turkle said in her book "Alone Together" we must realize that the internet and digital technologies are not done and ready but they are in their youth. We need to take a step back and think what we want and need from technology. 

It also very well may be that I have misunderstood the tweet completely wrong, which is so common in the era of trying to say something in little space and as quick as possible. But I got to write this nevertheless.

Big Data, Small Politics

Evgeny Morozov recently gave a talk on the relationship of digital technologies and their relationship with societal and political systems in Collegezalencomplex Radboud University, Nijmegen. It's a lomg talk, but worth a watch. Even the first half an hour sheds light to the complex problems that may arise when internet-solutionism and data surveillance are married with governments that outsource more and more of their services to private sector.

Big Data, Small Politics Lecture by Evgeny Morozov Thursday October 16, 2014, 19.30 - 21.30 hrs, Collegezalencomplex Radboud University, Nijmegen Organised by Soeterbeeck Programme