Reflecting on my social media activity
Over the years I have built a sense of vulnerability by using social media like Facebook.
Many years ago, when I was happier and more innocent, I used to like to share some thoughts, links, pictures and comments about my friends' posts. Then, some political changes in my home country divided people, including some of my good friends, and it was awkward to make any comments because, by doing so, I was going to be labeled and classified.
Some personal changes also encouraged me to stop sharing my life. I felt I had no protection when I most needed a hug. I was completely exposed even to strangers, without my consent.
Nowadays, I just share strictly what I can put my hands on fire for, what I can fight for, what I have words for, or energy for. And I only do it when I have a moment. I learnt not to feel the pressure of being "updated" at all times, not to know "everything" of nothing, not to be taken for a walk as a puppy on the internet. I use it only when I need it. I don't feel a slave of the internet anymore.
I use the internet to work, to study, to research, to connect with my clients, to connect to my family overseas and to trade. I used to have a website to sell printed products made by myself, but since I started my new career, I decided to close it and build my own when I'm ready.
Everyday I use a laptop that I consider my "right hand". I don't think I can live without a computer anymore! When I'm out, I use my phone mainly to navigate on the internet. I just wish it was as big as my laptop... I'm loosing my eyesight with age... It's very uncomfortable. I can't afford an expensive tablet just to go out, and this issue changes the way I interact with the digital world. If I had a bigger screen, I would probably be more focus in what I'm looking for, rather than worried for not being able to read properly.
Digital world definitely shapes us, and the absence of it changes the way we create, we think, we share, we live!
POWER THROUGH THE ALGORITHM?
PARTICIPATORY WEB CULTURES AND THE TECHNOLOGICAL UNCONSCIOUS
This text explains how software algorithms have the power to make decisions with information that shapes lifestyles and environments. It's done by relational databases and locational hardware that create inferences.
The decisions made are part of an invisible process where data generates custom categories and tags customers in a way that is used by companies to maximize efficiency and profits.
If before our lives were mediated by software and information according to ways of understanding the world, now our lives are constituted by them.
Lash (2007a) describes the emergence of new forms of complex power that calls "Post-hegemonic power".  Hegemon (social and cultural structures) is in the everyday and power operates from inside rather than above.
Bauman (2007) also writes that the content creation in the internet is user-generated ("power operating from inside", in terms of Lash). Ratings, reviews, blogs, posts and other content generate inferences through software algorithms that constitute the new Post-hegemonic power.
If traditionally power operates from outside, now works from inside the digital networks. The system has the power to organize itself using user-generated megadata tags. According to Lash (2006), information finds us and the domination occurs through communication.
The algorithms shape social and cultural formations and impact on individual lives. The information about us is "harvested" to inform and predict the taste of the individuals. Capitalism then, has a new power; the power of the algorithms to shape cultural experiences.
Distinctively, we are shaped by algorithms that sort and filter data, and these events do not create resistance. I believe they have an incredible invisible scary power. 



Eli Parisier, in "The Filter Bubble", analyses the internet space as a commercially controlled and extremely personalized space that it's not independent neither free anymore. It controls our lives through filter bubbles that constantly learn from our behaviors, our search history, our type of computer and our location. Those bubbles repentantly reproduce for us what we have chosen to see, and the ideology we have shown we commit to.
"Personal intelligent agents lie under the surface of every Web site we go to. Everyday they are getting smarter and more powerful, accumulating more information about who we are and what we are interesting in" (Parisier, 2011:25)
  The quest that has shaped the internet we know today started in 1995 with the race to provide personal relevance. For example, Amazon.com became an Artificial Intelligence company powered by algorithms that were capable of matching customers and books instantly. And because books don't need to be tried on and a physical bookstore would never be able to inventor millions of titles available,  online stores could offer not only unlimited titles but a whole new intimate and personal experience for the shopper. It was already a a system that was able to tune itself based on feedback. "Amazon proved that relevance could lead to industry dominance" (Parisier, 2011:30).
A few years later, the Google founders were thinking of using algorithms to sort through sites on the Web. They discovered that "The key to relevance, the solution to sorting through the mass of data on the Web was... more data" (Parisier, 2011:32)
Pariser's book is interesting to read. Full of stories of the birth of Google, Facebook and their founders. It describes in detail how "the user is the content" of the internet by creating filter bubbles from "click signals". "Whereas once you had to buy the whole paper to get the sports section, now you can go to a sports-only Web site with enough new content each day to fill up ten papers" (Pariser, 2011:51). This is only an example of many that the book offers.
I wish I had time to read it completely before this assessment. I might just highly recommend it because it gives enough signals to show us how we have been systematically manipulated by powerful corporations that seem to know too much about us.
Al the end of the book, Pariser analyses our future in relation to the internet. He is focus on how to make technology and media serve democracy and encourages people to understand that "There are literally hundreds of millions of us across all demographics - political, ethnic, socioeconomic, and generational - who have a personal stake in the outcome. And there are plenty of smaller online enterprises that have every interest in ensuring a democratic, public-spirited Web. [...] But in the end, a small group of American companies may unilaterally dictate how billions of people work, play, communicate, and understand the world. Protecting the early vision of radical connectedness and user control should be an urgent priority for all of us". (Pariser, 2011:243)

THE MYTH OF THE ONLINE ECO CHAMBER
We call echo chambers to those filter bubbles that constantly show the same ideas, same beliefs, same views and information through news feed and social media.
Robson (2018) explains that "reading habits shape our political opinions" and he recommends to read different sources (and social media can assist us with this) to "backfire political polarization".
He believes that the influence of echo chambers has been over-estimated, despite of the well-known manipulation of social media. Although, he thinks is essential to teach KEY SKILLS at school to navigate in internet: basic critical thinking to identify bias in an argument.

JOINING THE DEBATE
Topic 2:  The creators of a technology are those most responsible for its ultimate uses.
It's hard to agree with this statement completely because when a design was born with good intentions and then someone else uses it in the worst way possible, I think it's unfair to blame the creators. Also, it's really difficult to consider the full ramifications of a creation. What it had a meaning in certain historic period, might change completely as soon as the surrounding circumstances change; circumstances that probably were completely unexpected.
Having said that, I agree that as designers we must study the potential impacts of our creations and research deeply about their potential users. Conducting interviews, testing the product, visiting locations where the products are going to be used, understanding the rights and wrongs of our design, and being able to change it if we detect potential harm, should be our way to proceed in order to be responsible creators. 


You may also like

Back to Top