The nineteenth century belonged to “Popular Media” (ofcourse apart from Rock gods and World Wars). It saw the advent of Telegraph, which subsequently moved on to Radio and the phenomenon that was Television. This hunger for information gave rise to giants like Reuters, BBC and CNN, their power growing exponentially with their reach. Towards the end of the century then, these media groups were probably the most powerful force in the world, representing the bubble of humanity and capable of bringing even the mighty to their knees.
This natural progression and the innate curious nature of mankind led us to the culture of 24×7 news – which is where the troubles began. The insatiable thirst for these media houses to bombard us with information (whether useful or not) diluted the very concept of journalism. It moved on to the dark lanes of tabloid journalism and then to the more unfortunate episodes of fabricated reportage. This mixed with the ever increasing power of the media, twisted the control of public opinion and there is a prevalent feeling that something needs to be done.
To the rescue came – or so it seems – Web 2.0. Well the internet was quite well established by the turn of the century, but the early 2000s saw the explosion of “Social Media.” The World Wide Web therefore moved on from being the Pandora’s box of information to a living web of a million connected and interacting individuals.
The explosion of social networking sites such as the ubiquitous “Facebook”, to the much lesser known “Couchsurfing” meant that one could connect to likeminded individuals across the world, no matter how bizarre be the field of interest. Much less glamorous was the rise of “Blogs”, which though seem to have taken over the creative and cognitive landscape of the web. As with the networking sites, you can find a blog to cater to the funniest, craziest, dumbest, and whatever-you-might-like interest. And I better not get started about Twitter here.
What makes this new concept so addictive and powerful is 1) its immediacy in spread of information, and 2) the intimate access to our life accorded to it by us. In short, a tweet about my lost dog reaches to all my contacts instantly, right onto their handheld devices. This in turn is broadcast by them, and in a span of minutes the lost dog is a concern for more people than needs to be. Good for the dog then!
Humans are social animals, and when you have the fodder that is Web 2.0, tectonic changes are bound to follow. This living sphere of digital ones and zeroes evolved into the largest open forum on the planet, with views being exchanged, ideas floated and opinions offered to an ever interested audience. To cite just an example it is estimated that social media was integral to the Arab revolutions of 2011. As one activist put it, “We use Facebook to schedule the protests, Twitter to coordinate, and YouTube to tell the world.”
But where does this freedom of use of information and broadcast stop. The fiercest debate on this was raised by Julian Assange, the maverick founder of Wikileaks. While most netizens hailed him as a visionary, governments across the world were not amused. What followed is history and the dust hasn’t settled for sure, but the matter might be crucial in deciding the future path of online freedom. Does an individual’s right of expression accord him/her the liberty to disclose secrets without knowing the full extent of its impact? And does the world have the right to know every fact, or does good old censorship still hold true.
As Uncle Ben said to Peter Parker “with great power comes great responsibility” and naturally the same would employ to this nascent technocultural bubble. Unlike the popular media, there is very scant editing or censorship in the social media, and anyone with access to a mobile device is capable of shooting “expert opinion” as he/she fancies. This not only is potentially hazardous, but also tones down the credibility of social media to be used as a legal tool in any future conflict.
But the reality is much tangled than appears. Despite being hailed as the saviour of public information, Web 2.0 is anything but so. Did you know that Google tracks 57 signals about each user before turning out results for the searched content? And this is even when you are not logged in! We then live in what Eli Pariser hails to be the “Filter Bubble.” He describes it as “the personal universe of information that you live in online — unique and constructed just for you by the array of personalized filters that now power the web.”
In his book of the same name (a must read according to me), he argues that personalization is sort of a privacy turned inside out. Today’s net not only allows you to control what the world can see about you, but conversely also decides what you get to see of the outside. The worrying part though, is that most of this happens passively, unknown and uncontrolled by us.
The twenty first century seems to belong to the social media, but if we have learnt anything from the past, then it is upto us in preventing it turning from a powerful source of focussed opinion to an uncontrolled fire breathing monster. Internet is the most adaptable and dynamic invention by us, and is in a way the first form of a self conscious machine much feared by sci-fi writers since ages. But the beauty of it all is that this self consciousness has come to be defined and in turn defines the cultural landscape of the world today. Where it will lead is to be seen, but what is sure is that we are driving the bus, and therefore hold the sole responsibility for the path it takes.
ps: If you want to know more about Eli Pariser and his Filter Bubbles, watch the video below