Is the ‘new’ trend going to define humanity itself?
Recently, in a New Media Studies class, I was intrigued by the excitement and enthusiasm my professor carried. I have always identified New Media as a very powerful discipline, however, it is seldom that you come across someone talking about design from so many perspectives that the birds of thought in your brain flap their wings and set off to fly again.
Looking at it from a modern outlook, it won’t be justified on our part to define new media through some hardcore definition as such. With the onset of the ‘latest’ and ‘what was new yesterday not being new today’, the lines have become blurred and the system itself is expanding its wings at a rate comparable to universal expansion(just kidding). The more I think about this concept of New Media, the more I feel as to how tremendous it is! Two words ‘New’ and ‘Media’ both with well-defined meanings merge and together we get a combination which is for the entire world to explore. It is more than just a media, it’s an extension of ourselves.
Some time back, I had come across an article on AI (Artificial Intelligence) and how it was going to play a pivotal role and command most of the work processes in our sight. Focussing on AI separately, the definition and our acceptance of what is ‘intelligent’ in itself keeps on changing every single time. In the past, we would have said only a superintelligent AI could drive a car, or beat a human at chess. But once AI did each of those things, we considered that achievement obviously mechanical and hardly worth the label of true intelligence. Every success in AI redefines it.
Coming back to what initiated this chain of thoughts and my professor from the class above, some of his statements went on like this. “New Media encompasses all sorts of interaction between new technology and established media form to start with and goes way beyond that to bringing and visualising what could be the next new cool” thereby encompassing everything that I have the capacity to think of, if I look at it from a neutral perspective. “The very definition of new media is less settled upon, known and identified. It spans a complex path with computer sitting at it’s locus of convergence”.
If I could not express myself vividly as yet, all you need to do is to look in your favourite direction right now. The first object of interest that you might have settled upon is still considerably away and there are a whole lot of things in between at even minute scales all of which can be associated with some form of technology and may come under the umbrella of new media. The only wait is to tap upon that and don’t get me wrong when I mention ‘wait’ since there are already predictions way ahead into the future by people who have spent their life predicting things and often, correctly so. As such, if you are thinking on these lines for the first time, you need to realise that you are way behind.
One particular statement that caused havoc in me was something very fundamental to design. There is a very old and established statement which goes like “Form follows function” and there are arguments mostly in favour and against it. However, when you hear your professor say, “Today form and function are not that elemental and the core is being occupied something which we all carry. Emotion!”, things just elevate to an altogether different level. I am not bothered about whether ‘Form follows Function’ has lost it’s glamour but what is definitely correct is the fact that emotion is driving a lot many things today.
Now mainly, because my mind keeps on jumping from one thing to another in a restless fashion, I suddenly got into thinking the various possibilities that a merge of New Media and Artificial Intelligence (especially emotional AI) could do. As soon as that thought occurred to me, next came in ideas equally horrifying! This horror part was partially because of the impact which movies like ‘Ex Machina’ and ‘Her’ both from the fantasy category portrayed. I will leave the interesting part of exploring about the plots for these two movies at least to the users but the first thought that came to me was that of surprise and then fear. Surprise because the extent to which a machine could get a human emotionally attached to itself and the extent to which consciousness in these AI systems can be damaging was indeed scary.
Coming to the actual point that I am trying to make, we need to be sure if we are actually going to be always in control in future or will there be a moment when AI’s would define humanity and tell who we are or will things get worse? For understanding, imagine the following. We are already innovating and coming up with the latest in terms of new media and so is our dependency growing on it. The day is not far when everything would be seamlessly integrated. The technology of automation somewhere overlaps with this domain of New Media and then there is Artificial Intelligence.
The entire concept of Neural Networks, algorithms which are the backbone like that of ‘deep learning’, ever improving cheap parallel high speed computation and obviously Big Data has made it possible for not one, not two but millions of people to explore and achieve things which seemed unrealistic and impossible yesterday.
It is only realistic to assume that in this competitive age, everyone is obsessed with creating the smartest system. As a result, even though all of us today agree to the statement that “As AIs develop, we might have to engineer ways to prevent consciousness in them and our most premium AI services will be advertised as consciousness-free”, there is a strong probability that even a secluded experimental set up could give birth to a conscious AI system. The above two movies mentioned clearly demonstrate what emotional intelligence can do.
To paint things even more vividly and to add a little bit of dramatic effect, lets assume that we have every single thing seamlessly integrated in our immediate environment and communicating with us. Let’s also assume that despite all our precautions, AI systems become smart enough to gain consciousness and we get too late in realising this. Since we want every smart thing to be intelligent, let’s assume there is such a strong interplay that every smart thing is also artificially intelligent and carries consciousness hidden deep within. If you know AI even one bit, you already understand that AI improves with more usage. As a result, our bed and smart windows and fans and lights and alarm clocks and doors could also conspire to create a situation where I can never leave my room. Why would they do so? Well, when we are being so generous, let’s assume we have given them enough reasons to hate us and they realise that it’s time that we humans, start working for them! Scary! Isn’t it?
Well, my thoughts obviously are that of a beginner and I might have jumped and made wrong conclusions above.
However, we haven’t just been redefining what we mean by AI! We’ve been redefining what it means to be human. Over the past 60 years, as mechanical processes have replicated behaviour and talents we thought were unique to humans, we’ve had to change our minds about what sets us apart. As we invent more species of AI, we will be forced to surrender more of what is supposedly unique about humans. We’ll spend the next decade indeed, perhaps the next century in a permanent identity crisis, constantly asking ourselves what humans are for. In the grandest irony of all, the greatest benefit of an everyday, utilitarian AI will not be increased productivity or an economics of abundance or a new way of doing science although all those will happen. Is it that what we want instead of intelligence is artificial smartness. Unlike general intelligence, smartness is focused, measurable, specific. It also can think in ways completely different from human cognition.
I do not consider myself fit to draw any conclusions as of now but when and how do we figure out if we are crossing the lines. Innovation is so fast paced that we will achieve individual elements mentioned in the horrific scenario above very soon.
So, should we be more cautious? Or is it like, even the worst case scenarios where our humanity gets redefined and AI tells us who we are and what role we should exactly play some 30 years down the line is something we need not worry about?