top of page

EMOTIONAL ARTIFICIAL INTELLIGENCE

Amruta Kadam,

The Indian Learning Magzine

Indian Society of Artificial Intelligence And Law

_________________________________________________________________________________

One of the important characteristics of technology is its dynamic nature. One such product of its dynamic nature is the creation of artificial intelligence and this report aims at discussing a subset of artificial intelligence called the Emotion Artificial Intelligence(AI). Sticking exactly by its name, Emotion AI is in its developing phase but holds humongous potentials of making human life easier. To put in simple terms Emotion AI is a way to measure, understand, simulate human emotions. In other terms, it is also known as ‘Affective Computing’ or ‘Human Centric Artificial Intelligence’. The study takes us back to a detailed article titled “Affective Computing” published by MIT Media lab professor Rosalind Picard. While enlarging the term emotion he argues that emotion can no longer be considered a luxury when studying essential cognitive processes instead, recent neurological evidence indicates they are necessary not only in human creativity and intelligence but also in rational human thinking and decision making. He further suggests that if computers will ever interact naturally and intelligently with humans, then they need the ability to at least recognize and express affection. Affective computing is a new area of research, with recent results primarily in the recognition and synthesis of voice inflexion. Its study involves both practical and theoretical ways to relate to human emotions as much as they can. The motive of such an approach is to increase the utility of AI in human life.

Over the period with the popularity of AI, it has led to developing the technology where facial expressions of human beings have been tracked which might have even lasted for a second. And based on the algorithms, patterns are created. The created patterns are then used to trace their causal factors and then it is used to conclude behaviour traits. Companies which can analyse Emotional Artificial Intelligence are the new trends in the analytics business. Some of the prominent AI startups working are –

1. Entropik Technologies- Bengaluru, India

2. Emotibot, Taiwan

3. Affectiva- Boston, United States

4. Beyond Verbal- Israel

5. NVISO- Lausanne, Switzerland

6. Promobot + Neurodata Lab

The idea of emotions and robots for many years seemed impossible but today this uncommon combination has been impressing various sectors of the society and giving birth to a new era of technology. Some of the instances where it outdid the existing ways of dealing with things include therapy to people suffering from autism spectrum disorder (ASD). In the words of Maja Pantić, Professor of Affective and Behavioural Computing at Imperial College London- human face displays 10,000 different facial expressions, out of it 7000 are expressed on daily basis by a developing human. People classify those expressions in maybe 15-20 categories. But autistic children find difficulty in this generalisation ability. But emotion AI plays a very important role while dealing with such problems.

Besides Autism spectrum disorder (ASD) other mental disorders such as depression have found their ways to deal with these challenges with the help of the AI system. Depression is a common illness worldwide, with more than 264 million people affected. It is different from the usual mood fluctuations and short-lived emotional responses to challenges in everyday life. Especially when long-lasting and with moderate or severe intensity, depression may become a serious health condition. As per the World Health Organisation, approximately 800,000 people take their own life every year[1]. Therapists are seeking ways of which involve emotion AI system as they aid the process of determining emotions of the survivors based on their facial expressions. Mental health does not have sources such as X-rays to make treatment more efficient therefore depends a lot upon qualitative data. This may result in therapy inefficiencies and make it hard to understand the treatment response. Hence one might assert that AI has extreme potentials to provide quantifiable metrics to space.

Meanwhile India’s New Education Policy 2020 encouraging the system of AI in educational field states“New technologies involving artificial intelligence, machine learning, blockchains, smart boards, handheld computing devices, adaptive computer testing for student development, and other forms of educational software and hardware will not just change what students learn in the classroom but how they learn, and thus these areas and beyond will require extensive research both on the technological as well as educational fronts,”. In this way, ED-TECH gets a boost and lets students understand every concept in a more illustrative way. The expression of students are analysed by the system to curate them with required knowledge and providing them with extra exposure to the world. This technology is useful in hiring talents, people’s preferences, for sports personality traits analysis, event planning, dating[2] apps and much more.

Like mentioned above there are many other different ways how emotional artificial intelligence is helping humans to make their life easy and fulfil their needs in a much comfortable way. The AI system keeps proving its benefits on the diverse levels but if left unregulated it has capabilities of causing hazards. In technical term, unrestricted use could give rise to a form of ‘Emotiveillance’. Per se media tech giants, advertisement industries have AI systems deeply embedded in them. Consumer’s behaviour which is analysed with the help of the system helps the million dollar online industries to understand its market adequately. We as a buyer while surfing various websites do not realise how we are under constant surveillance of these marketing agencies. While we negligently accept ‘cookies’ of websites we do not understand what we have agreed to, and that's where they take the toll on our interest and lure us to buy various things. The system is set up in such a way where the consumers choice, priorities and needs are well versed by it. Now, this looks fascinating as a consumer but slight negligence in its operation or negligence on behalf of the buyer can cause a breach of privacy. Besides, there are such emotion AI setups around us which also influence the demand and supply cycle of the market. For instance, cameras are set up covertly in stores which capture our facial expressions towards commodities displayed and that automatically manipulate the cost of the commodity. Affectiva a company founded by Rosalind Picard and Rana el Kaliouby recently gained unicorn status using emotion AI for market analytics and other companies are seen following the trend. It’s shall be too harsh to brand these start-ups ‘unethical’, however, they are definitely profiting from our emotions. And the advertising industry isn’t known for its strong moral compass. It already manipulates consumer emotions; just not yet in closed-loop. The mind readers at the prospect of emotionally intelligent ads are aimed at vulnerable buyers, then be it Amazon’s Alexa, Facebook or any other social site. The information whether stored, analysed or transferred involves sensitive information within it and this raises an important question of privacy. In fact, in the Covid times, we saw a range of events which drove our attention to some serious lacunae that exist in Data Privacy Legislation. As far as India is concerned we still lack in legislation pertaining to data protection thereby opening gates of privacy breach of individuals.

Lack of legislation on privacy protection on one side and mandatory installation of applications like Aarogya Setu[3] on the other makes it necessary for us to think about instances where the State might while using emotion artificial intelligence breach privacy of people. Issues like how far shall they be liable in it, who should be held accountable for reliefs to compensate for the damage that has been caused, seems to be untouched. Therefore as fortunate one could feel about growing comfort it gets from emotion AI it cannot neglect the potential threats it will have to suffer.

As far as now there is no such established statute protecting data of individuals but except

The Personal Data Protection Bill 2019 which aims at the concept of data minimisation. The term Privacy is a concomitant of the right of the individual to exercise control over his or her personality. It finds an origin in the notion that there are certain rights which are natural to or inherent in a human being. The Hon’ble Court has spelt out an individual’s Right to Privacy from Article 21[4]. This is in consonance with Article 12 of the Universal Declaration of Human Rights, Article 17 of the International Covenant on Civil and Political Rights, 1966 as well as The European on Human Rights which states: “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour & reputation. Everyone has the right to the protection of the law against such interference or attacks”. The aforesaid provision seeks to create a legal regime & cast an obligation upon the state to protect physical, personal, health and sensitive data privacy. The ratio held in Aadhar case[5] states three requirements to justify the actions of the state in case of breach of privacy.

These three requirements apply to all restraints on privacy (not just informational privacy) and are as follows-

· The first requirement that there must be a law in existence to justify an encroachment on privacy is an express requirement of Article 21[6].

· Second, the requirement of a need, in terms of a legitimate State aim, ensures that the nature and content of the law which imposes the restriction falls within the zone of reasonableness mandated by Article 14, which is a guarantee against arbitrary state action.

· The third requirement ensures that the means which are adopted by the legislature are proportional to the object and needs sought to be fulfilled by the law.

Therefore if we are aiming at expanding our Emotion Artificial Intelligence horizon we need to bring strict legislation which would protect data as well as not give unnecessary power to exploit the captured data by authorities at their own whims. And while we still hold the history of Cambridge Analytica and China’s social rating programme spreading, it is evident that AI’s predictive abilities are the area where the law is fragile and demands positive change.

Data privacy statutes need to be also drafted in the light of a foreign concept called ‘Right to Know’ which was held by Karnataka High Court in Sri Vasunathan v. The Registrar- General[7]. Where the court held that “This would be in line with the trend in western countries of the ‘right to be forgotten’ in sensitive cases involving women in general and highly sensitive cases involving women in general and highly sensitive cases involving rape or affecting the modesty and reputation of the person concerned”. Similarly, Once Justice Sanjay Kishankaul delivered his opinion on the right to forgotten and he stated, “The right of an individual to exercise control over his personal data and to be able to control his/her own life would also encompass his right to control his existence on the internet”. Again in 2017, in Justice KS Puttaswamy’s case[8], “the right to be forgotten” was unequivocally recognized by Justice Kaul as; Right to be forgotten means, when the data of any person is no longer required or who expects that his/her personal data will be no longer stored or processed then he/she should be able to remove it from the system where the information is no longer necessary, relevant or is incorrect or is illegitimate.

To conclude we human beings are not just about our physical appearances but composed of our emotions. We hold the power to represent them and therefore they form an integral part of our life. But Emotion AI subjects us all to the same set of algorithm and allows it to discriminate us based on our emotional composure. For one might be held ineligible for a position based on a biased algorithm. But since a complete ban on such an invention will only make us loose in the global race of technology, therefore, we must understand that just because we are subjected to the same set of algorithm our differences must not be excluded. Emotional AI brings a nexus between our rainbows of expressions and should be used in a positive manner.

______________________________________________________

[1] World Health Organisation website [2] https://www.forbes.com/sites/bernardmarr/2017/09/28/the-ai-that-predicts-your-sexual-orientation-simply-by-looking-at-your-face/#b78f6d934565 [3] Aarogya Setu is a digital service, primarily a mobile application, developed by the Government of India and is aimed at protecting the citizens during COVID-19 [4] District Registrar and Collector, Hyderabad and another v. Canara Bank and Anr., 1997 (4) ALT 118; People’s Union for Civil Liberties v. Union of India, AIR 1997 SC 568; Sharda v. Dharampal, A.I.R. 2003 SC 3450 [5] Justice K.S. Puttaswamy v. Union of India, (2017) 10 SCC 1 [6] 21. Protection of life and personal liberty No person shall be deprived of his life or personal liberty except according to procedure established by law [7] Sri Vasnathan v. The Registrar General, 2017 SCC OnLine Kar 424 [8] Justice K.S. Puttaswamy v. Union of India, (2017) 10 SCC 1

The Indian Society of Artificial Intelligence and Law is a technology law think tank founded by Abhivardhan in 2018. Our mission as a non-profit industry body for the analytics & AI industry in India is to promote responsible development of artificial intelligence and its standardisation in India.

 

Since 2022, the research operations of the Society have been subsumed under VLiGTA® by Indic Pacific Legal Research.

ISAIL has supported two independent journals, namely - the Indic Journal of International Law and the Indian Journal of Artificial Intelligence and Law. It also supports an independent media and podcast initiative - The Bharat Pacific.

bottom of page