Course_Presentation
Course_Presentation
Course_Presentation
‘online gaming intermediary’ means any intermediary that enables the users of
its computer resource to access one or more online games;
Intermediary Liability and Safe Harbour
Immunity
Intermediary Liability and Global Perspective
Models of Intermediary
Liability
Broad
Lorem Ipsum
Strict Liability Safe - harbor
Immunity
Model Model
Model
Vertical Horizontal
Model Model
7
World Intermediary Liability Map
The WILMap documents laws around the world that govern Internet
intermediaries and shape users’ digital rights. It provides both basic and
advanced tools to search for and visualize how legislation, decisions and public
policies are evolving globally.
Intermediary Liability Regime in India
• India follows the safe-harbour model.
• India’s safe harbour regime under the IT Act has changed substantially since it
was first adopted in the year 2000. Evolution of the immunity under Section 79
and its cognate regulations in four stages:
● Challenges
● IT Act 2000 ● Shreya ● 2021 IT to the 2021
● 2008 Singhal and Rules IT Rules
Amendment its
● 2011 IT aftermath ● 2022 IT
Rules Rules
9
Amendments
Intermediary Liability Regime in India- Interpretation
of ‘actual knowledge’
• Section 79 of the IT Act: Intermediaries can claim immunity under Section 79 of the IT Act when
the function of an intermediary is limited to providing access to a communication system over
which information made available by third parties is transmitted or temporarily stored or hosted; or
the intermediary does not initiate the transmission, select the receiver of the transmission, and
select or modify the information contained in the transmission.
• Section 79(3): This immunity will not apply when the intermediary:
■ has conspired or abetted or aided or induced, whether by threats or promise or otherwise in
the commission of the unlawful act;
■ upon receiving actual knowledge, or on being notified by the appropriate Government or its
agency that any information, data or communication link residing in or connected to a
computer resource, controlled by the intermediary is being used to commit the unlawful act,
10
the intermediary fails to expeditiously remove or disable access to that material on that
resource without vitiating the evidence in any manner.
• Failure to comply with the due diligence obligations envisaged under 2021 IT Rules can also lead
to an intermediary losing its safe harbour immunity under Section 79.
Relevant Judgments regarding intermediary liability
• Actual Knowledge
■ Pre 2015: Intermediaries were deemed to have actual knowledge as soon as an affected
person had communicated their grievance to the intermediary.
■ Post 2015: In the landmark judgment of Shreya Singhal v Union Of India (2015) 5 SCC
1, the Court held that an intermediary is obligated to take down content only based on a
valid court order or direction from an appropriate government agency. Intermediaries were
no longer deemed to have actual knowledge only on the basis of user complaints.
■ In Amazon Seller Services Pvt. Ltd. v. Amway India Enterprises Pvt. Ltd. & Ors., 2020
SCC Online Del 454, the Delhi High Court explained that Section 79 of the IT Act provides
a “safe harbor” to intermediaries, under which plaintiffs “have to first show that there ha[s]
been a violation of any of their rights due to the Defendants’ activities before the
‘affirmative defence’ of Section 79 could be sought to be invoked.” (Paras 123-124).
11
Relying on the Shreya Singhal decision, the Delhi High Court further held that the
“obligation of an intermediary to remove content under Section 79(3)(b) of the IT Act arises
only if there is a Court order or a notification from a government agency on the grounds
mentioned under Article 19(2) of the Constitution.” (Para 128).
Relevant Judgments regarding intermediary liability
• Determination of whether an entity is an intermediary:
■ Courts have often ruled that the question of whether an entity is an intermediary under Section
2(1)(w) of the IT Act should be answered at trial (as opposed to an interim or preliminary stage)
(Google India Pvt. Ltd. v. Vishakha Industries 2020 (4) SCC 163, Para 153). For example,
where a party alleged that Amazon was acting beyond the scope of an intermediary through its
active involvement in selling products, the Delhi High Court held that “[g]iven the disputed
questions of facts that emerge from the pleadings in the suit, it is obvious that the issue of
whether an entity is an intermediary or not can be decided only after a trial” (Amazon Seller
Services Pvt. Ltd., Para 141)
■ In Myspace Incv Super Cassettes Ltd 2016 SCC Online Del 6382, the Delhi High Court held
that a share' button on a platform does not amount to the intermediary initiating a transmission or
selecting a receiver, as the decision to click the button rests with the user.
■ The Court also found that an automated editorial system which inserted
12
advertisements into
infringing content did not amount to modifying the "content" of the transmission, prima facie
satisfying the threshold of Section 79(2)(b) (Para 64).
■ In Amazon Seller Services Pvt. Ltd. the Delhi High Court held that in the case of e-commerce
platforms, it was the customer who initiated the transmission and the e-commerce platforms did
not modify the information contained in the transmission (e.g. choice of product and number of
units) when they transmitted this information to sellers (Para 139-144).
Relevant Judgments regarding intermediary liability
• Intermediary cannot be tasked with judging which material on their platforms is lawful
or unlawful.
■ In Shreya Singhal, the Supreme Court expressly declined to impose a proactive
monitoring obligation on intermediaries, concluding that intermediaries cannot be tasked
with judging which material on their platforms is lawful or unlawful.
■ Instead, the Supreme Court clarified that intermediaries only have a duty to remove or
disable access to unlawful content upon receiving “actual knowledge” through a court
order sufficiently identifying the content that is unlawful.
■ “Section 79(3)(b) has to be read down to mean that the intermediary upon receiving
actual knowledge that a court order has been passed asking it to expeditiously remove or
disable access to certain material must then fail to expeditiously remove or disable access
to that material. This is for the reason that otherwise it would be very difficult for
13
intermediaries like Google, Facebook etc. to act when millions of requests are made and
the intermediary is then to judge as to which of such requests are legitimate and which are
not.” (Para 122, Para 124.3).
■ The Supreme Court reiterated this reasoning in Google India: “Shreya Singhal makes it clear
that an intermediary’s liability will not arise unless it failed to take down material upon there
being actual knowledge by court order or government communication. This safeguard has
been put in place to avoid the chilling effect on free speech. The intermediaries would, if a
contrary view is taken, stand elevated to the status of super censors and denude the internet
of its unique feature of a democratic medium for all to publish, access and read any and all
kinds of information.” (Para 54).
■ The Delhi High Court in Kent RO Systems Ltd. v. Amit Kotak, (2017) 240 DLT 3 held that the
IT Act does not require an intermediary to “screen all information being hosted on its portal for
infringement of the rights of all those persons who have at any point of time complained to the
intermediary.” (Para 31).
■ The judgment in Kent RO Systems relied on Myspace Inc., which held that an intermediary
can only be denied immunity if it receives “actual knowledge,” and that “general awareness” is
14
not sufficient. (Paras 38 and 77(b)). The Delhi High Court found that: “In case of Internet
Intermediaries, interim relief has to be specific and must point to the actual content, which is
being infringed.” (Para 77(b)). The Court also recognised that tasking an intermediary with
determining which content infringes and which does not infringe would have a chilling effect on
the fundamental right to free speech (Para 71).
Regulating
Emerging
Technologies
Augmented Reality (AR)
Augmented reality is an enhanced, interactive version of a real-world
environment achieved through digital visual elements, sounds, and other
sensory stimuli via holographic technology. AR incorporates three features: a
combination of digital and physical worlds, interactions made in real time, and
accurate 3D identification of virtual and real objects.
Examples:
● Apple Vision Pro
● Snapchat AR filters
● Pokemon Go
Virtual Reality (VR)
It is fully immersive, which tricks your senses into thinking you’re in a different
environment or world apart from the real world. Using a head-mounted display
(HMD) or headset, you’ll experience a computer-generated world of imagery and
sounds in which you can manipulate objects and move around using haptic
controllers while tethered to a console or PC.
Examples:
● Metaverse
Offences in AR/VR/MR
● Assault
● Theft
● Privacy violation
● Real-world injuries
● Identity thefts
● Financial crimes
● Others (such as IPR violations, hacking)
Elements of MUIEs that can impact content policies
Web 3.0
A new version of the web, built
on blockchains, that would (in
theory) be de-centralised,
democratic, and peer-to-peer.
Cryptocurrencies, NFT, and
DAOs (Decentralised
autonomous organisation, a
headless corporation where
decisions are voted on by
members, and executed by
encoded rules on blockchain) are
all part of Web3 and enable a
read/write/own internet.
Can blockchain enable free speech?
Key Questions:
● Whether current Indian law sufficiently covers these emerging
technologies?
● How will liability be ascertained in a decentralised web?
● What are the gaps in law?
Module II
Privacy in Machine-to-Machine
Communication
Indian Privacy Law: Current
and Prospective
K.S. Puttaswamy v. Union of India (2017) 10 SCC 1
para 248
“248. Privacy has distinct connotations including (i) spatial control; (ii)
decisional autonomy; and (iii) informational control. [ Bhairav Acharya, “The
Four Parts of Privacy in India”, Economic & Political Weekly (2015), Vol. 50
Issue 22, at p. 32.] Spatial control denotes the creation of private spaces.
Decisional autonomy comprehends intimate personal choices such as those
governing reproduction as well as choices expressed in public such as faith or
modes of dress. Informational control empowers the individual to use privacy as
a shield to retain personal control over information pertaining to the person.”
“250. The above diagrammatical representation presents two primary axes : a horizontal axis consisting
of four zones of privacy and a vertical axis which emphasises two aspects of freedom : the freedom to
be let alone and the freedom for self-development. The nine primary types of privacy are, according to
the above depiction:
(i) bodily privacy which reflects the privacy of the physical body. Implicit in this is the negative freedom
of being able to prevent others from violating one's body or from restraining the freedom of bodily
movement;
(ii) spatial privacy which is reflected in the privacy of a private space through which access of others
can be restricted to the space; intimate relations and family life are an apt illustration of spatial privacy;
(iii) communicational privacy which is reflected in enabling an individual to restrict access to
communications or control the use of information which is communicated to third parties;
(iv) proprietary privacy which is reflected by the interest of a person in utilising property as a means to
shield facts, things or information from others;
(v) intellectual privacy which is reflected as an individual interest in the privacy of thought and mind
and the development of opinions and beliefs;
(vi) decisional privacy reflected by an ability to make intimate decisions primarily consisting one's
sexual or procreative nature and decisions in respect of intimate relations;”
“(vii) associational privacy which is reflected in the ability of the individual to choose who she wishes
to interact with;
(viii) behavioural privacy which recognises the privacy interests of a person even while conducting
publicly visible activities. Behavioural privacy postulates that even when access is granted to others, the
individual is entitled to control the extent of access and preserve to herself a measure of freedom from
unwanted intrusion; and
(ix) informational privacy which reflects an interest in preventing information about the self from being
disseminated and controlling the extent of access to information.”
Privacy Regulation: Global Perspective
United States of America:
● United States privacy law is a complex patchwork of national, state and local privacy laws and
regulations. There is no comprehensive national privacy law in the United States.
● However, the US does have a number of largely sector-specific privacy and data security laws at
the federal level, as well as many more privacy laws at the state (and local) level.
● In recent years, beginning with California, states have begun to introduce their own
comprehensive privacy laws, and other states are expected to follow and enact their own
comprehensive state privacy laws.
○ California alone has more than 25 state privacy and data security laws, including the
California Consumer Privacy Act (CCPA) and its regulations as recently amended by the
California Privacy Rights Act (CPRA), collectively referred to as the CCPA.
○ Beyond California, Connecticut, Colorado, Utah, and Virginia have enacted new
comprehensive state privacy laws that take effect in 2023
● Consumer protection laws, which prohibit unfair and deceptive business practices, provide
another avenue for enforcement against businesses for their privacy and security practices.
General Data Protection Regulation
● The European Union General Data Protection Regulation (“GDPR”), effective from May 25 2018, is
the data protection law that regulates how organizations protect the personal data of individuals
residing in the European Union (“EU”). The GDPR covers all EU member states and additionally
covers the European Economic Area countries i.e. Iceland, Lichtenstein and Norway.
● The GDPR applies to: (i) a controller or processor established in the EU; or (ii) a controller or
processor not in the EU where the processing activities relate to the offering of goods/services (paid
or for free) to data subjects in the EU or the monitoring of the behavior of individuals in the EU.
● Article 5 of the GDPR sets out its seven key principles which include: (i) lawfulness, fairness and
transparency; (ii) purpose limitation; (iii) data minimisation; (iv) accuracy; (v) storage limitation; (vi)
integrity and confidentiality; and (vii) accountability.
Information Technology (Reasonable security practices and procedures and sensitive
personal data or information) Rules, 2011
• Rule 3 defines sensitive personal data or information (“SPDI”) as “personal information relating
to passwords, financial information such as bank account or credit card or debit card or other
payment instrument detail, physical, physiological and mental health condition, sexual
orientation, medical records and history, and biometric information.”
• Rule 4 provides that the body corporate shall provide a privacy policy for handling of or dealing
in personal information including SPDI and ensure that the same are available for view by such
providers of information who has provided such information under lawful contract. Such policy
shall be published on website of body corporate or any person on its behalf and shall provide
for—
○ clear and easily accessible statements of its practices and policies;
○ type of personal or sensitive personal data or information collected under Rule 3;
○ purpose of collection and usage of such information;
○ disclosure of information including sensitive personal data or information as provided in
Rule 6;
○ reasonable security practices and procedures as provided under Rule 8
• Other obligations of body corporate:
○ Consent: Obtain written consent prior to the collection of SPDI (Rule 5(1)).
Users should also have an option to not provide the data or information sought, and
withdraw consent. Upon withdrawal of consent, the body corporate shall have the
option not to provide goods or services for which the said information was sought (Rule
5(7)).
○ Purpose of data collection: Collection of SPDI must only be for a lawful purpose, and
collection of SPDI is considered necessary for that purpose (Rule 5(2)).
Information collected shall be used for the purpose for which it has been collected
(Rule 5(5)).
○ Retention: SPDI can be retained for only as long as necessary for the lawful purpose (Rule
5(4)).
○ Review and rectify: Provide users the right to review and correct any personal information or
SPDI (Rule 5(6)).
The Draft Digital Personal Data Protection Bill, 2022
● Under the DPDP Bill, ‘personal data’ refers to data about an individual who is identifiable by or in
relation to such data. Unlike previous iterations of draft data protection frameworks, the DPDP Bill
does not create tiered categorisations of personal data (such as sensitive or critical personal
data).
● ‘Automated’ refers to any digital process capable of operating automatically in response to
instructions given or otherwise for the purpose of processing data.
● A ‘data fiduciary’ refers to any person who alone or in conjunction with other persons determines
the purpose and means of processing of personal data.
● A ‘data processor’ refers to any person who processes personal data on behalf of a data
fiduciary.
● A ‘data principal’ refers to the individual to whom the personal data relates and where such
individual is a child includes the parents or lawful guardian of such a child.
● Grounds for processing digital personal data - Personal data may be processed only in
accordance with the provisions of the DPDP Bill and rules, for a lawful purpose for which the data
principal has given or is deemed to have given consent.
● Notice - Data fiduciaries, under the DPDP Bill must provide “an itemised notice in clear and plain
language containing a description of personal data sought to be collected” and “the purpose of
processing of such personal data” on or before requesting a data principal’s consent.
○ Data fiduciaries will also be required to give itemised notice to data principals, as soon as
reasonably practicable, in respect of consent obtained before the commencement of the Bill.
○ All forms of notice are to be made available in English or languages specified in the Eighth
Schedule to the Constitution of India.
● Consent - Consent of the data principal refers to any “freely given, specific, informed and
unambiguous indication of the data principal’s wishes”, by clear affirmative action, signifying
agreement to the processing of their personal data for the specified purpose.
○ The data principal shall have the right to withdraw their consent at any time. If consent is
withdrawn, the data fiduciary shall within a reasonable time cease and cause its data processors
to cease processing of personal data.
○ The performance of a contract between a data fiduciary and a data principal shall not be made
conditional on the consent to processing of personal data not necessary for that purpose.
DPDP v. GDPR - How Do They Compare
Digital Data Protection Bill GDPR
Concerned ● The DPDP Bill uses the term ● Uses the term "Data Subject"
Parties "Data Principal" to refer to the to refer to the natural person
same whose data is being processed
● However, the DPDP Bill uses the ● The "Data Controller" is the
term "Data Fiduciary" to refer to entity that collects the data of
the Data Controller. the data subjects and decides
the purposes and means of
processing personal data in
both laws.
Legal Consent remains the primary basis for The GDPR has six legal
Justification for data processing, with certain bases for processing personal
Processing exceptions. data, which include consent,
Personal Data performance of a
contract, legitimate interest,
vital interest, legal
requirement, and public
interest
Protection of ● Under DPDP Bill, Data ● There are additional requirements when
Children’s fiduciaries must confirm the obtaining consent from children who are
Rights & Age age of a child and obtain either under the age of 16 or the age
of Majority permission from a parent or specified by their respective EU member
guardian before processing state.
any personal information ● The law specifies that when offering
related to the child information society services directly to a
(someone under 18 years child, processing their personal data is
old). only legal if the child is at least 16 years
● The primary responsibility old. If the child is below this age,
when processing personal processing their data is only lawful if the
data is to ensure that the parent/guardian has given consent or
rights of children are authorized it
safeguarded and decisions
made are in the child's best
interests
Regulating
Emerging
Technologies
M2M Communication
M2M communications refer to automated applications which involve machines or devices
communicating through a network without human intervention. Sensors and
communication modules are embedded within M2M devices, enabling data to be transmitted
from one device to another device through wired and wireless communications networks.
Common examples of machine-to-machine technology are controlling electrical devices like fans
and bulbs using smartphone’s Bluetooth. Here, the smartphone and the electrical devices are
the two interacting devices. Another example is the smart meter, which can track electric
consumption in real-time. M2M technology is widely used in applications such as tracking and
tracing, automation, metering, healthcare, etc.
TRAI’s Consultation Paper on Embedded SIM for M2M
Communications
● The M2M Ecosystem broadly consists of the following entities:
○ Device Manufacturer/Provider: The device provider is responsible for devices providing raw data to
the network provider and application provider according to the business model. This category will
encompass the M2M chip-set manufacturer, the M2M module manufacturer and the end device
manufacturer (e.g., a Car or an Air Conditioning manufacturer who integrates the M2M module in his
device).
○ Connectivity/ Network Provider: The network provider/ operators are the connectivity providers
who own the underlying network to provide connectivity and related services for M2M Service
providers
○ M2M Service Provider (MSP): M2M SP provides M2M services to third parties using telecom
resources. DoT has issued guidelines on 8th February 2022 for the ‘Registration process of M2M
Service Providers (M2MSP) & WPAN/WLAN Connectivity Providers for M2M Services
○ M2M Application Provider: It is an entity that realizes the service logic of an M2M Application and
utilizes capabilities/resources provided by the network provider, device provider and M2M service
provider, to provide M2M applications to end users.
○ End user: Individual or company who uses an M2M solution
M2M Applications and Examples
M2M Communication Technologies:
● M2M communications refers to the technologies that allow wired/wireless systems to
communicate with devices of the same ability. M2M uses a device (sensor, meter etc.) to capture
an event (motion, meter-reading, temperature etc.) which is relayed through a network (wireless,
wired or hybrid) to an application (software program), that translates the captured event into
meaningful information.
● Based on communication networks, the communication technologies used for M2M
Communications can broadly be classified into:
○ Fixed & Short-Range Technologies (RFID, Bluetooth, Zigbee & WiFi)
○ Long Range Technologies:
● Non-3GPP Standards (LPWAN): LoRaWAN, Sigfox etc.
● 3GPP Standards: LTE-M, NB-IoT, 5G.
Internet of Things (IoT)
IoT or the Internet of Things is a concept where network infrastructure such as devices,
applications, sensors, and actuators are connected via the internet. IoT is the
interconnection of distinctively identifiable embedded computing machines within the existing
internet infrastructure. This interconnection allows them to exchange data with each other and
with other devices. This enables them to be remotely monitored and controlled.
IoT is the successor of M2M technology. In other words, M2M serves as the foundation for IoT.
IoT takes the basic concepts of M2M and expands them by creating large cloud networks of
devices that communicate with one other on cloud networking platforms. The cloud architecture
provides infrastructure, software, and platform for all IoT devices. It allows users to create fast,
flexible, and high-performance networks that connect a wide range of devices.
Internet of Everything (IOE)
IoE expands on the concept of the “Internet of Things” in that it connects not just
physical devices but quite literally everything by getting them all on the network. It moves
beyond being a major buzzword and technology trend by connecting devices to one another and
the Internet, and offers higher computing power.
Examples: Wearable devices - Different wearable devices such as fitness bands, smart watches,
smart clothing, shoes, etc., can offer IoE benefits to people using their products. For example, in
2019, self-lacing shoes were introduced by Nike. These shoes had sensors that could sense the
wearer’s blood pressure in real-time and loosen or tighten the laces on their own, based on the
detected blood pressure.
What data do users share?
● Mobile phones
● Applications and services
○ Social media, e-commerce, financial services
● Smart devices
○ Alexa, Google Echo, smart home devices, smart cars
● Data collected by the government
○ Social security services, medical records, financial records
Module II
Surveillance Technologies
and Indian Law
Committee of Experts under the
Chairmanship of Justice B.N. Srikrishna
“Surveillance should not be carried out without a degree of transparency that can pass the
muster of the Puttaswamy test of necessity, proportionality and due process. This can take
various forms, including information provided to the public, legislative oversight, executive and
administrative oversight and judicial oversight. This would ensure scrutiny over the working of such
agencies and infuse public accountability.” (Page 125)
“The surveillance architecture should also embed systematic risk management techniques
within itself. This would lead to the prioritisation and narrowing of its activities, by devoting
resources to credible risks, whether reputational or organisational. For example, an assessment
of whether a particular measure is the least intrusive measure to achieve a stated aim may be required.
Not only will this reduce costs incurred by the State, it will also be consistent with civil rights protection.”
(Page 128-129)
K.S. Puttaswamy v. Union of India, (2019) 1 SCC 1
(Puttaswamy II)
“186. After going through the Aadhaar structure, as demonstrated by the respondents in the powerpoint
presentation from the provisions of the Aadhaar Act and the machinery which the Authority has created
for data protection, we are of the view that it is very difficult to create profile of a person simply on the
basis of biometric and demographic information stored in CIDR….Therefore, the Act has
endeavoured to provide safeguards [ We may also take on record responsible statements of the
learned Attorney General and Mr Dwivedi who appeared for Uidai that no State would be interested in
any mass surveillance of 1.2 billion people of the country or even the overwhelming majority of officers
and employees or professionals. The very idea of mass surveillance by State which pursues what an
Aadhaar number-holder (ANH) does all the time and based on Aadhaar is an absurdity and an
impossibility. According to them, the petitioners' submission is based on too many imaginary
possibilities viz.:(i) Aadhaar makes it possible for the State to obtain identity information of all ANH. It is
possible that Uidai would share identity information/authentication records in CIDR notwithstanding
statutory prohibition and punitive injunctions in the Act. It is possible that the State would unleash its
investigators to surveil a sizeable section of the ANH, if not all based on the authentication
records. It is submitted that given the architecture of the Aadhaar Act, there are no such
possibilities and in any event, submission based on imaginary possibility do not provide any
basis for questioning the validity of the Aadhaar Act.”
In summary:
(a) The right to privacy is recognised by the Nine Judge Bench as inherent
fundamental right having protection as an intrinsic part of the right to life and
personal liberty under Article 21 and as a part of the freedom guaranteed by Part
III of the Constitution which is subject to specified restrictions,
(b) Any infringement of the right to privacy by State Authorities will have to meet
the following four tests based on the “Principle of proportionality and legitimacy”:
1. The action must be sanctioned by law;
2. The proposed action must be necessary in a democratic society for a
legitimate aim;
3. The extent of such interference must be proportionate to the need for
such interference;
4. There must be procedural guarantees against abuse of such
interference.
Other relevant judgments:
● People’s Union for Civil Liberties v. Union of India: The Supreme Court in this decision held that
phone-tapping without appropriate safeguards, and without following legal process, was a violation of
individuals’ fundamental right to privacy. The Court, cited international instruments as well as Indian
and international jurisprudence to affirm the right to privacy and noted that it could not be violated
except by a procedure established by law. They laid down specific situations in which phone tapping
could be conducted, but noted that procedural safeguards for the fair and reasonable exercise of
substantive power were missing. Accordingly, the Court did not strike down Section 5(2) of the
Telegraph Act, but laid down detailed guidelines for the exercise of surveillance powers by the
executive in order to put a check on the misuse of these powers and to safeguard the right to privacy.
● Indian Hotel & Restaurant Assn. (AHAR) v. State of Maharashtra: Dealt with the constitutional
validity of the impugned legislation introduced by the Maharashtra Government with the intent to curb
obscene behaviour and prostitution crimes at bars, hotels, and other such public amusement
establishments in the State. The Supreme Court allowed the curfew to stay and the cameras shall be
installed at the entrance of the bars but not the inside.
● Central Public Information Officer, Supreme Court of India vs. Subhash Chandra Agarwal: The
Court conducted a proportionality test, balancing the right to privacy against the public interest in
disclosure, to find that the requested information regarding the functioning of the Supreme Court and
judicial assets should be released in the name of transparency and accountability, but that information
related to third-parties needed to be re-examined.
Legal provisions:
The (proposed) European AI Act defines AI as: “‘artificial intelligence system’ (AI system) means
software that is developed with one or more of the techniques and approaches listed in Annex I and
can, for a given set of human-defined objectives, generate outputs such as content, predictions,
recommendations, or decisions influencing the environments they interact with”
Facial Recognition Technology
○ "Biometrics" means the technologies that measure and analyse human body characteristics, such
as 'fingerprints', 'eye retinas and irises', 'voice patterns', "facial patterns', 'hand measurements' and
'DNA' for authentication purposes. (Rule 2(b), Privacy Rules 2011)
○ “biometric information” means photograph, finger print, Iris scan, or such other biological
attributes of an individual as may be specified by regulations. (Section 2(g), Aadhar Act, 2016)
○ "biometric data" means facial images, fingerprints, iris scans, or any other similar personal data
resulting from measurements or technical processing operations carried out on physical,
physiological, or behavioural characteristics of a data principal, which allow or confirm the unique
identification of that natural person. (PDP, 2019)
Niti Aayog’s “Responsible AI for All: Adopting the Framework – A
use case approach on Facial Recognition Technology”
● The Niti Aayog has published the third paper in its Responsible AI series which focuses on
AI-based facial recognition technology. The paper puts forth recommendations for applications of
facial recognition technology within India and contains a case study of the Ministry for Civil
Aviation's DigiYatra Programme.
● The first section underlines the need for “Responsible AI” as the increasing use of AI and
algorithmic functions in both the public and the private sectors necessitate a discussion on the
ethical risks emanating from these use cases. The second section then provides insight into how
FRT operates and states that FRT “primarily seeks to accomplish three functions- facial detection,
feature extraction, and facial recognition”.
● It identifies two formats in which FRT is used, a) verification of identity and b) identification.
Recognising the growing use of FRT throughout the world, especially in India, the paper lists the
generation of vast amounts of facial images and video data in general, advancements in image
recognition technology, and the ubiquitous presence of closed-circuit television (CCTV) cameras
as some of the probable causes for this uptick.
● The paper then goes on to split FRT use into two categories; non-security and security.
○ Examples of non-security use include verification and authentication of the identity of an individual,
providing greater ease of access to certain services (contactless onboarding at airports), or to ease
usability (unlock smartphone).
○ Examples of security use include general law and order considerations like investigation,
identification of missing persons, monitoring of crowds, and screening of public spaces for finding
violations of masking protocols given the COVID-19 pandemic. Within this category, there are
further sub-categories:
■ Automated FRT (identification of persons for offences against witness sketches or an existing
set of suspects may constitute automated FRT)
■ LIve FRT (monitoring for crowd control).
● Risks associated with the use of FRT systems and categorises them into design-based risks and
rights based risks:
○ Design based risks: Inaccuracy due to technical factors, Inaccuracy due to bias, Inaccuracy due to
lack of training of human operators, glitches or perturbations, data breaches and unauthorised
access, Accountability, legal liability and grievance redressal, paque nature of FRT systems.
○ Rights based risks: Informational autonomy, non-participants in deployment of FRT systems, Legal
thresholds applicable to FRT systems, Anonymity as a facet of privacy
● Second part of the report gives an overview of the DigiYatra scheme, which is a biometric
boarding system involving the authentication and creation of a digital identity of a passenger, and
the subsequent verification of this identity at different checkpoints in an airport. It states that the
scheme is “purely voluntary” and contains alternatives at all stages.
● Further it also llists the potential benefits of the scheme such as lower congestion at airports,
seamless, paperless, and contactless passenger experience, and lower operational costs and
enhanced civil aviation capabilities.
● Lastly it discusses certain legal aspects of the scheme which includes how the data privacy of
users will be ensured, how Aadhaar based authentication will be carried out under the scheme,
and how information security will be maintained.
● Recommendations and (civil society’s) concerns:
○ Establishing a data protection regime: At present, India does not have a data
protection law and the DPDP allows blanket exemptions for selected government
agencies.
○ Setting rigorous standards for data processing: What are these standards and who
will set them? Practical issues with enforcement and training. Purpose creep /
surveillance creep
○ Omission of fundamental technical aspects of FRT and Implementation issues with
the DigiYatra Scheme
YOTI’s Age Estimation Technology
● Facial age estimation technology accurately estimates a person’s age based on a selfie without
sharing their name or ID document. All images are instantly deleted once someone receives their
estimated age – nothing is ever viewed by a human. It can’t link a name to a face or identify
anyone. This is the difference between facial analysis and facial recognition.
● The technology uses facial analysis to estimate a person’s age without identifying or recognising
any individual. This has been acknowledged by multiple regulators and has prompted the ICO to
update their definition of biometrics and agree that Yoti’s age estimation tool will not result in the
processing of special category data. “Having reconsidered our guidance in the context of our
engagement with Yoti, we have concluded that the above guidance needs to be updated. This is
because Yoti’s age estimation tool has demonstrated that it is, in some contexts, possible to use
biometrics to make a decision about an individual or treat them differently without using that
biometric data for the purpose of uniquely identifying that person.”
○ Special category data is defined by Article 9(1) UK GDPR: “Processing of personal data
revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade
union membership, and the processing of genetic data, biometric data for the purpose of
uniquely identifying a natural person, data concerning health or data concerning a
natural person’s sex life or sexual orientation shall be prohibited”.
Key Questions:
● What is the legality of surveillance technologies?
● What is the recourse available to users / citizens in case technologies
implemented by the government and / or private companies?
● Are their different standards of protection for different categories of data?
Module III
Disruptive Technologies and
Indian Law
Case Study: Satish N vs State Of Karnataka
● In 2016, the Karnataka High Court (Satish N. and Ors vs State of Karnataka, 2016 SCC
OnLine Kar 6542; 2017(2) KarLJ 6) dealt with a case which challenged the constitutionality of
the Karnataka on-Demand Transportation Technology Aggregator Rules, 2016 (“Karnataka Agg.
Rules”). Upholding the validity of the Karnataka Agg. Rules, the Court shed light on the
interpretation of the term ‘canvasser’ as used in the Motor Vehicles Act, 1988 (“MV Act”)
● Since this case was decided before the 2019 Amendment, the term ‘aggregator’ was not included
within the MV Act’s framework. Accordingly, the petitioner challenged that such licensing
requirements could not be issued for aggregators as they are beyond the scope of Section 93.
However, the Court held that an aggregator in the nature of the petitioner (Uber) amounts to being
a canvasser as it acts on behalf of the permit holders (drivers) and solicit business from
customers for the drivers.
● Among other things, the Court considered (i) the agreement between the aggregator and the
driver to hold that the aggregator is in fact acting on behalf of the driver; and (ii) to determine if the
aggregator was soliciting customers, the Court noted that the aggregator was advertising to the
customers in form of various catchy offers and discounts, and in some cases personal emails to
customers who have not taken rides in a long time.
The Motor Vehicles (Amendment) Act, 2019: The 2019 Amendment, which was pending at the time
of the proceedings before Karnataka High Court, brought aggregators within the regulatory framework
by providing licensing requirements under Section 93 and providing the definition of an ‘aggregator’
within the MV Act. Some of the other relevant provisions added by 2019 Amendment in relation to
aggregators are as follows:
● In Section 93 of the MV Act, another proviso has been inserted which prescribes that every
aggregator shall comply with the provision of the IT Act.
● The punishment for engaging as an aggregator in contravention of Section 93 (without obtaining
license or adhering to the provisions of the IT Act) has been prescribed as a minimum fine of
twenty five thousand rupees which may extend to one lakh rupees. [Section 193 of the MV Act,
amended via 2019 Amendment]
● Further, in case an aggregator contravenes a license condition under Section 93(1) of the MV Act,
which is not designated as a material condition by the State government, the fine shall be five
thousand rupees. [Section 193 of the MV Act, amended via 2019 Amendment]
Thank You!