Ai and Violation Of Privacy Rights In Healthcare Sector

Hema Gariya

BML Munjal University, Haryana

This Article is written by Hema gariya, a Fourth-year law student of BML Munjal University, Haryana

ABSTRACT:

This paper examines whether AI healthcare apps violate user data privacy rights and whether current data protection laws adequately address these concerns. In this paper I have tried to explore the evolution of AI healthcare apps, highlighting their potential benefits and privacy risks. Case studies like Practo illustrate data collection practices and privacy threats. Further, I have evaluated Indian data protection laws, including the Digital Personal Data Protection Act (DPDP) 2023, and assessed their effectiveness. Towards the end, mitigation strategies like granular consent and explainable AI are proposed to address privacy risks. Sector-specific regulations, collaboration among stakeholders, and user education are crucial for promoting responsible data handling. By addressing these challenges, AI healthcare apps can uphold ethical standards and protect user privacy.

I-INTRODUCTION:

The rise of AI-driven healthcare apps has brought up both outstanding achievements and difficult contentions surrounding data privacy rights in the rapidly developing field of healthcare technology. These applications provide fascinating solutions, ranging from remote patient monitoring to personalized diagnostics, by utilizing artificial intelligence algorithms. However, because these apps collect and exploit sensitive health data, concerns have been raised about the degree to which users' privacy may be violated. Consumers are relying more and more on mobile apps to provide them with easy access to healthcare services, but there are still concerns about how well these technologies oversee data privacy issues and comply with current privacy laws.

There has been a significant movement in the development of AI healthcare apps towards proactive and personalized healthcare management. To deliver personalized recommendations and insights, healthcare apps like PRACTO, MEDLIFE, and JOYLO use machine learning algorithms to evaluate enormous datasets, including user input such as location, contact lists, and personal photos as well as data from mobile devices and electronic health records. These apps offer a broad spectrum of functionalities that enable people to take control of their health and well-being, from adherence to medications and lifestyle advising to symptom assessment and prediction of diseases.

However, there have been downsides to this technological development, particularly in data privacy. Due to the constant requirement for access to private health information, these apps raise issues with data security, consent, and the potential for misuse.

To address these concerns, several jurisdictions have enacted data protection laws and regulations to protect people's right to privacy when it comes to healthcare data. Strict guidelines have been imposed on the collecting, using, and storing of personal health data by medical centers and third parties by laws like the Digital Personal Data Protection Act (DPDP), and other data privacy laws in India and the General Data Protection Regulation (GDPR) in the European Union. In addition to promoting innovation in healthcare technology, these rules seek to ensure transparency, accountability, and individual control over their data. However, the effectiveness of these data protection laws in addressing privacy violations by AI healthcare apps remains a subject of

Ongoing debate. Despite regulatory efforts, challenges persist in enforcing compliance, particularly in the context of rapidly evolving technology and global data flows.

II-Data Collection and Processing in AI Healthcare Apps:

In India itself according to the sources around 197 million to 245 million users are there of healthcare apps. The detailed medical history information that users provide to healthcare apps may be acquired in the following ways:

For instance, for an ordinary headache, one chooses to utilize a symptom checker app. Installing the app prompts the user to "complete your medical profile" and log in to receive a more "accurate diagnosis." In addition, it asks him to complete a profile and provide a complete medical history, including past operations, allergies, prescriptions, and family medical history. All of this detail seems like a lot more is necessary to diagnose a headache.

III-Potential Threats to Data Privacy in AI Healthcare Apps

In many cases, if the app has one, the statement of privacy may be buried deep in the settings, making it difficult to identify. It probably doesn't explain why such an extensive medical record is required or how it plans to use this information. Sometimes patients feel so compelled to get a proper diagnosis for their headaches that they quickly review the privacy policy and give all the permissions that are asked of them, unintentionally jeopardizing their privacy.

Once sensitive health data has been obtained in large quantities, the app becomes an appealing target for potential hackers. For example, A third-party file-sharing software vulnerability resulted in a data breach for healthcare by Nuance in September of 2023 where the data of over 1.2 million patients was affected.

Furthermore, certain medical apps ask for Unwarranted Location Tracking each time as soon as the app is downloaded, requesting permission to access location data. Regarding prescription reminders, location access seems unnecessary, but it might be relevant for appointment reminders.

This might pose an issue as even when the user is not making appointments, the app can silently collect location data1.

No one knows what happens to the data, and anonymized location data may be shared with a third party to provide "targeted advertising." It also indicates that location data and health information from the app may be used to create comprehensive user profiles for targeted advertisements, so compromising the user's privacy2.

In the worst-case scenario, insurance companies could be able to identify consumers with particular health issues who frequently visit particular sites, such as pharmacies that specialize in chronic illnesses, using the anonymized location data together with health information from the app. Insurance companies might resort to discriminatory measures as a result, such as raising premiums or rejecting insurance.

IV -CASE STUDY OF PRACTO: A leading healthcare app in India

Practo, a leading Indian telemedicine app having around 20 million monthly active users, employs a variety of methods to collect user data, including contact information (phone number and email address), demographic information (gender, date of birth, and pin code), information about how the services are used, the history of appointments the user has made with the vendor, insurance information (insurance carrier and plan), and other information. Furthermore, users have the option to connect their electronic medical records (EMRs) to the app, which offers thorough medical histories. To help with the patient's health assessment, it also makes use of AI-powered symptom checkers to collect data on stated symptoms.

There can be privacy risks associated with these data-collecting methods. For example, in the case of a data breach, private user data, such as medical records, might be made public, which could result in identity theft or discrimination based on a person's health3. Furthermore, keeping user data on servers with inadequate security safeguards exposes it to hacking and illegal access. These risks are worsened by Practo's lack of openness and clear consent procedures in its privacy policy, which may keep users in the dark about the scope of the company's data-gathering and sharing practices.

· After reading the privacy policies of the following app, these are the possible privacy threats that the app poses:

1. Collection of Sensitive Personal Data:

The sensitive personal information that Practo gathers includes financial data, health information, medical records, and biometric data. If this information is not adequately protected, accessing and retaining it could be hazardous.

2. Consent and Opt-Out:

Users may accidentally agree to conditions they find uncomfortable or might not fully understand the implications of giving an agreement for the gathering and use of their personal information. Some consumers may be unaware of their ability to opt out of specific communications or services.

3. Data Security:

There is always a chance of data breaches, unauthorized access, and theft of identity even with security measures in place. Practo states that it rejects all liability for any damages resulting from illegal access to user devices, which could potentially harm users.

4. Third-Party Sharing:

Practo might supply third parties with aggregated data for a variety of uses. Notwithstanding the anonymization of the data, there remains a possibility that it may be re-identified or utilized in manners that violate user privacy.

5. Cookies and Tracking:

Practo uses cookies for several functions, such as advertising and analytics tracking of user behavior. Users are likely unaware of how their internet activities are being monitored and used.

6. Limited Control Over Linked Sites:

When third-party websites are linked through Practo's services, Practo denies all liability regarding their privacy policies. By accessing these linked websites, users run the accidental risk of additional privacy breaches.

7. Data Retention:

Practo retains user data for as long as necessary to deliver services, but there are no explicit guidelines about data retention periods in the policy. Longer periods of retention raise the risk of misuse or illegal access to personal data.

8. No-Spam Policy:

Even though Practo claims it does not engage in spam, there is the possibility that user email addresses could be stolen or sold to unauthorized parties.

9. Legal Compliance:

To safeguard its legal rights or comply with legal requirements, Practo maintains the right to disclose personal information. However, the policy does not go into much detail about the conditions that might lead to these kinds of disclosures, so there's room for interpretation and possible abuse.

10. Collection of Personal Information:

During the registration and application processes, Practo gathers sensitive personal data, which could be misused or accessed by unauthorized persons.

11. Data Usage and Commercial Purposes:

Practo retains the right to utilize practitioner data for business objectives, which could involve data abuse or privacy violations if data is sold or otherwise transferred to unauthorized parties.

V-RIGHT TO PRIVACY: A FUNDAMENTAL RIGHT UNDER ARTICLE 21 OF THE INDIAN CONSTITUTION.

Both historic rulings from the courts and legislative developments have affected India's exceptional evolution in data privacy. Establishing the precedent, the court in Kharak Singh v. State of UP4 recognized that the right to privacy was inalienably linked to the right to life and liberty guaranteed by Article 215 of the Constitution.

Subsequently, the Information Technology Act of 2000 introduced significant clauses, such as Section 43A6, which established the "right to be forgotten," and Section 72A7, which was designed to prevent data breaches.

Challenges regarding privacy were raised in the 2012 case of Puttaswamy & Ors. vs. Union of India & Ors8., particularly concerning mandatory Aadhaar linking. Additionally, the introduction of the Sensitive Personal Data or Information (SPDI) Rules in 2011 addressed electronic data protection but revealed limitations in scope.

In 2017 Aadhaar data breach case demonstrated the shortcomings of the enforcement procedures in place. But in the case of Justice K.S. Puttaswamy (Retd.) vs. Union of India & Ors.9, the Supreme Court acknowledged privacy as a basic right under Article 21 of the Constitution, marking a significant turning point in the legal system. This important decision marked a paradigm.

Change and opened the door for India to have a stronger legislative framework protecting data privacy10.

VI -Current Laws for Data Protection in India:

India's healthcare app market is growing at a rapid pace, which might improve patient access to care, simplify diagnostic processes, and personalize health management. To protect sensitive user data, however, robust privacy regulations are essential to the success of this digital revolution.

The Information Technology Act of 2000 provides individuals the "right to be forgotten," as outlined in Section 43A, enabling them to request the deletion or updating of personal data that is maintained by various entities, including developers of healthcare applications.

Section 72A defines and penalizes cybercrimes such as unauthorized disclosure of personal information, yet lacks comprehensive regulations for data collection, storage, or usage within healthcare apps. Furthermore, sensitive personal data including health data collected by healthcare apps is expressly addressed under the Sensitive Personal Data or Information (SPDI) Rules of 201111, which were enacted under the IT Act. There are challenges with the "Consent Framework12," though, as pre-checked consent boxes and a lack of transparency could make it more difficult for consumers to understand and take control of their data. Furthermore, while "security safeguards13" outlines basic measures for protecting sensitive data, their effectiveness relies on proper implementation by app developers and lacks a robust enforcement mechanism. These constraints on "data storage location14" may impede international collaboration in health care.

Apps like Practo, for example, generate issues since they can collect a lot of user data without becoming transparent about how it is used. This is particularly problematic for AI-driven features like symptom checkers, where users have no control over sharing their data with third parties.

Inadequate enforcement approaches, permission concerns, cross-border data transfer limitations, and a lack of comprehensive data protection laws are just a few of the legal gaps that render it difficult to guarantee sufficient data privacy for healthcare apps in India.

VII-The Digital Personal Data Protection Act (DPDP) 2023:

The DPDP Act acknowledges the increasing concerns surrounding data privacy by establishing a comprehensive legal framework for processing personal data. It addresses several key issues:

Section 1115: The Act provides people the right to access and control the data that is held by data fiduciaries, like developers of healthcare apps. This allows users to request modifications or deletions of data as needed, as well as to understand what data is collected and how it is utilized.

Section 12: Enhanced Transparency, it is expected of healthcare apps to give users clear and simple information about their data collecting procedures, data usage goals, and data preservation policies. Users are better able to use the app while sharing their data due to this transparency.

Section 18: Accountability for Data Fiduciaries, the DPDP Act sets out the idea of data fiduciaries and places the burden of ensuring legal data processing on healthcare app developers. This involves putting in place the proper organizational and technical safeguards to stop illegal access and data breaches.

VIII-Addressing Data Privacy Issues in Healthcare Apps:

Despite acknowledging data privacy concerns and offering potential benefits, there are some lingering questions regarding the Act's applicability to healthcare data:

The Act addresses only "personal data," not data that has been anonymized. This throws doubt on the use of medical data that has been anonymized for research or statistical analysis which are vital.

For improvements in healthcare delivery. It is essential to provide additional details on the Act's stance on anonymized data and how it impacts medical research.

Although the DPDP Act provides a broad framework, it would be beneficial to create specialized standards that are appropriate to the complexities of healthcare data privacy. This could solve issues like the requirement for extremely fine user control over data sharing within healthcare apps and AI-powered collecting information techniques.

XI-Uncertainties and Potential Challenges:

The DPDP Act protects informed consent, but it could not go far enough to deal with the complexity of healthcare apps that use AI to collect data. It may be hard for users to completely comprehend how their data is used for purposes other than the immediate operation of the app. The Act must address the need for increasingly intricate and multi-layered permission processes, similarly to how developers must.

There are no specific rules about cross-border data sharing for healthcare purposes in the DPDP Act. Specific guidance on data transmission procedures and data security standards are necessary when collaborating with international specialists or research institutes, as they may require data transfer.

Unquestionably, AI has the potential to transform healthcare by offering personalized care, improved diagnoses, and more effective processes. But this advancement raises an important question: Are the rules now in place sufficient to protect data privacy, and can AI healthcare apps breach such rights?

AI health applications collect an enormous amount of private information, including medical records, prescription drugs, and possibly even genetic data. This raises concerns regarding the Openness and Knowledge Consent users might discover it hard to understand how their data is utilized due to the complexity of AI algorithms. Can consent be legitimately informed if the process of making choices is still not clear?

X-Mitigation Strategies

To address privacy concerns, users need to be able to authorize the use of data points in AI features. In addition, more choice over how their data feeds into the AI model could be provided by layered consent options.

It is essential to work on creating "explainable AI" models that patients and medical practitioners can understand. Making well-informed decisions based on both human knowledge and AI insights is made easier by this transparency health apps should emphasize data reduction, gathering only the information required to function. Through doing this, the risk exposure is decreased and user privacy is better protected.

Regulations specific to each industry should be enforced in addition to the ones already mentioned. The basis is provided by current data protection laws, but they might not adequately handle the complications of AI healthcare data. It is essential to create industry-specific laws with stricter user control and data security protocols.

Collaboration between legislators, app developers, and healthcare practitioners is required to address these issues. This ensures the creation of moral AI solutions that enhance healthcare while giving users' privacy top priority.

XI-Conclusion:

While AI healthcare apps have many advantages, data privacy is a concern. There are insufficient procedures for consent and transparency, and the data protection regulations in place could fail to adequately address these problems. Explainable AI and personalized consent are two crucial mitigation techniques. To ensure responsible data handling, sector-specific legislation, collaboration, and user education are required. To achieve their full potential and safeguard user privacy, AI healthcare apps must adhere to ethical norms.

REFERENCES:

1 Mulder Trix, ‘Health Apps, their Privacy Policies and the GDPR’ (2019) European Journal of Law and Technology, 2019, University of Groningen Faculty of Law Research Paper No.15/2020, https://ssrn.com/abstract=3506805 Accessed on 11 May 2024.

2 Papageorgiou, A., Strigkos, M., Politou, E., Alepis, E., Solanas, A., & Patsakis, C., ‘Security and privacy analysis of mobile health applications: The alarming state of practice’ (2018) 1(6) IEEE.< https://www.researchgate.net/publication/322779860_Security_and_Privacy_Analysis_of_Mobile_Health_Applicati ons_The_Alarming_State_of_Practice> Accessed on 10 May 2024.

3 Julie K. Taitsman, M.D., J.D., Christi Macrina Grimm, M.P.A., and Shantanu Agrawal, M.D. ‘Protecting Patient Privacy and Data Security’(2013) 368 (11) NEJMP. < https://www.nejm.org/doi/full/10.1056/NEJMp1215258>.


4 1964 SCR (1) 332

5 The Constitution of India, A.21.

6 The Information Technology Act,2000 S. 43A

7 The Information Technology Act, 2000 S. 72 A.

8 Writ Petition (Civil) No. 494 of 2012

9 AIR 2017 SC 4161

10 Kumar Rahul, ‘Jurisprudence of Right to Privacy in India’ (2020). SSRN: https://ssrn.com/abstract=3664257

11 Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011.

12 Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, Rule 4. 13 Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, Rule 8. 14 Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, Rule 5.

13 Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, Rule 8.

14 Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011, Rule 5.


15 Digital Personal Data Protection Act 2023, s.11.

16 Digital Personal Data Protection Act 2023, s.12.

17 Digital Personal Data Protection Act 2023, s.18.