Data Privacy and Security Research

of six

Data Privacy and Security - Importance to Companies using AI and Robotics

The reasons why data privacy and security become an even bigger issue for companies using AI and robotics include the fact that hackers can turn AI security system against the company, the high cost of data breaches, the risk of losing customer trust, and the potential of impacting stock prices.

Hackers Can Turn AI Security System Against The Company

  • Although AI and robotic systems can bring notable benefits to the company, cybercriminals can hack them to "weaponize malware and attacks to counter the advancements made in cybersecurity solutions".
  • Hackers understand how AI automated systems can receive knowledge and adapt accordingly. Therefore, hackers can launch a few attacks and learn the systems' vulnerabilities before creating an effective malware attack to finally access the companies' data. They can also use AI to accelerate polymorphic malware that makes the code to change frequently and make them almost undetectable.
  • For instance, TaskRabbit was attacked by cybercriminals which affected 3.75 million users, yet professionals and experts were unable to locate the attack.
  • Along with the growth of AI and robotic systems, hackers can also learn the same technique to compromise the companies' data. Isaac Ben-Israel, the Director of the ICRC and Chairman of Cyberweek, believed that it only takes some time for hackers to succeed breaching the companies' data.
  • Richard Stanes, the Chief Security Strategist NEU for Capgemini, agreed that many cybercriminals are using machine learning and AI, especially phishing, due to their abilities to produce well-scripted attacks and automate them.

The High Cost of Data Breaches

  • Along with the rise of advanced technologies such as AI, the need for consumer data becomes bigger. However, the data collected by the AI-integrated companies may also be stolen or commercialized for marketing opportunities, purchasing recommendations, or other services without the consumers' consent by cybercriminals.
  • Based on a study conducted by Ponemon's Institute, the average US data breach cost $7.91 million in 2018. In 2019, the number rises to $8.19 million. In some cases, large-scale breaches may reach $350 million.
  • The US is also named the most expensive country to experience a data breach.
  • According to experts from Ponemon's Institute, companies need to start improving their abilities to identify and solve an incident in their AI system to reduce data breach costs.
  • Jon Oltsik, the senior principal analyst at IT research firm Enterprise Strategy Group, agreed that the cybersecurity professionals working in the company must know the most updated research and threat intelligence.

The Risk of Losing Customer Trust

  • AI-integrated companies are likely to need customers' data for personalization services and improving customer experience. However, due to the complex definition of privacy and the lengthy user agreements, customers tend to click "accept" without realizing what privacy rights that they give to the companies.
  • Cybercriminals may also breach customers' information due to the weak companies' data security. For instance, as of 2019, there are 3,800 publicly disclosed breaches cases that affect millions of people in the US. There's also a 54% increase in reported breaches this year compared to the first six months of 2018.
  • The privacy breaches may cause consumers to lose their trust in the companies that they use. According to a study conducted by Ponemon's Institute, the average U.S. data breach reaches $4.2 million in lost customer revenue.
  • The healthcare, financial, and pharmaceutical industries have the most difficulty in maintaining their customers after they experience data breaches. The healthcare industries' abnormal customer turnover is about 7.0%, whereas other industries typically only reach 3.9%.
  • Although consumer trust in AI has been declining due to the irresponsible data breaches, Jim Hare, the research vice president at Gartner, is still positive about the usage of AI in companies to cut corporate brand and reputation risk. He also advises companies that use AI to "build a culture of responsible use, trust and transparency" to improve users' trust.

The Potential of Impacting Stock Prices

  • Many companies use AI systems to increase customer experience by collecting users' data. However, privacy breaches in companies that use IT and robotics can lower stock prices. The consequences of data breach that involve highly sensitive information, such as credit card and social security numbers, are more likely to inflict greater damage to the companies' stock prices.
  • Cybercriminals can easily use phishing campaigns, malware, or exploiting technical vulnerabilities in IT-integrated companies to collect sensitive information.
  • Based on a study conducted by Comparitech, 28 companies that experienced data breaches tended to suffer from low stock prices for almost three weeks. On average, their stock value dropped by 7.27%.
  • Rich Campagna, the chief marketing officer of Bitglass, believed that the breaches happening in the past three years have inflicted "massive and irreparable damage to large companies and their stakeholders."

Research Strategy

To determine the reasons why data privacy and security is becoming an even bigger issue now than ever before for companies using AI and robotics, we first conducted our research by reviewing expert blogs and reports released by IT professionals from trusted sources (examples here and here). Utilizing the insights from the previous strategy, we then browsed through the news articles and press publications to collect additional information about the selected reasons from reputable sources, such as Forbes and CNBC. We carefully selected these insights based on the latest and most updated topics that experts and professionals widely discussed to make sure that our findings are the current concerns of companies that use AI and robotics around data privacy and security.
of six

Data Privacy and Security - Data Privacy and/or Security Regulations: 2019

Some of the biggest data privacy or security regulations enacted in 2019 that have an impact on AI and robotics companies in the United States in 2019 include the California Consumer Privacy Act ("CCPA"), Nevada’s “Opt-Out” Privacy Law and the Future of Data Protection, and the Artificial Intelligence Video Interview Act.

The Regulations

California Consumer Privacy Act ("CCPA")
  • California has passed several bills intended to curb negative impacts of specific AI-enabled technologies such as facial recognition systems.
  • The Body Camera Accountability Act seeks to prohibit the use of facial recognition in police body cameras, while another would require businesses to publicly disclose their use of facial recognition technology.
  • By passing the bills, cities like San Francisco and Oakland banned the use of facial recognition technology by city agencies and law enforcement.
Nevada’s “Opt-Out” Privacy Law and the Future of Data Protection
  • The law grants consumers data privacy rights related to how businesses can collect, use and sell their personal data.
  • The law applies to all companies that collect data like artificial intelligence companies using or developing facial recognition softwares.
  • The law not only impacts organizations conducting certain types of business in Nevada, but is also a precursor to similar provisions set forth in the California Consumer Privacy Act, as amended (CCPA) and other data protection bills being drafted and debated across U.S. state legislatures.
Artificial Intelligence Video Interview Act
Deep Fake Legislation- H.R. 4355
  • The Deep Fake legislation is a new bill to combat manipulated media content known as “deepfakes” and has passed the U.S. House of Representatives Committee on Science, Space and Technology. It is not an official law yet, but would impact AI and robotics companies in the future if enacted.
  • It supports research on the outputs that may be generated by generative adversarial networks, otherwise known as deepfakes, and other comparable techniques that may be developed in the future, and for other purposes.
  • The legislation would also support research in artificial intelligence through computer and information science and engineering.
of six

Data Privacy and Security - US Tech Companies Adjusting to GDPR

Policies implemented by US tech companies to comply with GDPR include changing privacy policies as well as creating a privacy portal, and personal information removal request form.



  • GDPR provides EU citizens with control over their data whereby user data collecting companies have to provide EU citizens access to their data.
  • Apple established Apple ID Data & Privacy website for EU citizens to view all the data that Apple has about them. It includes information such as access to sign-in history data, contacts, calendar data, documents, and photos.



In order to determine how tech companies in the US adjusting to the General Data Protection Regulation, we reviewed news sources such as Tech Crunch and the Inquirer. Importantly, we restricted our research to firms whose headquarters are in the United States. Your research team provided insight on how the GDPR have led to a change in company policy and how it affects EU citizens who are the target beneficiaries of GDPR policy.
of six

Data Privacy and Security - US Tech Companies Leading the Charge: Data Security and Protection

Amazon, Microsoft and Google are all working on the cutting edge of cybersecurity in response to recent data breaches, fines and class-action lawsuits.

1) Amazon

Cautionary Tales

  • In general, Amazon Web Services (AWS) has had few security breaches in comparison to other large companies in the US. In November 2018, the company stated a "technical error" on the Amazon website had exposed an undisclosed number of customers' names and email addresses.
  • Some companies using AWS, like Accenture, Uber, and Time Warner, have had significant security breaches.
  • More significant concerns surround the worry that Echo home speakers are recording and transmitting information without their owners' knowledge.

Resulting Security Initiatives


  • AWS is a division of Amazon that provides cloud computing, database storage, and other IT functions. Amazon provides services to some of the largest companies in the world — like Netflix, Expedia, and NASA.
  • These resulted from incorrect configuration settings on the company's security settings. Incorrect settings cause most security breaches.
  • In response to this issue, Amazon is "no longer leaving data security entirely up to its clients." AWS has improved its user interface to lessen the chance of user error. [3]
  • It has also implemented a threat detection service called GuardDuty to protect Amazon accounts.
  • It is also launching Amazon Mazie; a machine learning-powered solution focused on security.
  • Amazon also has an application for a patent called "management of encrypted data storage" that describes new encryption services for AWS users.


  • To address user concerns, Amazon has provided detailed documentation describing when users are being recorded, how the data is used, and how to delete recorded data.
  • Amazon has also implemented strict control standards on its apps, instituted a more comprehensive review processes, and, most notably, begun encrypting the data that travels between Echo and Amazon servers.

2) Microsoft

Cautionary Tales

Resulting Security Initiatives.

  • Currently, Microsoft's largest security initiative is aimed at election security.
  • In response to the hack of the DNC Exchange server, Microsoft developed AccountGuard, which offers threat detection and security guidance across "both email systems run by organizations and the personal accounts of these organizations' leaders and staff who opt-in."
  • As well as the AccountGuard program, Microsoft is also providing another feature called the Defending Democracy Program. This solution is designed to protect "organizations that underpin democracy" from hacking and disinformation campaigns. These programs have been installed across the US and Europe.
  • According to the Tech Republic, Microsoft spends over $1B each year, defending against roughly seven trillion cyberthreats per day.
  • Microsoft is also developing multiple patents in the area of homomorphic encryption (HE), which will allow operations to be performed on encrypted data without decrypting it first. Other stakeholders like IBM, Intel, and SAP are working with Microsoft to define industry standards for HE.

3) Google

Cautionary Tales

Resulting Security Initiatives

  • Google recently "upgraded and improved users' control of account settings and permissions for Google accounts. These changes included new limitations for apps looking to access tools like Gmail and diminishing the ability for apps to access call logs and SMS history on Android mobile devices."
  • In the 2018 midterm elections, it is estimated that 65% of the candidates used Gmail accounts. Consequently, Google's Advanced Protection Program was created for high-risk users, including "journalists, activists, business leaders, and political campaign teams." It provides a 2-step verification for sign-in and restricted account access to third-party apps.
  • Google has also applied for a patent for "access control for user-related data." This system is designed to "detect potential large scale leaks while deploying encryption technology that can help maintain the privacy of the data."

of six

Data Privacy and Security - 2020 Privacy and Security Regulation Awareness

Data privacy and security regulations that US tech companies have to be aware of in 2020, as it pertains to certain jurisdictions, include California’s Consumer Privacy Act (CCPA), Maine Act to Protect the Privacy of Online Consumer Information and New York Stop Hacks and Improve Electronic Data Security (SHIELD) Act.

California’s Consumer Privacy Act (CCPA)

  • CCPA is a law designed to protect the data privacy rights of citizens living in California.
  • It forces companies to provide more information to consumers about what’s being done with their data and gives them more control over the sharing of their data.
  • The law goes into effect on January 1, 2020.
  • Businesses who annually buy, receive, sell or share the personal information of 50,000 or more consumers, have an annual gross revenue of over $25 million and derive 50% or more of their annual revenue from selling consumer personal information must comply with the CCPA.
  • Provisions in the law include customers' right to request deletion of personal information and granting consumers the right to control selling their information to third parties.
  • The maximum penalty of the CCPA is $7,500.
  • Given that many tech companies mainly deal with customer data, they need to be aware of the CCPA and successfully comply with it.

Maine Act to Protect the Privacy of Online Consumer Information

  • This regulation prohibits broadband Internet access service providers from using, selling, distributing or permitting access to customer personal information for purposes other than providing services, unless the customer expressly consents.
  • It applies only to broadband providers operating within Maine and becomes effective on July 1, 2020.
  • To comply, providers must provide notice and seek express opt-in consent before collecting personal information, and they must protect personal information.
  • It also prohibits ISPs from refusing to serve customers who withhold consent, as well as it bans ISPs from offering financial or other incentives for customers to opt-in.
  • Sanctions for violation of the regulation can be found in chapter 15 of Maine’s title 35-A on Public Utilities.
  • Implementation of the regulation may threaten the innovation of Internet-based technologies and services.

New York Stop Hacks and Improve Electronic Data Security (SHIELD) Act

  • The SHIELD Act aims to enhance consumer data security by expanding the definition of protected data and enhancing breach notification requirements.
  • It goes into effect on March 21, 2020.
  • Penalties for failing to fulfill data security requirements are capped at $5,000 per violation.
  • It can apply to any business that holds private information of New York residents, regardless of whether it conducts business in New York.
  • Businesses, including tech companies, can adopt and follow a formal plan that includes a breach notification policy and data security measures to protect private information to limit their exposure to penalties for noncompliance.
of six

Data Privacy and Security - Best Practices

Giving consumers the option to opt-out of the sale of their personal information, updating privacy policies, and deleting or anonymizing an individual’s data upon request are some of the best practices U.S. tech companies can use to stay in compliance with new privacy legislation/regulations. Below is an overview of the best practices.

New State Privacy and Security Laws in the U.S.

  • Understanding what each of the top new privacy regulations entails is vital to understanding the necessary practices that should be taken by U.S. tech companies to stay in compliance with these new regulations. Three important new privacy and security laws in the U.S. include the California Consumer Privacy Act (CCPA), the Nevada Senate Bill 220 Online Privacy Law, and The Maine Act to Protect the Privacy of Online Consumer Information.
  • In 2018, California passed the California Consumer Privacy Act (CCPA), a data privacy act based on the European Union’s General Data Protection Regulation (GDPR). The Consumer Privacy Act requires compliance by all businesses that collect data on California residents.
  • The Nevada Senate Bill 220 Online Privacy Law was signed by Governor Janet Mills on June 6, 2019, and goes into effect on July 1, 2020. It amends "Nevada’s existing privacy law by requiring businesses to offer consumers an opt-out regarding the sale of their personal information, with some exceptions."
  • The Maine Act to Protect the Privacy of Online Consumers was enacted to protect the privacy of customers of broadband internet access services and requires service providers to obtain opt-in consent before using, disclosing, or selling their customers’ personal information.

Best Practices for Compliance With New Privacy Legislation and/or Regulations

1. Providing Consumers With the Option to Opt-Out of the Sale of Their Information


  • The three privacy laws mentioned above require that service providers provide consumers with the option to opt-out of the sale of their personal information. This gives consumers the right and ability to control the act of selling their information to third-parties. Service providers can allow consumers to opt-out by placing a “Do Not Sell My Personal Information” link in the privacy policies or by setting up a designated request address that consumers can use to opt-out of the sale of their information.


  • This practice has broad coverage and is listed as a recommended strategy to remain in compliance with the CCPA, the Nevada Senate Bill 220 Online Privacy Law, and the Maine Act to Protect the Privacy of Online Consumer Information.

Leaders Endorsing It

  • In 2018, Senator Ron Wyden proposed The Consumer Data Protection Act of 2018 Discussion Draft that recommended service providers "give consumers a way to review what personal information a company has about them, learn with whom it has been shared or sold, and to challenge inaccuracies in it."

2. Updating Privacy Policies


  • This practice recommends service providers inform their users about how to exercise their rights under the new data privacy regulations.


  • A privacy policy should detail aspects such as the categories of personal information collected, a description of any process by which a consumer can review and request changes to their personal information, a description of how consumers will be informed of any changes to the privacy policy, disclosure of whether tracking technology will be used and the effective date of the notice. This ensures that consumers are well aware of the privacy terms and conditions of the service provider before proceeding to use the service.

Leaders Endorsing It

  • Representative Frank Pallone (Democrat representing New Jersey) called on Congress to pass legislation on Internet privacy during Facebook CEO Mark Zuckerberg's testimony. This was after the discovery of the Facebook and Cambridge Analytica data scandal.

3. Deleting or Anonymizing an Individual’s Data Upon Request


  • This practice entails that service providers know where they store personal information and make sure that they can delete or anonymize an individual’s data upon request.
  • This practice goes hand in hand with ensuring that they can respond to the consumer within 60 days when such a request is made.


  • This practice requires that the service provider carries out the duties of care, loyalty, and confidentiality and ensures that they do not disclose or share individual identifying data with any other person except for what is in conformity with the agreement between them and the consumer.

Leaders Endorsing It

Research Strategy

To identify best practices, we leveraged leading publications and industry databases. We first sought readily available lists of best practices to stay in compliance with new privacy legislation, but as these were not available (most best practices focused on individual regulations and not all privacy laws collectively) we identified three new privacy regulations and searched for recommendations to stay in compliance with these regulations. We identified those recommendations that were common for all three identified regulations and presented these above.


From Part 02
  • "The standards would address underlying scientific principles and methods, an assessment of disparate impact on the basis of demographic features such as race or gender, requirements for testing and validating the software and for publicly available documentation, and requirements for reports that are provided to defendants by the prosecution documenting the use and results of computational forensic software in individual cases."
  • "Facial recognition and other biometric surveillance technology pose unique and significant threats to the civil rights and civil liberties of residents and visitors."
  • "Nevada joins California in providing consumers with substantial new privacy rights, marking a significant expansion of the state data privacy landscape and signaling new frontiers in the patchwork of state data protection laws"
  • "Who has to comply with SB 220? Operators, or in the words of SB 220, anyone who: a) Owns or operates an Internet website or online service for commercial purposes;"
  • "Under the AI Video Interview Act, employers that record video interviews and use AI technology to analyze applicants’ suitability for employment must: inform applicants that AI technology may be used to evaluate their interviews; provide applicants with a written explanation of the technology’s mechanics, including the traits that will be reviewed and analyzed by AI and the characteristics the AI program uses to evaluate applicants; and acquire applicants’ prior consent to be assessed by AI technology."
  • "The National Science Foundation has identified the “10 Big Ideas for NSF Future Investment” including “Harnessing the Data Revolution” and the “Future of Work at the Human-Technology Frontier”, in with artificial intelligence is a critical component."
  • "Anthony Gonzalez (R-OH) introduced the “Identifying Outputs of Generative Adversarial Networks Act” (H.R. 4355), which would direct both the National Science Foundation and NIST to support research on deepfakes to accelerate the development of technologies that could help improve their detection,"
From Part 06
  • "The most comprehensive piece of state-level legislation across these often-intertwined categories that has been enacted over the past two years is the sweeping California Consumer Privacy Act (CCPA)"
  • "California Consumer Privacy Act (CCPA)"
  • "Nevada Senate Bill 220 Online Privacy Law"
  • "Maine Act to Protect the Privacy of Online Consumer Information"
  • "Include a clearly marked “Do Not Sell My Personal Information” link on websites and apps"
  • "In addition to knowing where you store personal information, make sure that you can delete or anonymize an individual’s data upon request"
  • "Update your privacy policies each year"
  • "Set up a designated request address that a consumer can use to opt out of the sale of their covered information"
  • "Update your Privacy Policy to refer to this process"
  • "Ensure you can promptly stop selling a consumer’s covered information on receipt of an opt-out request"
  • "Ensure you can respond to the consumer within 60 days"
  • "Obtain express, affirmative consent from customers before using, disclosing, or selling their personal information and allow customers to revoke consent at any time"
  • "Allow customers to opt-out of the use, disclosure, or sale of their non-personal information via written request"
  • "Provide clear, conspicuous, and non-deceptive notice to customers informing them of their rights and the provider’s obligations both at the point of sale and on the provider’s website"
  • "Democrat Frank Pallone of New Jersey, endorsed “comprehensive legislation"
  • "Oregon Democrat Ron Wyden got an early jump with a draft Consumer Data Protection Act that caught attention with high-level corporate disclosure requirements "
  • "Senator Schatz (joined by 15 Democrats) released a draft Data Care Act that would establish duties of “care, loyalty, and confidentiality” for online providers that handle personal data"
  • "Create a national Do Not Track system that lets consumers stop third-party companiesfrom tracking them on the web by sharing data, selling data, or targeting advertisementsbased on their personal information"
  • "Give consumers a way to review what personal information a company has about them,learn with whom it has been shared or sold, and to challenge inaccuracies in it"
  • "To establish duties for online service providers with respect to end user data that such providers collect and use"
  • "During his opening statement, Rep. Frank Pallone (D-N.J.) called on Congress to pass legislation on Internet privacy, saying he fears nothing will happen after Mark Zuckerberg's testimony"