Articles | Oct 18, 2023

AI Comes With Evolving Data Security and Privacy Considerations for Businesses

Rochester Business Journal

Read the Article
Data shield with technological blue print

AI should not be implemented before all considerations have been made

Artificial intelligence (AI) has many appealing features that can assist businesses in day-to-day activities, but its implementation should not take place before businesses consider the laws, regulations, rules and risks that may apply. For instance, businesses may use AI to make decisions directed toward consumers or employment decisions.

What should businesses be aware of if they decide to implement AI?

The regulation of AI is at the early stages, but is beginning to take shape. The EU and U.S. will play a key role in AI regulation and governance. While AI has not been directly regulated to a significant degree, there are several existing laws that cross over into the AI space. The following are some uses of AI and related risks to consider, namely employment law and data privacy risks.

Implementing AI into business decisions directed toward consumers

The Federal Trade Commission (FTC) announced in an article from May 2023 that the use of generative AI to persuade and influence consumers is currently a large area of its focus.1 This includes features such as AI-generated chatbots and customized advertisements targeting specific people or groups.2 One of the FTC’s main concerns is businesses using these features “in ways that, deliberately or not, steer people unfairly or deceptively into harmful decisions,” specifically in “areas such as finances, health, education, housing, and employment.”3 Furthermore, the FTC pays particular attention to “design elements that trick people into making harmful choices” in its investigations regarding potentially deceptive or unfair practices.4

The use of AI may implicate different data privacy laws, including the Health Insurance Portability and Accountability Act, the Fair Credit Reporting Act, the Uniform Trade Secrets Act and the Defend Trade Secrets Act, New York State’s Shield Act, the California Consumer Privacy Act (CCPA, as amended by the CPRA),  the General Data Protection Regulation (GDPR), as well as biometric laws. Businesses should determine if these laws apply and consider them when developing compliance programs. Furthermore, certain laws may require, among other things, notice to the consumer before disclosing data to a third party AI service provider, affirmative consent before collecting, processing or sharing data, and consumers’ rights to data deletion.5

Implementing AI into employment decisions

AI can now create job descriptions, screen resumes, conduct pre-employment assessments and even recognize facial expressions and nonverbal cues of candidates during video interviews. While these features may be helpful to employers, AI technology is still susceptible to biases and flaws, just like humans. Notably, under the Equal Employment Opportunity Commission’s (EEOC) guidelines and Title VII of the Civil Rights Act of 1964, employers may be responsible for “algorithmic decision-making tools even if the tools are designed or administered by another entity, such as a software vendor”6 if they lead to disparate impact discrimination, or discrimination based on race, sex, age or other protected categories.

Employers should consider conducting internal audits. If a change in interviewees’ demographics is found, an investigation should be held. In some cases, employers are required to conduct such inquiries. For example, Local Law 144 requires employers located in, or that have candidates or employees who reside in New York City, to carry out audits before using AI if it is used to substantially assist or replace discretionary decision making in hiring or other employment decisions.7

The SEC’s recent breach notification rule

On July 26, 2023, the U.S. Securities and Exchange Commission (SEC) finalized its rule on Cybersecurity Risk Management, Strategy, Governance, and Incident Disclosure by Public Companies.8 The rule became effective on September 5, 2023, and aims to “standardize disclosures regarding cybersecurity risk management, strategy, governance, and material cybersecurity incidents.”9 The rule applies to public companies that are subject to the reporting requirements of the Securities Exchange Act of 1934.10 It sets forth disclosure requirements regarding a) a cybersecurity incident that is deemed to be material; b) the material aspects of the nature, scope and timing of the incident; and c) the material impact or reasonably likely material impact of the incident on the company.11

If the company finds an incident to be material, then the disclosure generally has to be filed within four business days of the determination.12 Registrants also have to disclose the processes for assessing, identifying and managing material risks from current and previous cybersecurity threats and incidents.13 The board of directors’ oversight and management’s role in the assessment of risks created by cybersecurity threats should be included in the disclosure.

Personal liability imposed for cybersecurity incidents

Individuals have increasingly been held personally responsible for their employers’ cybersecurity incidents. In 2022, the former chief security officer of Uber, Joseph Sullivan, received three years of probation and a fine of $50,000 for his role in managing a ransomware attack on the company that involved 57 million users. Uber failed to disclose that incident to the FTC, which was then investigating the company in regard to a prior cyber incident.14

Earlier this year, the FTC finalized an order holding the chief executive officer of Drizly, an online alcoholic beverage delivery platform, personally liable for the company’s failure to implement appropriate information security practices15 and failed to “use reasonable information security practices to protect consumers’ personal information.”16 As a result, a malicious hacker was able “to access Drizly’s database and steal information relating to 2.5 million consumers.”17

In a time of developing AI and increased cybersecurity risks, businesses have to be diligent and proactive to ensure that all reasonable measures are taken to protect the business and individual personnel from liability.

Anna Mercado Clark, CIPP/E, CIPP/US, CIPM, FIP, is a partner at Phillips Lytle LLP and leader of the firm’s Data Privacy and Cybersecurity Practice Team. She can be reached at or (716) 847-8400 ext. 6466.

Michael R. Staszkiw, CIPP/US, is an attorney at Phillips Lytle LLP and a member of the firm’s Data Privacy and Cybersecurity and Business Litigation Practice Teams. He can be reached at or (585) 238-2044.

Adelyn G. Burns is an attorney at Phillips Lytle LLP and a member of the firm’s Business Litigation Practice Team. She can be reached at or (716) 847-5425.

1  Michael Atleson, The Luring Test: AI and the engineering of consumer trust, Fed. Trade Comm’n (May 1, 2023),

2   Id.

3   Id.

4   Id.

5   Local Law No. 144 (2021) of City of New York § 5-300; Cal. Civ. Code § 1798.105 (Westlaw through ch. 1 of 2023-24 1st Ex. Sess. & urgency Legis. through ch. 785 of 2023 Reg. Sess.); 15 U.S.C. § 1601.

6   U.S. Equal Emp. Opportunity Comm’n, Select Issues: Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964, (last visited Oct. 9, 2023).

7    Local Law No. 144 (2021) of City of New York § 5-300; Administrative Code of the City of New York §§ 20-870 – 20-874 (L. 2021/144 eff. Jan. 1, 2023).

8   Press Release, U.S. Sec. & Exch. Comm’n, SEC Adopts Rules on Cybersecurity Risk Management, Strategy, Governance, and Incident Disclosure by Public Companies (July 26, 2023)

9   U.S. Sec. & Exch. Comm’n, Fact Sheet Public Company Cybersecurity Disclosures; Final Rules, (last visited Oct. 9, 2023)

10   Id.

11   Id.

12   Id.

13   Id.

14   Press Release, U.S. Atty’s Off., N. Dist. of Cal., Former Chief Security Officer Of Uber Sentenced To Three Years’ Probation For Covering Up Data Breach Involving Millions Of Uber User Records (May 5, 2023),

15    In re Drizly, LLC, No. C-4780 (F.T.C. Jan. 9, 2023),

16   Id. at 3.

17   Id. at 1.

Related Insights

View All