AI‑enabled transcription tools have quickly become a fixture in modern business meetings. Properly vetted and deployed, they offer efficiency and accessibility advantages. Yet, while many companies are still developing governance frameworks for authorized AI tools, an emerging risk has quietly surfaced: employees using unauthorized transcription tools without the company’s or participants’ consent.
This practice, often referred to as shadow AI, creates new layers of compliance, privacy, and legal exposure. For companies seeking to adopt AI responsibly, it is no longer enough to decide which tools to authorize; they must also manage the risk of employees using tools outside approved systems.
In a recent survey conducted by the National Cybersecurity Alliance, 43 percent of AI users admitted to sharing sensitive company information with AI tools without their employer’s knowledge.1 These statistics underscore that shadow AI use is not theoretical, it is already happening, often outside company’s systems and platforms, and, therefore, not in view of legal and compliance teams.
Understanding the Risk Landscape
Once employees begin using transcription tools that the company has not vetted or authorized, problems can emerge quickly. Key risk areas include state law recording‑consent requirements, privilege and confidentiality obligations, governance controls, and record retention practices, all of which can lead to serious downstream consequences if not addressed through clear policy and oversight.
In “two‑party” or “all‑party” consent jurisdictions, every participant in a conversation must agree before the meeting is recorded or transcribed.2 Employees who enable transcription functions without consent may unknowingly violate state law. Some states impose criminal penalties; others allow civil lawsuits. In jurisdictions imposing civil liability, and where company policy does not expressly forbid unauthorized transcription, employers may face vicarious liability if employees act within the scope of their duties. In jurisdictions that recognize corporate criminal liability, an employer may be exposed if, for example, an employee surreptitiously records and transcribes a meeting with an executive of another company.
Unauthorized tools can also jeopardize confidentiality and privilege. When a company contracts directly with a vendor, it can negotiate data‑security and retention terms, deletion rights, confidentiality protections and control over how information is processed. Consumer or free transcription tools do not typically offer these safeguards. Meeting content may be uploaded into large, unbounded language models that learn from user data or store information indefinitely, creating potential waiver of attorney-client privilege, loss of trade‑secret protection, and violation of data‑privacy obligations. Once data enters such systems, the company cannot control its dissemination, access, or downstream use.
From a governance perspective, unauthorized transcription denies the organization the ability to decide strategically when and how meetings are recorded. Recording decisions should be made at the organizational level, not by individual employees, because the decisions determine what becomes part of the corporate record. Without that oversight, the company has no real visibility into what employees are recording or how the resulting transcripts are being managed. When employees record meetings on their own, the company loses the opportunity to review transcripts for accuracy, ensure proper context, and reconcile them with official notes or minutes.
These risks increase in litigation and regulatory matters. Data stored outside official retention and discovery channels can lead to production gaps or spoliation concerns, and civil discovery sanctions or even potential obstruction of justice charges if the production is to a government entity and as not maintained.3 Most consumer platforms lack defined retention periods or deletion controls, making it difficult for companies to ensure that unauthorized records comply with established data‑management policies. Investigators or opposing counsel may also compel transcripts directly from employees, bypassing corporate legal oversight and creating conflicting accounts that can undermine privilege and raise credibility questions.
In short, shadow AI undermines a core principle of good governance — the company must maintain control over what is recorded, how it is stored, and how long it is retained. Reestablishing that control is essential to protecting privilege, confidentiality, and compliance.4
Taking Control of Shadow AI
Effective management of shadow AI begins with a threshold question: should employees be permitted to use transcription tools at all, and if so, under what circumstances? That decision should be made as part of the company’s broader AI governance framework, led by legal, compliance, and IT‑security functions.
If the company concludes that transcription tools can provide value when properly managed, legal counsel should bring AI usage into the open. Begin by identifying what tools employees already rely on and why. In many cases, employees turn to shadow AI for efficiency, not to evade policy, and that insight should help shape a practical response. Engage directly with business teams to understand where current tools or support fall short and adjust approved solutions to meet those needs.
Once the landscape is understood, select secure, enterprise‑grade transcription tools that satisfy confidentiality, privilege, and record‑keeping requirements and align with the company’s operational needs and regulatory environment. Authorized vendors should offer clear data‑ownership terms, defined retention and deletion rights, and secure environments that prevent company information from being used to train AI models.
Company policies should specify when recordings are permitted and who approves them. Recording decisions should rest with designated personnel, not individual employees; and employees must also understand that each recording is a corporate document subject to consent obligations and document‑hold requirements.
Employee engagement and training are essential. Counsel and compliance teams should make sure employees understand:
- The legal and reputational risks of unauthorized recording or transcription;
- State consent requirements and the potential penalties for non‑compliance;
- How privilege and confidentiality can be lost through unapproved tools; and
- When recording is permitted, who must authorize it, and how to manage resulting data.
Equally important is giving employees functional, approved tools that actually meet their needs. When authorized solutions are efficient and transparent, shadow AI use naturally declines.
If the company decides transcription should not be used, that position must be communicated and reinforced through training. Employees should understand the reasons: unauthorized recordings can violate the law, compromise confidentiality and waive privilege. Technical controls can support enforcement by spotting or blocking prohibited applications, but consistent communication and visible support from leadership are usually more effective at sustaining compliance.
Looking ahead, companies should assume that some shadow AI activity exists. Strong governance depends on visibility and accountability — identifying unauthorized tools, limiting their use, and ensuring data from approved channels is properly managed. Integrating AI oversight into existing compliance and information‑governance programs helps organizations stay in control as technology and business practices continue to evolve.
- National Cybersecurity Alliance & CybSafe, Oh, Behave! Then Annual Cybersecurity Attitudes and Behaviors Report 2025-2026, 93 (2026). ↩︎
- States requiring all‑party (or two‑party) consent before recording a conversation include California, Connecticut, Florida, Illinois, Massachusetts, Montana (knowledge rather than consent), New Hampshire, Oregon (in‑person only), Pennsylvania, and Washington. ↩︎
- See, e.g., In re Google Play Store Antitrust Litig., 664 F. Supp. 3d 981, 991-94 (N.D. Cal. 2023) (civil discovery sanctions to be imposed under FRCP 37(e)); 18 U.S.C. § 1519 (federal obstruction statute). ↩︎
- The DOJ and other federal agencies are increasingly evaluating companies’ use of AI and ability to maintain ephemeral data as part of their compliance assessments. See, e.g., Evaluation of Corporate Compliance Programs in Criminal Antitrust Investigations, U.S. DOJ Antitrust Division (Nov. 2024), https://www.justice.gov/d9/2024-11/DOJ%20Antitrust%20Division%20ECCP%20-%20November%202024%20Updates%20-%20FINAL.pdf. ↩︎