The nightmare scenario corporate counsel has been warning about finally happened: a Fortune 500 company is now facing a Securities and Exchange Commission investigation after an AI transcription bot recorded a confidential board meeting containing material non-public information (MNPI).
According to reporting by The Wall Street Journal, the incident occurred when a newly-hired executive assistant added a cloud-based AI meeting bot to a scheduled board call without understanding the sensitivity of the discussion. The bot—operated by a popular transcription service—recorded, transcribed, and stored conversations about an upcoming merger, pre-announcement earnings figures, and strategic restructuring plans.
Within 48 hours, the SEC's Division of Enforcement opened an inquiry into potential Regulation Fair Disclosure (Reg FD) violations and inadequate information security controls.
What Happened: A Timeline of the Breach
Here's how a routine administrative task turned into a compliance crisis:
Day 1: The Recording
- 9:00 AM: Executive assistant schedules quarterly board meeting via Zoom
- 9:15 AM: Assistant adds AI bot to "help with note-taking" (company policy unclear)
- 2:00 PM: Board meeting begins; bot announces "This meeting is being recorded"
- 2:02 PM: Participants assume recording is for internal compliance only
- 2:05-4:00 PM: Board discusses:
- Pending $8.2 billion acquisition (not yet announced)
- Q4 earnings coming in 23% above analyst estimates
- Plans to divest underperforming division
- CEO succession timeline
- 4:15 PM: Transcript automatically generated and stored on vendor's cloud servers
Day 2: The Discovery
- 10:00 AM: General Counsel discovers bot was present during routine meeting review
- 10:30 AM: Legal team reviews vendor's privacy policy and terms of service
- 11:00 AM: Realizes recording contains MNPI, stored on vendor servers, accessible to vendor employees
- 11:45 AM: Emergency legal conference convened
- 2:00 PM: Company contacts vendor to request immediate deletion
- 4:30 PM: Vendor confirms deletion but cannot guarantee information wasn't accessed
Day 3: The Disclosure
- 9:00 AM: Board meeting to discuss disclosure obligations
- 2:00 PM: Company files 8-K with SEC disclosing potential information security incident
- 4:00 PM: SEC opens informal inquiry
Week 2: The Investigation
- SEC upgrades to formal investigation
- Subpoenas issued for:
- All communications with transcription vendor
- Information security policies and procedures
- Trading records of employees with vendor access
- Board meeting recordings and transcripts
- Company stock drops 8% on investigation news
The Legal Framework: Why This Is a Serious Violation
The SEC takes information security around MNPI extremely seriously. Multiple regulatory frameworks were potentially violated:
1. Regulation Fair Disclosure (Reg FD)
Under Regulation FD, companies must disclose material information to all investors simultaneously—not selectively to certain parties. When MNPI is recorded by a third-party vendor, that vendor and its employees become recipients of selective disclosure.
The regulation explicitly states that disclosure to any person outside the company who does not owe a duty of trust or confidence triggers public disclosure obligations.
2. Insider Trading Prohibitions
Anyone with access to MNPI who trades on that information—or tips others who trade—violates insider trading laws. When a cloud AI service records MNPI, every employee at that vendor with system access becomes a potential insider trading risk.
The SEC doesn't just prosecute the traders; they pursue companies with inadequate controls that allowed the information to leak in the first place.
3. Information Security Obligations
The SEC's 2018 guidance on cybersecurity requires companies to maintain "disclosure controls and procedures" to ensure material information is properly protected. Using unsecured third-party cloud services for board communications can constitute a control failure.
4. Sarbanes-Oxley Internal Controls
Section 404 of Sarbanes-Oxley requires companies to maintain effective internal controls over financial reporting. Allowing MNPI about earnings to be recorded by unvetted third parties demonstrates control weaknesses.
What the Vendor's Privacy Policy Actually Says
When legal counsel reviewed the transcription vendor's terms of service, they discovered several concerning provisions buried in the fine print:
Data Retention
"We retain your content for as long as your account is active and for a reasonable period thereafter in case you decide to reactivate service. We may also retain certain information as required for legal or business purposes."
Translation: Your board meeting transcript sits on their servers indefinitely. Even after "deletion," backups may persist for undefined "business purposes."
Employee Access
"Our employees may access your content to provide customer support, improve our services, train our AI models, and ensure platform security."
Translation: Customer support representatives, engineers, data scientists, and security personnel can all access your MNPI. The company has no idea how many people actually viewed the transcript.
AI Training Rights
"By using our service, you grant us a worldwide, royalty-free license to use your content to develop, train, and improve our artificial intelligence and machine learning models."
Translation: Your confidential merger discussions are now training data for their AI. That information could theoretically be surfaced in responses to other users.
Third-Party Sharing
"We may share information with service providers, business partners, and affiliates who assist us in operating our platform."
Translation: Your MNPI isn't just with one vendor—it's with their entire ecosystem of subcontractors and partners.
As Wired reported in their investigation of AI meeting tools, most users never read these policies and have no idea how broadly their information is shared.
The Regulatory Consequences
The company now faces multiple regulatory exposures:
SEC Enforcement Actions
- Civil penalties: Up to $1 million per violation for corporate entities
- Cease-and-desist orders: Formal SEC order requiring remediation
- Disgorgement: If any trading occurred, profits must be returned
- Officer and director bars: Individuals could be barred from serving as executives
Shareholder Litigation
- Securities fraud class actions claiming inadequate disclosure controls
- Derivative suits against board members for breach of fiduciary duty
- Claims that share price declined due to disclosure of investigation
Reputational Damage
- Public disclosure of information security failures
- Loss of investor confidence in management
- Difficulty recruiting board members concerned about liability
- Increased scrutiny on all future SEC filings
Operational Costs
- Legal fees for SEC investigation response (easily $5-10 million)
- Forensic investigation costs to determine full extent of exposure
- Implementation of new information security controls
- Board and management time diverted to compliance
Why This Keeps Happening: The Systemic Problem
This incident isn't isolated. It's symptomatic of a broader failure in how companies approach AI tools:
1. Decentralized Adoption
AI meeting bots are typically adopted at the individual or team level—not through formal IT procurement. An executive assistant, operations manager, or team lead simply adds a bot to meetings without legal or compliance review.
Unlike enterprise software deployments that go through security review, these tools proliferate invisibly across the organization.
2. Inadequate Policies
Most companies lack clear policies about AI transcription tools. Employees don't know:
- Which tools are approved for which types of meetings
- What information can and cannot be recorded
- Who is responsible for data security
- What consent is required from participants
3. False Sense of Security
Because these tools come from well-known vendors with professional websites and marketing materials, users assume they're secure and compliant. The reality is that most prioritize features and ease-of-use over true data protection.
4. Disconnect Between IT and Legal
IT departments focus on technical security (encryption in transit, access controls). Legal departments focus on regulatory compliance (MNPI protection, attorney-client privilege). Neither fully owns the AI tool problem, so it falls through the cracks.
The Only Safe Solution: On-Device Processing
This entire crisis could have been avoided with on-device AI transcription. Here's why:
Zero Third-Party Access
When AI processing happens entirely on the user's device—as it does with Basil AI—no third party ever touches the content. The transcription vendor doesn't have servers storing your MNPI, because there are no servers involved.
No vendor access means no Reg FD violation, no insider trading risk, and no third-party data breach exposure.
Complete Control
With on-device processing, the company retains 100% control over the information:
- Recordings never leave the device
- Transcripts are stored only where the user chooses (local files, corporate iCloud)
- Deletion is immediate and permanent—no vendor backups persist
- No terms of service grant rights to the content
Compliance by Design
On-device AI makes compliance simple:
- Reg FD: No selective disclosure because no third party receives information
- Insider Trading: Only authorized insiders have access (as intended)
- SOX 404: Strong internal controls because information stays internal
- Information Security: No cloud attack surface to secure
How Basil AI Protects Board Meetings
Basil AI was designed specifically for sensitive conversations:
- 100% On-Device Processing: Uses Apple's Speech Recognition framework, which processes entirely locally using the Neural Engine
- No Cloud Upload: Audio and transcripts never leave your iPhone, iPad, or Mac
- No Account Required: No vendor has a database with your information
- Apple Notes Integration: Transcripts save to your corporate iCloud (under your organization's control) or stay local
- Instant Deletion: When you delete a recording, it's gone—no vendor backups to worry about
- 8-Hour Recording: Captures full board meetings, strategy sessions, or all-day workshops
- Offline Capable: Works without internet connection, perfect for secure facilities
For a detailed explanation of how on-device AI works, see our article on protecting intellectual property with local processing.
What Corporate Counsel Should Do Immediately
If your company uses cloud-based AI transcription services for sensitive meetings, take these steps now:
1. Conduct an Audit
- Survey which AI meeting tools are being used across the organization
- Identify who has added bots to meetings containing MNPI or confidential information
- Review vendor terms of service and privacy policies
- Assess potential regulatory exposure
2. Implement Clear Policies
- Prohibit cloud AI tools for board meetings, executive sessions, and MNPI discussions
- Require legal/compliance approval for any AI transcription tool adoption
- Mandate on-device solutions for sensitive conversations
- Train employees on information security obligations
3. Deploy Compliant Alternatives
- Provide on-device AI tools like Basil AI for approved use cases
- Ensure solutions meet your security and compliance requirements
- Make compliant tools as easy to use as consumer options
4. Update Disclosure Controls
- Revise disclosure controls and procedures to address AI tools
- Include AI transcription in information security risk assessments
- Brief board on information security implications of recording technologies
The Broader Implications for Corporate Governance
This case represents a turning point for corporate governance in the AI era. As the SEC has increasingly emphasized, companies must adapt their controls to new technologies—not assume that traditional security measures suffice.
Board members should be asking management:
- What AI tools are being used to record or transcribe sensitive meetings?
- Who has access to information captured by these tools?
- What controls exist to prevent unauthorized disclosure of MNPI?
- Are we compliant with SEC guidance on information security?
- What alternatives exist that don't create third-party exposure?
Directors who fail to ask these questions may find themselves personally liable when the inevitable breach occurs.
Conclusion: Prevention Is Cheaper Than Investigation
The company in this case will spend millions in legal fees, face potential penalties, endure shareholder litigation, and suffer reputational damage—all because someone added an AI bot to a meeting without understanding the consequences.
The irony is that preventing this crisis would have cost virtually nothing. On-device AI solutions like Basil AI are:
- More secure than cloud alternatives (no third-party access)
- More compliant with securities regulations (no selective disclosure)
- Less expensive than even one day of SEC investigation response
- Easier to deploy than explaining to the board why you didn't use them
The choice is simple: invest a few dollars per user in on-device AI, or risk millions in regulatory penalties, litigation costs, and reputational damage.
For companies serious about protecting MNPI while maintaining productivity, the path forward is clear.
Protect Your Board Meetings with On-Device AI
Basil AI provides the transcription capabilities your team needs with the security your legal counsel requires. 100% on-device processing. Zero third-party access. Complete regulatory compliance.
Download Basil AI - FreeiPhone, iPad, Mac, and Apple Watch • No account required • Works offline