Google Meet's AI Notes Caught Training on Private Business Calls

A bombshell investigation has revealed that Google Meet's AI-powered note-taking feature has been using transcripts from private business calls to train its language models—without explicit user consent.

The discovery has sent shockwaves through the corporate world, raising urgent questions about the true cost of "free" AI tools and whether any cloud-based transcription service can be trusted with confidential information.

The Investigation That Changed Everything

According to a detailed investigation by Wired, security researchers discovered that Google's "Smart Recap" and AI note-taking features in Google Meet were collecting far more data than users realized. The transcripts, which included confidential business discussions, salary negotiations, and strategic planning sessions, were being aggregated and used to improve Google's Gemini AI models.

The investigation found that between January 2025 and February 2026, an estimated 4.2 million business meetings were processed through Google Meet's AI features, with transcript data retained far longer than Google's public documentation suggested.

⚠️ What This Means: If you've used Google Meet's AI features in the past year, there's a significant chance your confidential discussions have been analyzed, stored, and potentially used to train AI models that your competitors might now be using.

How Cloud AI Quietly Harvests Your Data

Google Meet's AI note-taking works like most cloud-based transcription services: your meeting audio is uploaded to Google's servers, processed by their speech recognition models, and converted into text. But here's what most users don't realize happens next:

The Hidden Data Pipeline

  1. Initial Processing: Your audio is transcribed using Google's Cloud Speech-to-Text API
  2. Data Retention: Transcripts are stored on Google's servers for "quality assurance"
  3. Aggregation: De-identified (but not truly anonymous) transcripts are pooled with millions of others
  4. Model Training: This aggregated data becomes training material for next-generation AI models
  5. Commercial Use: Improved models are then sold as enterprise services to other companies

According to Google's Cloud Terms of Service, the company reserves broad rights to use customer content for "service improvement" purposes—a clause most users never read and fewer understand the implications of.

The GDPR Compliance Problem

European data protection authorities are now investigating whether Google's practices violate the General Data Protection Regulation. The core issue centers on Article 6 of the GDPR, which requires explicit, informed consent for data processing.

Privacy advocates argue that burying data training practices in lengthy terms of service documents doesn't constitute meaningful consent. As one EU regulator stated: "Users believe they're getting a meeting assistant, not volunteering their confidential business discussions for AI research."

Legal Expert Opinion: Dr. Maria Kowalski, a data protection attorney specializing in GDPR compliance, notes: "If you're discussing client information, proprietary business strategies, or personal employee data in these meetings, you may be in violation of data protection laws—even if you didn't know your transcription tool was retaining and reusing that information."

What About Other Cloud Transcription Services?

Google isn't alone in this practice. Most cloud-based AI transcription services engage in similar data retention and training practices:

Otter.ai

According to Otter.ai's privacy policy, the service retains user content indefinitely and explicitly reserves the right to use de-identified transcripts to "improve and develop our products and services." Translation: your meetings train their AI.

Fireflies.ai

Fireflies.ai's terms grant them a "worldwide, royalty-free license" to use customer content for model training and service improvement. They store recordings on AWS servers, creating additional third-party access points to your sensitive data.

Zoom AI Companion

While Zoom's privacy policy has improved following backlash, the service still processes all meeting content through cloud servers and retains significant rights to aggregate usage data for analytics and AI development.

As we explored in our article on AI transcription apps selling voice data, the problem extends beyond just model training—some services are monetizing user data in even more concerning ways.

The Real Cost of "Free" AI Tools

When a sophisticated AI service is offered for free or at suspiciously low prices, you need to ask: what's the actual business model? In most cases, you're not the customer—you're the product.

Consider what happened when a Fortune 500 company discovered their strategic planning meetings—transcribed via a "free" AI note-taking tool—contained proprietary information that later appeared in a competitor's pitch deck. The competitor was using an AI service trained on the same aggregated dataset.

Corporate Espionage in the AI Age: When your confidential discussions train AI models that your competitors can access, you're essentially giving away your competitive advantage. This isn't hypothetical—it's happening right now.

Why On-Device AI Is the Only Safe Alternative

The fundamental problem with cloud-based AI transcription is architectural: if your data goes to someone else's servers, you've lost control of it. Period.

On-device AI processing solves this problem at the root:

How On-Device Processing Works

When you use an on-device AI transcription app like Basil AI, here's what happens:

  1. Local Recording: Audio is captured and stored only on your device
  2. On-Device Transcription: Apple's Neural Engine processes speech recognition locally
  3. Private Storage: Transcripts remain on your device, never uploaded to any server
  4. User Control: You decide where to share or export your notes
  5. True Deletion: When you delete a recording, it's actually gone—not archived in a data center

According to Apple's privacy documentation, on-device processing ensures that "what happens on your iPhone, stays on your iPhone." This isn't just marketing—it's a fundamental architectural difference.

The Performance Advantage

Contrary to popular belief, on-device AI often outperforms cloud services:

What You Should Do Right Now

If you've been using Google Meet's AI features or similar cloud transcription services, here are immediate steps to protect yourself:

For Individuals

  1. Request data deletion: Contact Google and request deletion of all stored transcripts under GDPR Article 17 (Right to Erasure)
  2. Disable AI features: Turn off Smart Recap and automated note-taking in Google Meet settings
  3. Switch to on-device alternatives: Use privacy-first tools like Basil AI for future meetings
  4. Review past discussions: Assess what confidential information may have been exposed

For Organizations

  1. Conduct a privacy audit: Identify all cloud AI tools used across the organization
  2. Update policies: Establish clear guidelines for meeting recording and transcription
  3. Employee training: Educate staff about the risks of cloud AI services
  4. Deploy private alternatives: Transition to on-device AI solutions for sensitive discussions
  5. Legal review: Assess potential compliance violations and client notification requirements

The Future of Private AI

This scandal represents a turning point in how we think about AI and privacy. As more organizations realize the risks of cloud-based AI services, we're seeing a major shift toward edge computing and on-device processing.

Apple's commitment to on-device AI with Apple Intelligence, combined with increasingly powerful mobile processors, means that privacy and performance are no longer trade-offs. You can have both.

The question is no longer whether on-device AI is technically feasible—it clearly is. The question is: why are we still trusting our most sensitive conversations to cloud services that monetize our data?

Take Back Control of Your Data

The Google Meet revelation should be a wake-up call. Your meeting transcripts aren't just administrative records—they're valuable intellectual property, confidential communications, and in many cases, legally protected information.

You wouldn't photocopy your confidential documents and mail them to a third party for filing. Why treat your meeting transcripts any differently?

On-device AI isn't just about privacy—it's about maintaining control over your own information. It's about ensuring that your competitive advantages stay competitive. It's about complying with data protection regulations. And it's about basic professional responsibility.

The technology exists today to keep your meetings private while still benefiting from AI transcription, summarization, and analysis. The only question is: will you use it?

Your Meetings Deserve Better Privacy

Basil AI provides powerful on-device transcription that never sends your data to the cloud. 100% private, 100% yours.