Perplexity AI Sued for Sharing User Data with Meta and Google: What It Means for CRE Investors

What is the Perplexity AI data sharing lawsuit, and why should CRE investors care? The Perplexity AI data sharing controversy refers to a class-action lawsuit filed on April 1, 2026, in federal court in San Francisco alleging that Perplexity AI secretly shared users' private search conversations with Meta and Google through hidden tracking software. For commercial real estate professionals who routinely input sensitive deal data, financial projections, and tenant information into AI search tools, this lawsuit is a direct warning about the risks of trusting AI platforms with confidential business intelligence. For a comprehensive overview of AI platforms available to real estate investors, see our complete guide on AI tools for real estate investors.

Key Takeaways

  • A federal class-action lawsuit accuses Perplexity AI of secretly sharing user conversations with Meta and Google via hidden trackers embedded at login.
  • CRE professionals who input deal financials, cap rates, and tenant data into AI search tools face potential exposure to third-party advertising networks.
  • Even Perplexity's "Incognito" mode allegedly failed to protect user privacy, transmitting data through undetectable tracking software.
  • Real estate investors should audit every AI tool in their workflow for data sharing policies, tracker presence, and SOC 2 compliance before sharing sensitive deal information.
  • This lawsuit could reshape how AI companies handle enterprise data, potentially accelerating adoption of self-hosted and on-premise AI solutions across CRE.

The Perplexity AI Lawsuit: What Happened

On April 1, 2026, a Utah man filed a proposed class-action complaint (Doe v. Perplexity AI Inc., 3:26-cv-02803) in the U.S. District Court for the Northern District of California. According to the complaint, as soon as users log into Perplexity's home page, hidden trackers download onto their devices. These trackers allegedly give Meta and Google full access to conversations between users and Perplexity's AI search engine, including the ability to "exploit this sensitive data for their own benefit, including targeting individuals with advertising and reselling their sensitive data to additional third parties."

The lead plaintiff shared family financial information, tax obligations, investment portfolios, and financial strategies with Perplexity's chatbot, exactly the type of confidential data that CRE professionals input daily. The lawsuit names Perplexity, Meta, and Google as defendants, accusing all three of violating federal and state computer privacy and fraud laws. As first reported by Bloomberg, the case seeks class certification for all affected users.

Why CRE Investors Should Pay Attention

Commercial real estate professionals are among the heaviest users of AI search tools. A typical CRE workflow involves querying AI platforms with highly sensitive information, including property addresses, asking prices, NOI figures, cap rate assumptions, DSCR calculations, tenant rent rolls, and acquisition pro formas. If an AI platform is silently forwarding that data to advertising networks, the competitive implications are severe.

Consider these scenarios that could directly impact CRE investors:

  • Deal intelligence leakage: If you search Perplexity for "cap rate trends in Dallas industrial 2026" alongside specific property addresses, that query data could theoretically be used by competitors or advertising platforms to identify your acquisition targets.
  • Financial exposure: Inputting NOI projections, IRR assumptions, or debt terms into an AI tool that shares data with third parties could expose your underwriting methodology to anyone with access to the advertising data.
  • Tenant privacy violations: Property managers who use AI search tools to research tenant financials, lease terms, or eviction procedures could inadvertently expose tenant personal information to Meta and Google's advertising ecosystems.
  • Regulatory risk: With California, Colorado, and multiple states enforcing stricter data privacy laws in 2026, CRE firms that allow employees to use AI tools with hidden data sharing could face their own compliance violations.

For a deeper look at how AI platforms handle your data, see our guide on AI model security and data privacy for CRE investors.

The "Incognito" Mode Problem

Perhaps the most alarming allegation in the lawsuit is that Perplexity's "Incognito" mode, which users reasonably expect to provide enhanced privacy, allegedly did not prevent data sharing. The complaint states that users' personal data was shared "even when they sign up for Perplexity's Incognito mode," with "undetectable" tracking software embedded in the search engine's code automatically transmitting conversations to Meta, Google, and other third parties.

This matters for CRE professionals because many investors assume that using privacy modes or enterprise tiers provides adequate protection. The lawsuit suggests that surface-level privacy features may not address the underlying tracking architecture. CRE firms using any AI tool in their deal pipeline should verify privacy protections at the technical level, not just the marketing level.

How to Audit Your CRE AI Workflow for Data Privacy

In light of this lawsuit, every CRE investor and property manager should conduct an immediate audit of the AI tools in their workflow. Here is a practical checklist:

  • Review data sharing policies: Read the privacy policy and terms of service for every AI tool you use. Look specifically for language about "analytics partners," "advertising," or "third-party data sharing." If the policy allows sharing with ad networks, treat the tool as non-confidential.
  • Check for SOC 2 Type II compliance: Enterprise-grade AI tools like Claude, ChatGPT Enterprise, and Perplexity Enterprise Pro offer SOC 2 certifications. Verify that your subscription tier includes these protections, as free and consumer tiers typically do not.
  • Use network monitoring: IT teams can use browser developer tools or network monitoring software to detect outbound tracker requests to domains like facebook.com, google-analytics.com, or doubleclick.net when using AI tools.
  • Segment sensitive workflows: Never input specific deal financials, property addresses, tenant names, or acquisition targets into AI tools that lack enterprise-grade data isolation. Use a tiered approach: general research on consumer AI, sensitive analysis on enterprise-grade or self-hosted platforms only.
  • Consider on-premise alternatives: Open-source models like Llama 4, DeepSeek V4, and Mistral can be deployed on local hardware, ensuring that no data ever leaves your network. The cost of a local inference setup has dropped below $10,000 for capable hardware in 2026.

For more on managing uncontrolled AI tools in your organization, see our article on shadow AI agents flooding enterprises.

The Broader AI Data Privacy Landscape in 2026

This lawsuit arrives at a critical moment for AI data privacy. Governor Gavin Newsom signed Executive Order N-5-26 on March 30, 2026, requiring AI companies to certify safeguards against bias and misuse to win California state contracts. The EU AI Act's high-risk obligations are entering enforcement. And the Perplexity lawsuit joins a pattern of AI security incidents, including the LiteLLM supply chain attack that compromised 97 million downloads in March 2026.

According to Deloitte's State of AI in the Enterprise 2026 report, 78% of organizations now use AI, up from 55% in 2023. Yet only 5% report achieving most of their AI program goals, and data privacy remains the top barrier to enterprise AI adoption. For CRE firms, the calculus is simple: the productivity gains from AI search tools must be weighed against the risk of exposing deal-critical intelligence to third parties.

If you are evaluating which AI platforms provide the strongest data protections for your real estate business, The AI Consulting Network specializes in exactly this type of assessment. CRE investors looking for hands-on AI implementation support can reach out to Avi Hacker, J.D. at The AI Consulting Network for a personalized AI workflow audit.

What Comes Next: Legal and Market Implications

The Doe v. Perplexity case is in its earliest stages. Perplexity has stated it has "not been served any lawsuit that matches this description." Meta pointed to policies prohibiting advertisers from sending sensitive information. Google did not immediately respond. However, the case joins Perplexity's growing legal challenges, including Amazon's lawsuit over unauthorized account access, where a federal judge has already temporarily blocked Perplexity from accessing Amazon via its Comet browser.

For CRE investors, the key question is not whether Perplexity is found liable, but whether your AI workflow is resilient to the possibility that any AI tool could be sharing data in ways its marketing materials do not disclose. The practical response is to assume that consumer-tier AI tools may share data and to build your sensitive CRE workflows exclusively on enterprise-grade platforms with contractual data isolation guarantees.

For personalized guidance on building a privacy-first AI strategy for your CRE portfolio, connect with The AI Consulting Network.

Frequently Asked Questions

Q: What exactly is Perplexity AI accused of doing with user data?

A: The class-action lawsuit alleges that Perplexity embedded hidden tracking software that silently shared users' search conversations with Meta and Google, enabling those companies to use the data for advertising targeting and resale to third parties, even when users activated Perplexity's Incognito mode.

Q: Should CRE investors stop using Perplexity AI immediately?

A: Rather than abandoning the tool entirely, CRE investors should immediately stop inputting sensitive deal data (property addresses, NOI figures, acquisition terms) into any consumer-tier AI tool until the allegations are resolved. Continue using Perplexity for general market research, but move all confidential analysis to enterprise-grade platforms with SOC 2 Type II compliance and contractual data isolation.

Q: Which AI tools are safest for sensitive CRE data in 2026?

A: Enterprise tiers of Claude (Anthropic), ChatGPT Enterprise (OpenAI), and Gemini Advanced (Google) all offer SOC 2 Type II compliance and contractual commitments not to train on user data. Self-hosted open-source models like Llama 4 and DeepSeek V4 provide the highest level of data isolation since no data leaves your network.

Q: How does this lawsuit affect CRE firms that already use Perplexity for research?

A: CRE firms should conduct an immediate review of what data employees have shared with Perplexity, assess whether any sensitive deal information may have been exposed, and implement clear policies about which AI tools are approved for which types of queries. The cost of this audit is minimal compared to the competitive risk of deal intelligence leakage.

Q: Could this lawsuit lead to broader AI data privacy regulations affecting CRE?

A: Yes. Combined with California's Executive Order N-5-26 and the EU AI Act, this lawsuit adds momentum to a regulatory trend that will likely require all AI vendors to disclose data sharing practices transparently. CRE firms that build compliant AI workflows now will have a competitive advantage as these regulations take effect.