Amid the Web3 revolution and the growing movement toward sovereign, privacy-respecting AI, our Perplexity Privacy Review 2025 delivers a thorough, independent assessment of how Perplexity approaches data protection, transparency, and user control. Drawing on an exclusive evaluation framework and an in-depth audit of publicly available information, this review examines everything from prompt storage and AI training practices to hosting sovereignty and Big Tech dependencies. It reflects our commitment to a future where digital privacy is treated not as a feature, but as a fundamental right.
The scoring system follows a comprehensive guide created specifically for this project, accessible here, and is designed to adapt dynamically as innovations emerge and feedback comes from the decentralized community.
Our mission is clear: to enlighten and inform, without filter or influence; so we can build together a fairer, more transparent AI ecosystem.
update : 25/08/11
Key Insights from the Perplexity Privacy Review 2025
Model
Perplexity Engine (core AI), Perplexity Pro (paid subscription, with Enterprise tiers that have stricter data retention controls), Comet Browser (standalone, privacy-focused browser with ad-driven data collection and its own privacy notice).
Data Collection
Prompts stored: Indefinite storage of prompts and interactions without immediate deletion guarantees undermines user control. Perplexity should implement a clear 90-day retention limit for non-essential data and offer instant deletion options via a user dashboard. C
Use for training:Using prompts for AI training by default with opt-out and no guaranteed anonymization risks user privacy. Perplexity must default to opt-in for training use and publish a detailed anonymization protocol to ensure data is untraceable. C
Account required: Requiring a Perplexity account for full features, including personal info like email and name, limits accessibility for privacy-conscious users. Allow full Comet Browser functionality without an account or minimize mandatory data to just an email address. B
Data retention duration: Consumer data is retained as long as “reasonably necessary” for business purposes, with no strict short-term cap. Deletion is possible upon request for most users, but not automatic. Enterprise files are deleted after 7 days by default. C
User Control
Deletion possible: Data deletion within 30 days is too slow and requires manual requests. Perplexity should offer a one-click deletion tool in the user interface with a 7-day processing guarantee. B
Export possible: Manual data export processes are cumbersome and user-unfriendly. Develop an automated export tool in the account settings to provide downloadable data in standard formats like CSV or JSON. C
Granularity control: Limited control over specific data types, like browsing history, restricts user autonomy. Introduce detailed settings to let users toggle data collection for individual features, such as search history or ad tracking. B
Explicit user consent: Relying on “legitimate interest” for non-sensitive data processing sidesteps user choice. Require explicit, granular consent for all data processing activities, aligning with GDPR-like standards. C
Transparency
Clear policy: The detailed privacy policy risks being inaccessible to non-technical users due to legal jargon. Simplify the policy with plain language summaries and multilingual translations to reach a broader audience. B
Change notification: Relying on users to check for minor policy updates is inconvenient and opaque. Implement proactive notifications via email and in-app alerts for all policy changes, with a 30-day advance notice for material updates. B
Model documentation: No public technical documentation on AI models erodes trust. Publish high-level model cards outlining training data sources and architecture without compromising proprietary details. C
Privacy by Design
Encryption (core & advanced): Claiming standard encryption without advanced privacy tech like homomorphic encryption or zero-knowledge proofs is insufficient. Adopt and document at least one advanced privacy-enhancing technology, such as secure multi-party computation, within 12 months. C
Privacy-Enhancing Technologies: Perplexity relies on standard encryption without advanced privacy-enhancing technologies like zero-knowledge proofs, weakening its privacy framework. Adopt a PET, such as homomorphic encryption, within 12 months and publish a clear explanation of its implementation to enhance user trust. C
Auditability & Certification: Lack of third-party audits undermines credibility. Perplexity should commit to annual SOC 2 or ISO 27001 audits and publish summary reports to validate privacy practices. D
Transparency & Technical Documentation: Policy transparency exists, but technical details are absent. Share a public technical overview of data flows and encryption standards to demonstrate privacy commitment. C
User-Configurable Privacy Features: Moderate controls for cookies and tracking are a start, but gaps remain for other data types. Add comprehensive privacy settings, like disabling specific data collection categories (e.g., location, search queries), in a centralized dashboard. B
Hosting & Sovereignty
Sovereignty: Exclusive US hosting with cross-border transfers conflicts with global privacy laws like GDPR. Offer EU and APAC data residency options by 2026 to comply with regional regulations and enhance user trust. D
Legal jurisdiction: California-based jurisdiction and Data Privacy Framework oversight may concern international users. Clarify DPF protections and offer contractual safeguards for non-US users to address jurisdictional risks. D
Local option: No offline or local deployment options limit privacy-focused use cases. Develop an offline mode for Comet Browser or local processing for enterprise users within 18 months. D
Big Tech dependency: Reliance on OpenAI and Anthropic introduces third-party privacy risks. Publish a risk mitigation plan, including data isolation guarantees, and explore in-house model development to reduce dependency. C
Open Source
Publicly available model: The fully proprietary model blocks transparency and community trust. Release a limited, non-commercial version of the model or its weights to encourage research and align with open AI trends. D
Clear open source license: Perplexity’s fully proprietary platform lacks any open-source licensing, limiting transparency and developer engagement. To improve, Perplexity could provide a public API or sample code to support privacy-focused development while maintaining proprietary control. D
Inference code available: No public inference code stifles innovation and trust. Provide sample inference code or an open API sandbox for developers to test privacy-preserving features. D
Remarks
Perplexity Privacy Review 2025: Overall Score
30.4/100
- Data Collection: 5 + 5 + 15 + 5 = 30
- User Control: 15 + 5 + 15 + 5 = 40
- Transparency: 15 + 15 + 5 = 35
- Privacy by Design: 5 + 5 + 0 + 5 + 15 = 30
- Hosting & Sovereignty: 0 + 0 + 0 + 5 = 5
- Open Source: 0 + 0 + 0 = 0
Total: 30 + 40 + 35 + 30 + 5 + 0 = 140
23 × 20 = 460
140 / 460 × 100 = 30.43
This evaluation of Grok 4 Privacy Review is provided for informational purposes only and reflects a subjective analysis based on publicly available data at the time of publication. We do not guarantee absolute accuracy and disclaim all liability for errors or misinterpretations. Any disputes must be submitted in writing to futurofintenet@proton.me
For full methodology, see our complete scoring guide here: LLM Privacy Rating Guide