The Commonplace
Home Dashboard Papers Evidence Syntheses Digests 🎲
← Papers

Treating privacy as a commodity produces distinctive harms that prices can’t fix: markets for personal data enable exploitation, erode autonomy and concentrate platform power, so policymakers should restrict or redesign data markets rather than simply 'fill' a missing market.

Data and privacy: Putting markets in (their) place
Reetika Khera · March 07, 2026 · Indian Journal of Medical Ethics
openalex theoretical n/a evidence 8/10 relevance DOI Source PDF
Commodifying personal data treats privacy as a tradeable right and creates distinctive moral and social harms (exploitation, autonomy corrosion, concentrated platform rents) that standard market fixes cannot adequately address, so data markets should be limited or redesigned.

Should privacy be a tradeable right? This is an issue for urgent consideration, given how much personal data collated from different sources can reveal about our personal lives. The rise of digital technologies and of the digital economy on the one hand, and of data mining capabilities on the other, present economic opportunities that are being harnessed, often at the cost of our privacy. Some see this as a case of “missing markets”, where appropriate markets with adequate rules and regulations should be put in place. In this paper, I argue that the creating of a market for personal data, amounts to making the right to privacy a tradeable right. Further, a market for personal data/ privacy has all the characteristics of what Debra Satz characterises as “noxious markets”. Other economists, notably Bowles, Hausman and MacPherson and Sandel, have sought to delineate the moral limits to markets in cases of child labour, the organ trade, etc. I argue that the market for personal data should be treated similarly.

Summary

Main Finding

Reetika Khera argues that creating markets for personal data effectively makes the right to privacy a tradeable commodity and that such markets are “noxious” — they undermine individual dignity, produce severe distributional harms, enable manipulation, and corrode civic and democratic norms. Accordingly, privacy should be treated as a social right (not merely a marketable asset), and policy should resist commodification and the uncritical creation of data markets.

Key Points

  • Marketization of personal data equates to trading the right to privacy. Even voluntary sales of personal data entail surrendering privacy and expose people to harms.
  • The current consent-based model is deeply flawed: consent is often uninformed, coerced (via essential services or welfare conditionality), or bypassed (data shadows from contacts/referrals), producing weak agency and information asymmetry.
  • Personal data markets meet the four criteria of “noxious markets” (Satz):
  • Vulnerability: cash‑strapped or otherwise constrained people accept poor deals (e.g., coupons, welfare beneficiaries forced to consent).
  • Weak agency: lack of meaningful information or indirect decision-making (consent-of-others creates data shadows).
  • Extremely harmful outcomes for individuals: identity fraud, financial loss, stigmatization (e.g., mental-health and reproductive-health app data risks).
  • Extremely harmful outcomes for society: erosion of equality, political manipulation, surveillance, and undermining democratic participation.
  • Efficiency arguments for data markets are overstated: algorithmic personalization can extract consumer surplus, manipulate behaviour, create addiction, and increase platform market power rather than delivering broad welfare gains.
  • Governments are both custodians and potential sellers/monetizers of large administrative datasets. Policies and official rhetoric (e.g., Economic Survey chapters, data-sharing/monetization proposals) risk blurring fiduciary duties and commercial incentives.
  • De‑identification/anonymization is an inadequate safeguard given demonstrated ease of re‑identification.
  • Commodifying privacy risks crowding out civic and moral norms (Sandel’s corruption argument) and can institutionalize unequal bargaining power and degradation of rights.
  • Policy moves that frame data rights primarily as property (tradeable assets) are ethically and politically problematic and conflict with the treatment of privacy as a fundamental right (referencing India’s 2017 Supreme Court judgment).

Data & Methods

  • Genre: normative policy commentary and ethical argumentation, synthesizing legal decisions, economic and ethical theory, empirical examples, and public policy developments.
  • Sources and evidence used:
    • Legal context: India’s 2017 Supreme Court judgment recognizing privacy as a fundamental right.
    • Scholarly literature: economics of privacy (Acquisti, Taylor & Wagman), critiques of markets (Satz, Sandel, Hausman & MacPherson), surveillance capitalism (Zuboff), and re‑identification risks.
    • Empirical and illustrative examples: Aadhaar debates; government data‑sharing/monetization episodes (e.g., motor vehicle registration sales and subsequent policy reversal); corporate cases (Flo app data sharing, Byju’s conduct during COVID-19, mental‑health apps, Crisis Text Line investigation); identity‑theft and cybercrime statistics in India.
    • Policy texts: drafts and acts on data protection (e.g., Data Protection Bill drafts, Digital Personal Data Protection Act 2023), Karnataka’s e‑Sahmati consent initiative, Economic Survey commentary.
  • No original quantitative analysis or new empirical dataset is presented; the paper uses case examples, secondary statistics, and conceptual analysis to build its argument.

Implications for AI Economics

  • Data markets as inputs to AI: Treating personal data as tradeable lowers the ethical bar for assembling large training corpora, increasing risks that AI models are trained on data obtained through coercion, weak consent, or illicit resale. This raises questions about model legitimacy and social license.
  • Market power and concentration: Monetization of personal data entrenches platform advantages (data network effects), enabling AI‑intensive firms to capture consumer surplus, raise barriers to entry, and reduce competition — all central concerns for industrial and competition economics.
  • Externalities and measurement: Data sharing creates negative externalities (privacy harms, stigmatization, political manipulation) that are hard to price. AI economists must incorporate these externalities into welfare analyses and not rely solely on market prices to assess social value.
  • Distributional harms and vulnerability: AI systems trained on monetized personal data can amplify inequities — e.g., credit, insurance, employment screening — disproportionately harming marginalized or cash‑constrained groups who “sold” data under pressure. Analyses of AI impacts need to include bargaining power and consent asymmetries.
  • Manipulation and behavioural impacts: Personal data enables fine‑grained personalization and nudging. Economists should model how AI‑driven targeting changes consumer decision‑making, autonomy, and market outcomes (addiction, increased consumption, preference shaping).
  • De‑identification limits and downstream risks: Re‑identification risks imply that “anonymized” datasets used for AI carry latent privacy harms and liabilities; economic models of data markets should account for probabilistic re‑identification and legal/regulatory risk.
  • Policy and governance responses relevant for AI:
    • Treat privacy as a non‑tradeable right for core civic functions (health, education, welfare), limiting data commodification for those domains.
    • Design data governance regimes that go beyond property metaphors: data fiduciaries, purpose limitation, restrictions on resale, strong constraints on government monetization, and collective data stewardship (data trusts).
    • Technical mitigations: promote privacy‑preserving AI (differential privacy, federated learning, synthetic data) as part of market design to reduce harms while allowing legitimate innovation.
    • Regulatory interventions: competition policy attentive to data hoarding, mandatory transparency on training data sources, provenance, and consent; standards for meaningful consent; liability rules for misuse and re‑identification.
    • Incorporate non‑market values: economists studying AI should integrate ethical frameworks (noxious markets, corruption and crowding out of nonmarket norms) into welfare assessments and policy prescriptions.
  • Overall recommendation for AI economics: avoid uncritical valuation of data by market price alone; account for weak agency, distributional asymmetries, political and social externalities, and the normative status of privacy when evaluating data‑driven AI systems and policies.

Assessment

Paper Typetheoretical Evidence Strengthn/a — The paper is normative and conceptual rather than empirical — it advances ethical and policy arguments drawing on prior literature and descriptive facts but does not provide causal estimates or empirical tests. Methods Rigormedium — Arguments are grounded in established moral-philosophical frameworks (Satz, Sandel, etc.) and a cross-disciplinary literature review; however, the paper lacks formal models, empirical validation, or systematic empirical evidence to test its claims. SampleNo empirical sample; the paper uses conceptual analysis, thought experiments, illustrative examples, and a literature review spanning economics, political philosophy, and ethics, together with descriptive facts about digital data practices and platform business models. Themesgovernance innovation inequality GeneralizabilityNormative, conceptual claims are not empirically validated and thus their practical effects are uncertain, Focused on personal, identifiable data—less directly applicable to aggregated, anonymized, or synthetic datasets, Relevance depends on institutional and legal context (varies across jurisdictions and regulatory regimes), Does not quantify trade-offs (e.g., impacts on innovation, firm profits, or aggregate welfare), limiting policy calibration, Assumes enforceable regulatory capacity and feasible design of alternatives (data trusts, fiduciary duties), which may not hold everywhere

Claims (14)

ClaimDirectionConfidenceOutcomeDetails
Creating a market for personal data is equivalent to making the right to privacy a tradeable right, and such a market should be treated as a 'noxious market' in the sense articulated by Debra Satz. Governance And Regulation negative medium Normative classification of personal-data markets (noxious vs non-noxious); status of privacy (right vs commodity)
0.01
Commodifying personal information poses distinctive harms to individuals and social practices, including exploitation, corruption of personal autonomy, distributional injustice, and information asymmetries. Ai Safety And Ethics negative medium Types and presence of moral/social harms (exploitation, autonomy corruption, distributional injustice, information asymmetries)
0.01
Consent in data markets is frequently weak, uninformed, or coerced (due to information asymmetries, complexity, and behavioral biases), undermining the ethical legitimacy of transactions. Ai Safety And Ethics negative medium Validity/ethical legitimacy of consent in personal-data transactions
0.01
Personal data are nonrivalrous and highly replicable, so selling data does not follow ordinary scarcity logic. Market Structure null_result high Economic property of personal data (rivalry/scarcity)
0.02
Aggregation and linkage across data sources can reveal intimate, predictive traits that were not foreseeable to the data subject at the time of sale. Ai Safety And Ethics negative medium-high Extent to which data aggregation yields unforeseen sensitive inferences about individuals
0.0
Harms from data commodification are often externalized, diffuse, and long-term (e.g., profiling, algorithmic discrimination, chilling effects on behavior). Ai Safety And Ethics negative medium Presence and characteristics of harms (externalization, diffusion, temporality) resulting from data use
0.01
Standard market-failure fixes (better information, pricing, contracting) are insufficient to address the moral and social-structural harms of commodifying privacy. Governance And Regulation negative medium Adequacy of standard market remedies to eliminate ethical harms of data markets
0.01
Data markets tend to concentrate benefits and rents in large platforms while externalizing harms onto individuals and society. Market Structure negative medium Distribution of economic benefits and harms across firms (platforms) and individuals
0.01
Compensation-based frameworks for personal data may advantage those better able to monetize data, potentially worsening inequality. Inequality negative medium Distributional impact (inequality) resulting from compensation-based data exchange regimes
0.01
Treating privacy as non-tradeable (or tightly constrained trade) will change incentives for firms that monetize personal data, affecting the supply of training data for AI and the trajectory of AI development. Innovation Output mixed medium Firm incentives, supply of training data for AI, and subsequent effects on AI development trajectories
0.01
Economists modeling AI markets should incorporate non-pecuniary harms, externalities, and moral constraints when assessing welfare, innovation trade-offs, and optimal policy. Governance And Regulation positive medium Scope of factors (non-pecuniary harms, externalities, moral constraints) included in economic models of AI markets
0.01
Policy tools such as bans on sale of certain sensitive data, fiduciary duties for data holders, privacy-by-default, and collective data governance (data trusts, regulated commons) are appropriate levers to limit harms from data commodification. Governance And Regulation positive speculative Effectiveness of specific policy levers in limiting harms from data commodification
0.0
Investing in privacy-preserving AI methods (differential privacy, federated learning, synthetic data) and governance institutions is warranted as an alternative to atomized data markets. Adoption Rate positive medium Appropriateness and potential uptake of privacy-preserving technologies and governance institutions as substitutes for data markets
0.01
Cost–benefit analyses in AI economics should internalize long-term, hard-to-quantify harms (autonomy loss, social trust erosion) rather than rely solely on market price signals. Governance And Regulation positive medium Scope and content of variables included in cost–benefit analyses for AI policy
0.01

Notes