The Commonplace
Home Dashboard Papers Evidence Digests 🎲
← Papers

Treating privacy as a commodity produces distinctive harms that prices can’t fix: markets for personal data enable exploitation, erode autonomy and concentrate platform power, so policymakers should restrict or redesign data markets rather than simply 'fill' a missing market.

Data and privacy: Putting markets in (their) place
Reetika Khera · March 07, 2026 · Indian Journal of Medical Ethics
openalex theoretical n/a evidence 8/10 relevance DOI Source PDF
Commodifying personal data treats privacy as a tradeable right and creates distinctive moral and social harms (exploitation, autonomy corrosion, concentrated platform rents) that standard market fixes cannot adequately address, so data markets should be limited or redesigned.

Should privacy be a tradeable right? This is an issue for urgent consideration, given how much personal data collated from different sources can reveal about our personal lives. The rise of digital technologies and of the digital economy on the one hand, and of data mining capabilities on the other, present economic opportunities that are being harnessed, often at the cost of our privacy. Some see this as a case of “missing markets”, where appropriate markets with adequate rules and regulations should be put in place. In this paper, I argue that the creating of a market for personal data, amounts to making the right to privacy a tradeable right. Further, a market for personal data/ privacy has all the characteristics of what Debra Satz characterises as “noxious markets”. Other economists, notably Bowles, Hausman and MacPherson and Sandel, have sought to delineate the moral limits to markets in cases of child labour, the organ trade, etc. I argue that the market for personal data should be treated similarly.

Summary

Main Finding

Creating a market for personal data is equivalent to making the right to privacy a tradeable right, and such a market should be treated as a "noxious market" in the sense articulated by Debra Satz. Because commodifying personal information poses distinctive harms to individuals and social practices (exploitation, corruption of personal autonomy, distributional injustice, information asymmetries), policy must limit or redesign data markets rather than simply fill a "missing market" with standard market mechanisms.

Key Points

  • Core claim: Allowing personal data to be bought and sold transforms privacy from a right into a commodity, with normative consequences that markets are ill-suited to handle.
  • Noxious-market framing: The market for personal data displays key features Satz uses to identify morally problematic markets:
    • It tends to exploit unequal bargaining power and vulnerable populations (people selling data out of necessity).
    • It can corrupt or degrade important social practices and relationships (surveillance, erosion of trust and autonomy).
    • It produces concentrated benefits for firms while externalizing harms onto individuals and society (discrimination, chilling effects).
  • Parallel literature: Similar moral limits have been argued for other domains (child labor, organ trade) by Bowles, Hausman & MacPherson, Sandel — the article argues personal data markets belong in the same category.
  • Consent problems: In practice, consent in data markets is often weak, uninformed, or coerced (information asymmetries, complexity, behavioral biases), undermining the ethical legitimacy of transactions.
  • Unique characteristics of data:
    • Nonrivalrous and highly replicable, so selling data does not follow ordinary scarcity logic.
    • Aggregation and linkage across sources can reveal intimate, predictive traits not foreseeable at time of sale.
    • Harm is often externalized, diffuse, and long-term (profiling, algorithmic discrimination, chilling of behavior).
  • Market solutions are insufficient: Standard economics fixes (better information, pricing, contracting) do not fully address the moral and social-structural harms of commodifying privacy.

Data & Methods

  • Methodological approach: normative, conceptual and argumentative rather than empirical.
    • Engages moral-philosophical frameworks (Satz's noxious markets criteria; arguments from Sandel, Bowles, Hausman & MacPherson).
    • Uses conceptual analysis of privacy as a right vs. commodity, and of markets' effects on social practices.
    • Likely draws on illustrative examples and thought experiments rather than econometric or experimental data.
  • Evidence base: literature review across economics, political philosophy, and ethics; descriptive facts about digital-era data practices and business models are used to ground the argument.
  • No formal market modeling or empirical estimation is central to the paper; the contribution is primarily theoretical and policy-oriented.

Implications for AI Economics

  • Limits on commodification alter data supply and valuation:
    • Treating privacy as non-tradeable (or tightly constrained trade) changes incentives for firms that monetize personal data, affecting the supply of training data for AI and hence the trajectory of AI development.
    • Standard price-based mechanisms for valuing personal data become inadequate because they ignore non-market harms and rights-based constraints.
  • Market design and regulation matter for welfare analysis:
    • Economists modeling AI markets must incorporate non-pecuniary harms, externalities, and moral constraints when assessing welfare, innovation trade-offs, and optimal policy.
    • Policies (bans, prohibitions on sale, fiduciary duties, data-use restrictions, privacy-by-default) will reshape firm behavior, competition, and investment in privacy-preserving technologies.
  • Distributional and competition consequences:
    • Data markets tend to concentrate power and rents in large platforms; limiting trade in personal data can be an instrument for countering platform power and market concentration.
    • Compensation-based frameworks may advantage those better able to monetize data, worsening inequality; policy design must account for this.
  • Research & technology directions:
    • Stronger case for investment in privacy-preserving AI methods (differential privacy, federated learning, synthetic data) and for research on data governance institutions (data trusts, collective bargaining, regulated commons).
    • Algorithmic audits, transparency requirements, and liability rules should be incorporated into economic models of AI deployment and firm strategy.
  • Policy evaluation criteria:
    • Cost–benefit analyses in AI economics need to internalize long-term, hard-to-quantify harms (autonomy loss, social trust erosion) and not rely solely on market price signals.
    • Normative constraints (rights-based or dignity-based considerations) should be explicit in policy prescriptions concerning data markets.
  • Practical policy levers to consider:
    • Prohibitions or strict limits on sale of certain sensitive data types.
    • Default privacy protections and opt-out rules that are robust to behavioral biases.
    • Legal duties (fiduciary or stewardship roles) for data holders.
    • Collective or public data governance (data trusts, regulated data commons) as alternatives to atomized markets.

Overall, the paper suggests that economists working on AI and data-driven markets should treat data commodification not as a neutral efficiency problem but as a domain in which moral limits, rights, and social practices must shape market design and regulation.

Assessment

Paper Typetheoretical Evidence Strengthn/a — The paper is normative and conceptual rather than empirical — it advances ethical and policy arguments drawing on prior literature and descriptive facts but does not provide causal estimates or empirical tests. Methods Rigormedium — Arguments are grounded in established moral-philosophical frameworks (Satz, Sandel, etc.) and a cross-disciplinary literature review; however, the paper lacks formal models, empirical validation, or systematic empirical evidence to test its claims. SampleNo empirical sample; the paper uses conceptual analysis, thought experiments, illustrative examples, and a literature review spanning economics, political philosophy, and ethics, together with descriptive facts about digital data practices and platform business models. Themesgovernance innovation inequality GeneralizabilityNormative, conceptual claims are not empirically validated and thus their practical effects are uncertain, Focused on personal, identifiable data—less directly applicable to aggregated, anonymized, or synthetic datasets, Relevance depends on institutional and legal context (varies across jurisdictions and regulatory regimes), Does not quantify trade-offs (e.g., impacts on innovation, firm profits, or aggregate welfare), limiting policy calibration, Assumes enforceable regulatory capacity and feasible design of alternatives (data trusts, fiduciary duties), which may not hold everywhere

Claims (14)

ClaimDirectionConfidenceOutcomeDetails
Creating a market for personal data is equivalent to making the right to privacy a tradeable right, and such a market should be treated as a 'noxious market' in the sense articulated by Debra Satz. Governance And Regulation negative medium Normative classification of personal-data markets (noxious vs non-noxious); status of privacy (right vs commodity)
0.01
Commodifying personal information poses distinctive harms to individuals and social practices, including exploitation, corruption of personal autonomy, distributional injustice, and information asymmetries. Ai Safety And Ethics negative medium Types and presence of moral/social harms (exploitation, autonomy corruption, distributional injustice, information asymmetries)
0.01
Consent in data markets is frequently weak, uninformed, or coerced (due to information asymmetries, complexity, and behavioral biases), undermining the ethical legitimacy of transactions. Ai Safety And Ethics negative medium Validity/ethical legitimacy of consent in personal-data transactions
0.01
Personal data are nonrivalrous and highly replicable, so selling data does not follow ordinary scarcity logic. Market Structure null_result high Economic property of personal data (rivalry/scarcity)
0.02
Aggregation and linkage across data sources can reveal intimate, predictive traits that were not foreseeable to the data subject at the time of sale. Ai Safety And Ethics negative medium-high Extent to which data aggregation yields unforeseen sensitive inferences about individuals
0.0
Harms from data commodification are often externalized, diffuse, and long-term (e.g., profiling, algorithmic discrimination, chilling effects on behavior). Ai Safety And Ethics negative medium Presence and characteristics of harms (externalization, diffusion, temporality) resulting from data use
0.01
Standard market-failure fixes (better information, pricing, contracting) are insufficient to address the moral and social-structural harms of commodifying privacy. Governance And Regulation negative medium Adequacy of standard market remedies to eliminate ethical harms of data markets
0.01
Data markets tend to concentrate benefits and rents in large platforms while externalizing harms onto individuals and society. Market Structure negative medium Distribution of economic benefits and harms across firms (platforms) and individuals
0.01
Compensation-based frameworks for personal data may advantage those better able to monetize data, potentially worsening inequality. Inequality negative medium Distributional impact (inequality) resulting from compensation-based data exchange regimes
0.01
Treating privacy as non-tradeable (or tightly constrained trade) will change incentives for firms that monetize personal data, affecting the supply of training data for AI and the trajectory of AI development. Innovation Output mixed medium Firm incentives, supply of training data for AI, and subsequent effects on AI development trajectories
0.01
Economists modeling AI markets should incorporate non-pecuniary harms, externalities, and moral constraints when assessing welfare, innovation trade-offs, and optimal policy. Governance And Regulation positive medium Scope of factors (non-pecuniary harms, externalities, moral constraints) included in economic models of AI markets
0.01
Policy tools such as bans on sale of certain sensitive data, fiduciary duties for data holders, privacy-by-default, and collective data governance (data trusts, regulated commons) are appropriate levers to limit harms from data commodification. Governance And Regulation positive speculative Effectiveness of specific policy levers in limiting harms from data commodification
0.0
Investing in privacy-preserving AI methods (differential privacy, federated learning, synthetic data) and governance institutions is warranted as an alternative to atomized data markets. Adoption Rate positive medium Appropriateness and potential uptake of privacy-preserving technologies and governance institutions as substitutes for data markets
0.01
Cost–benefit analyses in AI economics should internalize long-term, hard-to-quantify harms (autonomy loss, social trust erosion) rather than rely solely on market price signals. Governance And Regulation positive medium Scope and content of variables included in cost–benefit analyses for AI policy
0.01

Notes