Generative-AI algorithms are quietly binding gig couriers to platforms, eroding the employee/contractor divide; courts should test for substantive algorithmic control, legislatures should create a 'quasi-employee' status with tiered protections, and platforms must bear greater algorithmic accountability.
The rise of generative artificial intelligence (AIGC) technology is injecting new momentum into the gig economy and reshaping the rights and obligations of platforms and workers, while also constantly challenging traditional standards for determining labor relations. This study takes food delivery riders as the research object, analyzes the dilemma of labor relations determination under AIGC, explores legal regulatory paths, and argues from the perspectives of "personal subordination" and "economic subordination" that AIGC deeply and implicitly controls the labor process through mechanisms such as dynamic path planning, blurring the boundaries of determination. A multi-level solution is proposed: in the judiciary, the substantive and modern interpretation of the subordination standard should be developed, examining the substantive control of algorithms; in legislation, the binary model should be broken through, creating a "quasi-employee" subject and implementing tiered protection; collaborative governance should strengthen the responsibility of platform algorithms and promote the construction of collective bargaining mechanisms, providing theoretical reference for building a new paradigm for determining labor relations and protecting the legitimate rights and interests of workers.
Summary
Main Finding
AIGC (generative AI) technologies materially deepen and mask platform control over gig workers (illustrated by food delivery riders), undermining traditional legal tests for employment (personal and economic subordination). The paper argues that algorithmic path‑planning, dynamic pricing and opaque performance models create a new, hybrid form of dependence that makes binary employee/contractor categories inadequate. It recommends judicial modernization of the subordination test, legislative creation of a "quasi‑employee" tier with targeted protections, and collaborative governance (algorithmic responsibility, transparency, human review, and collective bargaining).
Key Points
- AIGC changes the locus and visibility of control:
- Algorithmic route optimization and real‑time guidance replace visible human direction, converting "instructions" into enforced algorithmic plans.
- Opaque, personalized AIGC performance evaluations create continuous, black‑box supervision that shapes workers' behavior.
- Economic dependence is reconfigured:
- Real‑time algorithmic pricing and demand forecasting make income volatile and platform‑dependent.
- Platforms retain data and algorithmic control, transferring operational risk to individual riders and concentrating bargaining power.
- Legal and policy gaps:
- Traditional subordination tests (human direction; payroll) struggle to capture algorithmic control and data‑based economic dependence.
- Binary legal categories (employee vs. independent contractor) produce gaps in protection for many gig workers.
- Proposed multi‑level remedies:
- Judiciary: interpret subordination substantively to include algorithmic control; weigh algorithmic constraints as evidence of personal/economic subordination.
- Legislation: create an intermediate "quasi‑employee" or "dependent self‑employed" status and build tiered protections (minimum income guarantees, inclusion in key social insurance e.g., occupational injury, right to algorithmic transparency and human appeal).
- Governance: mandate algorithmic responsibility (transparency balanced with trade secrets), human oversight, and strengthen/support collective bargaining or rider organizations.
Data & Methods
- Research design: conceptual and doctrinal analysis focused on how AIGC alters factual patterns relevant to labor law.
- Case focus: food delivery riders as a representative gig‑economy occupation exposed to route planning, dispatching, pricing, and evaluation algorithms.
- Sources and approach:
- Literature review of domestic and international scholarship on gig work, algorithmic management, and labor law.
- Legal analysis of existing standards for subordination and judicial practice, identifying gaps when applied to algorithmic control.
- Normative policy design drawing on comparative ideas (tiered protection, algorithmic transparency, collective negotiation).
- Empirical limitations: the paper is primarily theoretical/legal; it does not present original quantitative data or causal identification of AIGC effects.
Implications for AI Economics
- Labor market power and monopsony:
- AIGC can strengthen platforms' effective market power by using data and algorithms to centralize allocation, reduce worker bargaining power, and shift income risk to workers — implying stronger monopsonistic effects than in pre‑AIGC platforms.
- Income volatility and welfare:
- Dynamic algorithmic pricing increases earnings variance and consumption/savings uncertainty for workers; welfare analyses must account for stochastic, algorithm‑driven pay processes.
- Measurement and identification challenges:
- Standard labor economics models relying on observables (hours, wages, contracts) may understate control because algorithmic directives are latent and proprietary. New measures (logs of dispatch rules, ex‑post route deviations, inference from earnings trajectories) are needed.
- Policy and regulatory experiments:
- Evaluating tiered protections (quasi‑employee status, minimum guarantees, mandatory transparency) requires randomized/policy‑change evaluations to measure labor supply responses, platform pricing adjustments, and firm entry/innovation effects.
- Research agenda:
- Microdata needs: access to platform algorithm logs, dispatch/pricing rules, and matched worker outcomes to estimate causal impacts of algorithmic control.
- Structural models: incorporate algorithmic matching, dynamic pricing, and information asymmetries to quantify welfare transfers between platforms and workers.
- Field interventions: test human‑in‑the‑loop appeals, transparency interventions, and collective bargaining structures to measure effects on earnings stability, service quality, and platform efficiency.
- Broader takeaways for AI economics:
- AIGC intensifies tradeoffs between efficiency gains (better routing, matching) and distributional/agency problems (hidden control, risk transfer). Policy design must target algorithmic governance and labor protections to balance innovation with equitable outcomes.
Assessment
Claims (8)
| Claim | Direction | Confidence | Outcome | Details |
|---|---|---|---|---|
| The rise of generative artificial intelligence (AIGC) technology is injecting new momentum into the gig economy. Adoption Rate | positive | high | adoption_rate |
0.03
|
| AIGC is reshaping the rights and obligations of platforms and workers. Governance And Regulation | mixed | high | rights and obligations (legal status) |
0.03
|
| AIGC constantly challenges traditional standards for determining labor relations. Employment | negative | high | employment (classification/determination of labor relations) |
0.03
|
| This study takes food delivery riders as the research object and analyzes the dilemma of labor relations determination under AIGC. Other | null_result | high | research scope / sample (food delivery riders) |
0.1
|
| From the perspectives of 'personal subordination' and 'economic subordination', AIGC deeply and implicitly controls the labor process through mechanisms such as dynamic path planning, blurring the boundaries of determination. Task Allocation | negative | high | task_allocation / algorithmic control of tasks |
0.06
|
| In the judiciary, the substantive and modern interpretation of the subordination standard should be developed, examining the substantive control of algorithms. Governance And Regulation | positive | high | governance / judicial interpretation |
0.01
|
| In legislation, the binary model should be broken through by creating a 'quasi-employee' subject and implementing tiered protection. Social Protection | positive | high | social protection / legal status |
0.01
|
| Collaborative governance should strengthen the responsibility of platform algorithms and promote the construction of collective bargaining mechanisms. Governance And Regulation | positive | high | collective bargaining capacity / algorithmic accountability |
0.01
|