The Commonplace
Home Dashboard Papers Evidence Digests 🎲
← Papers

Algorithmic oversight in public transport—from constant GPS tracking to automated deactivations—correlates with higher anxiety, burnout and chronic precarity among drivers; authors call for transparency, mandatory mental-health risk audits, human review of sanctions and wage protections to curb psychological harm.

Algorithmic Control and Psychological Risk in Digitally Managed Public Transport Systems: Implications for Occupational Mental Health
Johnson O. Oloyele, ADEDOYIN IBOSIOLA, Goodness Olaleye, Damilola Elizabeth Bakare · March 25, 2026 · Annals of Geographical Studies
openalex review_meta medium evidence 7/10 relevance DOI Source PDF
A PRISMA-guided integrative review of 48 studies finds that algorithmic management mechanisms—GPS surveillance, rating systems, dynamic pricing, and automated sanctions—are consistently associated with elevated psychological risks (anxiety, burnout, depressive symptoms, and chronic precarity) among public transport and platform drivers via surveillance, income volatility, and procedural opacity.

Public transport systems increasingly adopt algorithmic management through GPS surveillance, rating systems, and automated sanctions, yet the psychological mechanisms linking these technologies to workers' mental health remain poorly understood and theoretically fragmented.To develop an integrated multilevel theoretical framework synthesizing pathways from algorithmic control mechanisms to psychological risk in public transport drivers.PRISMA-guided systematic integrative review of 48 peer-reviewed studies (2016-2025) from 4,812 initial records sourced from Scopus, Web of Science, and PubMed.Structured data extraction captured control mechanisms, psychological outcomes, and mediating pathways.Thematic synthesis integrated Job Demand-Control Model, Conservation of Resources Theory, and Algorithmic Management Theory.Four control mechanisms emerged: GPS tracking (panoptic surveillance), rating systems (emotional labour demands), dynamic pricing (income volatility), and automated sanctions (deactivation fear).Platform workers experience 59.6% higher digital speed determination and 36.3% more third-party ratings than traditional workers.The trilevel framework (technological organizational psychological) yielded six propositions: surveillance intensity hyper-vigilance ( = -4.213),algorithmic opacity procedural anxiety, income volatility depressive symptoms (23 -41% prevalence), rating pressure emotional exhaustion (41-67% high burnout), task defragmentation reduced accomplishment, and deactivation fear chronic precarity (78% report chronic fear).Algorithmic management operates as psychological governance eroding worker mental health through surveillance, opacity, and precarity.Human-in-command regulation requires: algorithmic transparency mandates, mandatory mental health risk audits, participatory co-design, human review of deactivations, and minimum wage protections aligned with ILO principles.

Summary

Main Finding

Algorithmic management in public transport and logistics acts as a form of psychological governance: automated direction, evaluation and sanctioning (GPS/telematics, ratings, dynamic pricing, deactivation) reshape work design and raise measurable occupational mental-health risks (burnout, anxiety, depressive symptoms, hyper-vigilance, chronic precarity). The authors synthesize evidence into a three‑level (technological → organizational → psychological) framework and propose specific pathways linking technical features to mental-health outcomes.

Key Points

  • Scope and synthesis: PRISMA-guided systematic integrative review of 48 empirical studies (2016–2025) drawn from 4,812 initial records across Scopus, Web of Science, PubMed and institutional reports.
  • Core algorithmic mechanisms identified: GPS tracking (continuous surveillance), user rating systems (emotional labour and social evaluation), dynamic pricing (income volatility), automated sanctions/deactivation (fear and insecurity).
  • Quantitative patterns from included studies:
    • 59.6% of platform workers face “digital speed determination” (higher time pressure).
    • 36.3% more platform workers are subject to third‑party ratings versus traditional workers.
    • High prevalence ranges reported: depressive symptoms 23–41%; high burnout (emotional exhaustion) 41–67%; 78% report chronic fear of deactivation in some samples.
    • Reported effect sizes/propositions include surveillance intensity → reduced wellbeing/hyper‑vigilance (β = -4.213) and algorithmic completeness negatively predicting autonomy (r ≈ -0.42).
  • Thematic propositions developed: surveillance → hyper‑vigilance; opacity → procedural anxiety/“algorithmic paranoia”; income volatility → depressive symptoms; rating pressure → emotional exhaustion; task fragmentation → reduced sense of accomplishment; deactivation fear → chronic precarity.
  • Policy recommendations emphasized by authors: algorithmic transparency mandates, mandatory mental‑health risk audits, participatory co‑design of systems, human review of deactivations, and minimum‑wage protections in line with ILO principles.
  • Limitations in the literature: many studies are cross‑sectional, geographically clustered, short‑term, and heterogeneous in measures—causal inference and long‑run health impacts remain underexplored.

Data & Methods

  • Review design: systematic integrative review following PRISMA 2020 to combine qualitative, quantitative and mixed‑methods evidence and integrate psychological theory with technical features.
  • Information sources: Scopus, Web of Science, PubMed; additional institutional reports (ILO, EU‑OSHA); backward citation searching.
  • Screening/results: 4,812 records identified → 3,120 after de‑duplication/screening → 481 full texts assessed → 48 studies included.
  • Inclusion criteria: peer‑reviewed empirical studies (qualitative, quantitative, mixed) addressing automated managerial functions (assignment, evaluation, discipline) in transport/logistics and reporting occupational mental‑health outcomes (burnout, anxiety, stress, depressive symptoms).
  • Data extraction: structured template capturing study metadata (author, year, country), sector, sample size/design, algorithmic mechanism(s), psychological outcomes, mediators/moderators (perceived fairness, autonomy, algorithmic literacy).
  • Synthesis: six‑phase thematic analysis using Braun & Clarke approach, organized at three levels — Technological (data collection, ML models), Organizational (work allocation, evaluation, discipline), Psychological (rumination, ego depletion, detachment).
  • Trustworthiness: triangulation across methods and settings; peer debriefing; emphasis on multilevel propositions grounded in extracted evidence.

Implications for AI Economics

  • Labor supply & participation: algorithmic governance alters job quality (loss of autonomy, higher instability). This can lower retention, change reservation wages, and alter occupational choice—potentially reducing labor supply in affected segments unless compensated by higher nominal pay or flexibility value.
  • Wage dynamics & income risk: dynamic pricing and automated task allocation increase income volatility and precarity. Economically, this raises the effective risk premium workers require and may increase demand for wage smoothing mechanisms (minimum guarantees, wage floors, or insurance).
  • Bargaining power & market structure: algorithmic opacity and automated deactivation weaken worker bargaining and increase monopsony‑like power for platforms that control assignment, evaluation, and sanctioning. This may depress equilibrium wages and increase asymmetric information rents captured by platforms.
  • Productivity vs. externalized costs: while algorithms can increase operational efficiency (matching, routing, utilization), a portion of productivity gains may be externalized onto workers via higher intensity, stress, and health costs. Ignoring these externalities risks overestimating net social gains from platformization.
  • Human capital and long‑run supply: chronic psychological strain (burnout, reduced accomplishment) can degrade human capital accumulation (skill depreciation, early exit), affecting long‑run labor quality and raising replacement/training costs.
  • Policy and regulation implications for economists/policymakers:
    • Transparency and auditability: regulations requiring explainability, reporting of algorithmic effects on work design, and independent audits could reduce information asymmetries and correct market power distortions.
    • Income guarantees and social insurance: subsidies, minimum pay rules, or mandated earnings floors can mitigate income volatility externalities and support labor supply.
    • Enforcement of human oversight: mandatory human review for deactivations and appeal mechanisms protect against automated errors and reduce psychological harm, which otherwise translates into productivity and turnover costs.
    • Measurement & evaluation: incorporate psychosocial risk indicators into cost‑benefit analyses of platform deployment; include mental‑health externalities when estimating social returns to algorithmic adoption.
  • Research gaps of economic relevance: need for longitudinal causal studies to quantify impacts on wages, labor supply elasticity, turnover costs, productivity net of health externalities, and distributional effects across Global North/South contexts. Better measurement will inform optimal regulatory design (tax/subsidy or direct regulation) and the appropriate scope of labor protections.

If you want, I can convert these implications into specific policy levers (tax/subsidy models, regulation templates, or sketch an empirical research design to estimate the wage impact of algorithmic surveillance).

Assessment

Paper Typereview_meta Evidence Strengthmedium — Systematic PRISMA-guided synthesis of 48 peer-reviewed studies provides convergent evidence on associations between algorithmic management features and adverse psychological outcomes, but included primary studies are mostly observational (cross-sectional and qualitative), rely heavily on self-reports, and do not establish causal effects; heterogeneity and potential publication/selection biases limit causal inference. Methods Rigorhigh — The study follows PRISMA guidance, searches three major databases (Scopus, Web of Science, PubMed), screens 4,812 records to yield 48 eligible studies, uses structured data extraction and thematic synthesis, and integrates existing theoretical models; however, it does not report a formal meta-analysis or risk-of-bias quantification for individual studies, and is constrained by heterogeneity in measures and designs of included studies. SampleSystematic integrative review of 48 peer-reviewed studies published 2016–2025 identified from 4,812 records in Scopus, Web of Science, and PubMed; included mixed-methods evidence (quantitative prevalence estimates, cross-sectional surveys, qualitative interviews) primarily on platform (ride-hailing, delivery) and public transport drivers across multiple countries, with outcomes including anxiety, burnout, depressive symptoms, and fear of deactivation; primary study sample sizes and country coverage varied widely and are heterogenous. Themeshuman_ai_collab labor_markets GeneralizabilityPrimary studies are heterogeneous across countries, platforms, and transport modes, limiting transferability to any single geography or regulatory regime, Over-representation of platform workers may reduce applicability to traditional public transit systems, Predominance of cross-sectional and self-reported measures limits inference about long-term/causal effects, Search limited to three major databases and peer-reviewed literature may exclude relevant grey literature, reports, and non-English publications, Rapid evolution of algorithmic systems since 2016 means findings may not fully apply to newer management algorithms or adaptive AI systems

Claims (13)

ClaimDirectionConfidenceOutcomeDetails
PRISMA-guided systematic integrative review of 48 peer-reviewed studies (2016-2025) sourced from 4,812 initial records (Scopus, Web of Science, PubMed). Other null_result high number of studies and records screened/included
n=48
0.4
Thematic synthesis integrated Job Demand-Control Model, Conservation of Resources Theory, and Algorithmic Management Theory to develop an integrated multilevel theoretical framework. Other null_result high theoretical integration
n=48
0.4
Four control mechanisms emerged from the review: GPS tracking (panoptic surveillance), rating systems (emotional labour demands), dynamic pricing (income volatility), and automated sanctions (deactivation fear). Task Allocation null_result high presence/identification of algorithmic control mechanisms
n=48
0.24
Platform workers experience 59.6% higher digital speed determination than traditional workers. Task Allocation negative medium digital speed determination
59.6% higher
0.14
Platform workers receive 36.3% more third-party ratings than traditional workers. Task Allocation negative medium number of third-party ratings received
36.3% more
0.14
Surveillance intensity is associated with hyper-vigilance (reported effect = -4.213). Worker Satisfaction negative medium hyper-vigilance (psychological arousal/state)
-4.213
0.02
Algorithmic opacity is linked to procedural anxiety. Worker Satisfaction negative high procedural anxiety
0.24
Income volatility from dynamic pricing is associated with depressive symptoms (reported prevalence range 23–41%). Worker Satisfaction negative high prevalence of depressive symptoms
23 -41% prevalence
0.24
Rating pressure is associated with emotional exhaustion, with 41–67% reporting high burnout. Worker Satisfaction negative high emotional exhaustion / high burnout prevalence
41-67% high burnout
0.24
Task defragmentation (fragmenting tasks via platform algorithms) leads to a reduced sense of accomplishment among drivers. Worker Satisfaction negative high reduced sense of accomplishment
0.24
Fear of deactivation (automated sanctions) creates chronic precarity; 78% report chronic fear. Job Displacement negative high self-reported chronic fear of deactivation
78% report chronic fear
0.24
Algorithmic management functions as 'psychological governance' that erodes worker mental health through surveillance, opacity, and precarity. Worker Satisfaction negative high worker mental health (general deterioration)
n=48
0.24
Recommended regulatory responses include algorithmic transparency mandates, mandatory mental health risk audits, participatory co-design, human review of deactivations, and minimum wage protections aligned with ILO principles. Governance And Regulation positive high policy/regulatory interventions recommended
0.04

Notes