The Construction of Normality: Biopolitics and Control
The Standard Human: Statistical Fictions and Biopolitical Control
Thesis: The standard human is a constructed ideal shaped by processes of statistical classification and biopolitical control. Using concepts like l’homme moyen and biopolitics, we can see how bureaucracies and infrastructures defined “normality,” excluded those who didn’t fit, and upheld systems of power that continue to regulate bodies today.
1. Statistical Norms and the Average Man (L’Homme Moyen)
L’homme moyen (Adolphe Quetelet’s “average man”) is the origin of statistical norms used to model the “standard human.” This average became the ideal through classification and standards (e.g., IQ, height, weight, behavior).
Political and Bureaucratic Context:
- Used in eugenics to justify ideas of fitness versus unfitness.
- Applied to IQ testing, military sorting (e.g., Army Alpha), and social policy.
Consequences of Statistical Classification:
- Human variation became defined as deviance.
- “Statistical individuals” emerged—people defined not by their identity but by probability and data points.
2. Biopolitics and the Management of Populations
Biopolitics is defined as the management of populations through health, reproduction, and control over life itself. Categorization (fit/unfit, healthy/sick) is used to sort who is supported versus who is policed or excluded.
Examples of Biopolitical Control:
- Eugenics and forced sterilization.
- Pass laws in apartheid regimes.
- Credit scores as tools of behavioral control.
- Data doubles and data aggregates used in predictive policing or targeted advertising.
Consequences of Biopolitical Sorting:
- Surveillance of marginalized populations.
- Reduction of people to behavioral patterns.
- Pseudoscientific justifications for systemic inequality.
3. Infrastructure and the Enforcement of Standards
Infrastructure enforces the “standard” invisibly (Paul Edwards). Tools like biometrics, ledgers, and census categories become anonymous power—regulating without visible force.
Examples of Infrastructural Bias:
- Facial recognition disproportionately misidentifying Black faces.
- Databases controlling access to credit, housing, or health.
Conclusion on Norms:
The “standard” human becomes racialized, gendered, and economically situated—yet is treated as neutral.
Conclusion: Challenging Constructed Norms
The “standard human” is a product of classification and biopolitical control, created through statistical fictions and upheld by institutional systems. True justice means recognizing that there is no neutral body—only constructed norms—and embracing multiplicity in how we design, measure, and govern human life.
Technology Defines Users: Embedded Values and Behavioral Control
Thesis: Technologies like facial recognition, IQ tests, and casino interfaces are not passive tools; rather, they are embedded with political, economic, and cultural values that define users according to standardized roles. Drawing on concepts such as reward schedules, algorithms, and classification, this essay examines how systems shape behavior, who is excluded or harmed, and how individuals resist or repurpose these roles.
1. Reward Schedules and Engineering Experience
Systems define users as “players,” valued by how long they stay, not who they are. Systems aim to induce “the zone” (Schüll), where autonomy and awareness fade.
Motives and Design:
- Economic Motive: Extract as much attention and money as possible.
- Cultural/Political Motive: Design for compulsion, not informed consent.
Resistance:
Resistance is often limited as design overpowers self-regulation. However, growing public awareness, gambling regulation, and research (e.g., Schüll’s Addicted by Design) expose this manipulation.
2. IQ, Classification, and Fixed Social Positions
Classification standards, such as IQ tests, define users as statistically ranked individuals—assigned a fixed position in society (smart/slow, gifted/unfit).
Political and Cultural Values:
- Used in eugenics, school tracking, immigration limits, and military hierarchy.
- Intelligence is often seen as innate, measurable, and predictive of success.
Resistance and Critique:
- Scholars like Binet resisted fixed IQ definitions.
- Communities push back on tracking systems.
- Critiques highlight bias in testing (race, language, class).
- Rise of alternative learning models.
3. Biometrics, Facial Recognition, and Coded Exposure
Face recognition and biometrics classify people as legible or illegible, safe or suspicious, based on training data that often excludes or misrepresents nonwhite bodies.
Motives and Consequences:
- Political Values: Surveillance and control (used in predictive policing, protester identification, and immigration systems).
- Economic Motives: Profit from surveillance capitalism.
- Cultural Values: Assumption of objectivity and efficiency.
Resistance:
- Activist efforts to ban facial recognition in cities.
- Public scandals (e.g., Robert Williams’ false arrest).
- Scholars like Benjamin demand refusal and redesign rooted in justice.
Conclusion: Value-Centered Design
Technologies define users according to embedded values, often invisibly—but these definitions can be challenged. True justice in design requires moving beyond user-centeredness to value-centered design—considering who we imagine users to be, and who we let them become.
Key Concepts in Classification, Biopolitics, and Technology
1. Making Up People
- Static Nominalism
- L’homme moyen (Adolphe Quetelet)
- Dynamic Nominalism (Ian Hacking)
- Identity
- Looping effect
- Kinds of persons
- Example: Shell shock transitioned to PTSD
- Example: Homosexuality/Heterosexuality
- Identity
2. Classifications and Biopolitics
- Classification Systems
- Example: Aristotelian classification
- Effect: UNESCO Statement on Race (1951)
- Example: Linnaean system
- Anthropometry
- The Bell Curve (Adolphe Quetelet) – Nurture
- L’Homme Moyen (Adolphe Quetelet)
- Measurable intelligence (Charles Spearman)
- First intelligence test (Alfred Binet)
- Example: Intelligence Testing (Lewis Terman)
- Example: Army Alpha
- Miscegenation
- Example: Census categories
- Effect: Elizabeth Warren controversy
- Example: Pass laws
- Example: Inheritance charts
- Hereditary Genius (Francis Galton) – Nature
- Eminence
- Example: Aristotelian classification
- Biopolitics and Control
- Eugenics
- Fit vs. Unfit: “Menace of the feebleminded”
- Sterilization
- Buck v. Bell (Supreme Court case)
- Individual vs. Population
- Positive Eugenics
- Example: Margaret Sanger
- Example: Better Baby contests
- Eugenics
3. Law of Amplification and Infrastructure
- Standards and Infrastructure
- Example: Wistar rat (standardized biological model)
- Example: Grain elevators (standardized measurement)
- Infrastructure
- Anonymous power
- Example: Computers in the classroom
- Example: ARPANET
- Example: Oosterschelde Storm Surge Barrier
- Market Capitalization and Data
- Example: Facebook
- Biometrics
- Example: Big Data
- Ledger
- Example: Card file vs. computer databases
- “Data doubles” transitioning to “Data Aggregates” & “Personalized Aggregations”
- “Data exhaust” leading to the Statistical Individual
- Example: Credit scores
- Example: Project Green Light (surveillance)
- Engineering experience
4. Complex Personhood
- Hybrid Identities
- Example: Spirit ambulance
5. Regimes of Perceptibility and Attention Economy
- International Style and Scientific Management
- International Style (Le Corbusier)
- Example: Ribbon window and Modular units
- Scientific Management (Frederick Winslow Taylor)
- Efficiency
- Example: “Machine for living”
- Example: Brasília (urban planning)
- Regime of Perceptibility
- Popular epidemiology vs. Toxicology
- Dose-response curve
- Example: Sick building syndrome
- International Style (Le Corbusier)
- Attention as a Commodity (Attention Economy)
- Engineering experience
- Example: Reward schedules
- Computational specificity
- Affect/emotion manipulation
- Example: “The zone” in casino gaming (Schüll)
- Individual vs. Aggregate (Zeynep Tufekci)
- Example: Player tracking algorithms & eye-tracking technology
6. Data, Algorithms, and Uncertainty
- Market Capitalization and Data Infrastructure
- Example: Facebook
- Biometrics
- Example: Big Data
- Ledger
- Example: Card file vs. computer databases
- “Data doubles” transitioning to “Data Aggregates” & “Personalized Aggregations”
- “Data exhaust” leading to the Statistical Individual
- Example: Credit scores
- Example: Project Green Light
- Engineering experience
- Data Infrastructures and Surveillance
- Example: Cold War defense projects
- Anonymous power
- Casino gaming
- Reward schedules
- Player tracking
- Attention as a Commodity (Attention Economy)
- Engineering experience
- Example: Reward schedules
- Computational specificity
- Affect/emotion manipulation
- Example: “The zone” in casino gaming
- Individual vs. Aggregate (Zeynep Tufekci)
- Example: Player tracking algorithms & eye-tracking technology
7. Science and Liberal Democracy
- Merchants of Doubt and Agnotology
- Merchants of Doubt (Stephen Schneider/Naomi Oreskes)
- Agnotology (the study of culturally induced ignorance)
- Example: Western Fuels Association
- Anthropocene
- Example: Climate models
- Human-nature hybrid
- Example: Great acceleration → CO2 → Greenhouse Effect → Climate Change
- Effect: Earth Day
- Manufacturing Doubt
- Example: Cigarettes and health risks
- The Public and Technologies of Humility
- Pure science (Naomi Oreskes)
- Technologies of Humility (Sheila Jasanoff)
- Vs. Technologies of Hubris
- Example: Chernobyl disaster
- Example: Sellafield and Cumbrian sheep farmers (radioactive contamination)
- Risk Society (Ulrich Beck)
- Framing, Vulnerability, Distribution, Learning
