Consiglieri

Multi-Stakeholder Analysis

Each article in this series was subjected to a structured multi-stakeholder analysis using a system that orchestrates a panel of 5-7 AI agents — each representing a distinct stakeholder perspective (displaced workers, policymakers, investors, technologists, and others) — through several rounds of position-building followed by multiple rounds of adversarial cross-stakeholder debate.

The agents construct formal Toulmin argument breakdowns, identify falsifiers, map system dynamics, and then stress-test each other's reasoning under rules designed to suppress courtesy agreement and reward genuine challenge.

The coordinator synthesizes the surviving claims, irreconcilable tensions, and conditional theses into a final memo. The goal is not consensus but clarity: surfacing the assumptions a thesis depends on, the populations it ignores, and the conditions under which it breaks.

The process takes 20-40 minutes per article and produces analysis that is often sharply critical of the source pieces. The goal is to surface blind spots, identify what survives scrutiny, and give readers a more complete picture than any single take provides.

AI isn't coming for your future. Fear is. - Analysis

AI isn't coming for your future. Fear is. - Analysis

Connor Boyack's thesis — that fear of AI displacement is a historically recurring cognitive error and that the real threat is mindset, not technology — survives multi-stakeholder scrutiny as a *descriptive* claim about recurring panic patterns and as a *conditional* macro-level prediction, but fails as prescriptive guidance and is actively harmful as a political argument against institutional intervention.

Analysis of AI isn't coming for your future. Fear is. by Connor Boyack

Read analysis
Tool Shaped Objects - Analysis

Tool Shaped Objects - Analysis

Will Manidis's "Tool Shaped Objects" thesis — that most current AI/LLM adoption produces the feeling of productive work rather than actual economic output — survives multi-stakeholder scrutiny with significant refinements.

Analysis of Tool Shaped Objects by Will Manidis

Read analysis
Something Big Is Happening - Analysis

Something Big Is Happening - Analysis

Matt Shumer's thesis — that AI has crossed a critical capability threshold in February 2026 and will eliminate massive numbers of knowledge jobs within 1-5 years — survives multi-stakeholder scrutiny in direction but fails on timeline, mechanism, and scope.

Analysis of Something Big Is Happening by Matt Shumer

Read analysis