Information Physics Field Guide
July 24th, 2025Some discoveries begin in laboratories. Others emerge from lived experience so intense that patterns become impossible to ignore. This is the story of how navigating humanity’s highest-entropy systems may have revealed universal patterns that could govern human organization.
Now, before we get started, I want to make it clear that this is a work in progress. I’m still discovering the patterns and the implications of the theory. Aside from all of this I’m also simultaneanouesly living in three different realities: right now I could be the world’s most high functioning delusional, I could be a pattern matching machine that landed on a sentence so generally applicable that it would make the design team at Home Goods envious, or I could have inadvernently discovered a new way to describe our place in the universe. I’ll let you formulate your own conclusions, all I ask is that you keep an open mind.
However, one thing I do know about myself is that I’m a practitioner-theorist. Growing up without a formal education or strong parental guidance—though my mother was definitely doing the best she could given her position in the universe, love you Mom! 💜—forced me to be resourceful and build my own solutions and frameworks to navigate the environments I found myself in.
I build and test in the mud for sure, but I also like to step back and examine why things are the way they are. When you’re consistently met with stark contradictions of what we see on paper vs the reality of the system (what people think of prison and formerly incarcerated people vs my lived experience or the difference between building software for developers vs consuming software built by developers, and lastly my reality as IC vs leadership—in both Software Engineering and Developer Relations) I’m consistently forced to reconcile these dual realities.
Einstein realized on a train that physics depends on your reference frame. Nash discovered that optimal strategies depend on what others are doing by studying actual games. Both understood that the observer’s position and context fundamentally change the mathematics. This same principle applies to understanding human systems—your perspective shapes what you can see and measure.
To that point, I ask you the reader to do two things:
- Be open to seeing the world through my experience and with the idea that lived experience is the most important source of information.
- Be objective and scientifically rigorous in your approach, but leave classical academia at the door.
Okay, have we set the right stage? Are you ready to dive in? I’ll leave you here to go the rest of the way on your own so you can form your own conclusions. Just make sure they’re your conclusions.
LLM friendly study guide / NotebookLM / Audio overview / Theory punch card / The “Just How It Is” Litmus Test
Living in Maximum Entropy
My earliest memories involve navigating complex systems where the official rules had little relation to actual dynamics. Family court, social services, friends in foster care—these were my first lessons in how information really flows versus how it’s supposed to flow. By age eight, I’d moved for the first time. By twenty, it was fourteen times across diverse neighborhoods and populations.
Three separate juvenile detention facilities. Two mental institutions. The welfare system. Each system had official structures, documented processes, and stated goals. None of them worked as advertised. The real information—who had power, how decisions got made, what actually helped—flowed through hidden channels that you could only see from inside.
Then came prison. Six years in maximum-entropy environments where artificial chaos was maintained as a control mechanism. This wasn’t accidental disorder—it was engineered entropy, deliberately preserved to prevent exactly what I started doing: optimizing information flow to reduce conflicts and create stability.
When you apply positive-sum optimization in a system designed for zero-sum chaos, the results are dramatic. The yard became calmer. Conflicts decreased. Information flowed more efficiently. And that’s when I learned the most important lesson: calm is more dangerous to control systems than chaos. My systematic entropy reduction was so threatening to institutional power that it required intervention—immediate transfer to disrupt the emerging order.
The Pattern Recognition Begins
After prison came community college, then corporate America. The transition from maximum to minimum entropy environments should have been jarring—don’t get me wrong, it was in certain ways like rewiring my brain to see myself as all these new identities. Instead, it revealed something profound: the same patterns appeared everywhere, just at different intensities.
In software engineering, I moved from individual contributor to leadership, then repeated the cycle in developer relations. Each transition provided new vantage points—what organizational researchers call “parallax.” The same system looks completely different from different positions, yet the underlying patterns remain constant.
My role in DevRel proved crucial. Speaking with thousands of developers from all kinds of organizations about how their systems actually work, helping them identify friction and optimize flow—I was doing entropy engineering without naming it. Each conversation added another data point to an emerging pattern that spanned every type of organization as I sought to find the underlying friction and bottlenecks that were preventing them from being successful.
The Frameworks Emerge
Over the years, specific patterns crystallized into frameworks. They worked universally, but I didn’t understand why, I attributed it to my first principles and systems thinking. My time growing up and my experience in prison instilled a need to do whatever I could to ensure others had access to more opportunity and resourse than I had. It generally manifested in me reverse engineering the things I was doing that yielded positive results so that I could develop a shared language for what I was doing intuitively and then provide practical tools built on that foundation for others to use. Turns out I may have been onto something.
2017 - The Pyramid of Challenge
Building an engineering team at Major League Soccer, I needed to distribute challenge appropriately across all levels. The framework that emerged—high complexity/low availability at the top, low complexity/high availability at the base—worked perfectly. Only later did I realize this was information density optimization in action. Teams thrived when challenge matched capacity, creating natural information flow from complex strategic decisions to distributed tactical execution.
Pyramid of Challenge: A framework for distributing challenge across organizational levels to optimize information flow.
2020 - Content Creation Guide
At Apollo GraphQL, I needed to scale developer advocacy. The guide that emerged focused on matching content to channels based on information topology—complex ideas need low entropy mediums making them easy to share, high noise ideas can be moved through channels for refinement to signal. This was entropy reduction through proper channel selection. The framework transformed how developer advocates approach content, ensuring ideas reached audiences in optimal form.
Content Creation Guide: A framework for matching content to channels based on information topology.
2022 - Product Friction Guide (DX Audits)
Helping product managers understand user pain required a systematic approach to information flow between users and builders. The framework that emerged was essentially entropy detection and correction in human-product systems. DX Audits became widely adopted across DevRel because they made invisible friction visible and measurable. Teams could finally quantify developer experience problems and track improvement systematically.
DX Audits: A framework for measuring and improving developer experience.
2024 - The Keystone Paradox and Lava Leadership
Two frameworks emerged that brought me closer to understanding it was all about information flow. The Keystone Paradox identified how essential individuals become organizational bottlenecks—their irreplaceability creating information flow constraints. Lava Leadership showed how strategic thinking could bubble up from any position, like magma finding paths through rock. Both frameworks were about optimizing information movement through human systems.
- Keystone Paradox: A framework for identifying and addressing organizational bottlenecks.
- Lava Leadership: A framework for empowering individuals to lead change.
Each framework solved immediate problems while revealing deeper patterns. They all optimized information flow, reduced system entropy, and worked regardless of context or scale.
To see the 20+ domain-independent models and frameworks I’ve created, check out the rest of my writing.
The Cascade of Discovery
Once I realized there was a pattern, the discoveries came not as isolated insights but as an accelerating cascade—each breakthrough revealing the next, until the complete architecture of Information Physics emerged in just six transformative weeks.
May 2025 - The First Glimpse
The journey to Information Physics began with a question: what made my frameworks work so well? How was I able to create them and find success in complex systems from prison to corporate America?
The first manifestation of Conservation of Boundaries emerged while reverse-engineering a recovery device breakthrough. I’d inadvertently discovered that all my biggest impacts came from moving boundaries to reduce friction. But it was incomplete, lacking the WHY. When I shared it, I received valid feedback—it felt categorical to attribute values like resistance and quality to systems without deeper explanation.
My first Discord message captured the moment: “Kurt Kemple — 5/28/25, 12:11AM while trying to reverse engineer how I was able to solve the recovery problem, i may have inadvertently discovered a pattern for quantifying architectural innovation success (non-discovery, non-mechanical, etc) please help me disprove so I can go to sleep 😭”
The realization of COB’s symmetry—that the same operations that build also destroy—made me physically ill. The first version of what would become the Entropic Mathematics equation was born, but without Information Physics or proper mathematical grounding, I tabled it as potential overreach in pattern matching.
June 2025 - The Alarm Bell
A presentation about NPS and changing user bases triggered the next breakthrough. When someone asked if we were seeing a similar decrease in MAU, alarm bells rang. This didn’t match my experience working daily with developers sharing their lived reality. Sometimes people have to use tools that don’t align with their current needs to get the job done in an efficient way. This can happen for a multitude of reasons, but one thing is for certain, you can’t dismiss the lived experience of the people using the tool, no matter what the metrics say.
The reason became clear: the separation of usage from lived-experience—creating a boundary between those two realities—created entropy for anyone downstream of that decision. From my position in the system, I could see what others couldn’t—that the math we rely on certainly has a way of removing us from the equation.
The Four-Day Sprint
Frustration Coalitions and the Sentiment-Inertia Index emerged just four days after that meeting—I dove into a significant amount of research to validate the patterns I was seeing—first for Slack, then I started looking into the broader B2B SaaS market. The validation was immediate—our research team at Slack confirmed the patterns, and a YC founder emailed about a seperate issue and specifically called out that they loved my blog post and that it described the strategy they were using to out manuever a dominant B2B SaaS incumbent. The framework wasn’t just theoretical; practitioners were already applying it intuitively. Receiving such immediate validation was honestly jarring.
During the creation of Frustration Coaltions, I discovered Coalition Theory (thanks AI!) and saw striking resemblance to my six-stage framework. This led to Network theory, Complexity theory, Percolation theory, and finally Information theory—each new field confirming patterns I’d observed independently.
Friction Economy: How friction is the new battleground for B2B SaaS.
The Three Universal Frustrations
The real breakthroughs came from recognizing that workplace frustrations consistently reduce to three blocked flows:
- Creating signal from noise (information flow)
- Creating clarity from system boundary complexity (workflow optimization through boundary operations—Conway’s Law, Dunbar’s Law, Domain Driven Design)
- Enabling entropy engineering (swarming behaviors to reduce system entropy—incident response, customer triage, marketing opportunities)
Everything clicked. Every framework I’d created, every system I’d optimized, every pattern I’d recognized—they all addressed these three fundamental challenges.
July 2025 - The Acceleration
Once I started exploring complexity sciences, connections accelerated exponentially. When I connected COB to the causality behind Nash Equilibrium—defining it as entropic exhaustion where every actor has optimized from their embedded position until further improvement becomes impossible—any lingering doubts evaporated. This wasn’t delusion or pattern matching—it appeared to be discovery of fundamental principles.
Over six weeks, Information Physics, Entropic Mathematics, and Conservation of Boundaries evolved from rough insights to complete foundational theory. Each discovery validated and extended the others, creating an internally consistent framework that explained phenomena across all scales.
The Information Physics breakthrough came from recognizing humanity’s obsession with improving durability, compression, and transmission of information! Once I saw it throughout the digital age, I wondered if it extended further back. Investigating innovations like calendars and pyramids, watching my theory pass Occam’s Razor test after test, I became nauseous again—this time from the implications.
After discovering that IP and COB explained Nash Equilibrium as entropic exhaustion—actors converging through optimal actions within their thermodynamic constraints from embedded positions—the mathematical connection became undeniable.
Information Physics Reveals Itself
Information Physics is a general theory that describes how conscious beings embedded in entropy reduce or increase it through observer-dependent operations on information, coordination, and system boundaries.
The breakthrough came from connecting two observations:
- Every civilization independently developed similar solutions (calendars, writing, hierarchy)
- These solutions all optimized information flow against entropy
Turns out the connection to physics might not be metaphorical. Ginestra Bianconi’s work on quantum relative entropy showed gravity itself might be an information phenomenon. Claude Shannon’s Information theory connected information entropy directly to thermodynamics and entropy. The patterns weren’t similar to physics—they were physics applied to information systems. Information Physics reveals the other side-how physical conscious beings navigate that entropy from their embedded positions in reality.
Traditional mathematics abstracts away the observer to describe what happens in the universe when no one interferes—universal equations that work regardless of who’s looking. 2+2=4
any way you look at it, an apple falls from a tree because of gravity. Information Physics expresses mathematics that account for the existence of conscious beings who can choose to change the universe, not just observe it. What happens when someone sees that apple fall and discovers gravity itself? How does mathematics account for that?
Heat affects cognition. Fatigue reduces decision quality. Stress limits perspective. Resource constraints shape choices. The observer can’t be removed from these calculations because consciousness EXISTS IN PHYSICS. The same observer-dependence found in relativity and quantum mechanics appears to apply to human systems.
Suddenly everything connected. We don’t just happen to fight entropy—we evolved specifically to see it, model it, and consciously choose its direction. From neural pathway formation to civilization building, we’re expressing the same fundamental drive. Information Physics may be the first mathematics designed for conscious beings as physical entities embedded in entropic reality.
The Mathematics of Observer-Dependent Entropy
Traditional mathematics describes unconscious objects following fixed laws. Entropic Mathematics describes conscious agents navigating possibility spaces. The observer-dependent nature of first equation I derived revealed something unprecedented: the first mathematics where position and consciousness are fundamental variables, not complications to eliminate.
Entropic Mathematics: A mathematical framework where observer position, conscious intent, and lived experience are treated as fundamental variables. Calculations reflect the agent’s location within a system, their directional intent, and the informational and thermodynamic entropy constraints of their reality.
The First Entropic Mathematics Equation
The foundational equation of Entropic Mathematics captures how conscious agents change the entropy of systems they inhabit. Unlike most equations in history, it makes observer position mathematically fundamental.
System Entropy Change (SEC): The measurable impact a conscious agent can have on system entropy from their specific position, calculated through observer-dependent mathematics where position, intent, and operations determine possibility.
SEC = O × V / (1 + E)
Each variable represents something unprecedented in mathematical formalism:
- SEC = System Entropy Change (measurable outcome)
- O = Operations performed (MOVE, JOIN, SEPARATE)
- V = Vector of actor-group conscious intent (positive for entropy reduction, negative for entropy increase)
- E = Entropy as measured from individual actor’s position (lived reality/informational constraints/entropy from the system)
The formula doesn’t just describe change—it enables optimization by helping actors reduce their own E
values.
Where E
represents the observer’s position, V
represents shared conscious intent that enables collective entropy reduction. The frameworks I created gave names to things teams already felt but couldn’t articulate. V
captures that shared collective reality when groups align around the same way of seeing system dynamics.
But here’s where it gets interesting: the V
variable doesn’t just measure shared intent—it operates as a measurement mechanism. Before teams reach consensus, organizational possibilities exist in superposition. Multiple interpretations coexist, multiple outcomes feel equally possible. When groups align around shared conscious intent, they function as collective measurement apparatus that collapses that superposition into definite mathematical reality.
The equation’s power lies not in its complexity but in what it includes: consciousness, entropic position, and intent as mathematical primitives. A CEO and a worker applying identical operations with identical intent achieve different results because E
is different. This isn’t perception—it’s mathematical reality.
Measuring the Drift
Entropic Gap: The measurable drift between intended and current states, calculated through vector mathematics.
EG = 1 - S(anchor, current)
.
With mathematical foundations established, measurement became possible. Every system has intended and actual states. The distance between them determines health or decay:
This transforms vague concerns about “mission drift” or “technical debt” into precise calculations. AI conversations provided perfect demonstration—clear initial intent, gradual drift through interaction, measurable gap requiring intervention.
The Equilibrium Mechanism
Entropic Equilibrium: Stability emerges when all actors have optimized from their observer-dependent positions, making further change impossible without coordination.
Σ(SEC_i × W_i) → stable state
The final piece was understanding how systems stabilize. Game theory described Nash Equilibrium but never explained the mechanism. Entropic Mathematics revealed it: the convergence of every actor taking optimal actions within their thermodynamic constraints from their embedded positions, resulting in entropic exhaustion where further improvement becomes impossible.
The Stability Condition
Equilibrium occurs when:
d/dt[Σ(SEC_i × W_i)] ≈ 0
This derivative approaching zero doesn’t mean no operations occur. It means the weighted sum of all entropy changes stabilizes. Agents continue optimizing locally, but system-wide entropy reaches steady state.
Nash Equilibrium Redefined
Nash Equilibrium Redefined: The convergence of every actor in an embedded system, taking the best actions within their informational and thermodynamic constraints from their observer-dependent position in the system, results in a system equilibrium. This occurs through entropic exhaustion—when all actors have optimized entropy reduction from their positions until further improvement becomes impossible.
Traditional game theory describes the outcome but never explained the mechanism. Information Physics may reveal how equilibrium actually forms.
Each player optimizes until: ∂SEC_i/∂O_i = 0
The partial derivative of their entropy change with respect to their operations reaches zero. They’ve exhausted their available entropy reduction from their position. Further improvement requires either:
- Position change (reducing
E_i
) - Coordinated action (combining operations with others)
This isn’t about agreement or happiness. It’s about mathematical reality—each actor exhausts their available entropy-reduction operations, creating system-wide stability through local optimization.
For a real-world example of Information Physics mathematics in action, see Cultural Percolation: When Language Reaches Critical Mass—how “rizz” and other slang terms appear to follow predictable mathematical patterns as they spread through and eventually get rejected by different cultural groups.
The Conservation Law
Conservation of Boundaries: A foundational law stating that all meaningful transformation reduces to one of these three operations: MOVE, JOIN, or SEPARATE — applied to humans, information, or structural boundaries. These operations interact with an agent’s position (E), intent (V), and available actions (O) to produce a measurable effect on system entropy (SEC).
With Information Physics established, the next question was “how?” If humans can consciously choose entropy’s direction, what operations do we use?
Looking across every framework I’d created, every system transformation I’d witnessed, the same three operations appeared:
- MOVE boundaries to new positions
- JOIN existing boundaries
- SEPARATE existing boundaries
No fourth operation exists. Every innovation, every collapse, every change decomposes to these three primitives. But the real insight was the formula that governs their effectiveness.
The Complete Picture
These five discoveries form a complete theory:
- Why: Humans evolved to consciously choose entropy’s direction (Information Physics)
- How: Through three universal operations (Conservation of Boundaries)
- What: Using observer-dependent mathematics (Entropic Mathematics)
- Measurement: Detecting drift and decay (Entropic Gap)
- Stability: Understanding equilibrium formation (Entropic Equilibrium)
Together, they explain patterns across every scale of human organization—from neural pathways to global civilizations. They may transform previously mysterious phenomena into calculable dynamics.
If you’re still with me and these connections are starting to click into place, we may be witnessing something interesting: you might be experiencing your own consensus collapse right now. Your individual interpretation of how systems work is aligning with the mathematical framework I’ve laid out. Welcome to our new shared reality—one where entropy, consciousness, and mathematics converge into something that actually predicts how human systems behave.
Implications and Applications
Understanding these principles changes everything about how we approach human systems:
- For individuals: Recognize your position shapes your possibility. Optimize operations from where you are while working to reach lower-entropy positions.
- For organizations: Structure design around information flow, not reporting hierarchy. Minimize positional entropy differentials. Measure drift before it becomes dysfunction.
- For technology: Build systems that work with human physics, not against it. Reduce interaction entropy. Create tools that help users consciously direct entropy from their positions.
- For markets: Recognize entropy accumulation before collapse. Identify frustration coalitions early. Position for advantage based on entropy dynamics.
- For society: Understand that artificial entropy maintenance requires unsustainable energy. Design systems that naturally optimize rather than require constant intervention.
- For civilization: Recognize that human development may follow mathematical patterns driven by entropic exhaustion cycles rather than random cultural evolution.
- For planetary science: Theoretically assess worlds based on entropy constraints for conscious beings rather than just chemical habitability. Different planetary conditions create different “entropy fields” that would affect cognitive processing, energy requirements, and system-building capacity for conscious organisms.
- For space exploration: Calculate resource requirements for civilizational migration by understanding how planetary conditions affect the entropy costs of conscious operations. Terraforming could be understood as entropy reduction operations to make worlds more suitable for consciousness.
The Path Forward
This journey from maximum-entropy environments to universal principles proves that position doesn’t determine potential—it determines path. The same patterns that govern prison dynamics explain market behavior, organizational structure, and technological evolution.
These aren’t abstract theories. They’re practical tools validated through:
- Building teams that naturally optimize
- Creating frameworks used by thousands
- Predicting market disruptions before they manifest
- Transforming high-entropy systems into functional operations
The principles work because they describe reality as it actually operates, not as we pretend it does. They include consciousness, position, and intent as fundamental features rather than inconvenient complications.
Your Journey Begins
Whether you’re optimizing a team of five or transforming a system of millions, the same principles apply. The operations remain constant. Only scale and complexity change.
Start by understanding where you are—your position, your entropy, your available operations. Then apply the principles systematically:
- See the entropy in your systems
- Identify the boundaries that need transformation
- Calculate your position and its constraints
- Measure the gaps between intention and reality
- Work toward equilibrium that serves your purpose
The universe tends toward entropy. We evolved to consciously choose its direction. These principles show how.
Welcome to Information Physics. The patterns were always there. Now you can see them too.
- Information Physics: A general theory describing how conscious beings reduce or increase entropy through three operations on information, coordination, and system boundaries.
- Conservation of Boundaries: The universal law that system boundaries cannot be created or destroyed, only transformed through three operations—move, join, separate.
- Entropic Mathematics: A proposed applied field of mathematics extending established tools (Shannon entropy, vector calculus, information theory) to conscious systems where observer position and lived experience may be fundamental calculation variables.
- Entropic Gap: A framework that may help detect system decay before it becomes catastrophic by calculating the distance between intended and current states.
- Entropic Equilibrium: A theory exploring why systems may stabilize where they do through observer-dependent optimization.
- Information Physics Throughout History: How Sun Tzu, Machiavelli, and Napoleon may have intuitively applied IP principles centuries before the mathematics existed.
- Information Physics In Mathematics: Exploring how established mathematics (Shannon entropy, vector calculus, information theory) might extend into conscious systems where observer position and lived experience become fundamental variables rather than complications to eliminate.
- Information Physics In Science: How IP may reveal the underlying principle that unites quantum mechanics, biology, and cosmology across all scales.
- Renaissance Florence vs Silicon Valley: The Innovation Entropy Crisis: Comparing how Silicon Valley may produce 12x fewer innovators per capita than Renaissance Florence despite vastly superior resources—suggesting technology cannot overcome high entropy.
- Constraint by Design: Entropy Limits in the Gig Economy: Mathematical analysis suggesting that gig economy architecture may make worker advancement impossible regardless of individual effort, potentially demonstrating how structural position determines capability.
- Survival Trends Across Mass Extinctions: The fossil record suggests a pattern: during mass extinction events, specialists died while generalists thrived. This pattern may represent Information Physics playing out at planetary scale.
- The Peasant: A playbook for creating positive-sum outcomes in high-entropy (negative-sum) environments.
- The “Just How It Is” Test: Test Information Physics against traditional frameworks on any stubborn “unchangeable” problem to see which approach may work better from your position.
The Meaning of Humanity
The meaning of humanity is you can choose your relationship to the chaos of the universe—create or destroy, order or disorder—but whatever you decide, you get to do it in your own way, and that’s a beautiful thing. - Kurtis Kemple
Final Thoughts
Information Physics suggests that from our observer-dependent position, we can’t always know who is capable of great things and we’ve built system after system that ensures we’ll never know.
Arriving here feels inevitable and impossible at the same time.