The argument about whether practitioners or academics make better teachers is the wrong argument. It assumes the two groups are competing for the same job — that a practitioner could replace an academic in the same role and either do it better or worse. The actual situation is that practitioners and academics are equipped to teach fundamentally different things. When a practitioner is deployed to do what an academic does well, they fail in predictable ways. When an academic is deployed to do what a practitioner does well, they fail in equally predictable ways. The question worth asking is not which is better, but what each can teach, and how to design learning experiences that use each for what they are actually equipped for.
I teach governance and systems design at the PCU Graduate School alongside running a portfolio of ventures that include a cooperative agricultural platform and a property technology company. The combination is useful precisely because it creates friction. The theory I teach is tested against the systems I build. The patterns I see in the field surface questions the academic literature has not addressed clearly. Neither role is complete without the other — and recognizing the difference between what each produces has made me a better teacher and a more careful builder.
What Practitioners Teach Well
A practitioner's primary contribution to learning is the transmission of heuristics — the judgment rules that experienced people use to make decisions quickly in conditions where full analysis is not possible. These are not the same as principles, which are abstract and general. Heuristics are specific, contextualized, and often carry implicit caveats that the practitioner has developed through repeated exposure to failure.
"Scope is always larger than you estimate, so add 30% to your initial assessment" is a principle. "In Philippine SME technology projects, scope tends to expand most at the integration layer — especially when legacy accounting systems are involved — so add 40% to integration estimates specifically" is a heuristic. The difference is resolution. The heuristic is more useful in context and more likely to produce a correct decision, but it is also more fragile outside its context.
Practitioners also teach pattern recognition — the ability to look at a situation and quickly identify which class of problem it belongs to. This is not the same as theoretical categorization, which groups problems by underlying structure. Pattern recognition groups problems by what the experienced response should be. A practitioner who has seen forty failed technology rollouts in organizations with similar characteristics does not need to reason from first principles about why the rollout is failing. They recognize the pattern and move directly to the relevant interventions.
What practitioners teach through case narration that no textbook can replicate is failure mode exposure. The cases that appear in academic curricula are almost always cases of success — or cases of failure where the lesson has been so thoroughly processed that it feels neat and legible. Practitioners can describe failures in their full messiness: the moment when the team knew something was wrong but did not escalate, the political constraint that prevented the obvious fix, the decision that looked right at the time given the information available. This unprocessed failure narration is where learners develop the capacity to recognize early warning signals — and it is genuinely irreplaceable.
Practitioners also teach judgment about tradeoffs that cannot be resolved through analysis alone. In my governance teaching, the recurring lesson is that most hard problems in organizational design have two or three reasonable solutions that trade off different values — and choosing between them requires making a value judgment, not finding the right answer. Practitioners teach this not by lecturing about it but by narrating the actual choices they faced, why they chose what they chose, and what they would do differently. That is a fundamentally different kind of learning than theory can provide.
What Academics Teach Well
The practitioner's strength — heuristic and pattern-based knowledge — is also the practitioner's primary limitation. Heuristics are efficient but brittle. They work until the context changes in ways that make the heuristic misleading. Pattern recognition is powerful until the pattern is wrong — until the situation looks familiar but is actually different in ways that matter. Without the underlying theory, a practitioner cannot tell when their heuristic is failing them.
Academics teach the underlying mechanisms. Not "add 30% to scope estimates" but "scope estimates exhibit systematic underestimation because of planning fallacy, reference class neglect, and anchoring on best-case scenarios — here is the research on each, here is what happens at each stage of a project when these effects compound, and here is how you can calibrate your estimates to correct for them." That is a different kind of knowledge — slower to apply, requiring more cognitive work in the moment, but more robust across contexts.
Academics also teach research methods — how to generate knowledge rather than just consume it. A practitioner learns from their own failures and the failures they observe. That sample is necessarily small, non-random, and shaped by the contexts they happen to work in. Research methods teach how to construct knowledge systematically: how to ask a question that can actually be answered, how to gather evidence that is representative, how to interpret findings that contradict your prior beliefs. Organizations that have people who can think this way are better at learning from their own experience — because they can distinguish between a lesson that is genuine and a lesson that is an artifact of a limited sample.
Academics teach edge cases. Practitioners' knowledge is shaped by the common case — the situations they encounter frequently enough to have developed good heuristics. But edge cases matter. The unusual situation is often where the most expensive failures happen, precisely because practitioners do not recognize it as belonging to a familiar category. Academic teaching that develops the underlying theory rather than the heuristic prepares learners to reason about edge cases because they understand the mechanism, not just the pattern.
Finally, academics provide a vocabulary that allows learners to communicate clearly about complex concepts with people they have never met. When I teach governance theory, I am teaching a shared vocabulary that allows two practitioners from different organizations to describe their problems to each other and be understood. Without that vocabulary, organizations reinvent wheels in isolation, fail to benefit from others' experience, and struggle to hire people with relevant expertise because the expertise cannot be described precisely.
Failure Modes of Each Type in the Wrong Context
Deploying a practitioner in an academic role produces characteristic failures. The most common is the practitioner who can describe solutions but not explain why those solutions work. Students in a practitioner-taught course often leave knowing what to do in situations that closely resemble the ones the practitioner described — and being helpless in situations that are somewhat different. The heuristic was transmitted without the underlying mechanism, so there is no basis for adapting when the context changes.
A second failure mode is survivorship bias in the cases narrated. Practitioners tend to narrate the projects that completed — the ones where the pattern held and the solution worked. The failures are often described in ways that are slightly less representative than they should be, because the practitioner's relationship to their own failures is more protective than their relationship to their successes. Academic case analysis subjects cases to a kind of scrutiny that a practitioner's self-narration rarely achieves.
Deploying an academic in a practitioner role produces different failures. The most common is the instructor who can explain why something fails without being able to show what to do instead. Students learn to diagnose problems with increasing sophistication without developing the corresponding capacity to act. This produces a particular graduate who is excellent at identifying what is wrong with other people's work and much less effective at producing their own.
A second failure mode is over-reliance on clean models. Academic models are designed to isolate variables — to hold other things equal and examine the relationship between specific factors. Real organizational contexts do not hold other things equal. The academic trained primarily on clean models may struggle to make decisions in the presence of the confounding variables that define actual practice. The messiness that a practitioner navigates intuitively is genuinely disorienting for people trained primarily in controlled conditions.
Combining Both in a Curriculum Without Producing Paralysis
The goal of curriculum design is not to have practitioner content and academic content sitting next to each other in a course schedule. It is to create an integrated learning experience where each type of knowledge activates and deepens the other.
The pattern that works is: theory first to provide the mechanism, practitioner narration to show the mechanism operating in conditions, structured case analysis to develop the judgment about when the mechanism applies, and applied exercises where learners produce their own analysis of novel situations and receive feedback on their reasoning.
This sequence is important. Introducing practitioner cases before the theoretical foundation is established tends to produce learners who understand the specific case but cannot generalize. They remember the story but not the principle. Introducing theory without practitioner case narration tends to produce learners who understand the principle but have no intuition about how it manifests — they can define planning fallacy but cannot recognize it in a project meeting.
The applied exercises at the end are where judgment is actually built, and they require genuine ambiguity. Cases with clear answers teach content. Cases with genuinely contestable answers teach judgment. The distinction is in whether reasonable, well-prepared people examining the same evidence would reach different conclusions. If they would — if the right answer genuinely depends on a value judgment or a context-specific assessment — the case is building judgment. If any well-prepared person would reach the same conclusion, the case is testing content retention.
At the graduate level, where students often have significant professional experience, the combination works differently. Practitioners in a classroom bring their own pattern libraries — which means the academic content lands differently when it connects to a pattern they already have. Teaching planning fallacy to a group of experienced project managers who have been burned by scope underestimation is a different experience from teaching it to undergraduates who have never managed a project. The academic content gives them a vocabulary and mechanism for something they have already experienced. The practitioner narration confirms the theory from a different angle. The result is deeper retention and broader applicability than either type of content would produce alone.
Lessons From Teaching Governance and Systems Design at Graduate Level
In the governance and systems design courses I teach, the most consistent finding is that experienced practitioners learn differently from students entering the domain without experience — and the curriculum has to accommodate both without patronizing either.
The practitioners learn fastest when the academic content resolves a confusion they have been carrying. When planning fallacy explains why their scope estimates are always wrong, they do not learn planning fallacy as an abstract concept — they learn it as an explanation for something they have been experiencing without a clear frame. The academic content accelerates their learning because it gives structure to already-accumulated experience.
Students without practitioner experience learn fastest when the academic content is immediately followed by applied exercises that produce something — a plan, an analysis, a governance structure — that receives genuine feedback. Without the practitioner's accumulated experience to activate, the theory remains abstract until it is used. The exercise creates a synthetic version of the practitioner's experience: a specific, consequential situation where the theory has to be applied and the result is evaluated.
The classroom dynamic becomes interesting when practitioners and students without experience are in the same room. The practitioners are the most valuable resources in the room for a certain kind of learning — the practitioner narration, the heuristic transmission, the failure mode exposure. But they can also dominate the conversation in ways that crowd out the academic content that would actually strengthen their practice. Managing that dynamic is itself a practitioner skill: knowing when to draw out the experienced voices and when to redirect to the mechanism that makes their experience legible to everyone.
The synthesis that works is not balance for its own sake. It is clarity about what each type of knowledge can do and deliberate design of the learning sequence to deploy each at the right moment. Practitioners in teaching roles should be explicit about their role: "I am here to show you how this mechanism operates in real conditions, not to replace the theory." Academics in teaching roles should be explicit about the limits of their models: "This framework holds in controlled conditions — the practitioner's experience tells you how to adapt it when those conditions don't hold." That mutual acknowledgment of what each can and cannot do is what allows a curriculum to actually build the judgment that organizations need.
