There is a version of culture that gets talked about in boardrooms and hiring decks. It involves perks and values statements, away days and Glassdoor scores. It is the version that gets a budget line and a head of people ops and a Slack channel called #culture. It is also, almost entirely, a distraction.
Real culture does not live in the artefacts. It lives in the layer beneath them — in the unspoken rules that govern what is actually rewarded, what is quietly tolerated, and what is punished by omission. It runs silently, in the background, executing the same logic on every decision, every interaction, every moment of ambiguity your team encounters. It is not a benefit. It is an operating system. And like any operating system, it is shaping the output of your organisation whether you designed it or not.
This essay is about what culture actually is, how it gets written, what happens when it is corrupted, and what it takes to deliberately rewrite it. The argument is not that culture matters — most leaders will concede that much. The argument is that culture is the primary determinant of organisational performance, that most leaders are running an OS they didn't choose, and that the work of building a great organisation begins with understanding exactly what code you are executing.
Part One: What the Operating System Actually Is
Below the Surface
Edgar Schein spent decades studying organisations with a deceptively simple question: why do groups behave the way they do, even when that behaviour is costly, self-defeating, or at odds with what the organisation officially claims to value? His answer, as per Organizational Culture and Leadership, was that culture operates at three distinct levels — and that most attempts to manage culture are aimed at entirely the wrong one.
The first level is artefacts: the visible, tangible expressions of an organisation's culture. Office layout. Meeting cadence. What goes on the walls. How people dress. These are real signals, but they are easily faked and frequently misread. A company can install a meditation room and still run on fear.
The second level is espoused values: the things the organisation says it stands for. The mission statement. The company principles. The manager who tells you in the onboarding session that 'we really value candour here.' Espoused values are not irrelevant, but they are only as meaningful as the degree to which they match what actually happens when they are tested. When there is a gap between the espoused value and the observed behaviour — and in most organisations there is — the observed behaviour always wins.
The third level is basic underlying assumptions: the beliefs so deeply embedded in the organisation that they are no longer examined, and in many cases no longer conscious. Assumptions about whether people can be trusted, whether disagreement is dangerous, whether the purpose of a meeting is to make decisions or to ratify decisions already made elsewhere, whether failure is data or evidence of personal inadequacy. These assumptions are the operating system. They are the code that runs beneath every policy, every initiative, every town hall. They are what Schein called the essence of culture, and they are almost never discussed — because they are taken for granted as simply the way things are.
The artefacts are the UI. The espoused values are the documentation. The underlying assumptions are the code. Most culture programmes are rewriting the documentation while the code runs unchanged.
The Invisible Conditional
Karl Weick's concept of sensemaking sharpens the picture further. Weick argued that organisations do not simply respond to reality — they actively construct it. When something ambiguous happens in an organisation, people reach for a shared interpretive framework to decide what it means. Is this a problem or an opportunity? Is this my responsibility or someone else's? Is it safe to say what I actually think? The answers to these questions are not determined by the facts of the situation. They are determined by the cultural framework through which those facts are interpreted.
This is precisely what an operating system does. It receives inputs and, before passing them to any application, interprets them: classifies them, routes them, filters some out entirely. Culture operates the same way. The same piece of information — a missed deadline, a dissatisfied client, a product failure — will be processed entirely differently by two organisations running different cultural code. In one, it surfaces immediately, is treated as data, and drives a constructive post-mortem. In the other, it is buried, attributed to an individual, and never examined at the systemic level. Same input. Different OS. Entirely different output.
Daniel Kahneman's work on fast and slow thinking offers a complementary lens. Most workplace behaviour is what Kahneman calls System 1: fast, automatic, pattern-matched, operating below the level of deliberate thought. People do not, in general, consciously decide how to behave in a meeting, whether to speak up about a problem, or how much effort to put into work that no one will notice. They act on trained intuitions — intuitions shaped, over hundreds of interactions, by the signals the culture has sent about what happens to people who do certain things. Culture, in this sense, is the training data for System 1. It is the accumulated pattern-matching that tells your team, without a word being spoken, what the rules actually are.
Part Two: How the Code Gets Written
Action, Not Aspiration
Ben Horowitz, in What You Do Is Who You Are, makes an argument that should make every leader uncomfortable: culture is not what you say, it is what you do — and more specifically, what you do when it is costly to do the right thing. One of his examples, the samurai code was not written in manuals. It was enacted in ten thousand daily behaviours that accumulated, over generations, into something that required no enforcement because it was simply who you were. I meanwhile, talk about how as a military officer I was taught that "the standard you accept, is the behaviour you walk past."
The founder equivalent is less dramatic but operates on the same mechanism. The leader who says they value candour but reacts with visible displeasure when challenged in a meeting has written a line of code: dissent carries a cost. The leader who says they value work-life balance but sends emails at midnight and expects replies has written another: availability is what is actually rewarded. The leader who says they are committed to learning from failure but, when a project fails, focuses the post-mortem on identifying who was responsible, writes 'failure is dangerous'... the rational response from the team there-on is nearly always concealment.
Robert Cialdini's research on social proof adds the propagation mechanism. In conditions of ambiguity — which describes most of working life — people look to the behaviour of others, especially those with status and credibility, to determine what is appropriate. The leader's behaviour is the most powerful cultural signal, not because it is mandated, but because it is observed, interpreted, and replicated. Culture spreads laterally through imitation before it is ever articulated as policy. By the time someone thinks to write a values statement, the culture has already been engrained for months.
Origin Stories and Critical Incidents
Schein identified two mechanisms through which cultural assumptions become self-reinforcing over time: the way organisations handle critical incidents, and the stories they tell about their own past. Both deserve attention from any leader trying to understand — or redesign — the OS they are running.
Critical incidents are the moments when the culture is tested: when a values statement meets a real decision with real consequences. How does the organisation respond when a top performer behaves badly? When a client asks for something ethically questionable? When speaking truth to a senior leader carries personal risk? These moments are disproportionately formative. They are watched carefully by everyone in the organisation, and the signals they send about what the culture actually prioritises — over its stated values — are retained and transmitted far more durably than any all-hands presentation.
The stories organisations tell about themselves serve a similar function. The anecdotes shared in onboarding, repeated at team dinners, cited in all-hands meetings — these are not incidental colour. They are cultural instruction. They communicate, through narrative rather than decree, what kinds of people succeed here, what kinds of behaviour are celebrated, and what happens to those who cross important lines. A culture that tells stories of scrappy resourcefulness and individual heroism is writing a different OS to one that tells stories of collective problem-solving and disciplined process. Neither is inherently superior. Both are actively shaping how people interpret their role.
Part Three: What Happens When the OS Is Corrupted
The Propagation of Permission
Robert Sutton's research on workplace incivility — developed in The No Asshole Rule — documents a phenomenon that every experienced leader will recognise: the degree to which a single tolerated behaviour can corrupt a cultural environment far beyond its immediate source. Sutton's focus was on the specific damage caused by people who behave with consistent cruelty or contempt toward their colleagues. But the mechanism he identified is general. When a behaviour — any behaviour — is observed without consequence, it is interpreted by everyone watching as a signal that the behaviour is permitted. Permission, once extended, spreads.
This is the corrupted file problem. An operating system does not evaluate the intent behind each instruction — it executes the code it has been given. A culture does not evaluate whether a given behaviour represents the organisation's true values — it registers what is tolerated and updates its norms accordingly. The senior leader who interrupts junior colleagues sends a signal. The manager who takes credit for their team's work sends a signal. The executive who is visibly impatient with questions sends a signal. Each signal, individually, may seem trivial. Cumulatively, they write an OS that tells people, with great precision, what kind of place this is and what kind of behaviour it rewards.
The Failure Filter
Amy Edmondson's research on psychological safety provides the most consequential illustration of what a corrupted OS costs in practice. Across hospitals, manufacturing plants, financial services firms, and technology companies, Edmondson found that the single strongest predictor of a team's ability to perform and improve was psychological safety: the shared belief that the team environment is safe for interpersonal risk-taking — for speaking up, asking questions, flagging errors, offering dissenting views without fear of humiliation or punishment.
The critical insight from Edmondson's work is that psychological safety is not a personality trait distributed unevenly across individuals. Psychological safety is a cultural norm, produced by leadership behaviour and team dynamics, that varies dramatically across teams even within the same organisation. And its absence is not merely uncomfortable — it is operationally catastrophic. A team that does not feel safe raising concerns will not raise them. The errors will still be made; the near-misses will still happen; the wrong assumptions will still drive the product roadmap. They simply will not surface until they become too large to ignore. An organisation with a punitive culture around failure has, in systems terms, disabled its own error-correction mechanism. It is running an OS that filters out the feedback it most needs.
Google's Project Aristotle, the five-year study of team effectiveness that examined 180 teams across the company, found that psychological safety was by far the most important dynamic distinguishing high-performing teams from low-performing ones. Not skills. Not composition. Not individual talent. The cultural norm governing whether it was safe to be honest. James Heskett's quantitative work on culture and performance provides the organisational-level equivalent, estimating that culture-related factors account for somewhere between twenty and thirty percent of the differential in performance between otherwise comparable organisations. Culture is not a soft topic. It is among the most consequential strategic variables a leader controls — or fails to.
Part Four: Rewriting the Code
The Gerstner Principle
When Lou Gerstner arrived at IBM in 1993, the company was in the middle of what many analysts believed was a terminal decline. His assessment, delivered with characteristic bluntness in Who Says Elephants Can't Dance?, was that IBM's problems were not primarily strategic or structural. They were cultural. The company had developed an OS characterised by internal politics, bureaucratic deference, and a profound incapacity to respond to external reality when that reality conflicted with internal conviction. Gerstner's observation — that culture is not just one aspect of the game, it is the game — was not a philosophical statement. It was a diagnosis of why every previous attempt at strategic change had failed. The strategy had been updated repeatedly. The code had not.
The approach Gerstner took at IBM illustrates the first principle of cultural redesign: the change has to be modelled at the top, visibly and consistently, before it can be expected anywhere else. He showed up. He engaged directly with customers and with frontline employees in ways that IBM's senior culture had not normalised. He refused to accept that certain conversations were above a certain pay grade. He made it clear, through behaviour rather than announcement, that the norms were changing. This is not soft leadership. It is the precise, deliberate use of the leader's disproportionate cultural leverage.
Debugging Deliberately
Schein's later work, particularly Humble Inquiry, points to a mechanism that is both simple and systematically underused: the quality of questions a leader asks is among the most powerful cultural signals available to them. Not the speeches they give, not the values they post on the wall — the questions. When a leader asks 'whose fault was this?' after a failure, they are writing one kind of code. When they ask 'what did we learn from this, and what do we need to change?', they are writing another. When they ask for opinions before offering their own, they are signalling that other people's thinking is genuinely valued. When they answer their own questions before anyone else can respond, they are signalling the opposite, regardless of what the values statement says.
Horowitz makes a related point about the specific mechanism of cultural change in established organisations. The most common mistake leaders make when trying to change a culture, he argues, is to announce the change rather than enact it. Values statements, culture decks, rebranded principles — these are artefact-level changes. They update the documentation. The code runs unchanged until the underlying behaviour changes, consistently and visibly, in situations where the old code would have produced a different result. This means the change has to show up in decisions, not declarations. It has to show up when it is costly — when the old behaviour would have been easier, more comfortable, or more immediately rewarding.
Culture cannot be announced into existence. It can only be enacted — decision by decision, response by response, in the moments when it would have been easier to revert to the old code.
The Write Cycle
What does deliberate cultural redesign actually require? Drawing across the evidence, three things stand out.
The first is diagnosis before prescription. Schein is insistent on this point, and he is right. Leaders who move directly to culture-change initiatives without first understanding the current OS at the assumptions level — not the artefacts level, not the espoused values level, but the actual underlying beliefs that are governing behaviour — will design interventions that address symptoms. The question is not 'what culture do we want?' but 'what does our current culture actually believe, and how did it come to believe it?' The answer is usually found not in surveys but in stories, in the handling of critical incidents, and in what happens in the organisation when no one senior is watching.
The second is to identify the highest-leverage cultural interventions and pursue them with disproportionate consistency. Not everything can be changed at once, and attempting to change everything at once signals that the change is cosmetic. The organisations that successfully shift their cultures tend to identify two or three specific behavioural norms that, if genuinely changed, would have downstream effects on everything else. In a culture defined by fear of failure, the highest-leverage change might be how the senior team publicly responds to its own mistakes. In a culture defined by internal competition, it might be how collaboration is — or is not — rewarded structurally. The specific intervention matters less than the consistency with which it is held to.
The third is patience calibrated to realism. Schein's estimate is that meaningful cultural change in an established organisation takes somewhere between five and fifteen years. That is not an argument for fatalism — significant shifts in specific norms can happen much faster, particularly in smaller organisations or when the leadership change is decisive and visible. But it is an argument against the culture initiative that runs for a quarter and is then replaced by the next initiative. Cultural assumptions are built through accumulated experience. They are only rebuilt the same way.
Conclusion: You Already Have a Culture
Every organisation has a culture. The question is never whether culture exists — it is whether the one you are running was designed or inherited, examined or assumed, serving your strategy or quietly undermining it.
The ping-pong table is not the problem. The problem is the leader who invests in the artefacts because they are visible and measurable and communicable, while the underlying assumptions layer runs on code written years ago by decisions that were never labelled as cultural ones. The tolerance for a certain kind of behaviour. The story told, again and again, about what happened to someone who took a risk. The question that was never asked in the room. These are the lines of code that are actually governing your organisation's output, and they will continue to do so until someone reads them clearly and decides, deliberately, to write something different.
Culture is not a benefit you offer your team. It is the medium through which your team thinks, decides, and acts. Getting it right is not an HR priority. It is the primary work of leadership — and it never stops.
Key Sources
- Schein, E. H. Organizational Culture and Leadership (4th ed.). Jossey-Bass, 2010
- Schein, E. H. Humble Inquiry: The Gentle Art of Asking Instead of Telling. Berrett-Koehler, 2013
- Deal, T. E. & Kennedy, A. A. Corporate Cultures: The Rites and Rituals of Corporate Life. Addison-Wesley, 1982
- Weick, K. E. Sensemaking in Organizations. SAGE Publications, 1995
- Kahneman, D. Thinking, Fast and Slow. Farrar, Straus and Giroux, 2011
- Horowitz, B. What You Do Is Who You Are: How to Create Your Business Culture. HarperBusiness, 2019
- Cialdini, R. B. Influence: The Psychology of Persuasion. HarperCollins, 1984
- Edmondson, A. C. The Fearless Organisation: Creating Psychological Safety in the Workplace for Learning, Innovation, and Growth. Wiley, 2018
- Sutton, R. I. The No Asshole Rule: Building a Civilised Workplace and Surviving One That Isn't. Business Plus, 2007
- Heskett, J. L. The Culture Cycle: How to Shape the Unseen Force That Transforms Performance. FT Press, 2011
- Kotter, J. P. & Heskett, J. L. Corporate Culture and Performance. Free Press, 1992
- Gerstner, L. V. Who Says Elephants Can't Dance? Inside IBM's Historic Turnaround. HarperBusiness, 2002
- Google re:Work Project Aristotle: Understanding Team Effectiveness. re.work, 2016