I have been a college teacher for some of the happiest years of my life. When I tell people what I do for a living, what I really do, I say I teach people to think for themselves. It’s still a wonderful way to make a living, but over time I have begun wondering whether I have been fooling myself. I could just be teaching them to think like me, or how to package the conventional wisdoms that they have scraped off the Internet. I find myself wondering, therefore, what it really means to think for yourself, what it means for teachers, for students, and for a society that tells itself it is free. Thinking for yourself has never been easy, but the question of whether it is still possible at all is of some moment. The key ideals of liberal democracy — moral independence and intellectual autonomy — depend on it, and my students will not have much experience of either if they end up living in a culture where all of their political and cultural opinions must express tribal allegiance to one of two partisan alternatives; where they live in communities so segregated by education, class, and race that they never encounter a challenge to their tribe’s received ideas, or in a society where the wells of information are so polluted that pretty well everything they read is “fake news.” Thinking for yourself need not require every thought to be yours and yours alone. Originality is not the goal, but the autonomous and authentic choice of your deepest convictions certainly is, and you are unlikely to make authentic choices of belief unless you can learn to wrestle free from the call of the tribe and the peddlers of disinformation and reach that moment of stillness when you actually can ascertain what you think. The contradiction that teachers face in their classrooms — are we teaching them how to think or how to think like us? — plays out across our whole society. From grade school through graduate school, in our corporate training centers, in our government departments, our enormous educational apparatus is engaged in training us in the ways of thinking appropriate to academic disciplines, corporate cultures, and administrative systems, but with the ultimate objective, so they proclaim, of independent thought. This contradiction – the simultaneous pursuit of conformity and independence — is also at the core of our anxiety about innovation. Our societies say that they prize “thinking outside the box,” whatever the box may be. Our political authorities tell us that the solution to every problem we face — secular stagnation, climate change, geostrategic chaos — depends on “innovation,” which in turn depends, finally, on somebody somewhere thinking beyond the cliches and the conventions that inundate us daily and keeps us locked in a state of busy mental stagnation. Our culture tells us we either innovate or we die, but economists such as Robert Gordon point out that the capitalist economy of the twenty-first century has nothing on the horizon to rival the stochastic lurch of innovation in the Edison and Ford eras at the beginning of the twentieth century. The energy transition, when it finally comes, may yet unlock a new stochastic lurch, but so far progress is slow. Despite these warning signs of stagnation — even decadence, as some conservatives argue — our culture clings to the signs of innovation that we can identify, even to minor upgrades to the software on our phones, because such hope as we have for our future depends on faith that the onward and upward path of new ideas continues. But this hope soon collides directly with the unprecedentedly large pressures for conformity that are generated by those same institutions of innovation. The Pharaonic size of the corporations that control the digital economy raises not just a problem of monopoly or distributive justice, but also an epistemological quandary: are Google, Microsoft, and Facebook, these all-devouring machines, actually places where people can think for themselves? I refer not only to their customers but also to their employees. Is it likely that their well-compensated and disciplined workers can think against the grain of the corporate logic that they serve every day? If not, we may all be on the long way down to secular decline. An institution of innovation may be a contradiction in terms. The same question haunts the universities in which I have spent some of my life teaching. Every one of these institutions is devoted to what it calls “academic freedom,” but how much freedom do these institutions, their disciplines, their promotion and recruitment processes, their incentives, their intellectual fashions, actually make possible for the individuals inside them? Are they hives of orthodoxy or heterodoxy? Does the competitive search to publish, to earn grants, to make an impression on the public, create a genuine climate of intellectual liberty or just enclose everyone, willingly or not, in an organized, pious, censorious, progressive conformity? That is not a trick question. The academic fields in which I work — history and international relations — demand new thinking because the world that they study, and for which they train aspiring practitioners of diplomacy and finance and politics, is changing fast, yet it is unclear that any of us have the measure of the moment. To use the metaphor that everybody is using, we are observing a shift in the tectonic plates, a fracturing of what used to be called “the liberal world order.” Not a day passes without someone offering a new grand narrative to replace the stories that we told ourselves to make sense of the end of the Cold War. But each new claimant fades away, defeated by the complexity of the task; and in some cases, it may be the affirmation of the old wisdom that is a sign of intellectual independence. We are living in a dark cloud of crises — climate change, war, global inequality, economic turbulence, social intolerance — and nothing matters more than that