When Will We Start Taking Business Seriously?

Slouching Toward a Theory of Post-Businessism

Christopher Locke – Monday, April 16, 2001 – clocke@panix.com

Business is beleaguered today from all sides. The corporation is not only expected to financially reward the faith placed in it by investors, it must also value diversity among its employees, be sensitive to environmental concerns, and play a positive role in the development of the community, whether municipality or nation. In short, business is expected to behave in a socially responsible manner. Which expectation, as we know, is often honored in the breach. Some business leaders argue that this is no surprise, given that social responsibility is only an expectation, not an intrinsic part of the corporate charter, which is: to maximize shareholder profit. But such argumentative retorts tend to be few, or are delivered only sotto voce. “The public be damned” has not been a popular stance since the days of J.P. Morgan. Not that it was all that popular then. Instead, we get press releases full of earnest genuflections to diversity, environment, community. Anything else would be tantamount to dissing Mom’s apple pie.

Also no surprise to anyone, the authenticity of intent behind such statements is generally taken to be problematic, at best. Both speaker and audience understand that these noises are required, but usually mean little. They are rituals performed in honor of fondly held illusions – holy rites in the widespread religion of businessism. This is a useful, if purposely ugly, term for the panoply of tacit beliefs and attitudes that have accreted around the actual doing of business. The single most important quality of businessism is that its very existence is invisible, not only to its practitioners, but also to its critics. How is intelligent criticism even possible under such circumstances? Good question.

Language often continues to carry the baggage from journeys  long since forgotten. When we find ourselves perplexed  by current events – “how did we ever come to such a pass?” – unpacking those bags is often instructive. Take the word “culture,” for instance, surely as hotly contested a term as you’ll find anywhere these latter days. Forget what it means; look at how it’s used. We continue to speak of certain persons as cultured or highly cultivated, without consciously realizing that cultivation and culture are rooted in another, more earthy sense of culture: agriculture. In the 21st century, the connection seems odd, but it wouldn’t have to Thomas Jefferson, a gentleman farmer of the old school. The old school being the British House of Lords with its Dukes and Barons and God only knows what else. Becoming a member of that august body required being born into a patrician family of the landed gentry. Land was everything, and from it all wealth derived. Business was basically a pastoral affair, conducted at a stately and hereditary pace, over a decent sherry, quite possibly, at the club. The first Old Boys were basically farmers. So: agriculture, culture, cult. Another word for cult is class.

Hold that thought as we fast-forward to a party in progress on New York’s Upper West Side. It’s a classy party, all would agree, though you’d get less agreement and more disapproving looks if you began asking around whether it was upper-classy or lower-classy. In America, we have prohibited such concepts. Not wanting to rock the boat – you’re far too nice a person – you resist this urge and instead ask the people you meet what they do. “Oh, I’m an art critic,” says one, offhand, fully expecting you to be impressed. And, being a nice person, you are. Then in rapid succession, you meet a literary critic, a music critic, a media critic and a culture critic – the last one laughing, “Cult crit, you know?” and you laugh along, agreeing ha-ha that’s funny, but wondering precisely why. Then you meet me. “I’m a business critic,” I say. And you say: “Excuse me? A what?

Do a search on Alta Vista or Google for “business critic” and you will find, as I did, that nearly every hit refers to someone who thinks business sucks. This is unusual if you look at other areas of what we call criticism. An art critic may deeply dislike a particular artist, but few are likely to suggest that art itself, well… bites. Similarly for music and literature, media and culture. Anthropologists may argue about what “culture” means, but to my knowledge, no one is saying that, whatever it is, it has to go. On the other hand, “business critics” tend to be saying exactly that: “we’re mad as hell and we’re not gonna take it anymore.”[1]

The fact is – and it’s a very large fact if you see it in the right light – there is no such animal as business criticism. It’s a hole in the world you could drive a truck through. What we do have, of course, is data reporting on the financial markets, economic treatises, and business journalism of the who-what-when-where-why variety. But all of this is what the Swiss linguist Ferdinand de Saussure would have called synchronic – a cut through current practice with the rearview mirror removed. In an article that appeared just yesterday in The Industry Standard, the authors write “Today's MBA students don't want to pay for ancient history lessons.”[2] Naturally, the B-schools  have responded to this market pressure – thus keeping the folly of businessism safely under wraps.

But it is folly, and – especially since the advent of the Internet – trying to keep it hidden is anything but safe. The tacit assumptions of businessism cry out for deconstruction, not to show how ethically wrong their results – such arguments having the persuasive force of asking a dog politely please not to do it on the lawn – but rather how mismatched those results are with the expectations of networked markets. However, we risk getting ahead of ourselves here, thanks to a built-in chicken-and-egg impasse. To understand what businessism consists of, we need a viable business criticism, which, uh, er, um…. unfortunately does not exist. The first question to ask, then,  is why.

But before we can examine why there is no such animal, we need some idea of what the animal might look like if there were one. Art criticism, music criticism, literary criticism and so on, all situate their subjects within an historical context. Before Surrealism there was Dada. Before Elvis there was Mozart. Before Dave Barry there was Herman Melville. That sort of thing. And before the Internet there was… last week. Business journalism doesn’t take history into account. Maybe it goes back a few years, or less often, a few decades, but for the most part, it’s a synchronic cut through current events.

A form of business criticism does exist, but only at the margins of the mainstream with which business concerns itself. Such critics are called intellectual historians, sociologists, philosophers – or often simply: eggheads. Lacking the active engagement of bona fide business practitioners, this is criticism in a vacuum. However, the reasons that mainstream and margin are delimited as they are – and thus, where the coin of attention gets spent – are intrinsic to our larger story, so we’ll pass over this category for the present. We’ll return to it, though, with some thoughts on such formal categorization. And on how the Internet is radically shifting time-honored conceptual boundaries.

There is also a historically rooted form of critique that is applied to business, but it tends to be deeply ideological. This isn’t to criticize these critics. Ideology is real and legitimate. It just doesn’t constitute or encourage dialogue. Art criticism presumably joins practitioners and audience in a conversation. Shouting “mount the barricades” does not. And whether said barricades are mounted from the left or the right makes little difference. To be of any real use, a critical community must join a larger community in exploring matters of mutual concern. It can’t be an exclusive club – like say, our New York literati at their classy party. No matter how noble their initial intent, critics tend to form themselves into elites. If there were such a thing as human nature, one might be tempted to use this fact to call attention to its perversity.

The reference to elites is not gratuitous. Think about art, music, literature. These have histories dating back thousands of years. You could say the same about business, but only if you were talking about very general notions of trade and commerce. What we call business today is barely 150 years old. And when it emerged in the middle of the 19th century, it had no pedigree whatsoever. Why is there no business criticism today? Simple: because we swept business under the rug. Because, despite its unquestionable impact on the world we live in, we don’t take business seriously.

Notions of filthy lucre go back a long way. As do Biblical warnings about how hard it is for rich men to get into heaven. But we needn’t go back that far to find the broom and dustpan. The period after Reconstruction was a fertile time for American business. It saw the establishment of the form of legal incorporation we recognize today, and – not unrelated – the rise of our lovingly labeled Robber Barons: Andrew Carnegie, John D. Rockefeller, Cornelius Vanderbilt, Jay Gould. The list could go on, but you get the drift. Along with their rise, a new expression came into the English language: nouveau riche. It wasn’t a term of endearment. The new rich had more money than Croesus, but they didn’t have culture. They lacked, shall we say, a certain cultivation. Commodore Vanderbilt, who never finished high school, was known to take 12 lumps of sugar in his tea. How gauche!

Having completed high school myself, though just barely, I can relate to the ostracism this first crop of modern business leaders must have been subjected to. And it wasn’t just because they were unschooled. It was because they were unlanded. The Gilded Age of the robber barons represents the head-on collision of a European agrarian past with a thoroughly American industrial future. On first contact, America was a pastoral, primordial wilderness, an Edenic echo of the Promised Land to settlers just arrived from Europe. Though this nostalgia still lingers, and is often invoked today, it is hard for anyone now living in the First World to imagine the spiritual impact of that first flash of paradise regained. Then along come these uncouth parvenu yahoos, ripping it up! Laying rails, pumping oil, smelting iron, making money hand over fist. My God, what was wrong with these people?

And they weren’t just rejected by the landed patricians of the upper class – for having no class – they were also rejected by the landed lower class: the farmers. The populist movement of the late 19th century was largely driven by agrarian interests calling for the blood of the new fat-cat industrialists. The history of antitrust agitation begins with farmers pushed off their land. Caught in a pincer action from above and below, the robber barons were not just ostracized, they were evicted from society.

Core to Freud’s theory of psychoanalysis is the concept of repression. Here’s how it works for individuals. The mind, faced with incommensurable opposites, stuffs the experience so far down into the unconscious that the juxtaposition seems never to have taken place. So it seems, but there are traces. Freud called them symptoms, and they usually aren’t the sort of thing you want to display in polite company. Repressed psychological contents become autonomous, untouchable by the usual vectors of socialization and civilization. From society’s perspective, they become demonic – like feral children operating only on instinct. The primary instinct, Freud thought, was the pleasure principle. If it feels good, do it. For example, maximize shareholder profit. And society – the public – be damned.

Business was not only rejected by society, it made the rejection mutual. Not happy about the reception it had received, business gathered up its considerable marbles and went home. But the sting went deep and business began to develop a long-term strategy for eventual revenge. While this was (probably) not hatched as some backroom plot, it won wide acceptance across the new class of nouveau riche. The executive summary was a one-liner: We’ll show you! Unlike the powerful titans of industry portrayed in those old civics classes we all tried to ditch, the first capitalists of the modern era were embittered, embattled, defensive and paranoid. All with good reason.

If you have buckets of cash but find yourself lacking in status, what do you do? You buy some, of course. But first you have to determine what’s hot. What was hot in the 19th century was science, which had grown by leaps and bounds since the European Enlightenment had overturned ecclesiastical authority and made Man (with a capital M, no women need apply) the center of a suddenly knowable universe. It was more complex than that, of course, but hey, this is HBR not Modern Language Journal. Long story short, thanks to dudes like Voltaire and Kant and Locke (the other one), ideas like reason and rationality became a big deal. No longer were the Laws of Nature hidden deep in the Mind of God. They could be sussed out and expressed as abstract principles – which in those days were called principia,  as Latin added to the overall impression. Isaac Newton came up with a set of principia that still has people impressed. Wow. It was real science! Newton pretty much nailed the physical laws – at least  as far as physics went at that point. He didn’t know about quarks, of course, but he got equal and opposite reactions right. We all remember this one: when getting out of a rowboat, better hang onto the pier.

And the cool thing about science was that, when you applied these abstract principles, you could do nifty stuff: survey rail lines, pump oil, smelt ore, charge interest. While it’s unlikely that the robber barons spent much time reading the works of the Enlightenment’s philosophes, they did happen to live inside one of its major results: the United States. The American Revolution and U.S. Constitution were directly based on Enlightenment ideas, and on one in particular that was thought up by the philosopher business does read: Adam Smith. His big idea was an open-ended future, not cyclical, not predestined. When you applied rational thought to the world, he said, you got knowledge, and when you applied that scientific knowledge of abstract principles back to the world, you got progress. [3] Ta-da! As in General Electric’s tagline slogan in years then still to come: “Progress is our most important product.”

This was hot stuff to be sure, and like business, eminently practical. Not like that high-hat liberal education hogwash favored by the old agrarian elite. Here’s Andrew Carnegie on the subject…

“While the college student has been learning a little about the barbarous and petty squabbles of a far-distant past, or trying to master languages which are dead, such knowledge as seems adapted for life upon another planet than this as far as business affairs are concerned, the future captain of industry is hotly engaged in the school of experience, obtaining the very knowledge required for his future triumphs… College education as it exists is fatal to success in that domain.”[4]

As you probably already guessed, I did not find this quote on Carnegie Mellon University’s web site. Clearly, this sentiment was expressed before Andrew Carnegie and Cornelius Vanderbilt and Leland Stanford figured out that if you can’t lick ‘em, join ‘em – start your own  B-school! And, as the Rockefeller and Ford Foundations later proved: you can collect the whole set! Beginning with Wharton in 1898, business schools grew and prospered through the largesse of wealthy industrialists who wanted to pass along their hard-won knowledge to future generations – and by the way, prove they were every bit as good as any highfalutin lazy ne’er-do-well patrician landowner with a degree from Oxford. The current allure of the MBA, once a humble technical certificate, has proven their point. But at incalculable cost.

That may sound like a setup for my patented brand of business bashing, but it’s not. At the turn of the last century, business was only following in the footsteps of the “high culture” that had ostracized it. The power of science lay in abstraction. The trick was not to describe everything in the world, but to ignore most of it. If you can’t see the forest for the trees, then hell, clear cut! As this applied to things like the motions of the planets and what happens if you get out of a rowboat without hanging onto the pier, the approach entailed hypothesis, observation and repeatable experiment to establish facts, then employed mathematics and logic to arrive at first principles. Something like that. And what about anomalies that didn’t prove what you set out to prove? You swept them under the rug. This approach was named the Scientific Method.[5]

Because it worked so well, physics became all the rage, and every other discipline came down with a severe case of physics envy. Science was hot, and mathematically expressed abstractions were the way to show you were in on the new wave – sort of like having a dot.com address a few years ago. Any area of study that had to do with human beings was folded into an overarching category called social science. However, the social sciences were considered “soft,” as normal people were still able to read these sorts of books. This situation was remedied by the science of economics, which later evolved into econometrics. Today, only a handful of policy wonks inside the Washington beltway even pretend to understand this stuff. It is, however – and this is the important bit – a science. That’s progress for you.

This tack clearly trumped the un-quantifiable blather of economists like Thorstein Veblen, who wrote about  “conspicuous consumption” and railed against the degradation of university education at the hands of business administrators, or Max Weber, who warned of the “iron cage” inevitably created by corporate bureaucracy. For these and other sins of softness, they have since been reclassified as sociologists.

Far better was the method of Frederick W. Taylor, whose principles of “scientific management” gave business something to measure: workers. Taylor had obviously done his branding homework: not only do we get principles, we get scientific principles. The fact that there wasn’t anything remotely scientific about it didn’t matter. There were clipboards, there were stopwatches, there were graphs and charts! What more could a manager ask for? Or a business school? Taylor’s time and motion studies reduced complex operations to just two abstract dimensions, which were, for the record, time and motion (duh!). To fully leverage the power such abstraction provided, it was, of course, necessary to ignore certain potential sources of noise, such as levels of skill, and employees’ names.

While it was noticed early on that other factors might be involved in productivity – these came to be called “human factors” in honor of everything the newly re-branded “industrial engineering” had decided to ignore – this insight was itself successfully ignored for many decades until the panic attack brought on by global competition. Since then, as you know, everybody’s been empowered and we’re finally all living in a workers’ paradise.

But let’s recap. Science introduced a new level of mathematical abstraction, and this kind of abstraction was powerful for business because it supported equations, formulae from which it was possible to construct standard procedures. For instance, to maximize profit, take current inventory minus cost of sales, multiply by annualized revenues, count the legs, divide by four, and Bob’s your uncle. I don’t know. I never went to B-school. All the intractable uncountable stuff about workers and customers – the human factors – get factored out. Business becomes a paint-by-numbers puzzle-solving exercise, operations experts and bean-counters come into the corporate ascendant, and a mountain of stuff  gets mass produced and mass marketed. Science, it’s a bloody miracle! The history given here is a bit informal, but basically that’s what happened. This form of applied scientific abstraction worked. And it worked like a charm – as long as you could keep the system closed – hermetically sealed from certain human factors, sometimes referred to as “employees,” “competitors,” “prospects” and “customers.”

However, two unsuspected threats broke the magic seal: a globalized economy and global networks. “Who left the damn door open?” business lamented. But by then it was academic.

Now how, you are likely asking yourself, does this not constitute business bashing? In retrospect, it sounds pretty stupid for business to have isolated itself from the rest of the world like that, operating strictly by its own lights for its own ends. Here’s why: because this attitude wasn’t unique to business. “High culture” – the cultured and cultivated cadre who delighted in looking down on business – was doing exactly the same thing at the same time. This was called modernism, and it’s slogan was “Art for art’s sake.” Modernism wasn’t a movement. People didn’t ever really say, “No kidding? Why, I’m a modernist too!” It was more a second-order reaction to the so-called Enlightenment project that aimed for “the disenchantment the world.” Rationality and progress filtered through the experience of early industrialism with its “dark Satanic mills” led to Romanticism, which after Darwin and Freud and the karmic back-pressure from the looming First World War, had artists rather down in the mouth, you could say, about the whole reason for reason, and trying to insert a little more magic and mystery into the tour. Disenchantment, re-enchantment, then everyone got tired. Or  confused. Again, the history is informal, but I’m trying to be balanced with respect to whatever two cultures may still be left standing. God having been declared dead, Reason was supposed to fill in. But it wasn’t looking too good. Aha, someone said – perhaps T.S. Elliot or Virginia Wolfe (of whom I am not one bit afraid) – perhaps it’s aesthetics then! But not the old aesthetics. Heavens no. We need a new formalist aesthetics, by which works of art are only judged by their use of color, form, composition, balance – anything that’s inside the frame. This was the flower of “high modernism,” which coincided with and was reinforced by the New Criticism.

If this seems entirely silly, that’s because it was. If you agree, congratulations: you’ve just become postmodern!

But notice too what was outside the frame: politics, society, culture – as more broadly construed than a genteel landed gentry or an elitist literati. These earth-bound human concerns were sacrificed to an abstract conception of art that dealt with a world of pure form. How? If you have to ask, you must not be one of us. It was a club. A cult of culture.

And though it was silly to think that way, the prescription against social context was no less strange than the attitude business adopted toward abstraction at roughly the same historical moment – and  largely continues to operate under today. Through various forms of “scientific management” (Taylor supplied only the basic idea), business could ignore those messy “human factors.” Translating, this simply means people. In the hands of business, abstraction became infinitely more powerful than it did in the world of art. It enabled repeatable procedures, command that really commanded and control that had real clout – “Don’t want to play along? Fine then, Jesperson, you’re fired.” – and led up the food chain from principle to equation to formula to the big daddy of them all: algorithm. Handed off to a digital computer, there was no telling what that puppy couldn’t do.[6]

Abstraction is freedom from context. Linguistics developed context-free grammars. These enabled well-formed Chomskian expressions of the form: “Colorless green ideas sleep furiously,” thus liberating syntax from what so many still love to call “mere semantics.” The same idea was borrowed by computer science (note that, after a few years mucking about with data processing, it declared itself a science) to create context-free programming constructs. Given the right set of values populated into a relational database, and given the right algorithm to operate across them, you could prove… why, anything you wanted. Because the “right” data and the “right” algorithm were the ones that proved whatever you wanted to believe about the world. Q.E.D. And the beauty part was that you didn’t need to go ask the world in advance. If it didn’t already, you could make it behave that way. Stay inside the corporate fortress. Never get your hands dirty. Progress. Powerful stuff.

If you drank enough of this Kool-Aid, you could even get yourself to believe that artificial intelligence was just around the corner. Just a few more MIPS or mega-FLOPS. All a matter of Moore’s Law, really. Stop me if you’ve heard this one already. And I know you have. A million times. Me too. I used to work inside some of the biggest AI projects in the world. I breathed the rarefied air of (what Eisenhower called, remember) the military-industrial complex – until I came to see that artificial intelligence was nothing more than high-tech Taylorism. Intellectual work orders executed with perfect obedience – i.e., without people. I believed it until one day I asked, “So about these ‘lights-out’ robotic factories… If they catch on, who’s going to buy their output?” The response I got should have been predictable: not our area of specialization.

No less weirdly than modernist art, business convinced itself it could ignore everything outside the frame – which in the case of business consisted of Axiom One: maximize profit. If business was logically consistent, it wouldn’t give press-release lip-service to concerns about the environment or social responsibility. It would simply say: not in our charter. It has, in fact, learned to do this quite well with respect to “downsizing.” The official response to any question on that front has become: “Social contract? What social contract?” Sink or swim. There just aren’t enough life jackets to go around. Sorry, Charlie.

You want to know what burst the AI bubble? Easy: the proliferation of the personal computer. Suddenly, people couldn’t be snowed any longer by Big Science. As long as “computer” meant an exorbitantly priced mainframe, people were willing to believe we’d have flying cars any day now, that we’d be walking around in glass-domed cities wearing togas and having exciting psychotherapy sessions – maybe even sex!– with some glowing, all-knowing machine intelligence. But after the PC… are you kidding? This piece of junk? The mightiest most well-heeled corporation on the planet can’t keep it from crashing. With the Blue Screen of Death, our eyes were suddenly opened to the (ahem, cough, ahem) programming challenges that still confront the industry as a whole. Yeah, sure. However, and notwithstanding this rude awakening  – such is the indomitability of the human spirit – we did figure out how to have sex with them.

Another reason AI crashed and burned was natural language. “Natural,” as in the way real people talk. If you can dig it. Not some collection of context-free formalisms jammed into a fixed-field database operated on by however many production rules. I’ve talked to these things. Very funny. To succeed, artificial intelligence – king of the magically non-algorithmic algorithms – desperately needed to understand natural language. But it couldn’t. End of story. Game over. And what proved this beyond a doubt was that the net suddenly flooded the world with natural language. Such language didn’t fit the broken old fixed fields, and the fields couldn’t be fixed fast enough to accommodate it. Companies were used to telling suppliers and customers what was important. Now suppliers were telling them, and – can you believe it? – not even using the correct form! What’s worse, they were laughing. They were telling jokes. To paraphrase W.B. Yeats, mere anarchy was loosed upon the world. Only Yeats was not noted, as I am, for adding: groovy!

Human beings are not context-free. You tell them colorless green ideas are sleeping furiously, they want to know where. They want the URL. You tell them you can personalize your Nikes, they want to have “sweatshop” embroidered on them. You tell them no, you can’t do that – maintaining command, preserving control – and before you know it, 50 million people are rolling on the floor laughing at your pathetic attempt to follow procedure, to hew to the abstract algorithms of businessism.

Once upon a time, isolation worked. Gentlemen, select your categories – start your rule-based knowledge engines! It doesn’t anymore. There is no World of Art, no World of Business – nor of Science, Politics, Religion, Philosophy, Music, Literature. These “worlds” never existed. They are manners of speech, insubstantial abstractions. In partial evidence of this, the field Max Weber started many years ago – economic sociology – is enjoying a remarkable resurgence. Trying to understand human beings as strictly economic non-social entities doesn’t work any better then trying to understand them in strictly social non-economic terms. Many such old categories, once useful for apportioning power within (still) medieval university systems, have long since outlived their usefulness.

Donning the hat of The World of Business (or any “World of” hat) may hold the inevitable  moment of existential dread at arm’s length for a while. But ultimately, it affords little solace or protection. Telling someone, “I’m a Company Man” today is a whole lot worse than admitting you’re an idiot. Which is not far off from what people are doing online: “Hey, what do you think it’s all about?” – where “it” may represent anything from the Q3 sales quota to the meaning of life. Understanding that no one is in control, we tend to be less reticent about appearing stupid. No longer pretending to have all the answers, we tend to ask each other.

Toward the end of his life, Freud posited another instinct. He called it the thanatos syndrome. Counterbalancing the pleasure principle, the death principle. This was in a book tellingly titled Civilization and Its Discontents. When business says the health of the environment is outside its charter to maximize shareholder profits, it needs to be reminded by someone or someones who have successfully resisted the Kool-Aid of Abstraction – call them the designated drivers of the human race – that without an earth, there won’t be anyplace to put the corporate headquarters. It needs to be reminded that, without the goodwill of the people they’ve kept at bay for so long, there won’t be a business left to busy itself with. But it’s a moot point. The fact is, these people are no longer at bay. They’re out baying at the moon on the web. Or mooning your company from some webified e-commerce loading bay. Whichever shoe fits better. The logic of business has been air-tight until fairly recently. However, it was the logic of businessism – not just making and selling stuff to people, but a dysfunctional complex of primitive defense mechanisms and neurotic behaviors that have become as toxic as anything regulated by the EPA. The bad news is that refusing to let go of this mentality will surely kill your business. The good news is simpler: holding on is no longer necessary. And anyway, it’s too late.

It’s easy to see how business got itself into this morass – after all, what are business critics for? But getting out will require more than a surface understanding. How’s that? You expect me to tell you how to do it? What, you think information wants to be free? Sorry, Charlie. Buy the book. Or read the gonzo-model hints in HBR (“Smart Customers, Dumb Companies,” November-December, 2000). Action figures sold separately.


Christopher Locke (clocke@panix.com) is co-author of The Cluetrain Manifesto and author of Gonzo Marketing: Winning Through Worst Practices (Perseus Books, October 2001).

[1] From the movie Network.

[2] Des Dearlove and Stuart Crainer, “A Whole New B-School,” The Industry Standard, April 16, 2001.

[3] Christopher Lasch, The True and Only Heaven: Progress and Its Critics, W.W. Norton, 1991.

[4] Quoted in Laurence R. Veysey, The Emergence of the American University, University of Chicago Press, 1965, p. 14.

[5] Thomas Kuhn, The Structure of Scientific Revolutions (second edition), University of Chicago Press, 1970. See also Paul Feyerabend, Against Method (third edition), Verso, 1993.

[6] David Berlinski, The Advent of the Algorithm: The Idea that Rules the World, Harcourt Brace, 2000.