- Home
- I Am John Galt
Donald Luskin Page 6
Donald Luskin Read online
Page 6
Unbelievably, some employees took him to task for being greedy and not allocating more of his 80 percent share in the firm to his workers. At least one former executive, cut from the cloth of Ayn Rand herself, disagreed. “We live in a world where everyone says, ‘It’s unfair—somebody got more than me.’”38 Employees negotiated their stock options up front and were fortunate that the company even survived through lean years on the strength of Jobs’s checkbook. “Can you really blame Steve if he didn’t feel like giving them more stock than they had agreed on when they were hired?”39 Like Roark, Jobs neither gives nor asks for charity. As Roark puts it, “I am not an altruist. I do not contribute gifts of this nature.”
With future films A Bug’s Life, Toy Story 2 and 3, Monsters Inc., and many others, Pixar would rack up billions in earnings and earn the title of the most successful movie studio of all time. More important, Jobs had transformed an entire industry through his audacity and stalwart belief—both in the technologies he thought were cool and in himself. Computers would enter the mainstream of visual entertainment as a vehicle to tell enduring stories. And Jobs wasn’t done yet.
Back on Top
By 1995, Apple was on the ropes and struggling to stay standing. Customers were flocking to the latest generation of improved Microsoft Windows software and it looked like Apple might become a footnote in the annals of computer history. Sculley had been forced out in 1993 after Apple’s market share shriveled from 20 percent to a measly 8 percent under his watch. Turnaround expert Gil Amelio was installed to right the ship. He recognized that cost cutting would go only so far and began to push for a new operating system to first defend and then rebuild Apple’s market share. Increasingly convinced that the foundering in-house team was incapable of developing a solution in time, he cast about for a third-party alternative.
After analyzing the field of players, including several conversations with Bill Gates at Microsoft, Amelio decided that NeXTSTEP might be his salvation and began negotiating with a surprised but amenable Jobs on buying his company outright. Apple eventually paid $325 million in cash to the investors and 1.5 million shares of Apple stock, which went to Steve Jobs.40 Steve was also retained as a strategic adviser to Apple. NeXTSTEP would become the basis for the Mac OS X operating system and the company’s path back to profitability. But it was too little, too late for Amelio.
In mid-1997, Apple’s market share had fallen to 3 percent and the company reported a quarterly loss of $708 million. Amelio was ousted and Jobs was installed as interim president and CEO. He would work for the princely salary of $1. According to long-gone CEO Sculley, “I’m actually convinced that if Steve hadn’t come back when he did—if they had waited another six months—Apple would have been history. It would have been gone, absolutely gone.”41
Apple’s stock price would seem to agree with Sculley—as we show in Figure 1.1.
Figure 1.1 Apple (AAPL) Stock Price
Back at Apple, Jobs quickly focused on revitalizing the business. He killed the foundering Apple Newton, a clunky handheld device that was widely lampooned, including a hilarious sequence in the popular comic Doonesbury.
In August 1998, Jobs introduced the iMac, an all-in-one unit encased in a translucent turquoise plastic shell harkening back to the days of the original Macintosh. Critics called it technically unimpressive and predicted it would be hampered by its overreliance on the universal serial bus (USB) for connectivity to peripherals. Once again, the traditionalist critics were wrong. While a nascent technology at the time, USB would become truly universal, allowing standardized connectivity of keyboards, mouses, printers, and portable memory across PCs and Macs alike. The look of the iMac itself would become a design icon of the late 1990s.
At $1,299 a pop, the iMac received over 150,000 preorders and went on to sell 278,000 units in the following six weeks.42 Strong sales were reported for both first-time computer buyers and those switching from a Windows-based PC. In October, Jobs reported the first profitable fiscal year since 1995. As one Wall Street analyst remarked, “Steve pulled the rabbit out of the hat over the past year. Apple was in disarray. It was at the gate of extinction. Now we have a company turnaround.” The results would catapult Apple back into the mainstream computer market from nearly perishing roadside as an also-ran.
With the company stabilized and on the road to recovery, some CEOs might rest on their laurels and collect a fat bonus. But not Steve Jobs. He doesn’t see himself so much as a business executive as an artist always taking new creative risks. In a Fortune interview he explained, “If you look at the artists, if they get really good, it always occurs to them at some point that they can do this one thing for the rest of their lives, and they can be really successful to the outside world but not really be successful to themselves. That’s the moment that an artist really decides who he or she is. If they keep on risking failure, they’re still artists. Dylan and Picasso were always risking failure.”43
By the end of the decade, music companies were struggling to confront a changing technological landscape. Clinging to old-school models of physical distribution channels, they were helpless in the face of a burgeoning network of Internet connectivity. In the past, music buyers might dub a copy or two of their favorite songs to give to friends. Now the same music buyers could “rip” a CD into a digital file and share it with a worldwide network of millions with the click of a mouse. Why buy a CD when you could get the music for free through file-sharing services like Napster?
For Jobs, music held a special place in his heart, along with respect for intellectual property. He could see the problems emerging in the music industry and was appalled at the spastic response by the record companies. On one hand, they attempted to crack down on criminal pirates—often a kid in a dorm room who was just enthusiastic about music and was listening to emerging artists. On the other hand, they offered restrictive subscription services on a pay-by-the-month model. Jobs saw a middle path and set out to change the landscape.
As he explained to Rolling Stone, he set up meetings with record executives. First, he made it clear that he respected the primacy of intellectual property rights—what individualist wouldn’t? “If copyright dies, if patents die, if the protection of intellectual property is eroded, then people will stop investing. That hurts everyone. People need to have the incentive so that if they invest and succeed, they can make a fair profit. But on another level entirely, it’s just wrong to steal. Or let’s put it this way: It is corrosive to one’s character to steal. We want to provide a legal alternative.”44
Next, he demolished their digital business model. “We told them the music subscription services they were pushing were going to fail. Music Net was gonna fail, Pressplay was gonna fail,” Jobs would say. “Here’s why: People don’t want to buy their music as a subscription. They bought 45s, then they bought LPs, they bought cassettes, they bought 8-tracks, then they bought CDs. They’re going to want to buy downloads. The subscription model of buying music is bankrupt. I think you could make available the Second Coming in a subscription model, and it might not be successful.”45
Finally, Jobs described the middle path. He would offer an Apple music store. It would be safe from viruses; it would be fast; and it would be high-quality, inexpensive, flexible, and, best of all, legal. In a way only Jobs’s mind could synthesize, he struck an elegant balance between artists’ rights and customer usability. Once you bought a song, you owned it. You could burn it onto a CD; you could play it directly from your computer or portable device. You could even share it with a few friends. But the embedded technology would prevent mass distribution and wide-scale pirating. At $0.99 per song, it was affordable—an impulse item—yet artists were compensated for their work. It was brilliant.
And in some ways it took a figure as big and trusted as Jobs to move the industry seized in paralysis as it faced the technological future. Only he had the clout, the appreciation, and the respect to pull an entire industry toward a visionary future. By the end of the
decade, Apple iTunes would be selling over a quarter of all music in the United States. Jobs again had rescued and transformed a moribund industry—just because it was cool.
To Infinity and Beyond
What does the future hold for Steve Jobs? His problems with his health are well-known, but as of this writing he’s been able to cheat death as brilliantly as he’s been able to overcome technology and business challenges throughout his life.
Someday death will come to him, as it must to all of us. What he’s built for the world will make him an immortal figure in the history of technology and business. But he’s immortal in another sense, in the way that all self-motivated and self-consistent people are—that they don’t die a little bit every day by compromising themselves, that during their lifetimes they truly live.
What a fellow artist said of Howard Roark in The Fountainhead might have been said of Steve Jobs: “I often think he’s the only one of us to achieve immortality. I don’t mean in the sense of fame, and I don’t mean he won’t die someday. But he’s living it. I think he is what the conception really means.”
Chapter 2
The Mad Collectivist
Paul Krugman as Ellsworth Toohey, the man who preaches socialism from the pages of America’s newspaper of record
“We’ve fixed the coin. Heads—collectivism, and tails—collectivism. . . . Give up your soul to a council—or give it up to a leader. But give it up, give it up, give it up. My technique . . . don’t forget the only purpose you have to accomplish. Kill the individual. Kill man’s soul. The rest will happen automatically. Observe the state of the world at the present moment. Do you still think I’m crazy . . . ?”
—The Fountainhead
Who is Ellsworth Toohey?
In The Fountainhead, villain Ellsworth Toohey symbolizes the collectivist, in contrast to the hero, Howard Roark, who symbolizes the individualist.
Toohey was a brilliant and articulate—but sickly and puny—child, raised in an impoverished household by a weak father and an overprotective mother. He was envious of his wealthier and stronger classmates, and he used his sharp mind and even sharper tongue to undermine them.
After going through a brief religious phase as a teenager, Toohey becomes an avowed socialist. In adulthood, he drops any overt association with socialist politics, but dedicates his life to promoting collectivism gradually, through his growing influence as a public intellectual.
He writes a book on architecture throughout history, and improbably it becomes a best seller. Toohey parlays that into a regular column in New York’s leading newspaper, the New York Banner—ostensibly for architectural criticism, but quickly evolving into a personal soapbox from which he promotes all manner of collectivist causes.
Beyond obvious advocacy, Toohey’s strategy with the column is to promote architects and other artists—authors, composers, playwrights—of no ability, to enshrine mediocrities as superstars. His goal is to corrupt the culture—to advance collectivism by default, by eliminating from the culture any great individuals who could have offered an alternative.
Toohey singles out the brilliant architect Howard Roark as his most dangerous opponent—an exemplar of the greatness of the individual, not the collective. To defeat Roark, he masterminds a series of complicated plots aimed at discrediting Roark and economically ruining him, at one point causing him to abandon architecture and work as a laborer in a quarry to survive.
Throughout, Roark never lifts a finger to either fight Toohey or defend himself against him. Toohey is simply beneath Roark’s notice, demonstrating Rand’s belief that evil is small and impotent, and best simply ignored.
Christiane Amanpour’s eyes darted back and forth in fear and her mouth twisted in disgust, because she could see where this was going. A guest on her Sunday morning political talk show, ABC’s This Week, was getting dangerously overexcited, and something very regrettable was about to happen.
She could see that he was winding himself up as he talked about how a recent deficit-reduction panel hadn’t been “brave enough”—because it failed to endorse the idea of expert panels who would determine what medical services government-funded care wouldn’t pay for.1 When ObamaCare was still being debated in Congress, conservative spokeswoman Sarah Palin had created a media sensation by calling them “death panels,” causing most liberals who supported ObamaCare to quickly distance themselves from any idea of rationing care as being tantamount to murder.
Cut to Amanpour’s horrified face. Cut back to the guest. Then it happened.
The guest said, “Some years down the pike, we’re going to get the real solution, which is going to be a combination of death panels and sales taxes.”
It was all the more horrifying because the guest was not a conservative, not an opponent of ObamaCare. This guest was an avid liberal, a partisan Democrat, and an enthusiastic supporter of government-run health care. He was endorsing death panels, not warning about them. He was saying death panels are a good thing.
Apparently it didn’t bother him that his choice of words, “real solution,” had historical parallels that are very disturbing in any conversation about government control over who will live and who will die.
And it was even more horrifying because of who this guest was. This was no fringe lefty wearing a tinfoil hat churning out underground newspapers in his parents’ basement. This was an economics professor at Princeton, one of the country’s most prestigious universities. This was the winner of the Nobel Prize in economics, the highest honor the profession can bestow. This was a columnist for the New York Times, the most influential newspaper in the world.
This was Paul Krugman, live, on national television, endorsing government control over life and death. And while we’re at it, let’s raise taxes on those who are permitted to live.
The Abysmal Pseudo-Scientist
Who does Paul Krugman think he is to think such things, never mind say them on television?
He’d like to think he’s John Maynard Keynes, the venerated British economist who created the intellectual framework for modern government intervention in the economy. Keynes is something of a cult figure for modern liberal economists like Krugman, who read his texts with all the exegetical fervor with which Scientologists read the pulp fiction of L. Ron Hubbard. But Krugman will never live up to Keynes. However politicized his economic theories, Keynes’s predictions were so astute that he made himself wealthy as a speculator. Economics is called “the dismal science,” but as we’ll see, Krugman’s predictions are so laughably bad his economics should be called the abysmal pseudo-science.
If Krugman is not Keynes, maybe he’s John Nash, the mathematician portrayed in the film A Beautiful Mind. They’re both Princeton economists. They’ve both won the Nobel Prize in economics. And they’re both bonkers. In Krugman’s own words, “My economic theories have no doubt been influenced by my relationship with my cats.”2 But so far there’s been no movie about Krugman, just a cameo appearance as himself in the lowbrow comedy Get Him to the Greek in which his lines consist of “Yeah,” “Thank you,” and “Oh boy.”
As a boy, Krugman says his “secret fantasy” was to be Hari Seldon, the “psychohistorian” from science fiction author Isaac Asimov’s Foundation trilogy, who used what we would now call econometrics to secretly control the progress of human civilization.3 This inspiration is what drew Krugman to study economics, as he has revealed more than once, as though he were proud of it.4
Maybe in practice he’s more like Dr. Strangelove, the dark Hari Seldon, the cold war madman of Stanley Kubrick’s film masterpiece. They both have near-genocidal notions of how government should determine who lives and who dies—especially what kind of experts they should consult in the decision. Krugman is nostalgic for the cold war era when “the U.S. government employed experts in game theory to analyze strategies of nuclear deterrence. Men with Ph.D.s in economics, like Daniel Ellsberg.”5 Maybe you thought real men don’t have PhDs in economics. But Krugman does.
Bu
t Paul Krugman isn’t Keynes, Nash, Seldon, or Strangelove—as much as he’d like to be. The indisputable truth is he is the living embodiment of Ellsworth Toohey, the villain from Ayn Rand’s first great novel, The Fountainhead.
Krugman mocks people who have been inspired by Rand,6 but he himself is living Rand with every breath he takes. Truly, the parallels between Krugman and Toohey are downright eerie.
Both are acclaimed scholars who have written books for the masses on complex topics—architecture for Toohey, and economics for Krugman.
For both, their erudite texts were the springboard for becoming public intellectuals with high-profile opinion columns in a major New York newspaper—the New York Banner for Toohey and the New York Times for Krugman.
Both are bleeding-heart socialists. Krugman confesses he is “an unabashed defender of the welfare state, which I regard as the most decent social arrangement yet devised.”7 He advocates “a state that offers everyone who’s underpaid an additional income.”8
Both Toohey and Krugman hate the rich. Krugman questions their very right to live, pronouncing that they must be “defenders of the downtrodden” to have any hope of “justifying their existence.”9
Both exalt the incompetent at the expense of the competent. Krugman sniffs, “The official ideology of America’s elite remains one of meritocracy. . . . But that won’t last. Soon enough, our society will rediscover . . . the vulgarity of talented upstarts.”10
Both tell lies. Toohey warns, “I shall be forced to state a few things which are not quite true” in the service of his collectivist mission. Daniel Okrent, the Public Editor of the New York Times, rendered the judgment in the pages of the Times itself that “Paul Krugman has the disturbing habit of shaping, slicing and selectively citing numbers in a fashion that pleases his acolytes but leaves him open to substantive assaults.”11 But we shall see subsequently that when it comes to lies, damn lies, and statistics, Krugman chooses all of the above—including going on national television and falsely accusing his most trenchant critic of stalking him.