1-Page Summary

Steve Jobs, as co-founder and CEO of Apple, was a guiding force in the electronics industry, from the development of the first home computers in the 1970s to the world-shaking impact of smartphones and tablets. The decisions Jobs made and the passions he followed had a direct impact on the way we use computers and share information. To understand Steve Jobs’s life is to understand why the modern information landscape functions the way it does.

While much has been written about Jobs’s life, this biography was commissioned by Jobs himself. He wanted his own point of view on the record, but he didn’t want a “puff piece” that would glorify his achievements while glossing over his negative side. Instead, he tapped well-known biographer Walter Isaacson to write a book that would balance his accomplishments with an unflinching look at his personal demons. In an uncharacteristic move for the usually domineering Jobs, he allowed Isaacson full access to his life with no oversight or control of what was written. Indeed, Jobs passed away without ever having read it.

(Shortform note: Walter Isaacson first met Steve Jobs while working as a journalist for Time. When Jobs reached out about writing his biography, Isaacson had already written books on Henry Kissinger, Benjamin Franklin, and Albert Einstein. This biography was released in 2011 shortly after Steve Jobs’s death. It was a critical and commercial success, though several of Jobs’s family and friends were unhappy with Isaacson’s portrayal. The book was adapted into a 2015 film by Danny Boyle starring Michael Fassbender as Jobs.)

This guide will give an overview of Jobs’s life, from his years growing up in Palo Alto, California to his ups and downs at Apple and his battle with cancer. We will then dive deeper into the specific themes that Isaacson drew from the narrative of Jobs’s life—what made him unique as a technological innovator; the abusive, controlling dark side of his personality; and the values and vision that set Jobs apart as a leader in the corporate world.

In addition, this guide will also explore Jobs’s legacy as an industry giant, the wider context that shaped both his life and the Information Age, as well as other writers’ views on Jobs’s impact. Because Isaacson’s biography was published shortly after Jobs’s death, we’ll examine how Jobs’s legacy lives on in the years that have followed its release.

Part 1: Steve Jobs’s Life

Steve Jobs was as much a product of his time as he was one of its chief architects. His rise to fame and tumultuous life mirrored the whirlwind of the digital revolution. Jobs came of age on the cusp of the 1970s’ computer boom, and he guided many of the 1980s’ advances. After falling out of the limelight in the ’90s, he surged back to prominence at the turn of the millennium with innovations that still impact the digital landscape to this day.

Formative Years (1955-1974)

Jobs would not have become who he was without the unique combination of circumstances that defined his early life. His upbringing shaped the path he would follow, from his childhood in Silicon Valley to his rebellious years in college and his fascination with electronics and design. His interest in computers led to a particular meeting of minds that would shape the future of the digital world.

On February 24, 1955, Steven Paul Jobs was born to Joanne Schieble and her boyfriend, Abdulfattah Jandali. Schieble’s parents objected to her dating a Muslim and insisted she give her child up for adoption. Steve was taken in by Paul and Clara Jobs, a couple in Palo Alto, California. Isaacson makes it clear that even though Jobs always knew he was adopted, he considered his adoptive parents to be his real parents. Because they’d chosen to give him a home while his birth parents gave him away, Jobs grew up feeling both abandoned and special.

(Shortform note: Research shows that even children adopted as infants experience separation trauma from their biological parents, which can manifest as feelings of abandonment and shame and result in behavioral problems. Studies have revealed that children heal from the separation better when their parents tell the story of their adoption. While Steve Jobs’s adoption was closed—he had no contact with or information about his biological parents—Isaacson says that Paul and Clara Jobs emphasized to him that they chose to be his parents.)

Jobs grew up in Silicon Valley, surrounded by people who worked for Hewlett-Packard, Intel, and other pioneering companies in the field of computer technology. The young Steve Jobs was fascinated by the history of the place, and early on he developed into an electronic tinkerer. Isaacson recounts that while still a high school freshman, Jobs needed help with an electrical project, so he looked up the CEO of Hewlett-Packard in the phone book and called him at home to ask for parts.

(Shortform note: The San Francisco Bay area has been associated with technological progress since the early years of the 20th century, when it became home to Hewlett-Packard, Bell Labs, and the Ames Research Center. The ’60s and ’70s saw an explosion of computer companies in the region, such as Intel, Atari, and Xerox PARC, while the nearby Stanford Research Institute became part of ARPANET, the Internet’s precursor. The name “Silicon Valley” was coined in 1971 by journalist Don Hoefler of Electronic News.)

However bright a student he was, Jobs had a rebellious streak. In school, he was a prankster, but his parents always defended him, believing that his teachers didn’t challenge him enough. He also began experimenting with marijuana, LSD, and the counterculture movement, which would inform many of his attitudes in life. (Shortform note: The counterculture of the ’60s and ’70s sprang up in the midst of the Vietnam War, the sexual revolution, the civil rights movement, and the spread of eastern philosophies. Characterized as “hippies,” members of the movement were predominantly anti-war, anti-capitalist, and against societal norms in general.)

In 1971, Jobs met Steve Wozniak, who was five years older but still a child at heart. Jobs and Wozniak bonded over electronics, as well as their love of pulling pranks. After reading about a way to hack the telephone network, Wozniak designed a circuit that he and Jobs used to make long-distance prank calls. Jobs hit on the idea of selling copies of the circuit to students at Stanford. Isaacson points out that this business model would later become the model for Apple, with Wozniak designing the system and Jobs responsible for packaging and marketing.

Steve Wozniak

Wozniak, like Jobs, was a California native and electronics prodigy who initially dropped out of college, though he did eventually earn a degree from UC Berkeley. Wozniak designed the circuit board that would become the core of the first Apple computer. He went on to design the Apple II computer, over which he and Jobs clashed. Wozniak wanted its design to be open, with multiple expansion slots for hobbyists and tinkerers. Jobs favored a more closed design, but Wozniak’s wishes won out.

After a plane crash in 1981, Wozniak suffered a temporary coma and anterograde amnesia. He returned to Apple for a short while in the ’80s, but left to pursue his own design projects, such as inventing the first universal remote and the technology underlying GPS systems.

After graduating from high school in 1972, Jobs enrolled in Portland, Oregon’s Reed College. (Shortform note: Reed College’s ties to the counterculture movement stretch all the way back to the 1950s, when it was a launching point for poets of the Beat Movement such as Gary Snyder, Philip Whalen, and Allen Ginsberg. Reed College also had a reputation for lax policies regarding drug use on campus.)

While diving head-first into the early ’70s counterculture, Jobs developed an interest in Zen Buddhism, with its stark, minimalist values that he would keep over the years. He would also develop a fixation on extremely limited vegan diets and extended periods of fasting. (Shortform note: Single-food diets, while promoting short-term weight loss, have been shown to cause nutritional deficiencies and encourage unhealthy eating. The single-food diets Steve Jobs preferred were fruitarian, a subset of veganism restricted to fruits, nuts, and seeds.)

Jobs enjoyed his time at Reed, but he detested taking required classes. He eventually dropped out but persuaded the school to let him audit classes he was interested in for free. After leaving school for good, Jobs traveled to India, where he hoped to continue his spiritual pursuits, diving into self-deprivation and asceticism. (Shortform note: In Becoming Steve Jobs, Brent Schlender and Rick Tetzeli explain that after college, Jobs felt tempted to lead the lifestyle of a monk, though he was far too ambitious. He was attracted to the spiritual aspects of asceticism found in Hindu beliefs and was also drawn to Buddhism’s quest for perfection.)

The Birth of Apple (1975-1977)

After coming home from India, Jobs reunited with Wozniak at a time when a larger network of computer hobbyists was forming. This wider community gave their partnership a springboard from which they would make a lasting mark on the world. Jobs and Wozniak went from nobodies to industry-shaking pioneers so quickly that it’s almost hard to imagine.

(Shortform note: A crucial meeting place for California’s growing electronics community was the Homebrew Computer Club, founded in Menlo Park by political activist and hobbyist Fred Moore. In two years, its members grew from 32 attendees to a mailing list of over 1,500.)

At a meeting of the Homebrew Computer Club, Wozniak watched a demonstration of the new Altair 8800 computer, the first home computing system to include a microprocessor chip. Seeing the Altair gave Wozniak the idea of putting a microprocessor into a terminal he was currently designing. It worked, and Wozniak became the first person ever to type on a home computer keyboard and see text appear on a screen.

The Revolutionary Altair

Prior to the invention of the microprocessor, computers were large, expensive mainframes owned by universities, corporations, or governments. Instructions would be fed to the mainframe using punch card systems or teletype terminals. A computer that could fit in a home, or even on a table, was unthinkable.

The Altair 8800 was a do-it-yourself computer kit developed by a team of Air Force veterans. It included an internal 8-bit microprocessor, and its interface was a panel of switches and lights. In 1975, Popular Electronics hailed it as “the most powerful computer ever presented.”

Wozniak wanted to share his idea with the other Homebrew members, but Jobs convinced him that they could sell pre-printed circuit boards instead of giving the design away for free. The two scraped together their savings to fund their new partnership, which Jobs dubbed “Apple Computers” after an apple orchard commune he’d worked at in college.

(Shortform note: One of the keys to Apple’s success is the concept of emotional branding. Instead of merely advertising the quality of its products, such branding promotes good feelings in customers. According to Isaacson, this goes all the way back to Jobs’s choice of the company’s name. The juxtaposition of “Apple” and “Computer” conveys a feeling of playfulness and fun.)

Wozniak and Jobs introduced the Apple I at the Homebrew Computer Club, where it caught the attention of a local computer dealer. Jobs’s parents' garage became their first factory, where friends and family became the assembly line to manufacture the Apple I circuits.

Already thinking ahead, Jobs decided their next model should be a fully integrated system that required no assembly by the user. His intent was to take the personal computer from a hobbyist’s toy to a product for the general public. He and Wozniak went into business with Mike Markkula, formerly at Intel, to fund Apple’s expansion and production costs. (Shortform note: Mike Markkula, Apple’s unsung third founder, was often thought of as “the adult in the room” to mediate between Jobs and Wozniak, though he was also a programmer and engineer.)

The Apple II debuted in 1977, kicking off the personal computer industry. (Shortform note: While Apple may have launched the personal computer into the public eye, it was not the only company in the market. Tandy’s TRS-80 was released a month after the Apple II, with the Commodore PET following in December. All three ran versions of the programming language BASIC, and are collectively known as the 1977 Trinity.)

Introducing the Mac (1978-1984)

Even with Apple’s initial success, new computer systems were spreading like wildfire, and Jobs kept his finger to the wind, looking for the next move the industry would make. Not only would Jobs lead the next revolution in computer design, but he would also take a stand in what would become the defining contest within the computer world of the 1980s.

In the late ’70s, Xerox was developing a Graphical User Interface (GUI) that allowed users to operate a computer by activating icons on a “desktop” with a mouse. Xerox wanted shares in Apple’s next expansion, so in 1979 Jobs was given a demonstration of what their engineers were working on. According to Isaacson, Jobs was able to bully his way into seeing more of Xerox’s projects in development than their researchers wanted to show him.

Under Jobs’s guidance, Apple’s engineers improved the desktop concept by enabling the mouse to drag and manipulate icons, files, and folders. Many people agree that Jobs stole the GUI concept, but Isaacson points out that Xerox shares the blame for not making better use of its own technology. (Shortform note: Xerox released its Star 8010 computer in 1981, three years before the Macintosh. It introduced many features that are commonplace today, most notably its graphical interface and mouse. However, at $16,000 each, the Xerox Star was marketed solely to large businesses.)

Meanwhile, an industry-defining battle was brewing. Apple’s rival in business computing, IBM, allowed independent companies to make clones of its products and develop their own software to run on its systems. Jobs refused to let Apple do the same. Isaacson says that jobs believed hardware and software worked better together when developed in a closed, tightly controlled system—a “whole widget” approach that would define Apple’s products. While less idealistic, IBM’s open system allowed its designs to grab a larger share of the market.

The IBM Compatible

IBM entered the PC market in 1981 with the IBM 5150, designed in conjunction with Atari and Microsoft. It was faster and had more memory than other computers and was built with an open architecture that let users swap out parts and expand its capabilities. Bill Gates and Microsoft developed the computer’s command-prompt operating system, MS-DOS, which it licensed to IBM, but did not sell.

Because of IBM’s open design, other companies were able to manufacture fully compatible copies that would use the same parts and run the same software. And since IBM didn’t have exclusive rights to MS-DOS, its competitors were able to license that as well. Three former employees of Texas Instruments were the first to do so, releasing the Compaq Portable in 1983. Hewlett-Packard and Dell soon followed suit, and by 1985, IBM Compatibles swept into both the home and business markets, where the cheaply made clones outsold IBM’s originals.

With IBM already claiming victory on the PC battlefield, Jobs prepared a dramatic unveiling of Apple’s new graphics-based computer, the Macintosh. Kicking off the campaign was a cinematic Superbowl commercial that depicted IBM as a dystopian Big Brother and Apple as a daring rebel. (Shortform note: Apple’s 1984 Super Bowl commercial, directed by Ridley Scott of Alien and Blade Runner fame, was actually opposed by Apple’s board of directors. The commercial was so striking that it was replayed on news programs, won several awards, and began the trend of companies using Super Bowl time slots to feature cinematic, attention-grabbing ads.)

The commercial was followed by a torrent of publicity leading up to the computer’s public unveiling. At the climax of the event, the Macintosh introduced itself through a voice-simulator program. (Shortform note: As an example of the press that the Macintosh received, Popular Science featured Apple’s new creation in its March 1984 issue, praising its versatility and ease of use, as well as its capabilities in word processing and graphics.) The Mac’s graphical interface was elegant and intuitive, compared to command-based systems such as MS-DOS and BASIC. The rest of the computer industry now had to play catch-up.

Apple vs. Windows

Microsoft had already developed software for the Mac, such as versions of Microsoft Word and Excel. Bill Gates was such a fan of the Mac’s GUI that he petitioned Apple to license its OS to other computer systems. Jobs demurred, but Apple’s contract with Microsoft licensed Gates to incorporate Apple design elements into Microsoft’s future products. Microsoft came out with its own graphical interface in 1985, though it wouldn’t become a viable competitor to the Macintosh OS until the release of Windows 3.1 in 1992.

Nevertheless, so many of Windows’ features mimicked those of the Macintosh that Apple filed an infringement lawsuit in 1988. Microsoft had interpreted its agreement with Apple broadly, and since their contract was not specific in its limitations, the courts ruled heavily in Microsoft’s favor.

From Pitfalls to Pixar (1985-1995)

After the triumph of the Mac’s introduction, Jobs would face a series of defeats. While his visionary insights had brought him success, his unrelenting perfectionism and need for control undermined his ability to lead. When he lost control of Apple, his following ventures would prove unsuccessful until he found a way to merge his passion for design with his love of the arts.

After the initial surge of Macintosh purchases, sales began to fall off. Despite the computer’s intuitive operating system, its processor was slow and it lacked an internal hard drive. In addition, Isaacson says that Jobs started making poor business decisions, such as trying to oust Apple’s then-president. Engineers on the Mac team began to resign, and Apple’s board rearranged the company’s structure so that Jobs was stripped of any real power. Jobs left the company to found another of his own, taking several high-ranking colleagues with him.

NeXT, his new business, was intended to create high-end workstations for use in academia. But according to Isaacson, running his own business with no supervision let Jobs indulge all his worst impulses. His perfectionism went to such extremes that the NeXT was released two years behind schedule and was priced three times what he’d promised. Even worse, it wasn’t compatible with any other computer on the market. As could be predicted, it flopped. (Shortform note: In academia and business, NeXT was eclipsed by Sun Microsystems workstations, which were IBM compatible, ran the UNIX operating system, and enabled networked file sharing.)

However, while this was taking place, George Lucas needed to sell off his computer graphics division. Isaacson says that Jobs wanted to buy it outright but instead became its principal investor. Renamed “Pixar,” the new company integrated software and hardware to make art, which touched on all of Jobs’s passions. (Shortform note: Pixar began life as the Graphics Group, part of Lucasfilm’s computer division. When its founders spun it off into its own corporation, the technology needed to make a fully computer-animated film was still several years away. Instead, their main product was the Pixar Image Computer, which was marketed for government, scientific, and medical use.)

Pixar’s graphics systems were too costly for the mass market. To show off what their computers could do, Pixar’s head of animation, John Lasseter, created a digital cartoon, Luxo Jr. The film was only meant to be shown as a demo at a conference, but it took the computer graphics world by storm. (Shortform note: John Lasseter pioneered computer animation when he created the first fully CGI character in the 1985 movie Young Sherlock Holmes. His film Luxo Jr. began as a rendering of the lamp on his desk, which he infused with personality and humor. Luxo Jr. was nominated for Best Animated Short in the 1987 Academy Awards.)

Disney was Pixar’s biggest customer for hardware and software, so Jobs proposed that Pixar and Disney make a film together. That movie, Toy Story, went on to such enormous success that it redefined the field of animation itself. (Shortform note: In addition to putting Pixar on the map, Toy Story was a game-changer for the film industry. It began a trend away from hand-drawn animation while elevating animated features to stand beside traditional live-action films. In Creativity, Inc., Pixar cofounder Ed Catmull recounts that Toy Story set the bar for Pixar’s next film very high.)

Jobs’s Family Life

As Jobs’s career went through its ups and downs, so did his personal life. Just as Jobs himself had grown up feeling both abandoned and loved, so he too would latch on to some of those close to him while treating others with coldness and dispassion. Isaacson describes how Jobs would end up abandoning one family, reconnecting with another, and finally making a family of his own.

Outside of his loving relationship with his adoptive parents, Jobs’s family life was complex. In 1977 he had a daughter, Lisa, with his girlfriend Chrisann Brennan. Isaacson writes that even after a court-mandated paternity test, Jobs remained in denial about being Lisa’s father, abandoning her in much the same way that his own birth parents had abandoned him. (Shortform note: The 1970s saw a sharp increase in the number of children growing up without a father. Though Jobs grew up with two parents present, it’s not uncommon for people who experience abandonment in childhood to sabotage their own relationships as adults.)

In 1980, Jobs began the search for his biological mother, now Joanne Simpson, though he didn’t reach out to her until after his adoptive mother died in 1986. When they finally met, Simpson explained the details of Jobs’s birth and adoption, while apologizing for giving him up. Jobs also discovered that he had a sister, the novelist Mona Simpson, who eventually grew to become his close friend. When Mona tracked down their biological father, Jobs was stunned to learn that he’d met him in passing at a restaurant his father had owned. Nevertheless, Isaacson says, Jobs didn’t want to meet him again as his son.

(Shortform note: Jobs’s biological sister, Mona Simpson, is a professor of English at UCLA and the author of several novels. Her award-winning debut, Anywhere but Here, is a fictionalized account of her relationship with her mother, while her later novel, A Regular Guy, is a thinly veiled depiction of Steve Jobs himself.)

Jobs met his future wife, Laurene Powell, at a lecture he gave at Stanford University in 1989. According to Isaacson, she was a perfect match for his traits, balancing his negative aspects and providing an anchor for his personality. They married in March 1991, and together had three children—Reed, Erin, and Eve. (Shortform note: Laurene Powell-Jobs earned an MBA from Stanford University after working for Merrill Lynch and Goldman Sachs. She has founded several philanthropic organizations, such as the XQ Institute and the Emerson Collective, which support education and social entrepreneurship.)

A Triumphant Return (1996-2010)

In Jobs’s absence, Apple’s market share and profitability plummeted. Jobs made a deal to come back in an advisory capacity, along with NeXT’s key engineers and programmers. Soon it became clear that Jobs was running Apple from behind the scenes. Jobs’s priority was to cull Apple’s bloated product line so the company could focus on essentials.

The first new project he developed was the iMac, an intuitive, all-in-one computer that took the company back to the values of the Macintosh. In 1998, the iMac became the fastest-selling Apple product so far. (Shortform note: The original iMac is fondly remembered because of its eye-catching design, available in an array of bright colors in contrast to the generic beige towers of other PCs. It included a built-in ethernet port for easy internet access while doing away with the floppy disc drive. It also replaced plug-ins for older, legacy connectors with the now-standard USB ports for easy plug-and-play access.)

Instead of selling iMacs in the big-box stores of the day, Isaacson claims that Jobs wanted to control the experience of buying an iMac. He pushed for the creation of the Apple Store, which would be specifically designed to communicate the innovative values of the company. When they first opened in 2001, the stores were predicted to fail. Instead, they drew 20 times as many customers as expected and made over $1 billion within the first year of operation.

(Shortform note: Apple's rival Gateway was the leading computer retail chain during the internet boom of the ’90s. One key difference between Apple and Gateway was that Apple’s stores stocked finished products, while Gateway’s did not. Gateway instead offered customers a range of customizable options, from which they could order PCs that were tailored to their personal specifications. While Apple Stores flourished, Gateway shut down its retail outlets in 2004.)

Next on Jobs’s list of new products was a portable MP3 player. The iPod propelled sales of iMacs and gave the Apple brand a more youthful feel. What’s more, says Isaacson, the iPod repositioned Apple from being a mere computer company into a wider market. (Shortform note: The iPod also helped usher in the age of randomized personal playlists and broke apart the album as the primary model for listening to music. By introducing the iTunes store as a simple, seamless way to download songs, Jobs helped stem the tide of music piracy that flourished through file-sharing apps such as Napster.)

Seeing the possibilities of merging the iPod with cellular phones, Jobs launched the iPhone as a combination mobile phone, widescreen iPod, and portable internet device all in one, while replacing the clunky keypads of earlier smartphones with an elegant touchscreen interface. By the end of 2010, over 90 million iPhones had sold, cornering half of the global market. (Shortform note: The iPhone was not the first touchscreen phone—that was the LG Prada, released one month earlier. The iPhone’s screen was larger, with higher resolution, and introduced multi-touch features such as “pinching” to zoom in and out.)

Battle With Cancer (2003-2011)

By his mid-40s, Jobs had already revolutionized the field of computing several times over. Isaacson says that Jobs feared he’d die young and felt driven to accomplish as much as he could within the time he was given. In this, he was prescient.

In 2003, at the age of 48, a tumor was found on his pancreas. Though it was discovered soon enough for treatment, Jobs declined to have surgery and instead insisted on fighting the cancer with alternative medical treatments. (Shortform note: The mass his doctors detected was a pancreatic neuroendocrine tumor, which is rare but more easily treated than other pancreatic cancers. If discovered before they spread, such tumors can be completely removed, although recovery can be long and arduous.) It took nine months for those closest to Jobs to convince him to have the tumor taken out, but in the intervening months, the cancer had spread.

Jobs kept the fact that he still had cancer secret, especially from Apple’s employees and shareholders. By 2008, it had spread to his liver and his illness could not be kept hidden any longer. Apple’s stock went down, and speculation about his health ran wild. (Shortform note: According to Reuters, rumors of Jobs’s failing health began after his emaciated appearance at an iPhone demo in June 2008. Jobs continued to brush off the topic of his health at events in September and October. When he announced that he wouldn’t appear at the next Macworld conference, Time speculated about Apple’s viability without the presence of its charismatic leader.)

At first, Jobs was resistant to having a liver transplant, but when one became available in March 2009, he flew to Memphis for the surgery. The transplant was a success, but the doctors found signs that the cancer had spread even further. Jobs pressed forward on several new projects, such as designing a new corporate campus for Apple. He admitted to Isaacson that if he didn’t have something to work on, it would seem as if he were giving up on life.

Jobs's final downturn began in late 2010. Always on the cutting edge, Jobs was one of the first people to have their cancer cells genetically sequenced, which allowed doctors to tailor his treatment instead of relying on broad-range chemotherapy. (Shortform note: In the years since Jobs’s diagnosis, gene-sequencing has continued to advance cancer research, helping identify the genetic mutations that lead to the growth of tumors. Making use of these advances, doctors can use molecularly targeted therapies to treat a variety of cancers while reducing the dangerous side effects of other approaches.)

By July 2011, the cancer had reached his bones. His last act at Apple was to arrange a smooth transition of power to those who would carry on in his footsteps. Steve Jobs passed away at the age of 56, surrounded by his family, on October 5, 2011.

Part 2: Steve Jobs’s Legacy

Steven Paul Jobs was a man of contradictions. He was an idealist, a leader, and a visionary pioneer who earned fame and admiration for his many innovations. At the same time, he was often needlessly cruel—a demanding bully with a black-and-white worldview who insisted that everyone and everything live up to his high expectations. Isaacson argues that Jobs’s positive and negative aspects were inextricably linked, and all contributed to shaping a man who was adored and reviled, perhaps in equal measure. To make his case, Isaacson explores six themes that define how Jobs is remembered—his knack for innovation, his black-and-white thinking, his drive to merge technology with design, his dictatorial leadership style, his emphasis on product over profit, and his denial of any unwelcome reality.

Radical Innovation

Isaacson claims that what set Jobs apart as a technological pioneer wasn’t any skill as an engineer or programmer but his ability to imagine the future before it arrived. His mindset was fostered by the anarchistic counterculture of the 1960s and ’70s, in which rules and norms were routinely circumvented. Jobs embraced the counterculture’s emphasis on spiritual growth, which led to his belief that intuition is more powerful than intellect. His confidence in the strength of his own intuition let him see possibilities that others would miss.

(Shortform note: Jobs was not alone among radical thinkers in valuing the power of intuition. Albert Einstein had much to say on the subject, including his famous quote, “Imagination is more important than knowledge.” In Blink, Malcolm Gladwell describes how “snap judgments” can be made more effective, and he argues that intuitive decision-making can often be more powerful than a drawn-out, conscious thought process.)

To illustrate Jobs’s forward-looking imagination, when he was first shown Xerox’s graphical user interface, he didn’t just see it as a clever new system. Rather, says Isaacson, Jobs immediately intuited what all computers of the future would look like, and at Apple he developed the GUI concept far beyond what Xerox had created into the point-and-click interface we all use today.

Isaacson makes it clear that Jobs wasn’t a perfect oracle, but when he misstepped as to trends in computing, his response was to take the next leap ahead. For instance, Jobs failed to anticipate the popularity of burning music to CDs in the early 2000s—the iMac included a CD-ROM drive, but not with the ability to make music CDs. Instead of redesigning the iMac, Jobs moved forward to the next stage of music, the MP3 player. The launch of the iPod quickly made user-created CDs obsolete.

(Shortform note: In Blue Ocean Strategy, business professors W. Chan Kim and Renée Mauborgne argue that to stay ahead of current trends in business, it’s necessary to imagine what the market will look like if trends are taken to their logical conclusions, as Jobs did while the music industry was changing. Kim and Mauborgne also advise leaning into both functionality and emotion, which Jobs made the cornerstones of all of his product designs.)

Instead of resting on the iPod’s success, Isaacson shows that Jobs was still looking to the future, particularly toward the cell phone market. Cell phones already had built-in cameras, and he knew that as soon as they incorporated music players they would kill the market for the iPod. Therefore, Jobs chose to get there first with the iPhone. Coupled with the iPad tablet, which came out shortly after, Jobs helped steer the digital age away from PCs as the primary computing devices of choice.

(Shortform note: While desktop computers are still omnipresent, the years since Jobs’s death have shown a steady decline in PC purchases, while the sales of tablets and smartphones have risen. Portability seems to be the key. The intervening years have also seen the growth of hybrid laptops with detachable keyboards that blend the features of PCs and tablets.)

But Jobs didn’t see these as separate devices. According to Isaacson, Jobs envisioned an information landscape in which a desktop computer would serve as a “hub” for all of a user’s devices. This allowed for seamless integration from one device to another—an album or ebook bought on your iMac would instantly transfer to your tablet or phone. Toward the end of his life, Jobs believed that the hub for a person’s devices would no longer be their PC at all, but would move into the cloud.

(Shortform note: The basic concept of cloud computing, the online hosting of software and data, goes back to the development of the internet itself. Amazon pioneered the modern form of cloud computing in 2003 with Amazon Web Services, while the popular cloud storage app Dropbox launched in 2007. Apple introduced its iCloud service in 2011 as a means for users to access both their files and applications remotely, preceding Google Drive by a year.)

While it can’t be said that Jobs created these trends, Isaacson illustrates Jobs’s uncanny ability to intuit future shifts in computing and position his company ahead of its competitors.

A Black-and-White View

Despite the brilliance of Jobs’s leaps of insight, his belief in the power of his own intuition let him fall into the trap of binary “either/or” thinking. Isaacson argues that this temperament colored not only Jobs’s vision of the future but the way he treated the people around him. In Jobs’s eyes, every person he met was either a genius or an idiot. Every idea was either brilliant or rubbish. Every product was either the best or it was garbage.

(Shortform note: Binary thinking has a certain value in “winner-take-all” scenarios, but it blinds you to situations where solutions require trade-offs. In A New Way to Think, management expert Roger L. Martin argues that truly successful leaders are able to consider opposing ideas at the same time and synthesize new ones with the best qualities of both.)

Jobs was vocal about his opinions and would use them to build people up or tear them down, sometimes on the very same day. This type of behavior was particularly manipulative—Jobs would berate a hapless colleague, then later put them on a pedestal. This made people eager to please him and terrified of failing to do so. (Shortform note: The idea of tearing someone down in order to build them up may have some, albeit controversial, value in strengthening a person’s self-esteem. However, when it takes place in interpersonal relationships, it’s more often a sign of narcissistic manipulation.)

Those who worked with Jobs the longest learned to reinterpret his extremist opinions. For example, if Jobs said an idea was stupid, they decided that he was really saying, “Explain your idea. Why is it good?” In one incident that Isaacson recounts, Jobs verbally accosted an engineer on the Mac team about a design he wasn’t happy about. The engineer then had to go on at length about why he’d developed the design the way he had. Because of the way he had to spell out his thoughts, the engineer then discovered an even better solution than the one he’d originally shown Jobs.

However productive, Jobs’s angry tirades were needlessly hurtful to staff and morale. His “tantrums,” as Isaacson calls them, could extend to anyone he might interact with—waiters, hotel staff, and even business associates. His colleagues found that the only way to deal with him at times was to push back against him just as hard as he did.

(Shortform note: In Mindset, psychologist Carol S. Dweck claims that people who exhibit bullying behavior assert their superiority by enforcing a fixed mindset that their targets are inferior. Such attacks can be countered by responding assertively and setting personal boundaries. It should be noted, as Isaacson points out, that Jobs himself was often bullied in school, which has been shown to feed a cycle of bullying behavior.)

Technology Meets Design

In addition to his passion and insight, Isaacson argues that the intellectual gift Jobs brought to the table was his interest and obsession with the aesthetics of design. The principles of design that Jobs imbued in all his products were craftsmanship, simplicity, and emotional expression. As a person who’d always found himself with one foot in the “tech nerd” camp and the other in the arts, Jobs was uniquely positioned to be the bridge between the two.

The Arts and Technology in History

There is a long and storied relationship between the evolution of technology and the development of the arts. Leonardo da Vinci, the subject of another Isaacson biography, designed the first sketches of flying machines based on his study of birds and applied his understanding of visual perspective to some of his most famous paintings. Galileo Galilei used his skill as a draftsman to bring to life the astronomical discoveries he made using one of the earliest telescopes.

Perhaps the most important figure in the marriage of engineering and the humanities was Johannes Gutenberg, whose 1450 invention of the moveable-type printing press ushered in a communications revolution and sparked a wave of literacy that reshaped the world.

Isaacson states that Jobs’s father, a machinist and auto mechanic, instilled in him an appreciation for well-thought-out design. Included in this was the idea that when building something, whether a car or a cabinet, even the parts that would never be seen should be as finely crafted as the parts that were visible. When translating this idea to computer design, Jobs insisted that a computer’s interior, down to the layout of the circuit board itself, should be as aesthetically pleasing as it was functional. (Shortform note: This sentiment is echoed by famed industrial designer Dieter Rams, who stated in his Ten Principles of Good Design that “Good design is thorough down to the last detail.” Other Rams maxims that Jobs may have agreed with were that “Good design is aesthetic” and “Good design makes a product understandable.”)

Jobs was heavily influenced by the Bauhaus architecture school, which melded modernist principles of art with functionality. Key to this were values of simplicity and elegance. One often repeated story is that Jobs’s inspiration for Apple’s early designs came from a sleek food processor he’d seen at Macy’s. Isaacson argues that in Jobs’s mind, engineering and design went hand in hand and couldn’t be separated from each other. (Shortform note: The Bauhaus school of architecture was founded by German architect Walter Gropius in 1919. Its core philosophy was that “form follows function,” and that physical objects should be aesthetically pleasing as well as useful. The Bauhaus school began a trend in modernist design that incorporated simple geometric shapes and neutral colors.)

According to Isaacson, Jobs’s merging of form and function didn’t stop at the physical design of his products but went on to include the user interface. From the Macintosh forward, Jobs ensured that his products felt natural and intuitive to anyone who picked them up. When the iPhone and iPad were introduced, Jobs ensured that everything about their design was such that nothing distracted from the screen, even though it introduced engineering challenges that could have been avoided with a larger, clunkier outer casement.

In addition to being easy to use, Jobs wanted his products to look friendly and express a purity of purpose. Isaacson says that in order to show the “friendliness” of Apple computers, the Macintosh was designed to look like a human face. When the iPod was released, in order to signify its simplicity and purity, Jobs declared that every part of it, from the earbuds to the power cable, would be white.

(Shortform note: Psychological research has shown that humans assign emotional value to different shapes and objects. Product manufacturers have long made use of this principle, from designing feminine contours on soda bottles to blocky “tough” edges on power tools. Such designs engage users’ unconscious minds and can heavily influence purchasing decisions.)

For Jobs, creating the products themselves was not the end of his expression of design. He wanted to design the user experience itself, from start to finish. Apple's original backer, Mike Markkula, had taught him that every product and interaction with the customer should communicate the value of the company. Isaacson claims that this led Jobs to take a controlling hand in product launches, advertising, sales, and even packaging.

The creation of the Apple Store was Jobs’s single most powerful expression of his company. He designed the stores to have a minimalist layout that would highlight a small number of products and leave lots of empty space for customers to explore. Isaacson says the message Jobs wanted to express was that Apple products were creative, easy to use, and playful.

(Shortform note: Through the Apple Store, Jobs exemplified several ideas later codified by marketing professor Byron Sharp in How Brands Grow, particularly by placing the stores in high-traffic areas to find new customers from as wide a base as possible. By designing the stores around how people used products, rather than around the products themselves, Jobs created a narrative about the relationship between the company and its customers, a technique described in detail by marketer Donald Miller in Building a StoryBrand.)

A Tyrannical Taskmaster

Jobs was driven by a singularity of vision when it came to anything he had a hand in designing. Once he knew what it was he wanted, whether it was a photo for an ad campaign, the layout for an Apple Store, or a particular shade of blue for a computer, he wouldn’t stop until he’d got it just right. (Shortform note: A demanding, no-holds-barred leadership style can certainly produce positive results, such as when Pfizer CEO Albert Bourla drove his company to produce its Covid vaccine. However, it can also negatively impact productivity and morale. In The 21 Irrefutable Laws of Leadership, John Maxwell identifies serving and empowering the people you work with as vital characteristics of good leaders—traits where demanding CEOs like Jobs fall short.)

The intensity of Jobs’s creative vision, coupled with his lifelong desire for perfection, caused him to lead others with a very heavy hand, inadvertently creating many of the pitfalls that plagued his first tenure at Apple and at NeXT. Isaacson says that Jobs’s laser focus on minute details would often result in delays, cost increases, total redesigns, and overworked employees. Taken to extremes, his perfectionism led to ridiculous demands, such as that the NeXT computer be a perfect cube, regardless of its engineering needs.

(Shortform note: Experts in the fields of psychology and leadership identify perfectionism as a trait to be avoided. In Big Magic, author Elizabeth Gilbert calls perfectionism an obstacle to creativity that masks itself as “high standards.” Brené Brown asserts that perfectionism is inherently self-destructive and rooted in feelings of shame. In her book Dare to Lead, she argues that leaders should resist their own perfectionism and that of their teams as well.)

Isaacson points out that Jobs’s need for control is what defined Apple’s stance in the debate between open and closed computer systems, which led to Microsoft’s greater market share. This same conflict resurfaced in the 2000s when Google released its open Android system to compete against the iPhone’s closed OS.

The Cost of Jobs’s Need for Control

In the contest between open and closed systems, Jobs received pushback from more than just his competitors. In his essay “Why I Won’t Buy an iPad,” author Cory Doctorow criticizes Apple for infantilizing its users and acting as a technological gatekeeper.

In The Future of the Internet and How to Stop It, law professor Jonathan Zittrain warns against closed-system internet products, including those of Apple, but also Xboxes, TiVOs, and GPS systems that can’t be modified or fully controlled by their owners. These devices, he argues, represent a danger to the creative opportunities offered by an open internet.

Jobs’s philosophy of closed, controlled systems has exposed Apple to legal issues as well, such as an antitrust lawsuit filed in 2022 against its Apple Pay feature.

While Jobs’s vision often led him to act as a dictator, his colleagues learned that they could challenge him if they had better ideas than his. (In fact, the team that designed the Macintosh instituted an annual award for the employee who pushed back against Jobs the best.) Many of the people who worked with Jobs grew into tougher, more visionary people themselves.

(Shortform note: In Grit, psychologist Angela Duckworth argues that the most successful people embody traits of passion and perseverance, which she defines as resilience in the face of setbacks. While Jobs intentionally encouraged these traits in his employees, the mean-spirited way in which he did so wasn’t necessarily the best model to emulate.)

Despite Jobs’s often tyrannical behavior, he had a skill for fostering group collaboration. Many companies in the corporate world are fractured, split into divisions that compete against each other. Jobs would not let that happen, either at Pixar or at Apple. When Jobs designed a new building for Pixar, Isaacson says that he deliberately set it up so that teams working on different film projects weren’t segregated; they’d be forced to interact, even if by accident. At Apple, instead of having engineering, production, and marketing work on their facets of any project separate from each other, Jobs would insist that all divisions work in tandem, allowing for a cross-pollination of ideas.

(Shortform note: A divisional organizational structure does have its advantages, such as creating accountability and allowing for greater expansion. However, it comes with drawbacks as well, such as interdivisional rivalries and an inefficient duplication of resources.)

(Shortform note: The benefits of collaboration go beyond mere productivity. In Extreme Ownership, former Navy SEALs Jocko Willink and Leif Babin point out that collaboration encourages members of a team to prioritize the group’s success. In Creativity, Inc., Pixar founder Ed Catmull, who worked with Jobs, identifies collaboration as an essential part of innovation. To facilitate this at Pixar, Catmull instituted “Braintrust Meetings” in which team members from various projects would gather to give candid feedback to each other.)

Product Over Profit

While other business leaders were surely easier to work with, Isaacson claims that Jobs’s motivation went beyond the company's bottom line. The goal that shaped Jobs’s vision for his company wasn’t maximizing profits, but delivering the best product possible. He pursued this by focusing on Apple core essentials of craftsmanship, simplicity, and quality.

(Shortform note: In Rework, entrepreneurial experts Jason Fried and David Hansson affirm the importance of simplicity and core values when allocating a company’s time and resources. However, they also give advice Jobs would have dismissed, such as ignoring tiny details in the planning process and considering products finished before they’re perfect.)

For Jobs, “keep it simple” was a way of life. Focusing on priorities and eliminating inessentials defined everything he did, from what he ate to how he furnished his house to how he structured his companies and products. In particular, it was his push toward basic, fundamental principles that, according to Isaacson, halted Apple’s downward spiral once Jobs returned to the company. After his failures with NeXT, Jobs had learned to focus only on those projects that were most essential. That meant directing all of Apple’s resources to no more than two or three projects at a time.

(Shortform note: At Google, product designer Jake Knapp instituted a similar strategy on a smaller scale. During his “design sprints,” a team working on a specific project would cancel all other meetings and distractions in order to focus on their goal. This strategy was so successful that he expanded on it in his books Sprint and Make Time.)

Jobs infused his passion for delivering great products into all of the development teams he brought together. He wanted everything his companies created to be more than useful; they had to be works of art. Isaacson recounts that Jobs directed his teams as if they were artists, whether they were working on operating systems, circuit boards, or apps for phones and tablets. As artists, he told them not to compromise, not to be afraid to break rules if they had to, and not to stop working until their projects were done.

(Shortform note: As a motivator, Jobs embodied several of the principles of outstanding leadership identified by James Kouzes and Barry Posner in their book The Leadership Challenge. He established and modeled the values he wanted for his company, he inspired others with his vision of the future, and he looked for opportunities to upend the status quo.)

Isaacson writes that Jobs’s appreciation of artists and musicians was such that the usually unflinching CEO would actually listen to criticism on how Apple’s products served artistic creators. When users complained that the original iPad didn’t have the audio and video editing features of the iMac, he resolved to correct that problem with the release of the iPad 2.

Producing the best product possible was paramount. Jobs often lashed out at competitors, and even other people at Apple, when he felt they’d sacrificed quality in the name of profit. (Shortform note: In Basic Economics, Thomas Sowell points out that business profits are the price for efficiency, and that prioritizing profits is what incentivizes companies to produce the goods and services we need at the lowest cost. However, from a marketing perspective, Seth Godin argues in his book Purple Cow that it takes a truly remarkable product to capture the attention of modern consumers.)

Jobs vs. Reality

Despite Jobs’s intensity and idealism, he was also adept at denial. Jobs’s single-minded passion, his binary thinking, and his need to control everything around him contributed to his willingness to ignore any situation that didn’t conform to his predetermined views. Though sometimes Jobs’s willful blindness worked to his advantage, Isaacson makes the case that it often didn’t and actively harmed the people around him—such as when his arbitrary deadlines forced employees to work around the clock or when he continually denied his paternity of his first daughter, Lisa Brennan.

(Shortform note: In Poor Charlie’s Almanack, investor Charlie Munger points out that even in the business world, psychological denial is an anti-pain response that nevertheless leads to poor decision-making. Taken to extremes, some people use denial as a means to keep reality itself at a distance.)

Early on, those who worked closely with Jobs dubbed this his “reality distortion field.” Jobs would insist on deadlines, redesigns, or product features that his teams working on a project would deem impossible. Jobs had an uncanny knack for convincing those around him to forge ahead and accomplish what he told them. Isaacson says that more often than not, this strategy worked, forcing his colleagues to innovate in ways they hadn’t imagined.

(Shortform note: In The Obstacle Is the Way, media strategist Ryan Holiday points out that most people assume they can accomplish less than they actually can, and that setting ambitious goals empowers them to achieve their true potential. Likewise, in The 10X Rule, sales coach Grant Cardone argues that goals should be big to truly inspire you and that people fail when they don’t reach high enough. However, both of these authors refer to self-determined goals, not those imposed from on high by an overly demanding boss.)

Because Jobs’s strategy of redefining reality worked most of the time, Isaacson argues that Jobs never learned to deal with reality on its own terms—until he met a reality that he couldn’t bully to his liking.

When Jobs was first diagnosed with cancer, he denied it, and instead of following his doctors’ advice, he tried alternative methods to will the cancer away. Isaacson claims it’s possible that if Jobs had yielded to his cancer’s reality, it could have been dealt with promptly before it spread through the rest of his body. (Shortform note: Denial is actually a common reaction to cancer diagnoses and other serious illnesses, one that physicians confront on a regular basis. In order to help patients move on to acceptance, doctors have found it more effective to engage with the root causes of denial instead of dismissing it outright. At the time of Jobs’s cancer diagnosis, the science was unclear as to whether conventional treatments would have been effective. The delay caused by him seeking alternative therapies may not have made as much of a difference as Issacson suggests.)

Life After Jobs

Steve Jobs succeeded in leaving an impact that would last beyond the end of his life. In 2019, Simon Sinek characterized Apple as a company that plays The Infinite Game—an enduring business that looks to the long-term, ever-changing future. Under the guidance of Tim Cook, Jobs’s successor, Apple became the first company valued at over $2 trillion, while introducing innovations such wireless AirPods and the Apple Watch.

Jobs created a company that successfully avoided what economics professor Clayton Christensen called The Innovator’s Dilemma. Instead of floundering when disruptive technologies changed the landscape of the computer industry, Apple was more often than not the cause of those very disruptions. Steve Jobs practiced what W. Chan Kim and Renée Mauborgne describe as the Blue Ocean Strategykeeping an eye on technological trends and leaping forward into markets where there wasn’t any competition.

Other biographers in addition to Isaacson have explored Jobs’s business and personal lives. In How to Think Like Steve Jobs, Daniel Smith examines Jobs’s ability to intuit what users would want before they knew it themselves, comparing him to figures such as Thomas Edison and Sony’s Akio Morita. In The Steve Jobs Way, Apple vice president Jay Elliot offers insight on Jobs’s managerial style, while in Steve Jobs: The Man Who Thought Different, Karen Blumenthal focuses on Jobs’s personal qualities.

In iCon, an unauthorized biography published in 2005, Jeffrey Young and William Simon are critical of Jobs’s role as an innovator, arguing that he was simply in the right place at the right time. In The Bite in the Apple, Jobs’s girlfriend Chrisann Brennan portrays him as a charismatic figure to the outside world while distant and cruel to those closest to him. In Small Fry, a memoir by Lisa Brennan-Jobs, she describes him as a father who was both thoughtful and cold, but with whom she was able to reconcile in the end.

Exercise: Reflect on Steve Jobs’s Impact on Technology

Many of Jobs’s innovations and design features have now become things we take for granted. Consider the ways that Jobs’s philosophy of design still guides the information tools you use today.

Exercise: What Can We Learn About Innovation?

One of Jobs’s key strengths was his ability to imagine the needs of the future instead of simply following the trends of today. Even if you’re not an inventor or designer, there may be ways to apply that philosophy in your daily life.