Video Of The Week: The Lost Interview

Chris Dixon suggested I watch this interview with Steve Jobs from 1995. It’s about 70mins long and I’m not embedding the whole thing here, just a trailer. You can rent the whole thing on iTunes. I would strongly suggest you do that unless you’ve already seen it. It’s amazing.


Comments (Archived):

  1. kirklove

    It’s free on Amazon if you have Prime. :)I love the irony of that.

    1. fredwilson

      did you watch it?

      1. kirklove

        I’ve seen bits before on YouTube and was put off by his arrogance. Will watch again and put that aside and listen to the message.

        1. fredwilson

          yeah. i’ve always struggled with Jobs. but maybe the fact that he’s gone makes it easier to swallow now. he was brilliant. the part about rocks in a tumblr which is in the back half of the interview is a must see for anyone leading a team building a product.

          1. JimHirshfield

            There’s no other way to spell tumbler now, is there? 😉

          2. fredwilson

            Oh. That was a Freudian slip

          3. JimHirshfield

            Fredian, come on!

          4. jason wright

            you’ve sold it. stop it 🙂

          5. Twain Twain

            Business Insider has an article ‘Eric Schmidt on arrogance’:* http://www.businessinsider….Reading the comments section under that article is interesting.At the recent Vanity Fair conference, Jony Ive said he once asked Jobs to be kinder in his critiques of Ive’s team. Jobs asked Ive why he should be kinder.Ive said, “Because I care about the team.”Jobs said, “No Jony, you’re just really vain.”Ive was stunned when he heard that.Jobs continued: “No, you just want people to like you. And I’m surprised at you because I thought you really held the work up as the most important, not how you believed you were perceived by other people.”Ive says he was “really cross” when he heard that because he knew Jobs was right. He called this a “brutally brilliant insight.”

          6. Twain Twain

            My reading is that Jobs was an extreme aesthete (culture snob) rather than that he was arrogant.That’s what comes across when he critiques how “Microsoft just had NO TASTE!”What I also picked up on was his obsessive control of R&D burn. He’s so dismissive of Sculley’s time when they invested $ billions but “What products did they have to show for it?!”

        2. LE

          was put off by his arroganceI think it’s reasonable to say that there were people that were actually drawn to that arrogance. The were raised in a way that made them want to make daddy happy. The arrogance and anger is a big part of making that happen. I saw that happen at Supermac when Steve Blank was VP of marketing in the early 90’s.

          1. Richard

            One common covariate of Great companies is a cult like following of its employees.

      2. LE

        I’ve avoided watching that a million times. Even though I really liked Jobs and I am a big Apple user and was actually sad the day that he died. And netflix keeps telling me that it’s a 5 for me. I just hate those type of things. I find that they are motivational porn. Maybe part of it is that you can’t interact and ask questions. Maybe it’s because the luck factor, while sometimes discussed, is not taken as being a key part of why a particular decision that someone made was right (or wrong).Not saying that something can’t be gained from it. I just don’t like the pedestalizing something about it is a big turnoff and distonic to me.

    2. JimHirshfield

      Does that mean the Bezos biopic is free on iTunes?;-)

    3. William Mougayar

      It’s free on YouTube if you have Internet 🙂

      1. Mario Cantin

        Good one, well played.

    4. Jon Atherton

      Not sure what you’d call the $29 to buy and $6.99 to rent in Australia, compared to $9.99 /$2.99 in US store… Another word springs to mind!

  2. JimHirshfield

    Unmissable <— that’s a word you don’t hear very often. Does it mean I wouldn’t notice if it was missing? Or, you can’t miss it? ;-)#ambiguityisambiguous

    1. Salt Shaker

      Reminds me of Roger Clemens infamous “misremembered” quote.

      1. JimHirshfield

        Irregardless <—- another example of a non word.

  3. Rob Underwood

    Wait – does this mean Microsoft Outlook is not the ultimate expression of great taste?Just turned the interview on via Amazing Prime. H/T @kirklove:disqus.

    1. sigmaalgebra

      Apparently “not available” in the US.

      1. William Mougayar

        ouch…sorry . try Hola? This is the first time I see something “not available in the US”. Usually it’s the other way around!

    2. Twain Twain

      Thanks for sharing here!

  4. William Mougayar

    If a visionary is not misunderstood and critiqued when they are being visionary, they are not a visionary.

    1. awaldstein

      Really…I think it is true when you make a distinction that they are misunderstood by the critics but honestly not true within the niche communities they live and work in.From Charles MIngus to Jackson Pollack to even The Ramones and Jobs–new mass markets grew out of ideas that small groups and communities percolated,This is certainly true of the early Apple experience.

      1. William Mougayar

        Good point that they are embraced by a minority first, and it goes from there.

        1. awaldstein

          You are right of course except nothing happens by some lone thinker in a cave. Action happens by groups or teams no matter how small.That’s why in my own early stage few investments, i care not at all about traction I care about whether there is a community of support and its makeup more.

          1. William Mougayar

            There is a number of people that were criticized or not famous til after they died. “The death effect”. Eg. Galileo, van Gogh, El Greco, Anne Frank, Khalil Gibran, etc.

          2. awaldstein

            True but I think almost entirely a vestige of a past times.

          3. LE

            What was Anne Frank criticized about?

          4. William Mougayar

            I said “not famous” or “criticized”. In her case, she just became famous after she died.

      2. LE

        I think more relevant for every day people who aren’t going to be the next Steve Jobs (because let’s face it they aren’t) is to realize that people aren’t going to understand and pat you on the back and approve of most of the thoughts and ideas that you have. In fact they will think many of them are truly wacky and laugh out loud. (So if you’re running what I call a popularity contest you will have a big problem if you share your ideas with those people.) Either because they don’t think like you, or they don’t understand enough, or they have different priorities or dozens of other reasons. Lack of creativity for sure. That’s a big one.Creativity. Some of my very best ideas have been really wacky but creative. In the sense of “nobody would make this type of thing up”. Things that just pop into my mind. If I shared them in advance they would definitely be ridiculed. (Not claiming btw that they are world class or anything which is the point I am making in the first paragraph..)Here’s a rainy day story. A while back I had a large amount of cash that I needed to keep at my office to pay someone. So even though I have a few light duty safes I didn’t want to put the money in the safe (it’s an obvious target, right?) And while the office is a mess I couldn’t really figure where to hide it so if someone broke in they wouldn’t find it. I decided the solution was to hide it in a junk mail envelope right on my desk. I mean what’s the chance of a burglar thinking “maybe there a large amount of cash in that junk mail envelope sitting there with the utility bills”. Then I put some decoy money that would be easily found. (This is a variation of leaving money in a cash register so the thief doesn’t trash your place).

      3. SubstrateUndertow

        True, the birds of a feather thing is always at play.

    2. Twain Twain

      There are visionaries in the Bitcoin sector who’d agree with that.A lot of it is to do with how those visionaries communicate and educate other people about why they’re making what they’re making.Steve Jobs was a great communicator.

  5. William Mougayar

    Ok you’ve got an iPhone….now you’re pushing iTunes on us too 🙂 ?

    1. fredwilson

      sorrythat’s how i watched it on the plane last night where i wrote the blog post

      1. William Mougayar

        was kidding of course…remember, i’m in the android camp for now 🙂

      2. JJ Donovan

        Are you going anywhere good? or coming home from anywhere good? Hope it was productive!

        1. fredwilson

          I’m in LA for the weekend taking care of some real estate stuff

          1. JJ Donovan

            You are my hero!

          2. awaldstein

            I can assume then that your visit to the green market in I bet Venice/SM is a lot less rainy than mine to Union Square this morning.

      3. William Mougayar

        I saved it to Pocket, so it can be watched on PocketTV too.

    2. John Revay

      I had similar thought, Fred is now a apple fan boy 🙂

  6. Salt Shaker

    Brilliant, arrogant and lucky, whose image–like many “iconic” figures before him–has only grown posthumously, particularly w/ the aid of Isaccson’s brilliantly written bio.

    1. Jim Peterson

      In his biography it was noted how the iPhone was put together using all existing elements (though I think there had just been a huge innovation in glass for the screen). He /Apple just arranged the elements differently and in a pleasing way.That’s what all of us here are aiming to do, right?

  7. Rob Underwood

    The case for computer science as a liberal art at 21 minutes or so is really interesting.

    1. fredwilson

      Rob – I will forward you an email I wrote the moment I saw that

      1. John Revay

        Wrote to who – one of your children?

        1. fredwilson

          No. The CSNYC team

  8. sigmaalgebra

    So, we got microprocessors and, then, a case of “ontogeny recapitulates phylogeny”.So, we got some operating systems that, as intended in the Intel processor architecture, borrowed heavily from Multics. Now we also have virtual machine. Not a lot really new here: Multics was from Project MAC at MIT in about 1969, and virtual machine was from CP67 from the IBM Cambridge Scientific Center in about 1967.Then we got graphical user interfaces (GUIs) heavily from the Xerox PARC work.Uses were for spreadsheets, some number crunching applications, simple text word whacking, on-line interactions via CompuServe, Prodigy, AOL, etc., fora, e-mail, and some more.We got image software, for stills and then video, dot matrix printers, daisy wheel printers, laser printers, ink jet printers, color laser and ink jet printers, and, finally, less interest in output on paper! Along the way we got more advanced word whacking, e.g., Knuth’s TeX (which I still use for all my higher quality and/or mathematical word whacking), more from Apple, Microsoft, Adobe, etc.Then we got the Internet, HTTP, HTML, Web sites/browsers, CSS, JavaScript, much more use of fora (e.g., blogs), social, local, sharing, mobile, etc.All along, what we had in computing before microprocessors continued to be brought over from mainframes, e.g., software for data base, applied math, physical science, engineering, statistics, etc.And microprocessors have found uses as controllers in printers, robots, cars, etc.The future? We already know what we want in the famous one word answer “more”. How much more? Take a couple, both working for $10 an hour, 80 hours a week, each, for $80,000 a year, with three kids, and we are sure they can spend the $80,000.With a factor of 10 in productivity, one parent can stay home, the other cut back to 40 hours a week, still make $200,000 a year, and we are sure that they can spend it.With another factor of 10, they can make $2 million a year, have a nice house, pay for private schools and college for the kids, have a 40 foot boat, and take some vacations.With another factor of 10, they can make $20 million a year, have a 100 foot boat and a nice vacation house, retire after 20 years, and pursue interests in art, science, family, friends, politics, etc. E.g., they might contribute to a crowd funded project to put a vacation spot on the moon or Mars, telescopes at Lagrangian points, etc.So we are up to three factors of 10 in productivity, that is, a factor of 1000, so that can do in two hours what takes a year at 40 hours a week now.That’s a lot of increase in productivity.How to do that? Sure: Automation. How to do that? Sure, computers, or, via a simple analogy, have people managing computers, managing computers, ,,,, several levels deep, managing computers, doing the work.The work is for (1) information gathering and/or creation and/or (2) material manipulation.My project is an example of information gathering. Since my software writing is coming to an end for now, for initial data for my project I need to do some information gathering. It was easy enough to automate getting HTML files via HTTP with some simple TCP/IP socket software, but for HTML files via HTTPS more is needed. So, got cURL running and used it. It worked right away! Nice.So, let’s get on with it, more in automation for three factors of 10 in productivity!

    1. SubstrateUndertow

      A single cell transitioning up into a human persona is largely just massive layers of automated cellular-subsystem based productivity-amplification. Still at many points along that journey of massively-layered quantitative cellular-recombinants there has repeatedly emerged massively-qualitative epiphenomena transitions that change everything.Are we not more than simply large collections of cells executing massive survival productivity ?OROn the other hand did I misinterpreted your whole point ?

      1. sigmaalgebra

        On my “ontogeny recapitulates phylogeny”. in biology, say, developmental anatomy, phylogeny is the development, i.e., evolution, of whatever the biologists called a phylum, that is, a collection of species.Then ontogeny is the development, from egg to adult, of some one instance of some one species in the phylum.The claim of the quote is that what happens from an egg to an adult is similar to, repeats, recapitulates, what happened in the development of the species or even the whole phylum.E.g., for animals with backbones, somewhere back there fish got gills. Well, supposedly even for humans, at one point in the womb, gill slits are still visible.Or, if suspect that have a lot of junk will never use again on your hard disk, in your file drawers, on your bookcases, in packing boxes, that’s nothing compared with all the junk in the human DNA!This remark “ontogeny recapitulates phylogeny” does have a cute, even surprising idea, but it is also something of a joke in, say, a course in developmental anatomy. Not that I would have ever taken such a course, but my brilliant wife did, and I got the quote from her, along with an explanation.I applied the quote to computing: So, the phylogeny was that of computer architecture, say, from the 1940s in WWII through Multics in 1969 and even to the last of the water heater mainframes of the mid 1990s or so.Then the ontogeny was for one species in computer architecture, the microprocessors, say, from the Intel 4040 or 8080 to the 8086, 80286, 80386, 80x, to the present with 64 bit addressing, etc. Then that development did parallel the development of computer architecture, that is, the phylogeny, previously and more generally.So, some of the steps in development of both the phylogeny and ontogeny were the architectural features of, say, protected memory, floating point arithmetic, microcode, interleaved memory, cache memory, virtual memory, virtual machine, embedded operating systems (each address space thinks it has its own copy of the full operating system which is embedded in that address space), multiple processors (or processor cores), cache memory with cache concurrency, pipelining, multiple execution units (say, separately for fixed point and floating point arithmetic), branch prediction, speculative execution, out of order execution, etc.Or, these architectural features came forward in computing, in steps, over decades, once, say, for mainframes, and then again for microprocessors and came forward this second time not all at once but also in steps over decades.Or maybe for many years, to be a microprocessor computer architect, just pick up all the technical documentation can find on, say, IBM’s mainframes and repeat as fast as the microelectronics permits.There are likely good descriptions of each of these architectural features on Wikipedia.Read up on all of these, and your graduation music will be…

        1. SubstrateUndertow

          “ontogeny recapitulates phylogeny”.I got that part and although, as I understand it, that is debatable within the scientific community, its seems like a very appealing aesthetically coherent theory.”that’s nothing compared with all the junk in the human DNA!”maybe thats not junk but fallback insurance DNA ?Thanks for the effortful rely much appreciated !I think the same overarching narrative could be applied to our presently evolving social-organization memes. Our new organic network-based social organizational memes/ontogeny now recapitulates the analogous phylogeny of biological organizational evolution. Our social evolution can be framed as a social-instantiation of a more primatial, statistically-driven, self-extending, self-selecting, fractal biological self-organizing dynamic ?

          1. sigmaalgebra

            self-organizing dynamic ? Or, for another guess, starting with a paraphrase of E. Fromm, For humans, the fundamental problem in life is doing something effective about feeling alone. Only four solutions have been found, love of spouse, love of God, membership in a group, and a fourth not recommended. So, humans don’t want to feel alone. Or, long ago humans wanted to be in a group, say, a tribe or village.People still don’t want to feel alone so might use the Internet, maybe Facebook, to “be in a group” or some such.Or, very simply, humans are social creatures. So, wonder of wonders, now we have social media.

          2. SubstrateUndertow

            You always deliver great quotes. Cheers!

  9. John Revay

    Saw that this was posted under “Entrepreneurship”Just started listening to “How Google Works by Eric Schmidt, Jonathan Rosenberg, Alan Eagle” using the Audible app.Great stories and insight as to how Google works

    1. Richard


  10. kenberger

    This was just 1 of many incredible installments of Cringely’s “Triumph of the Nerds”…i watched them all religiously starting in 1996 just after arriving in Silicon Valley. I wonder if the series is out there somewhere..

  11. William Mougayar

    That was an amazing interview. (I watched the unabridged version).He doesn’t mince words re: Microsoft and John Sculley.Good to hear about how Bill Hewlett & HP influenced him early on, something that doesn’t get remembered too much.Surprised that he blamed the pursuit of profits back then for the low Macintosh market share, whereas he managed later to achieve spectacular profits and a very decent market share with the iPhone & iPad.

  12. sebastianconcept

    You joking?I watch the entire thing once every 6 months at least!I work with Smalltalk and is the thing he and team saw in their visit at Xerox PARC eventually inspiring the Macintosh.Did you saw his shining eyes when he describes that moment of inspiration during this interview? Cool, now you understand a bit why I’m into Smalltalk.

  13. Terry J Leach

    I’ve watched it a few times over the last few months on Netflix. Yes it is amazing because it shows vision and that he was always learning.

  14. jason wright

    the ‘lost’ genre. has digital ended it?

  15. pointsnfigures

    Just watched it with my wife. First impressions are about how Jobs talks about process. “People focus on process and not on content.” I thought a little more deeply about that. The reason may be that if they focus deeply on process, they think they are absolving or diversifying themselves away from risk. The get so focused on risk avoidance by designing great process, that the product becomes plain vanilla. Vanilla products work, but they don’t disrupt or change the world.My wife commented that a lot of people will find this boring. It will be lost on a lot of people. But anyone that is designing a product, thinking of an idea (innovating) should watch it.