Pay Attention To The Package

Tech investing is a lot about big trends and timing them.

We knew mobile was going to be a game changer as far back as the mid 90s, but it didn’t really take off until the iPhone came along in 2007

We knew personal computing was going to be a big deal in the late 70s, but computers didn’t become truly personal until operating systems got graphical user interfaces in the mid 80s.

The internet was super interesting in the late 80s and early 90s but it didn’t go mainstream until we had web browsers in the mid 90s.

Artificial intelligence has been around as a computer science effort for sixty years but it didn’t start impacting our every day experiences until it was packaged up (and effectively made to disappear) in web and mobile apps and increasingly cars and voice activated devices.

My point is that technologies present themselves as interesting investment opportunities long before they go mainstream and figuring out when they are going to go mainstream is a lot about looking for the right packaging.

Virtual and augmented reality has been an interesting and investable technology for the last six or seven years. But it hasn’t gone mainstream yet because the packaging of the technology remains problematic. At some point, some company will figure out how to package it up correctly and it will go mainstream. Until that happens, it is a difficult place to make money, even though a few entrepreneurs and investors have been able to do that.

Blockchain and crypto is in a similar state. Today, other than buying and selling crypto tokens, blockchain applications are clunky and hard to use. Centralized applications are way better than their decentralized cousins. When entrepreneurs figure out how to package up blockchain applications so that they are fun and easy to use, I think we will see them take off. My guess is that it will happen first in gaming and collectibles.

My point is that it is one thing to develop a technology that is superior to the current offerings, but entirely another thing to make it usable by most people. The first part is, in some ways, the more important thing (like Satoshi’s white paper) but the second thing is often where the investment leverage happens.

#VC & Technology

Comments (Archived):

  1. Pranay Srinivasan

    When the tech launches, the packaging is usually in the narrative.When the business use cases arrive, the packaging is usually in the utility.Conflating the two yields the current ICO crypto market

  2. Richard

    Agree! An application truly native to the block chain will be the first true breakout.With each of the previous technologies, we tried to fit the legacy model into it first. That’s suboptimal .

  3. William Mougayar

    The Web packaged the Internet well.Maybe the Blockchain is waiting for its package.

    1. Pranay Srinivasan

      I am pretty sure tradeable tokens or currencies are not that package even though they look like it right now.

    2. sigmaalgebra

      Of the first, necessary importance is the significant utility that is a “must have”. Packaging can come later as a “nice to have”. When the are lots of competitors providing the basic utility, then packaging can be a big advantage.”Maybe the Blockchain is waiting for its” as a unique source of must have utility.

  4. Vendita Auto

    My technical knowledge way behind many of you however my take on this is:Fiat financial institutions are holding back the rivers but climate change is inevitable. The two IMO most interesting companies in the markets are the well established/embedded Coinbase who’s current position reminds me of the “parable of the Wedding Feast”. The other company is Overstock [https://www.crowdfundinside…] where the team have created solid foundations for the near offshore future.Crypto and CyberAnquan security markets are dovetailed and will be at the forefront of the near future blockchain / D-Wave systems that are close to or currently in use now. D-Wave generated algorithms v algorithms are without mortal intervention, China has a commanding lead in long distance cryptography that twinned with D-Wave’s warp speed will open blockchain/Crypto security to any modern day Claude Shannon. IoT & Crypto systems will depend on international cyber agreements. I have no personal crypto holdings nor am I affiliated to any exchanges.

  5. Korf

    My sister (a leading toy designer), her early teen daughter (an art, science and computer enthusiast), and I (a crypto observer) all love the idea and execution of cryptokitties despite “cat” calls from crypto skeptics. Who else is doing interesting work in the area of crypto collectibles? It would be great to see a crypto kickstarter in this space where the best ideas get the funding and the community of supporters/participants can be a part of long term distributed value creation.

    1. falicon

      Check out https://www.stateofthedapps… – good list of projects. Most are still crap, but there are a few gems in there…

      1. Korf

        Thanks @falicon – I had a neat discussion with @TimOreilly and others about long term value the other day … Whats cool about the blockchain is that we might record the contributions of those who are helping build a hypothetical pyramid when the first stone is laid, then years or even decades later they could continue to reap reward when the last stone is laid and the work of the collective has created exponential value for all involved.Applied to complicated problems like climate change, rain forest preservation, educating the bottom of the pyramid to name a few where immediate reward is elusive, the blockchain provides an immense opportunity to change the world for the better.Building on my previous question, who is doing important (seismic level) with the blockchain to change the world for the better? Civic ( https://www.civic.com/ ) is one such example.

        1. falicon

          Civic looks interesting. USV is involved in blockstack that has an identity play as well. They also recently invested in https://www.algorand.com that seems pretty interesting.The other project that is making great strides in adoption and use right now is https://metamask.io

  6. Vasudev Ram

    This post reminds me of the book Crossing the Chasm by Geoffrey Moore. Had read it some years ago. Parts of it were really interesting.https://www.google.co.in/sehttps://en.wikipedia.org/wi…Excerpt from Wikipedia page:Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers or simply Crossing the Chasm (1991, revised 1999 and 2014), is a marketing book by Geoffrey A. Moore [1] that focuses on the specifics of marketing high tech products during the early start up period. Moore’s exploration and expansion of the diffusions of innovations model has had a significant and lasting impact on high tech entrepreneurship. In 2006, Tom Byers, director of the Stanford Technology Ventures Program, described it as “still the bible for entrepreneurial marketing 15 years later”.[2] The book’s success has led to a series of follow-up books and a consulting company, The Chasm Group.[3]

    1. JamesHRH

      Parts of it?Is a B2B marketing bible.

      1. Vasudev Ram

        Good point. But what I meant was, some of the other parts were somewhat obvious (once you have read them, anyway). By “interesting parts”, I was referring the parts I might never have guessed on my own, even with some general knowledge of marketing, etc.

        1. JamesHRH

          I’ve put more time into this than is healthy.Positioning by Trout and Rise is the consumer bible.Crossing the Chasm – which references positioning, I used to be able to quote the page – is the B2B bible.Everything after, to borrow a phrase, is young’uns thinking they discovered sex. Which, to be honest, is a good way to make a buck, as every generation wants the eternal wisdom packaged up in their own wrapper.A certain someone pops up in this community who I am flat out jealous of – he nailed this move for techies.

          1. Vasudev Ram

            >Positioning by Trout and Rise is the consumer bible.Ries.I had read their book(s), either the one you mention – Positioning – or the 22 Immutable laws of marketing, or both. IIRC I found at least one of them to be not that good or interesting. But I am a relative newb at marketing, so may be talking crap.There was either a book or a point about differentiation, maybe in one of their books, that I thought made good sense. I quoted the point to a consulting client of mine once, and they jumped on to it, but might have gone about it in the wrong way (not on the assignment I was doing for them). An example of how even any good concept or technique can be misinterpreted or misused.

  7. LE

    We knew personal computing was going to be a big deal in the late 70s, but computers didn’t become truly personal until operating systems got graphical user interfaces in the mid 80s.I (as others here certainly did) went through that in the 80’s. Part of what drove that adoption [1] was that the Mac was fun and a toy. That is what brought people to it generally. You could play with it. I remember my Dad even got one and when I went over to his house and my Mom saw me using it she said ‘why are you playing’. And I was not playing (I have never ever played computer games period) but whatever she saw looked like a game to her. Playing to me is writing a short program that automates or helps me with something I need to do.Anyway my point is the thing that drove Mac and graphical interface adoption was fun. Sure we also used it [1] for business but most people (other than in graphic arts) used msdos in business. Obviously. Not a computer with a graphical interface.Now me I loved ‘playing’ with computers from the teletype days. And the days of CRT’s in the Wharton Computer Center. I liked everything about that. I like the sound and the hum I liked the blinking cursor. I liked the spinning tape drives. I liked the amped up HVAC. I liked slaving for 8 hours trying to solve something that you can just google an answer to today. [2] Later I did note however people who only took to computers when they became ‘fun’ in a GUI way. (You know like people who like drones but would never do RC Helis because it actually takes work and perseverance and overcoming adversity to do that).[1] And we are talking about the Mac because Windows didn’t have it until much later and even when it did (and to this day) it sucked. And the Mac was a toy although as I have mentioned we used it in 1985 to drive a Linotronic Imagesetter to create type for ads.[2] Not that I am not glad that it easier but you get a certain kind of mental reward even if nobody else cares or knows when you do something like that.

  8. LE

    Virtual and augmented reality has been an interesting and investable technology for the last six or seven years. But it hasn’t gone mainstream yet because the packaging of the technology remains problematic. At some point, some company will figure out how to package it up correctly and it will go mainstream.I don’t think it will ever go ‘mainstream’. Why?This dovetails with what I said in my other comment about ‘fun’ and adoption of gui’s. However honestly I am not seeing it happening for virtual/agumented reality at least in a mass market way. Because other that specialized niche uses (medicine let’s say) there is no everyday killer reason everyday people want or need this. As opposed to a personal computer like a Mac in the 80’s where you could actually have fun but also use it for a purpose with a killer app like a spreadsheet or to write a letter and print out on a dot matrix or laser printer.Also going through the phase of prior to Gui’s and using computers it was obvious if you hung around small business that they would buy and use computers if the price and ease of using came down. Anyone spending time would see this. Not just me who was only a kid. I figured that out when I was in middle school in the 70’s working for my dad typing invoices for his wholesale business on an IBM selectric. Only thing preventing my dad from using a computer was back then a Dec was way to expensive so we used typewriters and a manual process. But there was a clear path forward no question (I tried to get him to buy a computer, might have been something like $80k in 70’s dollars way way to much for his small company).

  9. LE

    but entirely another thing to make it usable by most people.In particular computer and engineering types are really bad at this. Just look at disqus and their right corner icons as an example. Classic mistake of not knowing what a normal users sees (with no experience or insight) vs. the developer. This also plagued other areas even signage at airports sucked until someone studied how normal people drive into an airport and get around. Why did Walmart become so big? Because Sam Walton understood how his normal customers would think and act (all w/o reading any books by others or going to business school btw.)

  10. Girish Mehta

    The packaging and the Killer App/ JTBD/Utility aren’t the same thing. Missing a step there when you say that – “when enterpreneurs figure out how to package blockchain applications so that they are fun and easy to use we will see them take off”.In the case of the Internet, the killer applications were E-mail and the World Wide Web itself. (By Internet here I am referring to the Net before Sir Tim served up the first Web page on the Internet toward the end of 1990).Accepting your characterization of the web browser as the packaging…the browser made the Internet mainstream, but the packaging (Browser) required the application — which was the World Wide Web itself (and Email).Similarly with the Personal Computer, the first killer application was VisiCalc (or more broadly, spreadsheet) and on the Mac it was desktop publishing/printing and the education-school kids segment (K-12). Mac had the GUI well before Wintel PCs…the GUI certainly helped make personal computing mainstream but what would you do with the GUI without the utility ?I don’t think packaging makes public blockchain applications mainstream. It has to be the Killer Applications/Utility first, and that still is not there (now into the 10th year since Satoshi’s paper).Once you have the killer application(s), the packaging comes in and makes it mainstream.The packaging is not the application.

    1. JLM

      .We have a tendency to forget the significance of VisiCalc. It made the personal computer a business tool over night.VisiCalc — at 32K — was the killer app which drove the sale of Apple IIe. An Apple cost $2000 and VisiCalc cost $100.I owned the first Apple IIe in Austin, Texas and bought VisiCalc and an impact printer.I remember making a financial projection to calculate the interest during lease up for a high rise office building using a 55% AOLB at a 7% cost of funds.We sat there looking at that green screen waiting for the spreadsheet to update. In less than 45 seconds it did.After email, I think spreadsheets are the killer app for computing.JLMwww.themusingsofthebigredca…

      1. Girish Mehta

        Absolutely, VisiCalc was the killer app for the Apple II.And later Lotus 123 launched on the IBM PC. Huge success and it launched on DOS.The packaging is critical to drive up mainstream adoption, but the thing is – the killer application needs to be there, and its user proposition becomes quickly evident regardless of the packaging. One exposure to VisiCalc or Lotus 123 and you knew why it would be needed.

        1. Vasudev Ram

          Right. I had used Lotus 123 a bit and loved the slash commands convention. Keyboard input is far faster and more productive than mouse, for text, number and formula manipulation, at least (can be different for graphics and more visual stuff, but even for that there are keyboard and command-line based languages such as Unix’s pic and many more). (See The Art of Unix Programming and other comments I have made over time about Unix on this blog.)And I saw this a while ago – try this google search (about the internal code name of Excel :):https://www.google.co.in/se

      2. sigmaalgebra

        There is an extension of spreadsheets whose time may have arrived.So, do a financial plan for the next, say, three years. To keep it simple, do the planning by months. So, we have, say, month 0 to start, 36 months of the plan, and month 37 for the results. So, our spreadsheet has 38 columns.And we have some variables, initial values, values for each month, and final values, for the usual suspects, revenue, cost of raw materials, labor, etc.During the three years we want to bring some new products to market and have the plan tell us how fast we can go from the raw idea to the prototype, alpha test, beta test, publicity, roll out, build up, etc. We have to consider hiring, training, plant, equipment, ….So, in the spreadsheet we put some data into some of therow-column, variable-month cells. And in some of the cells we put algebraic expressions in terms of cells in earlier columns. Maybe in the bottom cell of month 37 we have the value of the work.So, as we know, we type in all this stuff, maybe have some graphs of some of the data, and ask for a “recalc”.Now, that sets the stage for the rest we can do:(1) We look at the spreadsheet and some of the data we typed in, and, to do some “what if?” work we type in some different data and do another recalc. We can lose sleep for days changing the data and doing recalcs.(2) So, we’d like to automate the work in (1): For those cells we kept changing, we will leave them blank, call them variables, and ask for what values would maximize that final cell? There is even some relatively general purpose software for that based on L. Lasdon’s generalized reduced gradient version 2, GRG2.(3) But we are not sure about some of the data. That is, we are willing to consider that some of the data is random variables. So, we can take the solution from (2), fix it, describe the random variables, do 1000 or so recalcs sampling the random variables, and get graphs of probability distributions for every cell of interest.(4) In football, we don’t call all four plays on first and ten and stick with those. Instead, we call the first play and, based on the results, call the second play, etc. Well, in our business planning over the three years is something like 37 plays instead of just 4 in football, and we should not decide what to do in month 2 before we have seen the results, say, from the random variables, in month 1.Okay, but, then, what is the best thing to do in month 1? Indeed, through out, what are the best things to do?First good news: By (3) we have essentially a well formulated problem.Second good news: Given that problem, for here in (4), there is some math for a solution.The math was due likely first to R. Bellman. Then lots of other good mathematicians got involved — R. Rockafellar, E. Dynkin, R. Wets, D. Bertsekas, S. Shreve, W. Fleming, G. Nemhauser.Deterministic optimal control: Landing an F-18 on a carrier in perfect weather.Broadly the field of Bellman, etc., is stochastic optimal control, e.g., landing an F-18 on a carrier at night in a hurricane.First bad news: The computing needed for the work on a big spreadsheet is a bit much. As the problems get more realistic, a high end desktop computer could get swamped quickly.Second bad news: So far only in some simple cases in practice will we have good descriptions of all the random variables or even all the algebraic expressions. E.g., what fraction of the people we start in our training program will come out as useful employees? In some simple cases, we may have some good data. In a lot of complicated cases, we won’t.A grown up case of Moneyball, e.g., what the offense might do inning by inning, might be an example!So, partly stochastic optimal control is some really nice math that will tell us just what to do down by down or month by month if we had a lot of data we don’t have. So, we’d have to work on the issue of the data. E.g., we’d want some models of interest rates, commodity prices, exchange rates, etc. So, users might go to a Model Store and get some models to plug in.Some users would like the work just as a high end version of sensitivity analysis.Third Good News: The current high end graphical processing units (GPUs) can do a lot of arithmetic, much more than general purpose processors. So, some GPUs might make some high end work stations start to be able to do some real problems.Fourth Good News: The usual approaches to the computations are enormously parallel. So, e.g., could take over all of Amazon’s AWS server farm for a day or so and run everything in parallel; that is, it’s a super computer problem. So, maybe to heck with workstations and, instead, just do the computing in the cloud.Fifth Good News: There are lots of particular cases where all this would be quite doable. One example was my Ph.D. dissertation in stochastic optimal control: For that I did some work on algorithms and wrote special purpose software and got the computing reasonable. But for the much faster computing available now, some general purpose software could do well and be nicely fast.Sixth Good News: The real world is awash in problems where do need best decision making over time under uncertainty. Sure, in principle they can be viewed as spreadsheets as in (3), but they are not usually thought of as spreadsheet problems. So, the GANT/PERT work used by Rickover or by NASA, improved to handle the random stuff, would be an example. Broadly supply chain problems are big examples, done now by whatever means and rarely done very well. So there are a lot of candidate problems.Yes, at least once, our last Fed Chair said that maybe she would use some control theory!Yes, stochastic optimal control has been seriously proposed as the core of national economic policy — I doubt if much has been done.Somewhere there is a far out but relatively serious Web site that at least once claimed that stochastic optimal control is necessarily the ultimate version of intelligence possible in the universe. But for this, likely would have to assume that all along would be running big experiments getting the data on the random variables.So, some such will come eventually! Maybe now the time is right for some of it!Bottom Line: There are likely easier ways to make money!

    2. cavepainting

      Loosely speaking, there is a huge transition from Phase 1 (raw technology) to Phase 2 (how these can be consumed at scale by users to get specific jobs done)I am guessing what Fred means by packaging is this quantum jump of taking something raw, arcane, and technical and making it easy, usable, and fun.There are multiple layers to this “packaging” but in the IT industry, the heart of the this transformation has often been developer and power user empowerment (even if the app developer has some times been the same company that owned the raw tech).- Make it easy for any developer to create a web site. – Make it easy to create Email apps atop SMTP- Make it easy for developers to provision storage, computing, or bandwidth on demand- Make it easy for desktop publishers to create high quality graphics enabled content- Make it easy for developers to create spreadsheets and word processorsWhen a large developer community is empowered to access the new technology and create new solutions, it creates a powerful virtuous feedback loop that also iterates the underlying technology to solve end-user problems better. Killer apps come out as a consequence of these interactions.Phase 1 morphing into Phase 2 consists of continuous feedback loops and is not linear or predictable. External conditions and trends can accelerate or hamper these loops.For Blockchain, the developer empowerment has already started to occur (maybe too soon) but the underlying technology is still very immature. The hope here is that the hundreds of decentralized app experiments will trigger feedback loops that fixes the core underlying tech (better scaling, simpler and more secure custody, easier access, etc.) while also increasing core capabilities of developer tool sets.

    3. sigmaalgebra

      > Similarly with the Personal Computer, the first killer application was VisiCalLikely that was the second. The first was software to kill off the typewriters. A lot of that was done with just a text editor and a text formatting program in the family RUNOFF. Then there was WordStar, etc. VisiCalc and Lotus came later.Also a lot of the PC success was from people being able to write some nice applications in Basic.

      1. Vasudev Ram

        >> Similarly with the Personal Computer, the first killer application was VisiCal>Likely that was the second. The first was software to kill off the typewritersLikely not. Typewriter software, aka word processors, may have been a killer app (and they were) but likely not quite on the scale of spreadsheets [1] (e.g.VisiCalc). although within an order of magnitude or so, is my guess. I’ve read (in US computer mags) that it was VisiCalc that really kick-started the personal computer revolution (for business), which soon took PC sales into the millions, and eventually into the hundreds of millions and more, over some years, without which the personal computer might have been restricted to at most a few tens of thousands of hobbyists.[1] In one of those articles I had read, it said that the term spreadsheets itself came from physical “spreadsheets” used by accountant before PCs. They used to type or print out many separate pages of a financial report and then lay the pages out on a table in a grid and them paste them together to form the full final report, hence the term spreadsheet. That metaphor was then replicated in software. but with much more flexibility, because it was a single sheet with no multiple pages, and you could rapidly move around to any cell or range of the software spreadsheet, just with some keystrokes. That, combined with the instant press-button recalculation when spreadsheet values or formulae changed, is what made them one of the killer apps that made PCs ubiquitous (in business, which was a much bigger market, not just hobbyist-land).And as a generalization of that point, I’ve noticed (and read) that a lot of computer terms (both hardware and software) are derived from non-computer real world and human life terms. Take the case of even simple terms like file, folder, directory, read, write, etc. And there are tons more.

        1. sigmaalgebra

          I just looked up VisiCalc and WordStar. So, apparently the history went, first the Apple II. Then in about 1979, VisiCalc for the Apple. Then from the VisiCalc usage, that is, in businesses, IBM’s area, IBM rushed out the IBM PC. Then there was a rush to put software on the IBM PC. Likely VisiCalc was ported to the PC. WordStar was shipped with a lot of success. Lotus did 1-2-3 and quickly destroyed VisiCalc, bought the company, and closed it down.Okay, VisiCalc made a big slash, first on the Apple. The splash got IBM to do something, ….But: Let A be the set of all people at work, in school, etc. where writing is important, and let set B be the set of all people working with spreadsheets. Well, clearly enough, nearly everyone working with spreadsheets is also where writing is important. So set B is essentially a subset of set A. And, really, set B is a small subset of set A: When have a computer that can do spreadsheets, also essentially necessarily also have a computer that can do word processing and, thus, replace the typewriters.Bottom Line: For essentially each computer, from the middle 1960s to the present, for all the typewriters near that computer, the computer was more active replacing the typewriters then replacing paper spreadsheets.That’s why I conclude that, in particular, in the early days of the PC, replacing the typewriters was more important than replacing paper spreadsheets.I just did some Google searching and foundThe Apple II Parallel Printer Interface Card was released in 1977 and sold for $180. Wozniak wrote the firmware ROM, and managed to make it fit entirely in only 256 bytes.And, in 1977, why have a printer? Sure, to replace a typewriter.Well, VisiCalc was out in 1979. And when have done good work on VisiCalc, guess what: Will want to print it out. To send it to the CFO, will want to write a report and a memo. So, will use the computer for that.So, people were using the Apple computer to replace typewriters two years before they were using the Apple for spreadsheets. And essentially all the spreadsheet users were using their Apple more for word processing than spreadsheets.A few years later on the IBM PC there was WordStar which made a big splash. Soon Lotus 1-2-3 came along and killed off VisiCalc; Lotus bought VisiCalc and ended it.

          1. Vasudev Ram

            Good research, and some valid points, but I think you may not have got my main point. Maybe I did not word it clearly enough.My point (per what I read, not per my own knowledge or experience, although I have used some of those tools), was that the really explosive growth of PCs (Personal Computers) – which really happened in businesses (according to my sources), happened mainly due to spreadsheets (like the pioneering VisiCalc and later Lotus 1-2-3).(Word processors (like WordStar and later WordPerfect and others) and presentation and graphics and print software (like PrintShop, PrintMaster, Harvard Graphics, etc.) also played a big role, for sure.)Your point that word processor users were/are a superset of spreadsheet users is likely somewhat correct, if not fully correct (may be fully too), but my point is that the *initial* explosive growth – over and above the home / hobbyist users, and into the market of business users – was propelled (mainly) by spreadsheets. Again, this is per my sources – read a while ago, not my own direct knowledge Those sources could be wrong, of course. I don’t have them handy now since I read this some years ago.Also, very soon after that initial explosive growth, the quartet of spreadsheet / word processor / some sort of graphics + presentation + print software / some sort of desktop database (PC-File, dBASE, Foxpro, etc.), became the four most common business software apps on PCs. (I’m considering utilities and games as separate categories for the purpose of this discussion).Edit: Also see JLM’s and Girish’s comments about VisiCalc above, which support what I said.Anyway, good discussion.

          2. sigmaalgebra

            What you remember about what the technology media said is correct. I’m sure that from 1979 to, say, 1985, there were essentially no headlines about the Apple II, the PC, and the Mac killing off the typewriters but lots of headlines about VisiCalc and, then, breathless speculation about the latest from Lotus.But from about 1979 to 1985, the typewriters died off faster than the dinosaurs. In number of microprocessor cycles used, those driving printers replacing typewriters had to be much greater than those recalculating spreadsheets.War Story: A new prof at the OSU B-school had, back in grad school, not at OSU, just written his Ph.D. dissertation on a Prime and printed it on a Xerox/Diablo daisy wheel printer. For the math subscripts, he used the fact that the Diablo would honor some simple control codes to roll the platen half a line. So, at OSU, before the Prime was there, he wanted to see if he could get a Diablo and drive it with the OSU academic IBM-based mainframe. E.g., the mainframe, yes, had the IBM word whacking program Script. So, might it be possible to format math papers and class notes with Script, put in some control codes for the subscripts for a Diablo, and print on a Diablo? Well, sure, internally the IBM box used IBM’s EBCDIC 8 bit character set, and the Diablo used ASCII. So, right, the mainframe had a communications box to convert between the two. So, could the mainframe be talked into sending control codes? Hmm ….Well also connected to the mainframe were many DEC DECwriter dot matrix terminals. So, there was a control code that would ring the bell on the DECwriter. So as a first test of getting the mainframe to send control codes, could the the mainframe ring the bell on a DECwriter? Well, there are only 256 bytes. And the mainframe had a PL/I compiler. And a few lines of PL/I were sufficient to send all 256 bytes. So, do that and see of the bell rings? It didn’t. So, call up the mainframe computer center and see what byte conversion tables they are using between EBCDIC and ASCII.That phone call was the beginning of the end for the longest sitting academic CIO (career is over) in US higher education!The phone call question went immediately to the CIO, and that evening at a faculty cocktail party the CIO told the B-school Dean that he had a prof trying to use the mainframe to ring the bell on his terminal! So, soon the prof explained to his Dean.Soon it became clear that there was no way to use that mainframe setup to drive a Diablo very well. So, that prof started discussing getting a Prime like he’d used in grad school. Soon the Dean asked the prof for a report and recommendation. The Dean assigned a full prof of econ to the new prof as “adult supervision”. The new prof looked at DEC, Data General, and Prime and recommended Data General — the computer in the book Soul of a New Machine. But the Data General (DG) office wanted big bucks and wouldn’t deal. Prime was much cheaper and easier to use than DG or DEC. So the recommendation was for a Prime.Then suddenly the new prof got a call at his office asking if he could come to the Dean’s office. No topics were mentioned. So the prof grabbed a cubic foot or so documents related to the recommendation and went to see the Dean. There with the Dean was the campus CIO, right, the guy who had complained about the bell ringing effort.The discussion went to computer room A/C, and the CIO claimed that the disk drives for the Prime needed high quality computer room A/C from, right, likely Columbus, OH Liebert company, with tight control over both temperature and humidity. The prof claimed that likely initially no A/C at all would be needed, and if so, if the area got a little warm, e.g., for people, then just hang an evaporator off the ceiling, have it drive some plenums, and feed it with a compressor on a pad just outside. Such A/C could be done well for $5000 by any of several HVAC companies there in Columbus. The OSU internal price was $15,000. The Liebert version price was about $80,000. Lots of feathers were going to get bent the wrong way no matter what the decision was!So the prof explained:”Prime sells this disk drive, but they don’t make it.” The CIO nodded in agreement. “The drive is from Control Data, and they call it a Storage Module Drive. It’s popular in the industry. We used one when I was in grad school. We had the Prime in an internal office with two doors. By summer the room was getting warm, so we put a fan in one of the doors. That helped a little but not enough. So we had an A/C evaporator hung off the ceiling with the compressor on the roof. Worked great!The CIO still claimed that such A/C wouldn’t work with that disk drive.The prof said, “I happen to have with me the official engineering specifications for that drive directly from Control Data. They are for temperature between (IIRC) 40 F and 105 F with humidity between 10% and 90% non-condensing.”The CIO claimed that the disk drives would not tolerate such.The prof went on: “Prime’s specifications are a little narrower than those of Control Data, but Prime’s specifications should also be easy enough to meet. Control Data does have one specification that Prime omitted — the rate of temperature change should be no greater than 15 F per hour. That specification should also be easy enough to meet.”The CIO continued to claim that Prime was wrong, that the CDC drives wouldn’t work.The Dean sided with the prof, and soon a Prime was running and did what was promised and much more, especially in word whacking. Soon it was the world site for D. Knuth’s TeX on Prime and had good e-mail and more. Via the alumni office alone it made big bucks.It’s was true that the Prime ate a lot of the carpet and drapery money in the B-school. So, some department offices failed to get some interior decorating they wanted and resented the prof!!!!Soon the CIO retired, and the prof was on the committee to select a new CIO for the university. And the prof was appointed Chair of the college computing committee. The prof ignored the idea of the committee, handled everything informally, and 99% left everything about the Prime to a really good little group in the basement, including two former students who did big things quickly. E.g., the large number of terminals and Diablo printers in the school was greater than the number of ports on the Prime. So, there was a “port mapping” box. The computer group did that on their own with no role for the prof and chair of the computing group.But heavily, it was all about word whacking.The prof never had any desire to be a prof and was there in the B-school to let his wife, in her long illness, be closer to her family farm home in Indiana. On the prof pay, even doubling that via applied math consulting, there was no way even to hope to buy a house and support a family. So, the prof went to IBM’s Watson lab in an AI project.So, I’m rating word whacking, killing off the typewriters, as the first big effect, “application” of PCs. For the timeline, again, the Apple printer adapter card was in 1977, two years before the first Visicalc (via Google searches).And for people using electronic spreadsheets, there’s my superset argument — the spreadsheet users were still likely spending more computer cycles on word whacking and printing than recalculating spreadsheets.Spreadsheets are fairly specialized things; word wacking is crucial for essentially any activity in business offices, education, or, for that matter, accounting.Indeed, one of the ways computers beat the typewriters was just reapplying the software tools computers needed and had for writing software. That is, whatever else writing software is, it’s a lot of word whacking.Indeed it remains, my most important software tool is my favorite text editor KEdit — nearly all my typing for anything goes into KEdit. Well, KEdit is a PC version of XEDIT written on his own time by an IBM guy in Paris for IBM mainframes. The thing is gorgeous, and “went viral” in all of IBM in weeks. For a macro language, he picked scripting language, still my scripting language, Rexx, by Mike Cowlishaw in England — elegant and powerful beyond belief and was long the main software for running the main administrative computing for IBM — VNET, about 4000 IBM mainframes around the world connected via bisync lines. It was very much like the Internet, e.g., with on-line fora, phone books, organization charts, documentation, except the packet routing function was handled directly by the mainframes. Well, nearly all the little applications were in Rexx. For what we do now with Web pages then was done with XEDIT and a lot of Rexx macros. Well, sure, one of the main pieces of applications software was IBM’s Script that IBM used to write their computer documentation — that was and remains a big deal, one of the biggest cases of word whacking, so big that for some years IBM was the world’s largest printer. Well, then, sure, Script got used for word whacking of wide variety. Part of the reason was that IBM was early and big in laser printers, the ones about as heavy as a car, that could print just gorgeous looking bank statements for 5 million people SLAM BAM. So, around IBM and likely at some large IBM customers, XEDIT, Script, and the big laser printers were being used also for general purpose word whacking. Now Microsoft might be the world’s largest printer except mostly people just read the many thousands of MSDN Web pages via HTTP, HTML, and Web browsers.Point: Computing was necessarily long really good at word whacking just for software development and then ordinary administrative stuff.E.g., when the Ohio State B-school got a Prime super-mini, sure, one of the reasons was to have more in applied statistics, simulation, and optimization for research. But one of the points that really sealed the deal was the fact that the school was close to screaming in the halls about the difficulty of getting the word whacking done for course notes, course tests, journal writing, etc. Then someone mentioned that, sure, get a daisy wheel printer and, then, just use the software tools on the Prime that were intended for writing software and that Prime also used for writing their manuals. So, again, the path into word wacking on computers was first from the fact that computers needed a lot of word wacking for software development and documentation writing. The Dean’s secretary wanted a CRT that was especially tall so that she could see a full page of formatted text at once; she got one!So, that daisy wheel printer remark sealed the deal, and soon every department in the B-school had all the secretaries on dumb terminals doing word whacking on the Prime and driving one or more daisy wheel printers in the department offices. The secretaries took to that new equipment like ducklings to water — no hesitation, delays, tears, etc.Spreadsheets? The Prime had a really nice spreadsheet program, but is was lightly used.When PCs came along, the computer based word whacking continued, and the typewriters died off like the dinosaurs.Yup, VisiCalc and Lotus made the headlines. But no doubt everyone in the then quite large typewriter industry saw the big effects.

          3. Vasudev Ram

            Long story. I guessed you were involved in it before the part where it became obvious.Haven’t tried out Rexx yet. I know there are some open source or free versions of it around. Object Rexx and Net Rexx I think. Do not know which is better, if any.I had done some interfacing of computers with printers (via software I wrote, C programs and shell scripts, on Unix) when I worked as a system engineer. Fun stuff.Speaking of IBM, reminds me of the book I am re-reading. Lou Gerstner’s Who Say Elephants Can’t Dance.>so big that for some years IBM was the world’s largest printerAccording to Gertner’s book, for some time, IBM was also the world’s largest software company.

          4. sigmaalgebra

            I use Open Object Rexx on Windows. I long used it on PCs, OS/2, Windows 2000, Windows XP SP3, and now Windows 10. For Windows 10 I had to download a new version that somehow always have something service process running all the time.For KEdit, Mansfield Software, CT, for their macro language they wrote their own version of Rexx they call Kexx.On Windows PowerShell is likely more powerful, but I haven’t converted to it yet.

          5. Vasudev Ram

            Brief was supposed to be a pretty good editor too, in earlier days. I only used it briefly (heh) in one project, but had read about it. It was advertised often in PC mags like BYTE. It was reported to have used many optimization techniques for speed, including maybe being written in assembly language, and using BIOS calls to adaptively change the speed of cursor movement depending on for how long the user pressed an arrow key (on the assumption that if an arrow key was pressed for longer, it meant the user wanted to scroll through larger portions of the file, so Brief would speed the movement up – maybe by increasing the key repeat rate or such).When on a visit to the US (Boston), I was introduced by a colleague to the creator of Brief, Norm Miles.

          6. Vasudev Ram

            >the main administrative computing for IBM — VNET, about 4000 IBM mainframes around the world connected via bisync lines. It was very much like the Internet,Yes. I had read about the IBM VNET. It had a bi-directional gateway to the Internet too. I remember that when I first got onto the Internet years ago, I tried out all sorts of things related to it, and remember receiving some emails from IBM VNET, in connection to something I was doing, maybe writing to someone on it, or subscribing to a mailing list or something. In fact I had email access (only) via UUCP (at work) before I had Internet access, and remember using methods to access the Internet (services like Archie and Gopher) purely via email. There is even an Internet FAQ on how to do that, called something like:”Accessing the Internet via email.” It may still be there on faqs.org.Might still be useful for people who don’t have proper Internet but do have email access.So using that, I got my education about the Internet without the Internet, so to speak 🙂

    4. JamesHRH

      All of these examples have obvious jobs to do. AR & The Chain do not.Tech is hitting the wall – the innovations are amazing but there is no job for them to do.‘Look, I can mix your real life with animation, It’s like you can live in Roger Rabbit. ‘ Who wants that?‘ Look, I can eliminate the middle man in online transactions so you don’t know who you are dealing with.’ Who want that?Consumer tech has jumped the shark.

      1. sigmaalgebra

        With some new tools, we have to expect a lot of junk applications, a lot of chaff for a little wheat, a lot of hype for a little substance.But computing and the Internet remain perhaps the most important technology in all of civilization — beats steel, steam, electric motors, etc.But that point alone is not sufficient for startup, business, or investment success. Instead, need to find some problems where the first good or a much better solution is close to a “must have” for enough people and revenue per person to make a good business. Sure, the solutions have to be doable and cheap enough; and we want good barriers to entry.Well, the computing will do essentially anything we tell it to do. So, we are still left with the question, what do we want the computing and Internet to do? What are those problems with the “must have” solutions?That’s where we have to get into a quiet room, calm down, sit in a comfortable chair, lean back, pop open a cold can of, say, Diet Pepsi, and think about pairs, problems and solutions, until find a good pair.Okay, old example: Typewriters. The industrialized world was spending huge big bucks on labor punching those things, using miles of correction tape, rivers of correction fluid, just to revise a little and type it all again. Wasteful beyond belief. Super big bucks wasteful.The little 8088 microprocessor? With a simple, primitive operating system, that little thingy could be the basis of good word whacking to destroy the gigantically wasteful typewriters. Net, it worked great. Quickly made Bill Gates worth $300 million and let Microsoft continue on to the present with some of the best computing so far, computing that makes the mainframes of 1980 look like kids toys in all of the hardware, system software, communications, …, and applications.Okay, what now? What pairs of problems and solutions?Again, the computers will do nearly anything we tell them to do. Now, what should we tell them to do next?

        1. JamesHRH

          That’s exactly the problem.What people want them to do is make them better humans, not a better reality.Make me healthier, make me more disciplined, hell, make me happier.IT is nearly over, its a dinosaur looking up at the blazing flashes in the sky.BIO is beginning, its crawling out of the ocean onto the beach.

          1. sigmaalgebra

            A point is the AMD FX-8350 I’m playing with: It has 8 cores and a 4.0 GHz clock. You may have noticed, that 4.0 GHz is one of the fastest clocks in the history of computing, at least desktop computing, at least for usual personal or commercial work. And now for such computing clock speeds are commonly back to 2.0 GHz. So, we’re not getting faster clocks.Yes, we can put many more cores on a chip, but the usual view is that the heat issue forces the clock speeds to go still slower, maybe even slower than 2.0 GHz.And for desktop computing, there is some question about what to do with a processor that could run 1000 or 100 threads.So, in a sense. usual commercial computing is for at least a while as fast it’s going to get. Sure, we hope that as line widths get smaller than 14 nm the heat problem will be relaxed and we can get back to faster clock speeds.Well, the first PC had a clock of, what, 4.4 MHz? IIRC the last IBM water heater mainframe had a clock of 153 MHz? So, 4.0 GHz is a big step up from that old history, ballpark a factor of 1000. So, until we start to do something different, another factor of 1000 is not so easy to see.So, in a sense, this amazing line of progress is seriously slowing.I believe that there is much more to do, and that some good applied math will be in some high level sense like another factor of 1000, but not many people agree with me.So, hmm, biology?But if want to be “happy”, the usual, sure-fire, single solution to making people happy is, and the candidates are …, and may I have the envelope please …, and the winner is “other people”! Ah, and accepting the award is Zuck himself!!!

      2. falicon

        I think the whole point is that none of them really had obvious jobs to do at the time that they were introduced and considered ‘interesting’…or at the very least, the ‘obvious job’ at the start didn’t turn out to be ‘the thing’ later on (e.g. mobile was obviously about making calls from anywhere…and yet, ‘the thing’ actually turned out to be ‘computing anywhere’).AR can (eventually) allow you to do and know things in-real time that just aren’t possible right now. That could be really amazing (it’s not yet).Blockchain can redefine what ‘trust’ means, how the world collaborates and self governs, and redefine what ‘economy’ means. That could be really amazing (it’s not yet).They could also both be nothing.It depends on how they end up evolving, get packaged, and ultimately seamlessly delivered…that’s the whole point of the post IMHO.

        1. JamesHRH

          The argument is this:- you say AR / The Chain redefine- you say the web redefined mail, publishing, administrative work flow- I say the web extended & expanded mail, publishing & administrative workflowI think there is a big difference. The Chain doesn’t extend or expand trust, it – as you say – reinvents it. Same for AR.Chances of those reinventions being adopted aren’t VC acceptable minuscule ( 1:1M) they are nanoscopie ( 1:1T).These techs go against human nature, the web went with it.

          1. falicon

            If nothing else, I like and appreciate that you dig in and stick to your nay-saying.I’m not following the AR space that closely, but my mental framework for it is that it’s really about tying “imagination” with computing power…so all those things in real life where you are left to “imagine the possibilities”…AR will allow you to manifest that “imagination” into a temp. reality…and that helps you to make better/faster decisions and have a more emotional connection to things.Simple example: Shopping for clothes? You can use AR to see how it really looks on you it various settings/situations and with an endless amount of accessories…and very quickly.About to climb a rock? AR can give you insight into every possible line you might take, and help safely guide you through the whole climb (with adjustable difficultly levels and help as you request of course.Again – these things might not happen at all…but it’s positioned to be possible…and that’s exciting.For blockchain – if you want to go with “extend” vs. “redefine”, I’m not sure why you wouldn’t argue that it has the potential to extend democratic trust to the global stage.Right now – without blockchain or cryptocurrency, I can communicate with almost anyone in the world via the internet…but I can’t really transact with a vast majority of them…especially on a personal/small scale (i.e. If we are in person, I can buy you coffee…but if we are not in the same physical space, I can’t really do the equivalent).The closest thing I can do is subscribe and hope the “49 cents a day helps a kid in Africa eat for a week” commercials are real (though we all know they take a massive cut of the pennies I send).With cryptocurrencies, I can get whatever amount of “money” I want directly to whatever person I want…just like if we were in person and I handed them the cash from my wallet.Again just one small example – but potentially a really big thing in terms of having a true “global economy” on a personal level….anyway, sure we’ll continue to debate and buy into our own sides of these things more and more over time…until we know for sure which way it’s all going to go (and then we’ll both officially claim victory and that we were saying “that” all along) 😉

          2. JamesHRH

            In all the years I have been around this, I have learned three things:- simple fundamental attitudes are important- simple fundamental assumptions are important- simple fundamental product attribute decisions are importantI never understood why every investor talked about how good the team had to be. I knew they were saying something important, but the way they were saying it was off.Why did Fred know Twitter would go? He had blogged as tweets for the first 9 months he posted. His experience shaped an attitude that few other investors held.Why did Zuck know that social status was important? Because he was living the social stratification nightmare at Hahvahd. It shaped his attitude and he knew everyone on the planet would be on The Facebook.So, there is no complex argument here.I really like your AR examples, they sound ultra-Nice to Have to me. If my attitude on that is wrong, I am wrong.I really don’t like your examples for BlockChain, as the fundamental product attributes that people pitch me on The Chain are anonymity, and direct transfers. If your answer is ‘I give people cash all the time and people have done so for millennia’, I would like to be able to give people cash online……I don’t see a business model for that.Fundamentals.Good news – one of us is really, really right 😉

        2. Matt A. Myers

          Trust happens best through people giving their attention to understanding and evolving themselves. You then say “I trust what this person or service says, and they currently trust this person or service – and these are the reasons they give” – so then you too decide to trust them. I haven’t seen a good explanation as to why or how blockchain has the potential to amplify this, I’ve heard it said many times the potential exists – but it’s not seen in how this can be done better than using a traditional DB; not to mention all of the issues of how all of the blockchain ecosystem is proving to not actually be decentralized, and not being safe from manipulation of various means including processing power utilized by one party vs. that of another. The legal world seems to be one place where blockchain may take hold, however if you use a standard DB that everyone by law is required to use, and then if there is a hack or data corrupted or manipulated after-the-fact — then we can investigate and stop trusting whomever was the cause; the issue here really is we need whistle-blower laws in place to avoid issues like Equifax, etc. All of the blockchain hype really has been because of the gambling aspect — it aligned all of the marketing ‘gurus’ worldwide who look to make a quick buck, and yeah.

          1. falicon

            In the traditional flow, you have to trust a third party (the db owner)…and you trust them to trust the people through them.The general idea is that instead, the blockchain network in the middle is the trusted middle…that isn’t controlled by one person or entity….so there is no “big brother” that needs to be inserted into the system.On a high level, Ethereum contracts allow for you and I to agree to terms on an exchange of value…and it will take care of the details of making sure we both live up to our end of the deal, then do the exchange.But yes – long way to go for all of this stuff. Potential is exciting, but still just potential.

      3. Vasudev Ram

        I agree with you more than you agreed with me in another sub-thread in this post :)It’s not consumer tech that has jumped the shark, it’s the proponents of it (for gain at any cost and without value to users) who have.

      4. nick

        I want that, if it means I pay a 1% fee to send an international transfer instead of 5-10%.

    5. Matt A. Myers

      Well, if you’re into gambling and that’s your goal then the Pyramid-Ponzi scheme is a real killer application for the blockchain.

  11. Tom Labus

    Something will come out of left field. Easy to use and with a good reason to use it too.

    1. Twain Twain

      This and maybe something that disrupts both AI and blockchain.

    2. Vasudev Ram

      Ha, reminds of the word exeunt used in literature like Shakespeare etc.https://en.wiktionary.org/w…>will come out of left fieldI know it’s a sports term.

  12. jason wright

    i can imagine it will be the gamification of a mundane reality of modern human life. decentralisation is about dispersal. I would be looking at an intersection of the two.

    1. brooksjordan

      Loved revisiting your piece on the wrapper Seth.

    2. SethandFred-2PeasInAPod

      Fred incorrectly compares Blockchain and crypto with virtual and augmented reality and look who shows up: Fred’s old buddy, a remarkably successful snake oil salesman promoting his latest rehashing of marketing 101 as if it were somehow insightful and important.Seth Godin’s “marketing wisdom” is as stale as bag of potato chips that has been left open for months. Hasn’t he persuaded enough gullible folks to buy his books that he can go retire to Florida?

      1. Russell

        Why be anonymous if you feel so strongly about it? Haters going to hate

  13. JaredMermey

    How do you weigh your responsibility to keep investing pre-packaging to progress the field to get to the point of packaging versus responsibility to LPs to make the best investment each time with highest probability of positive outcome?

  14. OurielOhayon

    i guess would depend on the territory for crypto. Clearly payment may be coming before gaming in countries where banking system is broken or non existing? i think for once we should try to not see a tech revolution with a “1st world” filter

  15. Emil Sotirov

    A few days ago, had a thought about what could be the “package” driving migration to web 3… https://twitter.com/sotirov

  16. Dan G

    “mobile .. didn’t really take off until the iPhone came along in 2007” mobile really took off with cheap, ubiquitous Androids

    1. Andu @ Widgetic.com

      Nope.

  17. Bill Gurleyman

    So in summary Fred. VR/AR, Blockchain… stay away. You missed one. Autonomy. Instead of writing a bullshit article that is self evident why don’t you call bullshit on the things that have no clear market for at least 10 to 15 years.

    1. RichardF

      Probably because its not bullshit

      1. Bill Gurleyman

        Really? Point to dead fred’s VR and Autonomy investments? He would disagree.

  18. Joe Lazarus

    What are some hypothetical examples of how blockchain / crypto might be used in gaming?

  19. Daniel Olshansky

    I know this isn’t the best forum or audience to share TV show recommendations, but I hope someone finds it interesting given the context.”Silicon Valley” is a really popular, modern show that a lot of techies can relate to. “Halt and Catch Fire” is another fantastic show that captures the same subject manner in a less comedic way, but in the mid 80s rather than 2010s.

  20. Pedro Almeida

    Why gaming? Can anyone point out to good examples? Thanks

    1. falicon

      https://www.stateofthedapps… – a large percentage of dApps are games or ‘collectibles’ right now (cryptokitties prob. being the most famous at the moment).Games are generally something most devs like to build, a great way to learn and try things out, and something proven to pull users in (though it’s very much a hit driven market to be in if you are looking for profit). So it’s not an uncommon starting point in new platforms/networks/systems…

  21. ThinkClearlyFred

    Shake yourself Fred so that you start thinking clearly.Virtual and augmented reality are going to be like the lever, the screw, and the wheel: powerful technologies that endure.Blockchain and crypto are like Charles Ponzi’s fraudulent scheme. Within a decade Blockchain and crypto will be tossed into the dustbin of history.

  22. sigmaalgebra

    I totally disagree!!!For investing, to heck with the “trends” or “themes”.PCs (personal computers), the Internet, mobile, AI (artificial intelligence), ML (machine learning), VR (virtural reality), AR (augmented reality), etc., are all too broad to be interesting as, or relevant to, investing in any sense. They are not much closer to making money than Newton’s F = ma (force is mass times acceleration).E.g., drones are deeply dependent on F = ma, but that is no reason to invest in drones.For investing, startups, and business, F = ma, microprocessors, PCs, the Internet, …, are big trends that we just take for granted when they arrive. That these exist, have arrived, have taken off, all are irrelevant to investing. E.g., I just plugged a $115 (from Amazon) AMD FX-8350 eight core, 64 bit addressing, 4.0 GHz clock microprocessor with 1.2 billion transistors into a $60 Asus M5 … motherboard, all astounding, the pinnacle of a technology giant leap for civilization, but that fact alone and a dime won’t cover even a 10 cent cup of coffee.Net, even a theme as astounding as that 1.2 billion transistor processor for $115 by itself is useless for startups, business, or investing.Instead, what is just crucial is how the heck can make money. Here a “theme” is useless. Instead of a theme, need to get quite specific — what problems, what people, users, customers, what solution, etc. There F = ma, an algebraic group of measure preserving transformations, a $115 1.2 billion transistor microprocessor, the Internet at 50 Mbps, hard disks with 2 TB (trillion bytes) of space for $100, etc. can all be crucial ingredients. But onions, carrots, celery, garlic, parsley, and shin of beef were all necessary ingredients in Escoffier’s cooking, but what was really crucial was what Escoffier and César Ritz did with those ingredients. Else might as well leave the carrots to the bunny rabbits in the garden.So, to heck with the broad themes: If see a big problem and a great solution cheap and fast to start with a good barrier to entry, etc., then maybe have a startup, business, or investment opportunity.Also the “packaging” is mostly just nonsense: Instead, the PC, right away, and well before the graphical user interface, was just terrific as technology with lots of really good startup, business, and investment opportunities but with essentially nothing in packaging.Why?(1) Get rid of typewriters. Flush the ocean of correction fluid. The typewriters and correction fluid, etc. were gigantic wastes for our economy. A PC with command lines, a text editor, and a daisy wheel printer were fantastically better for the word whacking and made big bucks right away.Once there was a technical firm with 300 people and lots of secretaries at IBM Selectric typewriters using miles of correction tape and gallons of correction fluid. The work product, for the US Navy, was quite technical and typed on paper. Once the Navy saw the layers of correction fluid and returned the paper!!! Soon, to get MUCH more computing for less money, one group of 20 people got a Prime super-mini computer. They also plugged in a daisy wheel printer. Then in that group of 20, the Prime, some dumb terminals, a simple text editor, and a daisy wheel printer junked all the typewriters and correction fluid in just a few weeks. Each secretary saw the future like a locomotive at 80 MPH just ahead. No training: Each secretary begged for a dumb terminal. If they begged nicely, then some of the guys got a spool of Belden general purpose 5 conductor signal cable and pulled a line from the Prime, over the ceiling tiles, and hanging down at the secretary’s desk. The secretaries taught themselves, typically in just a few days, and then, presto, bingo, there was a lot more typing output, much higher quality writing (easy to do lots of revisions), and much better looking work product. Soon that solution was copied over the whole company of 300 people. “JUNK the typewriters!!!!”.Big bucks? In 1980, even before PCs, Prime computer, a super mini computer, a bit-slice processor, an operating system borrowed from Multics, made the greatest return of any stock on the NYSE. Heavily the reason was just getting rid of the typewriters. So, just use, say, a dumb terminalhttps://hackadaycom.files.w…and get a Xerox/Diablo daisy wheel printerhttps://upload.wikimedia.or…and destroy typewriters like a tsunami wipes out grass shacks on a Pacific atoll.So, with that Prime (along with DEC, Data General, and a few others) there were good investment opportunities in printers, ink cartridges, modems, terminals, software, hard disk drives, tape drives, etc. Could even get a business making sound deadening enclosures for the printers. Maybe Belden and AMP made good money just from the signal wiring. And there were serial port switches. Yes, most of these were doomed to be small.But as an investment, Prime Computer did really well quickly. But, sure, about the time of the Intel 386 and the Motorola 68,000, Prime needed to reinvent themselves. Well, they got a CEO from IBM and, …, sure they were slowly beaten by PCs and workstations built on the 386 and 68,000. Prime had a great opportunity to port their very nice software and some very nice high end applications to, say, the 386 (with a lot of deep architectural similarities — the 386 was heavily borrowed from Prime which heavily borrowed from Multics), but what would we expect from an IBM CEO?E.g., right away the Ohio State University business school alumni office used a Prime, programmed and run with a dumb terminal, a little software, a text editor, a simple text formatting program, and a daisy wheel printer to carpet bomb the alumni list with beautifully worded, personalized, and typed letters and raise big bucks. The alumni office had their printer going 40+ hours a week and learned how to get the ink cartridges cheap. They got a sound deadening enclosure. CDC made money for the disk drives. The university was so impressed the central data processing group rushed to roll out a clone for all the alumni offices on campus.How did the Ohio State B-school get a Prime? Well, part of it was a new prof beat IBM’s super salesman Buck Rodgers and a very fired up IBM Columbus, OH branch office.(2) That Prime and its competitors were not just for word whacking and, instead, were heavily used for scientific, engineering, financial calculations of wide variety. IIRC, Bloomberg started with Prime computers. There was a really nice spreadsheet program with linear programming optimization built-in. Soon, via the Ohio State Prime, D. Knuth’s mathematical typesetting software TeX was running on Prime computers around the world. A lot of technical computing shops junked their IBM mainframe TSO accounts and got a Prime and did many times more computing.Then, with PCs, that trend continued — for a lot of the work, the graphical user interfaces (GUIs) were not necessary or very useful.Microsoft was already a very successful company before the first version of Windows with GUIs had much popularity.IMHO, GUIs are mostly just a pain in the back side. E.g., GUIs tend to have icons, and I deeply, profoundly, bitterly hate and despise icons and their theme, intentions, implementations, etc.Why was Microsoft so successful before GUIs? Because there were some HUGE problems, especially getting rid of the typewriters, doing scientific, engineering, financial computations. For all of those, there were plenty of big bucks ready to buy right away. GUIs not necessary.To heck with the big themes. Instead, see the particular big problem — enormously wasteful typewriters. Then see suddenly available means of a much better solution — some simple computing, MS/DOS, a PC, WordStar, and letter-quality daisy wheel printers. Done deal.Might mobile and the Internet enable some solutions, the first good or much better for a big problem? Yup. Invest in “mobile”? Nope. Pay much attention to the “theme” of mobile? Nope. See a particular solution that solves a particular problem and uses mobile to do an important part of that? Okay, maybe have something.For startups, business, and investing, specific solutions for specific problems can matter a lot. Such a solution might make use of some new technology. But invest in or based just on the technology? Nope.So now invest in anything making crucial use of AI, ML, VR, AR, self driving vehicles, internet of things (IoT)? Nope.If there is a good solution for an important problem, …, etc., then maybe have something even if it uses GUIs and mobile and claims to use AI and ML.VC Web sites have long been just awash in claims that the partners have “deep domain knowledge” and go to conferences and stay up on all the latest themes. Send them a business plan deeply in those themes? Will never hear back!”Packaging”? If the packaging is really important, then maybe the problem is not really pressing and the solution not very important. E.g., Bloomberg made really big bucks because could get a boiler room, some Bloomberg terminals, one trader per terminal, trade on the Bloomberg hot news, and make big bucks right away. What mattered was the ASAP Bloomberg information, a “must have”. The packaging with lots of GUIs graphs was a nice to have and came later.

  23. Aashay Mody

    Similar thoughts to Chris Dixon’s recent essay “Why Decentralization Matters”: https://medium.com/@cdixon/

  24. Walter Palma

    Success and timing in early stage investing is not so much about understanding the technology as it is about understanding when it no longer needs to be understood by the public at large.

  25. Pete Griffiths

    The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail, generally referred to as The Innovator’s Dilemma, first published in 1997, is the most well-known work of the Harvard professor and businessman Clayton ChristensenIt’s all in there.

  26. Marco Dominguez Bonini

    Totally agree. You can create a lot of value, but if you are not able to deliver it and capture your share, somebody else will.

  27. Pointsandfigures

    a lot of the blockchain stuff I have seen seems forced. It’s not embracing the true benefits of blockchain. when you see a company that does though, it seems to make sense and I think it will work. Filecoin comes to mind.

  28. Guy Lepage

    Delivery and distribution is such an important part of new technologies. That is what gets them in the door. As a blockchain UX veteran, I can tell you we are still a ways out from mass adoption.One thing that will assist immensely with the UX of blockchain applications is less, not more regulation. It is still extremely cumbersome and time consuming to purchase crypto of any kind. In my opinion, Coinbase has done a great job so far, but they, and others, are so heavily regulated that it’s slowing down adoption and innovation. Most decentralized applications will require users at some point to interact with crypto-currencies. This still needs to be solved before any mass adoption can happen.One solution might be to defer the user’s interaction with crypto. This is accomplished by 2nd & 3rd layer utility tokens that do not require users to create a wallet. Instead users just use their apps and their tokens accumulate in their profile until they reach a certain milestone. Once the milestone has been reached the user can “cash out” and exchange their utility token for the currency of their choosing. This solution incentivizes users to jump through the regulatory hurdles of creating a full crypto account.I feel it’s solutions like these that will allow developers to focus on a more traditional on-boarding and user experience.

  29. Pascal Aschwanden

    >> At some point, some company will figure out how to package it up correctly and it will go mainstream.It already has! Pokemon Go.I could see a few more AR apps coming along to get mainstream traction. But, on your phone, I just don’t think it’s going to be as transformative as the mobile revolution. As far as your phone is concerned, AR will complement many apps in the future, but there will always a large class of apps that don’t benefit from AR.In order for AR to really take off, it would need to be something like Google glass, or a brain implant that projects things into your thoughts.

  30. Michael J Lambie

    CameraIQ is really making an effort to bring AR to the mainstream. it’s a pretty interesting play.

  31. curtissumpter

    This is an amazing post.