Where Protocols Come From
There’s an interesting discussion on usv.com this week called Where Protocols Come From. Here’s the anchor to the discussion:
Protocols play a vital role in computing, as well as a vast array of our online interactions. The device you’re reading on now has a USB connection; without it, your device couldn’t interoperate with other devices. You’ve probably sent an email to someone in the past hour; without the standard IMAP/SMTP protocol, you wouldn’t be able to send email to people who aren’t on Gmail.
While protocols make interoperability possible, and in fact many are governed by standards bodies, history shows that standards are often imposed by one dominant player. For example, Apple may have quietly invented the new standard for USB. JVC played a large role in the invention of the VHS.
On the software side, the history is a little murkier. Among file formats, Adobe invented the PDF and Apple is largely responsible for the proliferation of MP4. HTTP was invented by a computer scientist and widely adopted without the domineering of any one industry player. Attempts to establish social networking protocols, such as Tent.io, have largely failed. We are, however, beginning to see an uptick in protocols proffered by companies, such as our portfolio company Onename.
This week we’re asking:
-
Why have hardware protocols been driven by dominant players but not software?
-
What might it take for a software company to establish a protocol?
-
What conditions must be met to establish to establish an internet protocol?
The discussion is here. We are collecting both comments and posts in the discussion, which is how we do every topic of the week at usv.com.
Comments (Archived):
Love this topic. Here’s what I wrote in response.Q. Why have hardware protocols been driven by dominant players but not software? A. This is a bit of tricky question, because even these hardware protocols have software inside of them, but that software is tightly coupled with a hardware function. So, it looks like hardware of course, but the software is the lever part for the hardware. That coupling is an important part of market adoption, because you avoid being a solution (protocol) looking for a problem (use case). They come together.Q. What might it take for a software company to establish a protocol? A. Surround the protocol with an open platform and a strong ecosystem around it.Q. What conditions must be met to establish to establish an internet protocol? A. Why do we need to think of “Internet” protocols only? I think there are special-purpose protocols that can exist on top of existing Internet protocols. E.g.: Bitcoin is one of them. And there could be vertical protocols (Onename). I would slightly turn the question to: How different are these overlay protocols from other protocols?
You know, in my heart, I agree.But open is not always the right solution as you infer.It isn’t for Twitter (aka the Meerkat embroglio).And the hardware/software paradigm breaks for me.IOS developers are certainly in a closed world and from a market perspective, it is a closed but lucrative place to be.De Facto is about market viability more than anything else I think. I want to say that open is always the better market solution and the standard maker but I don’t see it always playing out that way.
yes, market adoption is key. that creates new opportunities for sure.
yup, nothing matters except the market.truest line i know:Companies are either trying to discover their markets or working not to loose them.Everything we do is on that continuum.
May be a bit naive as I’ve never been responsible for writing or regulating broad protocols, but it seems like part of the point from Fred’s post yesterday is that lightweight framework / constraints for early development of a new space have to exist. It seems like there is a fundamental difference between service and applications layers (if you’re a utilities provider, for example, to expect that you will provide fair and innovation stimulating framework for everyone when you are also developing products/applications seems impossible). But to your comment, that may depend on how you define “open platform.”
some standardization actually allows more innovation to happen. in business school cases, they talk about Betamax vs VHS. Betamax was actually a better product but lost the war. Fax Machines had a similar protocol war. If you want an industry to grow with network effects, there has to be some industry standard to build network effects around. That standard isn’t always the “best in class” technology but it’s good enough to get the job done.
There is very often a disconnect between what people who are knowledgeable think that the end user will like, and what the end user is willing to accept which is quite often a much degraded product.I saw it back in the 80’s when end users thought that 300dpi from a laser printer was “good enough” to use for camera ready copy. In the end the market got much larger and the laser printers obviously got much better. I learned an important lesson back then with that one event. Quality isn’t an across the board lock in to a sale.Ditto for VHS vs. Betamax. Sony who had made a fortune with Trinitron made the assumption (I am assuming I didn’t read the case study so this is from memory) that the consumers would buy betamax because the quality was superior to VHS. But that didn’t happen.Another example: I remember my brother in law who is a certified gemologist and a jeweler (family in the business since the 30’s) absolutely laughing at the idea that people would buy diamonds or diamond rings over the internet. All he knew was how people came into the showroom and had to see and touch the merchandise. So he simply couldn’t believe that there would be any way that someone would buy a diamond sight unseen.
Car salesmen must have gone to the same school as the diamond seller guys.
Cars are more of an emotional purchase than diamonds are. Diamond engagement ring is a necessary purchase and typically paid in cash, not financed.A car salesman exists in order to take advantage of and convert a buyer by using manipulative techniques if necessary. That’s one of the reasons that fixed pricing hasn’t taken widespread hold. The dealer is able to adjust pricing to a particular buyer having some pay more and some pay less.Dealer also can apply pressure to get someone to make a decision “today” as opposed to “whenever you want to”.With cars in limited supply having “buy now” incentives is not as necessary since the idea is that the car in the color you want might not be there if you don’t buy it today. (Take Mini for example they are fixed pricing with occasional incentives by the MFG.) But in mega dealers where there are dozens of cars (of the same model) to choose from you need something to push someone to make a purchase. And that push is typically haggling over the price.People complain about car salesman all the time. But the truth is if you deal with consumers (or even businesses) you will find that people have no problem at all showrooming and wasting the time of a salesman and then buying elsewhere. Even knowing that they will do that. Consumers (or purchasing agents) are no angels.
reminds me, the car deals are the ones where both parties think they tricked the other in the negotiation. no other sale transaction leaves this kind of an impact to the sale
modems as well I think.
One of my favorite standard wars I studied briefly in school was Ethernet VS Token-Ring: http://www.quora.com/Why-di…
I remember that well. I was just starting out in VC at the time and reading all the computer trade publications. It was a big topic of discussion
As somebody that physically wired all of them, I can tell you wiring a token ring network would have your hands numb in less than an hour.The old 10base2 ethernet was so much easier but if somebody knocked out a cable, everybody would be searching for the break point.10base5 was better but that thick cable and the vampire taps were bad.Finally we got 10baseT. To your point of getting commoditized. I remember buying a 48 port Hub 10Mb for nearly $42k. (3 16 ports linked at $14k a piece)A decade later I bought a 48 port switch for $300 that was 5,000 times faster, yes 500,000% faster, and got a palm pilot thrown in.
What is also interesting is the social protocol layer that resides with the apps that run on top of the HW & how similar it is: there is an agreed upon ‘handshake'(friend, colleague, follower, etc.) & then you are connected. break the handshake…….
of the many protocols that are promulgated, the tech market is still dominating following just a very few of those.For instance, of the prescribed 7-layered network protocols, it appears the internet flourished using just 2 of those. it would be riveting to see when all these 7 will come to force.same with web architecture
Another classic case of an “easy to use and initially inferior technology” that evolved rapidly and inexorably winning against a “superior but p.i.t.a technology”. Bob Metcalfe is a academic EIR at U.Texas these days. You should come by to Austin and hear it from the horse’s mouth. @JLM:disqus will most likely also take you Matt’s El Rancho 😉
I would love to hear Metcalfe speak in Austin followed by and evening filled with tacos, bourbon, and 12-string bluegrass. That’s my heaven.
software is a highly fragmented meritocracy. barriers to entry are low to non-existent. Open source mindsetPermissionless innovationIt’s easy for a thousand flowers to bloom.**Survival of the Fittest.**Not so with hardware.Oligopolistic(Historical) huge barriers to entryProprietary Mindset**Survival of the Richest** – how can a software company establish a protocol? Easy1/ Prioritise nailing the use-case over personal profit/glory2/ See 1
“Prioritise nailing the use-case over personal profit/glory” wouldn’t that be a contradiction to “Survival of the richest”?
BitTorrent the protocol was established before BitTorrent the company.BitTorrent the company gets a ton of bad press for bad things third parties do with BitTorrent the protocol.
Another example of that is the use of the phrase “cybersquatter” to describe anybody who has a domain name that you want that got there first. Extends well beyond anyone’s reasonable trademark rights. Came to attention as a result of a clear case where someone registered a well known trademarked name:In December 1995, Panavision attempted to register a web site on the Internet with the domain name <panavision.com>. It could not do that, however, because Toeppen had already established a web site using Panavision’s trademark as his domain name. Toeppen’s web page for this site displayed photographs of the City of Pana, Illinois.http://techlawjournal.com/c…Also MTV vs. Curry:http://itlaw.wikia.com/wiki…
I remember falling in love with Gopher in ’92 and ’93 right before the Cambrian explosion of HTTP and Mosaic (which, in fairness, was actually already starting to happen). Classic example (Gopher) of a protocol that basically did and assumed too much (i.e., restricted too much) at too many levels of the OSI model.
“Mommy, where do protocols come from?””Well, Freddy, it starts with many attractive API endpoints flaunting their wares on the platform. Then a little embracing in some sectors. Which can lead to market penetration. In a fertile environment, and after an ensuing incubation period, the prevailing one emerges and is adopted.”
.Do you actually have a real job?Funny.JLMwww.themusingsofthebigredca…
Define “real”.
.”Your Honor, at this time the plaintiff rests his case.”JLMwww.themusingsofthebigredca…
To quote one of my highly regarded former Gartner colleagues, Christine Hughes: “The great thing about open standards is there are so many of them.” She was referring to the proliferation of COMPETING (software/protocol) standards being pressed hard during the early period of so-called “open systems” (late ’80s / early ’90s). Top-down industry-driven efforts produced little lasting fruit back then, and the ones that have had the most success since are those developed well prior to a technology’s market adoption. To challenge your premise, I would that argue even in software, through the last 30 years the “standards” driven by a dominant market share leader are almost always the ones that last.The most interesting exceptions to me are the bottoms-up, grass roots efforts that have emerged since that period, open source (and its offshoots) being perhaps the most impressive.
My understanding of this is a bit murky but usability issue I would like to address at Driver Stables is a single sign on for on demand drivers who contract to several providers.You sign in once and then you are sign onto Lyft, Uber Sidecar etc and then signed off once you accept a job from a particular provider.Is the issue being address?
I’m also interested in Where are Protocols Going.To what extent does knowing where they come from help us better understand where they are going?
it came from supernova and going to a blackhole up abovewe earthlings are the medium down here
.Hardware protocols require the production of hardware which means something has to be designed, tested, tooled, manufactured, sold, shipped, received, paid for, installed and used. To modify it requires the same process. There is a lot of inertia invested. Nonetheless, there are very subtle winds of change at work here.As an example, look at the industrial design and footprint of phones and their size implications. It is a constant game of chess both externally and internally.Software modification requires a guy in his boxers — or a girl in her PJs — and a laptop. The modification or customization or assimilation is instantaneous and therefore the benefits are also immediate. Very short product cycle.There is an entire industry for folks doing nothing but customizing big software platforms (SAP) and assimilating plug in and play modules.JLMwww.themusingsofthebigredca…
nicely put.your knowledge of everythig has no protocols
Jeff Bezos talks about web services as primitives. I like to think about protocols the same way.(drew this a year ago, but seemed relevant)
Protocols for email management software, please! Can’t your friends at ReturnPath just take over the world and simplify all this? Helping companies purchase their email software and related marketing tools has become such a headache — I can’t even count all the options anymore. They say the cost of running a business has gone down, but I’m not sure. It’s not cheap to purchase RelateIQ or Yesware at $50/month/person, Zapier at $50, Buffer at $50, AgoraPulse or SproutSocial or Hootsuite at $50, Clara Labs for meeting scheduling at $300, Mattermark for customer research at $400. Hubspot is $200 per month and I cannot understand what it is they offer even with my fricking honorary avc.com.PhD in inbound marketing… At least you can blog on Medium for free. Long Twitter because lightweight and classy!
This is the kind of thing that is not fun anymore: “Yes! Sniply integrates with 30+ popular platforms and apps, including Hootsuite, Buffer, Mailchimp, Zapier, and more.”
Rant over. Time for a long cool glass of water. Sun is out, it’s been a long winter since that giant snowfall over Thanksgiving.
Data standards are really where knowledge of the universe takes off, and we get away from a lot of the nonsensical noise. If we had open source access to non personalized demographic data, kept in the same format, social science, nature vs nurture, personalized medicine, and medical care becomes precise science rather than good art. I’d say parallel attempts to sharpen our spoken and written words would help, though partially standardized tagging system would almost have to be part of such a system.
Using williams format I’d sayQ. Why have hardware protocols been driven by dominant players but not software? A. Lower barriers to entry, (I wouldn’t says software is a no entry fee space, education, marketing, time, and team have to be at a certain level) I’d say Android and Iphone being platforms are dominant software players, even if one of them is proprietary bundled to hardware.Q. What might it take for a software company to establish a protocol? A. Cutting edge functions building off of that protocal, tight integration, and easy transition from similar predecessors, plus a hell of an introduction.
Whether we’re talking protocols or defacto standards, cgm was ‘going to be the way’ to describe graphics and atm was going to kill tcp/ip. cgm turned out to be limited and atm was just too complicated and the costs prohibitive. Protocols are a definition, standards are the definitions that people decide they are going to use.Which protocols become accepted standards depends on a lot of things including market clout. e.g. google og:metatags are standard , because if you don’t implement them properly, more sites than just google penalize you.I think that Liad got it right off the top. “Prioritize nailing the use-case over personal profit/glory” If it works – it works.
Well, I’ve been waiting for this discussion for a LONG time. Please have some patience 🙂 as I intend to address the three questions.First some background.Over a very long time, there was a huge change in the way technically oriented people contribute to ideas on the Internet.At the beginning, the Internet was characterized by what I call the “Age of Protocols”. Programmers and engineers would propose new ideas for infrastructure and applications by means of RFCs, often accompanied by reference code which implemented the ideas in the RFC.Over time, the Internet became what is now a big money making machine. I’m quite neutral in this respect; there’s nothing wrong with it, it’s just the way it is. But it changed the dynamics with regards to protocols. Instead of writing RFCs (which are mostly open, unless you’re a big company to start with) and publishing open source code, more and more clever folks began implementing their ideas as apps – be it web apps, mobile apps, or whatever.As an example, there’s a stark difference between, let’s say, Twitter versus email. Twitter is a web app, but 20 years ago it would probably be a protocol, with a reference implementation that could or not be the dominant one, and several compatible and interoperable implementations. Some think the world would be a best place if that was still the case (I’m mostly neutral on that, perhaps a little bit on the side of protocols).(BTW, and that’s a side note, Jon Postel is for me one of the most underrated heroes of the Internet, and I truly hope his memory gets recognition he deserves in the future).Now for the questions:1) Why have hardware protocols been driven by dominant players but not software?There are many reasons. First of all, hardware is easier to protect, and harder to document. Mass production lowers the cost of the implementation of hardware, which makes it easier for dominant players to enter the market. Hardware also requires more investment and dominant players have a better chance of executing it correctly. Patent protection is easier to enforce. Hardware also requires way more investment to be tested correctly, to simulate all situations, and to trap and interpret signals. Software is way easier in this respect.On the other hand, software protocols are fully documented in the RFC+code bundle, which allows compatible implementations to be written, tested, and to evolve later. It also allows for healthy competition; good ideas received comments (that’s what a RFC was for anyway) and were improved. Open source ethos prevailed and reference implementation accrued ideas from the community. That’s how some of the more influential protocols grew to what they are today (but that’s not the case anymore, in most situations).2) What might it take for a software company to establish a protocol?Protocols are intellectual property given away for free. It only makes sense if the company has something to gain from the protocol while at the same time, not being overly dependent on it.To put it another way, a protocol is something a company must be willing to give way for free; something that has more value the more people are using, but that is not absolutely necessary for it to control.Big companies like Facebook, Apple and Google can work on protocols on side projects because they have more to gain while not giving away any of their core strategic advantages.Small companies have to balance what they gain by having more companies adopt their protocols, while making sure that their core business can survive even if tons of companies publish alternative implementations or extensions.3) What conditions must be met to establish an Internet protocol?First of all, a clear specification (if possible something resembling the old RFC format, not the newer ones); and a reference implementation that can be easily compiled and installed anywhere.Second, it must be something useful at first. Like any MVP, it does not need to be complete, or high performance. It just has to be useful for something.Third, it must come with a set of minimum guarantees; it must move forward (never remove functionality for strategic or legal reasons); it must be kept open; and it must be safe to use in the future by anyone interested. Incomplete specifications that hold key functionality, or that leave open the risk that the sponsor may limit or revoke rights in the future must be avoided at all costs.Last, it must be something that allows an ecosystem to form where anyone can succeed. There’s nothing wrong if the sponsor has something to gain from the RFC; however, it must be possible for anyone to gain from its adoption, to be part of the ecosystem, an to have palpable advantages with the adoption.ConclusionThe key for protocols to succeed is their double nature: relevant for the sponsor, but still open and free enough for everyone to take advantage of it. Protocols only thrive if there’s an incentive for innovation around them. If your business needs to keep tight control of the protocol, then it’s not a good idea to publish it anyway.
I’ve been chewing on this a bit in the healthcare space. I believe the big vendors could establish essentially a protocol or standard for health information and everyone would fall in line. Unfortunately, their current business model keeps them from doing so even though the opportunity is there.
Remember EDI (electronic data interchange) in the late 1980s 🙂
I a more familiar with the standard or protocol put in place by the w3c and other body around the rendering in browsers. For a long time no progress was made in that area. The standard (which was not really a standard) was the dominant player IE. Firefox came along then Chrome and allowed for many innovation (ie. HTML5).Without getting into the details, you could say that Google like Microsoft 10 years ago is able to force the adoption of their standard, but a broad adoption is key and clearly open sourcing part of the software handling the mechanism of the protocol has been a key element of the success both for Firefox and Chrome.In my views here are the 3 important points when it come to establishing a protocol (which can later become a standard).- Innovation: To make people wants to adopt the protocol- Realistic: To be adopted a protocol need to provide a better solution for all parties – Open Sourced: To reduce the cost of adoption
There was a time when we defined our company’s value proposition in terms of a universal protocol for expressing and aggregating people’s opinions expressed in any language.But nobody could understand what we were talking about. So we stopped.
My experience on this one is that software protocols are like the joke about industry “standards”…there are so many to choose from. As a result, what emerges as a standard more often than not is the standard that emerges in the market.Why is this? Because it’s easier to ship the idea, fix and iterate, than to legislate the idea, and wait for innovation to manifest. Speed and agility wins hands down.By contrast, hardware has historically required greater investment, planning and orchestration, making it more stack dependent from the get go.Moreover, the hardware guys have never liked speaking to the software guys, and vice-versa, meaning that a negotiated protocol was the necessary bridge for each party to abstract dependency on the other.In other words, it’s a different definition of the situation for hardware and software.
Answer is still alive I got some Forex trading protocols here fx live where it comes from?