The Return Of The Command Line Interface
I learned to use computers in the era of the “command line interface”. It looked like this:
I started using computers in the era of mainframes and mini-computers and my first desktop ran MS-DOS which was a command line driven operating system. When Mac and Windows arrived, the command line more or less left my life, other than the occasional need to muck around in the internals of the computer and/or network.
But I feel the command line interface coming back, largely driven by text messaging and the increasing ability to leverage bots inside messengers. Check out this list of telegram bots that have been written since the Telegram bot platform launched a few weeks ago. I’m preparing myself for the moment when I can order coffee this way:
@bluebottle /cortado /tostay
You also see this in the 140 character constraint on Twitter. This tweet is closer to coding syntax than english language:
.@jobsworth @rmchase #ResponsiveOS: The #OperatingModel That Is Eating The World /by @aarondignan https://t.co/vhAKp4RSEe #SDH #Platforms
— Alexander Ainslie (@AAinslie) September 20, 2015
And I used this string of code on duckduckgo just now to get the photo I put at the start of this post:
commmand line interface !gi
We are relearning how to code and send instructions in short bursts of information that is most certainly not conversational in nature. But now this sort of thing is done as much by teenagers weeks after getting their first smartphone as it is by engineers who live in command line interfaces all day long.
That’s a shift, maybe temporary while we wait for AI and speech recognition to improve, but an important one for now.
Comments (Archived):
Love the command line. People learn to use computers these days as program loaders and the command line has them use computers as toolkits. Great to see the paradigm reborn.I try to model command line / os as toolkit whenever I can. A couple of examples:http://cestlaz.github.io/20…andhttp://cestlaz.github.io/20…Hope everyone enjoys what looks to be a lovely Sunday.
What’s old is new and what’s new is old.Old man alert before my next comment: The biggest issue we see with abstractions: I.e. Getting rid of command line for another interface, getting rid of SQL code etc, is that it reduces the understanding of the underlying system behavior.Now this is great for the end user that doesn’t really want to know about that behavior.But for technologists, it really does matter at scale. I fully understand that more powerful computers/computing services mean that you don’t have to care, but eventually it does matter.That’s the biggest thing we have to teach young developers. That’s not to say they don’t teach us a ton as well. (why diversity of all things is good)
For technologists, what you say has still more importance: In simple terms, we want to build things we don’t already have; we build from raw materials, pieces, tools that came before.So, we need easy, effective ways to combine old tools and some new code to make new tools. So far one of the better ways to do such things is command lines and scripts that can drive both command lines and other scripts while, yes, humans can drive scripts from command lines.
The Unix Pipe.
getting rid of SQL code etc, is that it reduces the understanding of the underlying system behavior.Here is my “old man alert”. This is similar to what Amazon is doing with AWS.There is an entire generation of people being raised who understand the way Amazon does things and only knows the naming that Amazon uses to do things. Someone wrote a great post about all of the Amazon services where they said things like “EC2 – should have been called ‘Amazon VPS'” and so on for each and every Amazon service. This was obviously intentional on Amazon’s part (and a brilliant strategy I must say..) Heroku tried to do the same thing as well.To me this is a total (old man alert) “Danger Will Robinson” and I recognized the seductive nature of AWS right from the start. Once you go there “now youse can’t leave”. Sure you think you can leave but you really can’t.Also everybody assumes that Amazon will never raise prices. After all they are always lowering prices. But once everyone is hooked it’s entirely possible (even if unlikely) that they can either raise prices or tack on additional charges because of the obvious impossibility of customers (in certain price categories) reengineering their stack and using traditional methods.
We can talk offline about this. For transaction processing it is brutal. We run bare metal. For simple serving up web pages…..I don’t think so. Running your application…..another story.
If your competition is cutting corners then you unfortunately need to cut corners. The end customer who doesn’t know about those corners is not worried about something that might happen. They are buying on price. I am not even sure that Volvo advertises safety anymore (all cars are safe enough..)I was watching one of those great “Air Traffic Disasters” with my wife the other night. I said something like “Some of this stuff is a result of deregulation and competition if airlines made boatloads of money they wouldn’t have to shove jets out, work pilots so hard and cut corners on maintenance”. Of course airline traffic is much safer than it was before de-reg. But anyone who thinks that profitability and airlines losing money for years (they are finally making money now) didn’t result in some forms of deferred maintenance of corner cutting is kidding themselves.I remember what the phone company trucks used to look like prior to de-reg in that industry. (The most perfect repair trucks in the world they must have had departments dedicated to keeping trucks looking auto show perfect). I remember the tree crews of power companies and how they had a much more proactive program of tree prunning. Doesn’t happen now, at least in the state that I am in. Every storm, there is some kind of power burp because of trees that have hit a line.
There is a happy medium. When it comes to airlines no, but as an instrument pilot who sits in the back 200k a year, and flew Air Malaysia three times last year, they really are much better.We call it farming chickens versus raising kittens. That is why Google killed AltaVista. I remember seeing Google servers on cork boards and AltaVista’s beautiful EMC and Sun Machines.If you have one kitten (a really expensive machine that you spend a ton of money to keep going) you give it a name, and spend a ton of money for care and feeding, another kitten costs you tons of money.If you farm chickens, you need more who cares?? One gets sick?? throw it away.I know I have stole this from somebody but I don’t remember who.
Just heard a Volvo radio ad yesterday, don’t remember the model but remember “The safest car we’ve ever made.”.
Getting rid of command line for another interface, getting rid of SQL code etc, is that it reduces the understanding of the underlying system behavior.Other big advantage of understanding the building blocks is that it allows you to be creative and to think and be able to do things that you would never be able to do if you didn’t fully understand all or most of the parts.I am reminded back in the day when a guy that worked for me wanted to know if a certain task or feature would be easy or hard to do when modifying our estimating or job management system (which I wrote if you want to call it that). Based on his question I automatically could answer (because I had built it and understood all of the parts) “yeah no problem to do that” or “nope not easily possible” where “easily” is a general feeling that you have because you know what is going on and what is involved. (Similar to construction, right?). So he couldn’t be as creative simply because each and everytime he had to run his ideas by me his brain could not go free and iterate ideas and solutions. If I had an idea I could just do the idea. I could be fully creative. No restrictions. Difference between “small tweak” or “major rewrite”. Seat of the pants feel. Instant reaction.We ran into this same thing when renovating the bathroom. Can you move the toilet? Can you move the shower? How hard is it relative to practical cost (cost not being a factor obviously anything can be done). Over a foot, yes, over to the closet “need to see if we can tap into a drain..” and so on.Often to create you need to know enough to confine yourself to what is reasonably practical and that is a great deal easier if you understand and are familiar with all of the tools and underlying parts. (This by the way also holds for deal making, strategy and a host of other things I have found.)
Very well put, totally agree.
“Never use a tool you couldn’t write yourself.” – a line I stole years ago from a friend of mine. Maybe a little too strong but the sentiment’s there.It’s something we really try to emphasize with our kids – they don’t have to write everything but they should have some understanding of what’s under the hood. You’re so right about “at scale.”
Totally agree. The corollary is if somebody else has written it and you can legally use it or buy it, do not build it yourself.I always say if we are not spending thousands of dollars a month on software we are wasting money.
Right, younger developers (or more likely a company trying to sell you a framework) will say, “Just use the framework! You don’t have to worry about any low level details.” Until the framework doesn’t work, then you have to tediously trace through 15 layers of abstraction, inversion of control, and generated code to find out what the computer is actually doing. “You can solve every problem with another level of indirection, except for the problem of too many levels of indirection.” — Someone Smart, hard to tell who said this first.On the other hand, no one is actually going to ever get anything done writing in pure assembly, so some level of hand-waving “someone else figured this part out” is necessary to ship usable applications. It’s a balance.One thing that helps for debugging is strong typing. The more the code can be analyzed at compile rather than run time, the better.
Mike – Any need at CSNYC for old vintage SGI Irix Indy Computers and monitors?
I remember being blown away by a flight simulator demo on an SGI box back in the day.I’m not part of CSNYC so I can’t speak for them. I love old stuff bu have no space at Stuy and there are probably more needy schools.
Oh, that is a lovely kitten.
That’s how blockchain programming started, and where it still is in many cases. Example below one of Ethereum’s Go client command interface language.
@yourwallet/$3.50/#charged
Joel Monegro wrote almost this exact bot on Telegram but its bitcoin to prepaid calling card topoffhttp://telegram.me/BitMinut…
To this day, terminal is the most powerful way to control a mac.Bring back the vi editor too. 1969 technology, yet still my power favorite because I learned it at such a formative time. Like ladies at the Chinese markets who can use an abacus faster than others can rock a calculator.
Vi? To advanced! Why not just “ed”!Edit: With “ed” no need to even set terminal type…
just letting you know, there continues to be a bug – can’t see/make comments when viewing AVC using Chrome. using latest version of chrome – Version 45.0.2454.93 (64-bit)
on what operating system?
Latest MacOS
Hmm. I use that exact same configuration on multiple machines and have not had this issue. I will double check and try to replicate the error
I’m on that exact configuration always. No issues here.
all set. see above. thanks for helping.
OSX Yosemite. Version 10.10.5. Older iMac circa mid 2010…
same here. it works. check your Java update? re-load the latest. sounds like your java might be corrupted.
figured it out – see above. thanks for helping
a ha. the issue is resolved – i use an EFF (Electronic Frontier Foundation) browser extension/plugin called “privacy badger” which blocked disqus and disquscdn. i disabled privacy badger for avc.com and disqus comments reappeared. i dont know how popular privacy badger is, but fred, you might let disqus know they should reach out to the EFF and get on their whitelist.
Clear cookies, clear cache, relogin to disqus.
“””””A future OS will interact with me through a repetitivecycle of request/response similar to twitter/SMS/IM interactions thatcan be managed in terms of priority, visibility, etc and that can alsobe chanelled through different media (SMS, twitter, IM, whatever). Itwill be the return of the command line, also a current trend.”””””””8 years ago: http://mvalente.eu/2007/10/…
you got that right
One of many….Serverside Javascript – Hope and Opportunityhttp://www.slideshare.net/m…GTUG JS will save us allhttp://www.slideshare.net/m…
I remember Dos in high school computer class. You certainly have a point. To the chagrin of the language arts literati, I suspect we are moving towards a more codified written language altogether and modern syntax / semantics will go the way of Shakespeare.
“The chicken is only an egg’s way for making an another egg” – Dawkins. Evolution is a beautiful thing.
I am a CLN (command line native) from the Commodore 64.Interesting CLI type thing I ran into yestersay. I was trying to send a tweet that started with a capital M only (I.e., “M” followed by a space). You can’t. As it turns out that’s a short cut way to tell the Twitter API you are writing a DM message, not a public tweet.
Agree and disagree with this: “Command line interface coming back, largely driven by text messaging and the increasing ability to leverage bots inside messengers.”To me, command line is useful for 3 things:(1.) Installing library packages like Homebrew;(2.) Git related commands for changing files, merging and updating.(3.) Controlling IoT devices.Compare and contrast how trying to deal with AWS is on command line with the more visual dashboard approach and we can see why visual dashboards are much much more developer-friendly.Compare also code vs visual window in XCode.
The difference may very well be are you doing a particular activity one time or the same activity many times over. If it’s something you are doing many times over from my perspective the command line is much quicker than a gui.Once you know and are familiar with the answers to the questions it is typically easier to go command line.
I agree. Repetitious coding is best done Command Line.My earliest forays into coding as a kid were MS-DOS and Pascal.Nowadays, I have to say I prefer GUI — although, periodically, I FORCE myself to code “old school” (SQL instead of AWS, C++ instead of SWIFT etc.) because “old school” trains us to be better at hunting down those syntax bugs and having to figure out the mathematical functions that may have gone awry (hence program crashes/doesn’t run).
Reading the post whilst waiting for the guy to make my cortado. Read about cortado.Was perfect timing.
Karma
Hm. serendipity?
There’s at least a 5 year window before the AI is broadly good enough and command line is very easy to understand and learn. Slack is another channel that will do a lot to intoruduce command line to a larger audience and it looks like they see it as pretty core/foundational for their product going forward. https://mobile.twitter.com/…Part of what’s great about it is that when using it, it’s a subconscious reminder that you are NOT dealing with a human which helps us cognitively manage the uncanny valley. This is critical until conversational AI truly becomes good enough.
Worth noting that Jerry Neumann was very early to this trend. His site, which features a command line interface, is the best I’ve seen in a very long time:www.neuvc.com
Re: “That’s a shift, maybe temporary while we wait for AI and speech recognition to improve…”For this to happen would require a SEISMIC INNOVATION like Blockchain+Bitcoin in Natural Language understanding.A few things to consider:(1.) TC article, Aug 2015: “By studying speech patterns, facial expressions, body gestures and physiological reactions to specific stimuli, researchers hope to amass a database of emotions they can train computers to recognize and interact with.The key challenge here is to establish a standard for what is definitively “happy,” “sad,” “angry” or another state, because right now, many apps and devices that claim to read emotions aren’t drawing from one definitive standard.”* http://techcrunch.com/2015/…(2.) Fei-Fei Li, Director of Stanford AI Lab, May 2015: “We are very, very far from an intelligent system, not only the sensory intelligence, but cognition, reasoning, emotion, compassion, empathy. That whole full spectrum, we’re nowhere near that.”* http://www.pbs.org/newshour…(3.) Geoff Hinton, Google’s “Father of Deep Learning”, Oct 2014: “IF the computers could understand what we’re saying…We need a far more sophisticated language understanding model that understands what the sentence means.And we’re still a very long way from having that.”* https://www.youtube.com/wat…(4.) Artificial Stupidity:* https://www.newscientist.co…* https://nplusonemag.com/iss…(5.) Art + Science of Building an AI:* https://www.startupgrind.co…
Arguably many humans who claim to read emotions aren’t doing so from one definitive standard – a stereotypical Russian will likely express happiness rather differently from a stereotypical Pirahã, say, though hopefully both actually feel it from time to time.So, having a unified standard for emotions for computers might not require they get it right 100% of the time – we certainly don’t. It may come down to (one of) the human cognition method(s) of distance-from-examples: “how similar is this face to the stereotypical happy face I have in memory?”
Oh there’s already methodology from 1970s academic theories on:(1.) Visual recognition of emotions by Dr. Paul Edman which is now applied in AI.* https://youtu.be/R6galodflT…(2.) Behavioral heuristics, Daniel Kahneman & Amos Tversky which is applied in A/B testing.Of course there are cultural differences. What a handful of US academics think is the answer may not be what the hundreds of millions of users in their own culture think is the answer — much less people belonging to another culture altogether.Hence why some social networks that work in the US don’t necessarily translate to Europe, India, China, Brazil etc.
The fractal/multidimensional nature of culture (and subculture) demands a fair amount of flexibility, agreed – and its diversity can’t be explained by geography alone.Especially now that the internet is involved.http://slatestarcodex.com/2… is a good (albeit tangential) discussion.The US might have some claim to a leg up on understanding this diversity – we do have rather a lot of immigrants from a lot of places, after all… but just because there’s a lot of detail in one part of the picture doesn’t mean the rest shouldn’t be explored as well.
Thanks, certainly is as complex as fractals and then some.Haha.
The best place to dive in to is the parameters of audio regarding pitch and strength measured via volume. I’m going to work on that line when I get caught up.Remember a blind person is able to tell emotions just as the deaf.
I think I’ve shared this link before:* https://medium.com/s-c-a-l-…There are all sorts of homophonics in Chinese which don’t exist in English so the Anglo-Saxon approach to speech recognition wrt pitch strength+volume don’t map over and apply.We’re not talking about the Latin accents of aigu, grave, circonflex, tréma and cédille either. Léger, très, tête, Noël and français all have clearly different spelling using different letters of the alphabet.Not so in Chinese. “Mother”, “paternal grandmother”, “horse”, “hemp”, “road”, “measles”, “troubled confusion”, “sesame”, “a rhetorical indicator” and other words ARE ALL SPELT AS “MA” but pronounced according to a lyrical scale for meaning and understanding.Chinese AI researchers have a VERY different frame of reference for Natural Language understanding from Western AI researchers because of their respective language heritages.
Both my daughters speak Chinese, one really well. I was shocked when I first found what you are talking about out.
One of the jokes I’ve told since I was a child is how precise the Chinese baby has to be or we end up calling our mothers “horses”.If Natural Language in machine intelligence is to be solved, it’s likely it’ll be by engineers who know English, Chinese and the Latin languages well alongside the arts, economics and sciences.Noam Chomsky’s generative syntax and Marvin Minsky’s symbolic structures are grounded in English and have been the basis of Natural Language Processing since the 1950s.It’s entirely possible there’ll be a new system in due course which replaces those legacy models.
That sounds almost like a ‘help wanted’ ad – you hiring? ;)(caveat – as previously stated, one of the monoglots here, so you’d have to look elsewhere for the intuitive rather than the intellectual language experience)
“Necessity is the MOTHER of invention” the saying goes.The engineer-inventor(s) are likely to be female because of natural advantage in intuition — which incidentally have never been factored into Command Line.Ah, all the functional languages that’ll need to be rewritten…Haha.
Feminist Bias !Everything of matter in this world has been so far invented by men. Not with lack of intuition either.
Firstly, “Hi” and thanks for commenting. On yesterday’s post, regular commenters at AVC bar were just saying how the opinions of folks who’re here but haven’t commented much yet would be most welcome (@fredwilson:disqus ).Secondly, I’m not a feminist; I’m a humanist. My personal avatar is a Yin Yang for a reason. It represents head+heart+soul, art+science, East+West, male+female, differentiation+integration.Thirdly, re. your comment, “Everything of matter in this world has so far been invented by men”, computing history is this:1830s: world’s first computer = Ada Lovelace + Charles Babbage1941: Wi-fi, Bluetooth + CDMA = Hedy Lamarr + George Antheil1944: Harvard Mark I computer = Admiral Grace Murray Hopper + Howard Aiken1959: COBOL language = Admiral Grace Murray Hopper + UNIVAC team1969: Apollo 11 moon landing = Margaret Hamilton + NASA team1971: Computer telephony switching = Dr Erna Schneider Hoover + Bell Labs team1972: Smalltalk language (later Squeak) = Adele Goldberg + Alan Kay + Xerox PARC team1974: CLU language = Barbara Liskov + MIT team1985: Spanning Tree Protocol (STP) for the Internet = Radia Perlman + Digital Equipment Corp team1990s: ATMs & distributed transaction processing = Dr. Mandy Chessell + IBM team1995: Affective Computing = Rosalind Picard + MIT team2015: Virtual Reality = Mary Lou Jepsen + Facebook/Oculus teamBeyond computing, here are some other notable inventions by women:* http://mentalfloss.com/arti…If we look at the discovery of the Americas, it was under the leadership of Queen Elizabeth I whilst the Industrial Revolution happened under the financial patronage of Queen Victoria.In the sciences, DNA was separately discovered by Rosalind Franklin and Crick&Watson:* http://www.theguardian.com/…Academics are also debating whether Einstein’s Theory of Relativity is also the work of his mathematician and physicist wife, Mileva Einstein-Maric:* http://www.technologyreview…* http://www.nytimes.com/1990…Specifically in relation to Command Lines and where AI and speech recognition need to go, it’s actually about whether Mathematics as a language is capable of reflecting and expressing emotions and nuances in the same way as Natural Language.”The Y has a mere 100 or so genes, and there is no evidence that any of them are linked to cognition. This contrasts sharply with the 1,200-odd genes on the X chromosome. There is mounting evidence that at least 150 of these genes are linked to intelligence, and there is definite evidence that verbal IQ is X-linked.” (https://www.psychologytoday….Since AI is increasingly reliant on Neuroscience and DNA research to inform how systems are modeled —we borrowed Neural Nets and parallel processing principles from those fields— the research on how the X chromosome affects intelligence, speech recognition and natural language will, in due course, inform the next wave of technology innovation.We can all look forward to systems that are more X+Y and intelligent.
Greetings !What you want to do is cherry pick falsehoods to prove your set outcome, that is belief not science, you are a feminist not a humanist.I am a humanist and I enjoy the differences between men & women. I don’t try to make them equal or say that one side has something the other does not (metaphysically speaking).1936: First Computer – Konrad Zusehttp://inventors.about.com/…http://people.idsia.ch/~jue…I am not going to even try to go down the list as they are all false lies.Self serving pseudo intellectuals such as yourself.Ying Yang is about positive & negative, light & dark, but you managed to distort that too, to match the meaning of whatever you want.http://dictionary.reference…Congratulations for being a believer in government (NAZA) and their FBI program of divide & rule, divide & conquer by Gender. https://uploads.disquscdn.c…
It’s unfortunate you self-identify as a “humanist” (the definition is someone who values the intelligence, contributions and characteristics of men and women) yet you insist that everything that matters has been exclusively invented by men. In this position, you’re contradicting what you wrote yourself, “I don’t try to make them equal or say that one side has something the other does not (metaphysically speaking).”It’s even more unfortunate that, when presented with clear historical examples of the brilliance that happens when men and women work together, you call those achievements that advanced Humankind “falsehoods”.As for Yin and Yang, its associations with male and female energies being distinctively different, yet unified to work in a virtuous complete circle, have been known since the I Ching was published before 700 BC.Therefore, it’s clear I’m not a believer of anything you accuse me of or that you believe.Respectfully, you’re entitled to your opinions, but I won’t be engaging further with anyone who dismisses the male and female partnership examples I provided as “false lies” like you do.That is hugely disrespectful to those men and women and inconsiderately offensive towards me.
I don’t value hi-jackers, pseudo-intellectuals & self righteous liars.
Yes, I remember now and your comment below is correct looking in the long ball set up.
Thanks, Dave. Confucius said, “Study the past if you’d divine the future,” and to see the long ball set up we’d need to stand on the shoulders of giants and see what they see and beyond.
Who defines the syntax? This would seem to gain more adoption of each service didn’t require a unique way of expressing actions, or if one service just becomes the dominant executor of code-like commands.
The successful services adopt the current dominant syntax and then add to it. I immediately knew how to use Slack, for example. Or are you thinking primarily of services like Telegram, IFTTT?
I guess I’m kind of wondering if a new service be developed that lets you connect a bunch of accounts and has a way to notify services or businesses that don’t have an authentication service, and adopts the syntax that Fred used: @{service} {action} {item} /{characteristic},{characteristic}Like: @amazon buy 6 foot extension cord /best rating under $10@uber pickup uberx /240 Broadway NYC@cafehimalaya deliver chili tofu/extra spicy,240 Broadway NYCKind of like a more syntactic Magic
Command line with speech AI according to the team that brought us SIRI.On the surface, it looks like it’ll improve AI and speech recognition.However, there are still missing keys and frameworks which go deeper into issues of semantic syntax and data classification itself.We may find we need to re-index every item of data since we first started documenting our experiences.
Definitely the semantic mismatches along the way are a huge hurdle. But it’s a nice vision.
Haha “huge hurdle” is an understatement.Try that W3C would need to completely revise its semantic structures as would EVERY SINGLE lexical database from the ones IBM’s been building up over the last 40-50 years to get it to the point of IBM Watson and Google over the last 10 years to get to Google Brain.
thanks for the memory Fred. I started with Algol and Fortran so MS-DOS, CP/M and using the DEC command lines was computing for me for quite some time. You are right though, it can be efficient and powerful, and it actually gets you closer to what is really happening. More reasons for people to learn to code.
The command line does have a few things going for it – a slash or other escape key at the beginning is hard to beat speedwise by any vocalized command phrase, and there’s little-to-no danger of accidentally switching into a command mid-sentence. For those of us whose memory is more visually inclined, it also gives us a reference to what we just ‘said’.So I don’t really see the command line ever vanishing entirely – just, perhaps, becoming a niche among many.
As someone who learned to stop worrying and love the command line this year, this sounds great to me. TIL about Telegram here @avc :)I agree with the evolution comment. For our species to move towards a more efficient way of communicating is a step forward. For those who worry about losing the art of language, I would say to think of things like Telegram as akin to the chime in the elevator that lets you know you’re at your floor. It will never replace Bowie.
I 100% agree that bots are the in between step to AI. You can see where teams of people are going to build a canon of tasks that an AI can quickly reference. This laundry list of Telegram bots above seem like early websites or wiki pages that an AI will reference as they get better organized and defined by someone who builds a Yahoo/Google of bots. The steps for “Fetch me cat pictures” or “Fetch me Coffee” are already defined and the AI will know if telegram, slack, etc is your preferred delivery mechanism.Ben Brown turned me on to these ideas and wrote a great post about it here: https://medium.com/why-not/…
I’d love for Bots to return to the scene. I remember in the early days of e-commerce we had shopping bots. The best one was Junglee, and it was so good that Amazon bought it and killed it.The closest I can think of today is IFTTT. Some of their recipes are pretty useful.
Yes. Their best recipes are the ones that make cute little databases right in your Google Sheets!
ah, i don’t know that one.have you heard of https://www.blockspring.com/ related to spreadsheets tricks
Looks like the same great concept. IFTTT–or more precisely, their Do Button app–is pretty buggy lately on my phone, so I’ll try blockspring.
When Mac and Windows arrived, the command line more or less left my life, other than the occasional need to muck around in the internals of the computer and/or network.The command line never left my life (specifically Unix when in 1986 I bought a multiuser multi terminal system for my business at the time) and I still use it every day (on both Mac’s and Linux systems) and find it the most efficient (and fun for that matter) way to get many things that I need to do done. With the command line (and shell programming) I can automate many tasks and create menus of commands. I can arrange, rearrange, combine to my hearts content. I can’t even begin to get into all of the ways this is helpful. It would be like describing why I like to be able to walk or use my hands. It’s an essential part of my life. I actually don’t like gui’s. I don’t like having to take my hands from the keyboard (I am a good typist) and use a mouse to select a box or to tab to a field. Way quicker to do all of this solely by keyboard.
I don’t know who is first with these thoughts here at AVC, but maybe the first should sue the second for plagiarism? :-)!
Or maybe JLM would say “your generation didn’t invent the command line”.
I agree with you more than you agree with your Unix system.
you need a thinkpad
Thinkpad becomes shit after IBM sold it to Chinese.
But now this sort of thing is done as much by teenagersBoth of my stepkids use the command line when playing with minecraft. And obviously use the command line when starting out in programming java which was driven primarily by their interest in minecraft (ages 13 and 11 iim.)
This tweet is closer to coding syntax than english language:Close but no cigar. Coding syntax obviously needs to be precise or the program halts, crashes and so on. A tweet is analog. Coding is digital. Analog syntax can be imprecise because a human is interpreting it and filling in the blanks or correcting it. Coding is interpreted literally. One of the fun things about programming (Falicon might confirm this) is hunting down the syntax that is causing the crash or wrong result. I don’t golf but my guess is that is the equivalent of a golfer having to putt the ball into a hole a few feet away. Something when you are experienced you approach confidently knowing that you will get the job down but still getting satisfaction from getting it done. And you can never do it so many times that you tire of it.
Congrats, Fred: As you set aside the Allan Kay Xerox PARC graphical user interface (GUI) religion, Ben Obi-Wan Kenobi: That’s good. You have taken your first step into a larger world Of course, Alan Kay might rise up and smite thou with a terrible, swift light saber for your being an heritcial infidel to the GUI order!But, you now are leaning toward my remarks on command lines inhttp://avc.com/2015/09/what…Next you will discover, and may I have the envelope, please, drum roll, thank you, right, and the winner is, scripts in an interpretive language to issue command lines!And such scripts can also do nice things with, right, the old but still useful environment variables!
Just opened a terminal / command prompt window today on my computer to get some quick information. I must admit it gave me a sense of pride that I at least felt I used to know more of how to do things that way #SecretCharmOfThings
I noticed I developed a similar kind of behavior/coding when I worked with a large, dispersed team in a completely email environment–no Slack channels at this company. Most of my emails were two-sentence transactions: “First sentence for you,” I muttered to myself 90 times a day, “Second sentence for me.”
And a CLI isn’t incompatible with a structured interface… as you’re typing /cortado…, it prompts you inline for small/large or it simply inserts “large” because that’s what you always buy and sets a request time for 10 minutes from now because (a) it knows a time is required and (b) that’s what you typically ask for. The simplest example of this is auto-complete names on Facebook. But add a bit of semantics to the top 1,000 commands and you can do a ton.You now have the basis for an “intent network” -> your commands or requests are pushed out into the cloud in a structured way that can be subscribed to by counter-parties who can do almost anything (give you 50% your next coffee at Ritual, right next door to Blue Bottle; pick your coffee up and deliver it for $1; etc.).It’s like autocomplete for the world.
This is also my fantasy for Twitter. Imagine Twitter as a universal pubsub engine based on machine-assisted structured tweet creation.
Dunno about coffee, but command-line pizza has been possible since 2004, via a perl utility called pizza_party:https://github.com/coryarca…Major bug is that it orders from Dominos 🙂
I always thought SMS could be cloud based command line with its own protocol without requiring an app.
I wish coding were as easy as hashtagging. I guess Fred would also consider Vbulletin code also coding.
Adjective-oriented programming – or, hashtag-oriented programming, if you prefer – seems to make sense only for larger projects, where the filtering becomes necessary.Not actually sure there are any coding languages that support adjective use like that yet. In the meantime I suppose Apps Hungarian notation can substitute – as per http://www.joelonsoftware.c…
I think the legacy of the command line is the “/” key, which is increasingly familiar to “normals” (thanks Slack!).The first time I started messing around in the command line I felt this thrilling sense of control, and I think that feeling persists as “/” shows up more and more in app interfaces.Slash commands kind of feel like super powers — when harnessed properly, that’s a really awesome feeling to give your user.
One reason to be bullish on this interaction model is that keyboards are software now. One of the more brilliant parts of Telegram’s bot platform is that you can also set the keyboard to be a set of simple options that you tap as you would a button. This alleviates some of the unforgiving nature of command line input: get one character wrong and the system doesn’t understand.
hmmm, allow me to play devils advocate… I think this is great and it is important for a subset of technologists, but from a fundamental perspective the command line interface is an inefficient hack necessitated by the limited ways of getting things accomplished.There is always a tradeoff between universal utility and specialized economy of interactions. Deep knowledge of command line syntax and functionality allows a user a much greater ability to accomplish various tasks. Yet this ignores the opportunity cost of acquiring knowledge necessary to become an expert in any interface over acquiring higher level knowledge.A basic level of proficiency of any underlying process is necessary to be an expert in any field. Yet more abstracted forms of interaction allows one to devote more time to building upon what has come before.There is always a need for experts at every level of knowledge to further refine every mode of interaction. Language interfaces will always be more easily learned than other forms of knowledge because of the natural ability of the human brain to process language.The fetishization of any type of technology or aspect of that technology is counterproductive to progress. I know what a transistor is but a set of them as a logic gate is beyond my conception. I understand that any interface is built of more primitive forms. Machine code becomes assembly becomes languages that define rules and structures that interpret simple text. Each abstraction restricts universality for speed and simplicity. Each level of abstraction allows another level of development that further increases the ability to accomplish things.A computer or a car or an airplane or any complex technology is built upon millions and billions of years of learning just by the individual teams of experts required to accomplish each task.The return of the command line is just a momentary stop along the progression of technology. As previous forms of interactions are found insufficient to accomplish a task the interaction model is simplified to the point of being sufficiently universal to accomplish a task previously impossible or very difficult.As simplified modes of interaction make certain sub-tasks easier some people (old people) will often be frustrated by the loss of functionality the new interaction imposes as a side-effect of simplification. Those people have already invested the time in learning the technology in a way that makes it useful. What is lost is the abstraction allows many more people to accomplish tasks previously beyond them. Those new users were previously excluded by the inability to devote the necessary time and effort to master any technology.The command line interface will one day disappear as easier forms of interaction replace it. At least until some new goal necessitates the reversion to a more simplistic interface that cannot be accomplished with other methods.The command line revival is fundamentally about the current limitations of technology and the infrastructure behind that technology. Natural language processing and voice control are still in their infancy and the text based nature of communication makes command line type interfaces the most complex universal interaction model available to many new technologies. This is not some fundamental permanent state of the world but a temporary moment in the history of human development.
My wife recently dragged 5,000 images to her trash, but released the mouse button too soon and they ended up on her Desktop. This caused her session to hang even after a reboot. I sshed into her box and did a sudo rm of those images and in less than a second her computer freed up. Glad I never abandoned my Solaris environment and working on the command line.
Heroic ;). You can do some damage w/ the rm command too heh. Cool story.
Wow, Fred, slightly unexpected post from you (not sure why I think so). But a good one. I’m glad you wrote it. A lot more people could benefit from knowing more about the power, utility (pun intended) and productivity of command line tools. I’ll add a few cents to the discussion:In the Beginning was the Command Line:http://www.nealstephenson.c…(post went viral)https://en.wikipedia.org/wi…https://en.wikipedia.org/wi…For those interested in learning the principles of writing (Linux/Unix *) command line tools (in C) (instead of just using them), one guide is this article by me, written for IBM developerWorks:Developing a Linux command-line utility:Get it (as a PDF) via this post (follow relevant links):http://jugad2.blogspot.in/2…* Unix tools work on Mac OS X too, because it is a Unix (since ver. 10, i.e. ver. X),
This discussion reminded me of Marc Suster’s post:Design for the Novice, Configure for the ProActually for most of the AVC audience I guess reading the title suffices. In other words,tl;drDesign for the Novice, Configure for the Pro
The rise of the CLI is a natural repercussion of the rise of API-first software development. It is trivial to write a CLI if you have already created an elegant API, yet the CLI is powerful. Once you have committed to making a great external and internal API, it’s a no-brainer to offer a CLI. Some of the powers of a CLI: unit testing; integration testing; and a sandbox for 3rd-party developers.Long-live the CLI!
Pipem is an open platform for adding text/voice/email “command line” interfaces on your phone/watch to existing online services, and it’s launching for developer preview in a few weeks: http://pipem.io We’ll post a blog with more details in a few days, and this was our TW demo day pitch this summer: https://www.youtube.com/wat…
The command line interface is in my opinion the main reason of the Linux for desktop failure, lack of acceptance. The command line interface never left professionals. But the mainstream user will never accept the “return” of it. GUI is the key for ease of use. command line interface has no place in modern OS for the masses.
The flexibility that CLI based tools and the APIs behind them for platforms like AWS, GitHub and Heroku are also great examples to examine. All of them have web/desktop interfaces as well, but without the API/CLI, would they be as well-adopted by such a wide spectrum of users? I don’t think so.
a system computer in 22 century is …..? microsoft write ms dos in 19 century . now we chosen cloud system. i hope…someone write a simple program , so easy to usel. it is not window 10 , ios 9 or chorome or andrios, A NEW System for all people,
We do prepare for the voice command era?It seems like a step backwards, but take another perspective. In business intelligence we like to navigate with visual interfaces and visualizations, but the most powerful power users (accounting) do prefer SQL or even raw CSV dumps over visualizations.And now look at the the Apple TV UI it is build around a speech interface, a speech interface is not far away from a command line.In BI questions that do need to be answered are:Show me the regions with the best performing shops in the last 90 days.Of that selection do show me the best performing products.Which of the best performing products was introduced in the last 30 days?This are business questions, but who do this with point&click? If you do know the data structure (which is not that complicated in a well maintained DWH) you will find out the answer faster with a good SQL statement than with a visual interface.That means your “@bluebottle /cortado /tostay” is nothing else than a DSL.I am very fascinated by text interfaces, to often we thing that an impressive user interface which drives the CPU/GPU to its limits is “best of breed”, but that is just not true, it is always best customer experience and that doesn’t necessarily mean best user interface.why not a “@bluebottle /cortado /tostay /in5mm /tip+0.1”?
Voice works well when you know exactly what to say and can articulate it in a few seconds. Like “Show me Products where ProductID is above 10″But for longer queries, voice becomes a problematic medium, cause speech is like a “live stream” where you can’t really pause and edit.When typing a complex query you usually stop, pause, go back and edit a few times before you run it.Let’s say you start writing a query : ‘SELECT Products WHERE ProductId > 10 && SaleDate < 2015-01-01 && SaleDate > 2015-02-30 ORDER BY ProductCost’then you remember that it should be ‘ProductId < 10’ instead, you go back and change it. Then type some more filters. Then edit again a few times.But I don’t see a scenario where that type of editing could work efficiently in voice commands. Quote: “Show me Products where ProductID is above 10 which were sold in Stores between January and February. Order by ProductCost. No wait, the ProductID should be *below* 10.. And only for Stores in the NorthWest region. And I want the result to be Stores, not Products. No wait, the Region is called NorthEast.”Even if voice technology could support it, it doesn’t feel like a good user experience.
If you in advance what you are looking for it is no problem, because you do know the path.In the case of data discovery you don’t know in advance what you are looking for, you’re trying a lot of different queries before you get the answer you was looking for.The problem is that in the most cases a database doesn’t have a “semantic layer”, that means you don’t know where the informations you are looking for could be.At the end you always end with a kind of query language. Having a voice interface here could speed up, the discovery process very much.If you do speak with data warehouse engineers about data navigation, the best way to communicate is to describe the design you are looking in plain language, like “Show me all products in region X, which performed well”.-> “performed well” of course is “difficult” to define, but if you do a performance dashboard, you have to define this anyway.
Hmm, how do you communicate with other humans?If I do say something which I don’t know and which isn’t articulated well, my con-humans tend to react not so failure tolerant. Reactions like “what the hell”, “bullshit”, “nonsense”, etc do give me the impression hat humans tend to react negatively if not hostile to badly articulated sentences.From that perspective a computer with its endless patience, is better than a human for bad structured sentences.Unless we start to teach computers impatience….
Humans like speaking to each other. Not to computers.-If the computer doesn’t understand our voice immediately we tend to sigh and give up after a few attempts. It just feels dumb to repeat yourself to a machine.-Today’s voice tech is barely good enough to understand straight forward 5-10 word commands. So it feels like there’s a long way until we’ll be able to effectively say things like “Show me X and Y. No wait, I mean Z”-People do not want to use voice commands in public. I don’t see that changing. (Not until we get to super high “Her”-style AI voice interaction)Sure, voice recognition tech will improve.But I just don’t see a world in the near future where people will feel comfortable talking to computers at length.
Here is a little on some of how to make uniquely good use of command lines and apparently not already in this thread:Preface 1:As a preface, one of the most important tools in my computing is a good hierarchical file system. The main reasons are: I have 10 million files; that number is growing quickly; I need to organize the files so that I can easily find files I want and put new files where I can easily find them; for that putting and finding, my main technique is a taxonomic hierarchy; and the hierarchy of the file system is semantically close to that file system hierarchy and, thus, what I use for the putting and finding. In particular, now I’m using the Windows High Performance File System (HPFS) and like it a lot.Preface 2:By far the most common operation in my computing is the two steps of (1) making some file system directory (folder) the current directory and (2) running some software that manipulates files in that directory.Then by far the most common software I run is just my favorite, general purpose, programmable text editor KEdit. The next most common software is the one spell checker I use for nearly everything, ASpell, that came with a distribution of TeX, D. Knuth’s mathematical word processing software.Preface 3:Clearly from one operation to another, it would be useful to have some memory that could write on one operation and use on, usually, the next operation. Here very useful is the facility of environment variables. Of course, another example of such memory is the system clipboard, and it is easy to implement more general versions of that.Preface 4:For more efficiency, I want an important case of software reuse: So I want to have a collection of little programs that (1) I can easily write and run and (2) that can easily run other programs.Preface 5:Of course, somehow I have to enter data. For that, I find that touch typing on a keyboard is mostly much more efficient than using a mouse.Preface 6:Of course Windows has what we can call text windows, command line sessions, console windows, etc. Such a window has text only (no graphics) and only monospaced fonts: Sure, in reading text, proportionally spaced fonts are nicer, but for working with data monospaced fonts are much easier.So, such a window is convenient for entering data via typing, right, on a command line, and the two steps of (1) making a directory current and (2) running some software that manipulates files in that directory.With the above points of preface, here are some of the little commands I find useful on command lines, especially for the two steps of (1) making a directory current and (2) running some software that manipulates files in that directory. Some of these commands might be regarded as good for tree walking.Long file system tree names for directories can have a lot of mnemonic value but would be a pain to type. So, while I make heavy use of such tree names, jump around everywhere among my 10 million files, with some of the commands below, I nearly never type such directory names. Create, read, and use them? Yes. Type them? No.In the context of the seven preface statements above, here is a list of some of my favorite command line commands — the commands are really short, easy, good, and fast to use:====================Command: RAbbreviates: RootPerforms: Make the root directory of the current file system drive letter the current directory.====================Command: DNAbbreviates: DownPerforms: In the current directory, displays a sequentially numbered list of subdirectories with names that start with x where x is the first argument, waits for input one of the sequence numbers, and then makes the corresponding directory current.====================Command: TDAbbreviates: Text window in current directoryPerforms: Create a new text window with the current directory of the present window as the current directory of the new text window.====================Command: CALENDARAbbreviates: CalendarPerforms: For first argument x, a year in the Gregorian calendar, displays a full calendar for that year.The display is in a monospaced font which makes the relevant columns line up vertically which makes the calendar easy to read.So, get a nice, fast calendar. E.g., since early in the movie Gone with the Wind, there was a BBQ at Twelve Oaks on a weekend a little after the firing on Fort Sumter, on April 12, 1861, what day was the firing and the BBQ? Well, from the calendar from my little program, the shots were fired on a Friday, so, maybe the BBQ was on Saturday the 13th or Sunday the 14th!====================Command: SDAbbreviates: SubdirectoriesPerforms: Get a new file in the current directory with one line with, for each file or directory in the current directory and all its subdirectories, full file system tree name, size, etc., nicely formatted in columns. For finding things in a subdirectory, good to have when all else fails!====================Command: MARKAbbreviates: MarkPerforms: Assign to environment variable MARK.x the tree name of the current directory where x is the first argument.====================Command: GAbbreviates: Go to a directory specified by the first argument.Performs: Make current the directory in the environment variable MARK.x where x is the first argument.Example: When I use command TD to create a new text window, that window already has assigned file system directory names to MARK.x for, currently, about 32 values of x.E.g., then commandG UNmakes current the directory name in environment variable MARK.UN which is usually the directory I have for Union Square Ventures.So, I have a fast way to make current any of the 32 or so directories I use most often. E.g., guess what commandG TRUMPdoes?====================Command: NEWAbbreviates: Make new directoryPerforms: In the current directory, make a new subdirectory using the first argument x as the directory name, make that new directory current, and run KEdit on file x.DOC (that I use for documenting the files in that directory).====================Command: MNDNAbbreviates: Make new dated directory namePerforms: In the current directory, make a new subdirectory using the first argument as the directory name prefix, followed by date, followed by a single letter.Example: So, commandMNDN INC_might create subdirectoryINC_20150920Awhich I might use for the first (suffix A) incremental backup of September 20, 2015.====================Command: FROMAbbreviates: Mark fromPerforms: Assigns to environment variable MARK.FROM the file system tree name of the current directory.====================Command: TOAbbreviates: Mark toPerforms: Assigns to environment variable MARK.TO the file system tree name of the current directory.====================Command: COPYFTAbbreviates: Copy from toPerforms: For first argument x, uses Windows command COPY to copy file x from directory in environment variable MARK.FROM to directory in environment variable MARK.TO.====================Command: COPYFT2Abbreviates: Copy from toPerforms: For arguments x and y, uses Windows command COPY to copy file x from directory in environment variable MARK.FROM to file y in directory in environment variable MARK.TO.====================Command: XCOPYFTAbbreviates: XCOPY from toPerforms: Uses Windows command XCOPY to copy directory and all its files and subdirectories in environment variable MARK.FROM to directory in environment variable MARK.TO.My main tool for disk to disk backup of files.====================Command: XCOPYIFTAbbreviates: Incremental XCOPY from toPerforms: Uses Windows command XCOPY to do an incremental copy of directory and all its files and subdirectories in environment variable MARK.FROM to directory in environment variable MARK.TO.My main tool for disk to disk incremental backup of files.====================Command: TREDELAbbreviates: Subtree deletePerforms: For first argument x, in current directory deletes subdirectory x and all its files and subdirectories.
Missed this yesterday, just too unbelievable outside to be on line. plus #Beakerhead in YYC.@fredwilson:disqus messaging is eating the world, whether people want CLI shortcuts is debatable (typically not a mainstream thing, but cultures change).Bots + Cloud services + messaging are going to radically change the world. The cats at slack get full marks for being ahead of that trend.As @twaintwain:disqus has rather voluminously proven on this thread, voice isn;t happening in our lifetime.
Do you use Google Now much Fred, it just gets better all the time. I find myself using it more and more and it copes with British accent well.
Of course I may be biased, but I think the command line (or the UNIX shell) never really went away. It’s been always there running things for you.I also think that it’s never going to go anywhere, and that it’s going to be complementary to AI. Just imagine yourself at the movies watching a great film or play at the theatre, and you want to know who’s this excellent actor that’s giving a great performance.Talk very loudly to your AI agent to make yourself understood, or enter “who’s this?” in your strategically hidden device?Extrapolate this to a shared startup office with a few dozen people talking loudly at their respective AI agents, or in the train ride home.Just not gonna happen. Long live new and more powerful shell interfaces.
Yes!!!Computers host agents that operate on our behalves. We issue command oriented statements for them to process. Over use of Graphical User Interfaces have obscured this critical reality to the point where they are increasingly less productive.The power of RESTful interaction patterns is best explored via CURL and an OS provided command-line, for instance. GET, POST, PUT, DELETE etc.. are verbs that connect an HTTP user with actions to be performed on an HTTP-accessible resource, by a piece of software (user agent we call the “browser”).IRC bots are another exemplar, and you can see all of this regurgitated in #slack and similar tools.
format c: /yWow, you can speed up your PC using that simple command line 😉
@bluebottle /cortado /tostay/tostay is an action. Why leave it to a human to translate? The next unicorn will be the platform that aggregates the worlds API’s and enables them to be executable from anywhere.i.e.@clifthotel #do/pay/100 #do/rate/4The above #do links transform into a fully interactive cards – http://hashdo.com/pay/snaps…andhttp://hashdo.com/xandgo/ra…
The ultimate command line is a command to Siri, Cortana or Alexa.
Teenagers are childhood zombie slaves to their electronics, they only know where the power button is.
Agree. I’m not getting why this is a step forward.
Its two mints in one !The common structure in both the material-reality-stack and any given technology-stack is their nested layer cake of “Holons””although it is easy to identify sub-wholes or parts, wholes and parts in an absolute sense do not exist anywhere. Koestler proposed the word holon to describe the hybrid nature of sub-wholes and parts within in vivo systems. From this perspective, holons exist simultaneously as self-contained wholes in relation to their sub-ordinate parts, and (simultaneously as) dependent parts when considered from the inverse direction.”Virtually everything in existence is a Holon. It is a universally applicable/reusable construct!The utility of what level of “Holons” one choses to utilize as a primary tool is completely dependent on the task at hand and that task’s position in the evolutionary stack.For some software developers the range of best fit “Holons” may well be to working down in the basement with “syntactical-code”, “command-lines”, “APIs” and “development-frameworks” but for the rest of us we need to stand on their shoulders with best fit “Holons”, that operate further up the stack, like “Apps” and “touch/voice-Interfaces” who’s more integrated functionalities accommodate accelerated recombinant creativity higher up the evolutionary (get on with it) stack.As layered systems advance most higher level developers are forced, by expediency, to stand on the shoulder of the stabilized/standardized lower level “Holon” created by the lower level developers.Programming nostalgia aside most software developer will be force to flee up stack in the name of efficiency and perceptually-ergonomic transparency.Where have all the slide-rules gone ?Organic Process LiteracyMeme us all up Scotty !
For the real “end” in “end user”, it’s not “a step forward”. But that “end” is not very good, and, instead, we need to build more, and not have just an “end”, by reusing earlier tools, and one of the best ways is command lines, scripts, and scripts driving command lines and other scripts. The main point is that command lines are a great source of ‘glue’ to permit building new tools.E.g., look at a desktop icon and its properties: Often will see just a command line. Havinga script drive an icon is a PAIN — program a robot arm to reach for a mouse, hit the user in the head, …. But having a script drive the command line in an icon is easy.
Pretty cool… it does point to the idea that CLIs can be interactive. Rather than having to remember every option and switch, let me type a word and then have software help get to where I want to go.