Do An Hour Of Code
It is CS Education Week, which happens at this time every year to celebrate and energize the growing K12 CS Education movement.
The highlight of CS Education Week is the Hour Of Code, in which students, teachers, parents, and community members all do an hour of code during the school week.
I would like to encourage everyone in the technology business to find a school this week, maybe it is your child’s school, maybe it is the school building in your neighborhood, or maybe it is a school where a friend teaches, and volunteer to lead an Hour Of Code.
It is really quite easy to do this.
Here is a guide on how to help a local school
Here are some activities you can do for your hour of code
Here are some ways to volunteer.
I’m on my way now to a school in the Bronx.
Love to see this. I started coding a few years ago and it has greatly helped my logic and reasoning in other disciplines as well. Thanks for promoting this, Fred.
As a CS educator, I would recommend this tool: http://jupyter.org/ Might be a good Friday project donation Fred! 🙂 (I’m not connected with them in any way)Its the gentlest python environment I’ve seen and can do just about anything. Great for intro CS programming!
Per my reply to Fred, there are so many modern languages – what are students learning and is there any “standard” curriculum around coding in K-12.
There has been A TON of work done around curriculum, standards, and frameworks in the last few years, much of it funded by the incredible Jan Cuny at NSF, who leads up CS Ed for that org. Much of the research has been driven by groups like the CSForAll Consortium (which exists in part due to our host), CSTA, and Code.org, along with many of the universities at the forefront of K-12 CSEd such as UTexas, Berkeley, Duke, Brown, Trinity, UCLA, U Oregon, etc. There are also crosswalks to Common Core as well.See https://k12cs.org/ and https://www.csteachers.org/…
CSS is an art. I gave up on keeping up with it long ago because unless you are working in it every day, you can’t keep track of what works each time a browser changes. I now hire CSS people as needed for projects.
Yes, my intern loved learning Python and it then gave her confidence in her ability to code and strength of spirit to continue learning other languages.
Are there signs “Don’t tease the girls!”? If not, then I will! “Girls”? Right, politically INcorrect version of “women”!!For the teasing, what is it about you girls and “confidence”, “spirit”, and “strength”? Or, to borrow from Yoda, “Always emotions, women.”. Or “Always difficult to understand, women!”!Uh, for a simple view, computers are very obedient and nearly always will do just what they are told to do! That’s a big help in writing software!Some men used to want such from girls, rarely got it, but those days are over!/teasting
Not to get into a religious war with you, but almost *everyone* I know currently vastly prefers GoLang for systems stuff over C these days….and for #4, Lisp is another great one to throw in there (because it forces you to think differently than most other languages).BTW – you could argue that Python is also a ‘popcorn’ language too (and also, I wouldn’t consider HTML or CSS actual languages, but that just might be the code snob in me).
Golang, Rust, etc. wannabees! ;)Lisp is AI from yesterday 😉 Its so, pre Haskell…Python is definitely a popcorn language (my term for Very High Level Languages that typically don’t require you to do memory managment) but you need to call one out for novices and I think its the best!
Python is simultaneously both a “popcorn” language (sort of) and not one, because it has been around for many years (20+) and has evolved a lot (both the language and the libraries, both standard and third-party) in that time. (It was also being used for serious work from early on, despite its ease of use.) Also, while it is somewhat easier to pick up the basics of Python as compared to many other languages, it is a deep language with some subtle features and interactions between them.Also, since it’s a recent topic on this blog due to their acquisition by Zeta, I’ll mention that Disqus is (or was – not sure about latest status) one of the biggest production sites in the world that uses/used Python (and Django, IIRC). YouTube and Google use Python a lot too, as do tons of other companies and organizations, big and small.https://www.python.org/abou…And a relevant plug: I conduct online Python training via the Net.https://jugad2.blogspot.in/…Anyone interested in Python training can contact me via my web site:vasudevram.github.io/contac…Python is also huge for data analytics and data science these days.
Since I don’t like nasty things done to vulnerable children, I have to respond.In particular, pushing C/C++ to children is nasty.First, HTML and CSS are, of course, computer languages, e.g., have some Backus-Naur syntax, but are not really computer programming languages because, no doubt, they are not Turing equivalent. HTML and CSS are just data markup languages.E.g., I argued with a guy for months that there was no way to translate D. Knuth’s TeX to HTML because TeX is a real programming language: It can read and write files, allocate and free storage, has loops and if-then-else, has variable names with both character and numeric data types, etc., and HTML and CSS — last time I looked, although I have not looked at HTML5, etc. — have none of those.C and C++ are awful: We still should use them where have to and otherwise banish them to the dustbin of history.Why? C was designed to be a minimal language on an old DEC computer with 5 KB of main memory. So, C syntax is sparse and “idiosyncratic” as in the fundamental, now classicBrian W. Kernighan and Dennis M. Ritchie, The C Programming Language, Second Edition, ISBN 0-13-110362-8, Prentice-Hall, Englewood Cliffs, New Jersey, 1988.In its original definition, C++ is just a pre-processor to C. Bell Labs liked pre-processors, e.g., Ratfor (rational Fortran), and C++ was another example. C++ tried to make software objects a religion and did with enormous harm for decades.There are some fundamental problems with C/C++: (1) Memory management. (2) Exceptional condition handling. (3) Nearly missing scope of names functionality (need scope of names for some source code modularity and more), (4) Crude and clumsy memory management, (5) Absurdly clumsy use of pointers. (6) And especially (1) — (4) in the context of multi-threading. For more, the string and array handling, in a word, suck. I still have to look up and think through again dereferencing. Bummer.For exceptional condition handling, scope of names, memory management, data types, strings, arrays, data structures, and much more, PL/I, well before C, was much better. E.g., PL/I has pointers, and I used them heavily, but I never had to worry about dereferencing!If like object oriented programming style, then that is easy enough to do in PL/I. Indeed, the key to object polymorphism is just entry variables as in PL/I and even Fortran. E.g., in Fortran, it was long common to write a numerical integration routine and pass to the routine the name of the function to be integrated. So, the integration routine was essentially polymorphic. E.g., in .NET, the polymorphic functionality is via interfaces which are essentially the same as the Fortran idea.It has long been recognized that in moderate to large software projects, at least the C++ (1) — (4) become debilitating, e.g., digging the Panama Canal with workers on their knees using teaspoons — ordinary teaspoons with C and tricky, hard to use teaspoons with C++.C was a big step down from PL/I and Algol and was done just to fit in the old DEC 8 KB. Sad stuff.We need to move on.It’s long been the case that we didn’t have to use the C/C++ teaspoon tool even for operating systems. E.g., Multics, 1969, a milestone in computing, with security rings, gate segments (still in the Intel architecture), hierarchical file systems, and authentication, capabilities, and access control lists (all now pillars of computer security from SQL to Web site certificates) and Primos, 1975, were written in PL/I.Long IBM’s operating system code was written in their PL/X version of PL/I, far ahead of C/C++.I’ve had occasion to write some C code: For some small, focused, narrow projects, it’s okay; like mixing up some sour cream and chives, use a teaspoon.E.g., my current project on Windows uses the .NET facility platform invoke to call (as a small part of some of the numerical parts of some of the work) some now classic, rock solid, quite important, open source C code; okay, I’ll confess, Linpack. Sure: Commonly the computing in a huge range of applied math boils down in part to some use of Linpack! Right, I took the official, open source, Fortran version of Linpack, used the open source Bell Labs C code program f2c (translate Fortran to C), compiled f2c on Windows, used that to translate Linpack from Fortran to C, compiled the resulting Linpack C code on Windows, and called the result with platform invoke.E.g., once at IBM’s Watson lab, I took the C-callable TCP/IP API and wrote C code callable from PL/I to let PL/I use TCP/IP.E.g., once I wrote a really nice, cute, sweetheart matrix package for C and, with that, wrote a quite general linear system solver in C.Once I wrote in C a relatively capable program to sort files. For large file sorting it’s still what I use.So, I’ve used some C.For C++, I had to conclude that not evenBjarne Stroustrup, The C++ Programming Language, Second Edition, ISBN 0-201-53992-6, Addison-Wesley, Reading, Massachusetts, 1991.clearly understood it, and I didn’t want to paw through the output of the C++ pre-processor, look at the generated C code, and, thus, make more clear just what the definition of C++ really was. Since Stroustrup was not able to do a good job describing his language, I didn’t want to bother doing it for him.Besides, being just a pre-processor to C, fundamentally C++ is stuck-o with not being able to do anything a C programmer couldn’t do. Since fundamentally C is too limiting, so is C++.I just looked at my collection of references on overviews of C++ and its version of objects: The references are now 5-15 years old, voluminous, and devastating. Net, C/C++ remain important for various reasons for some special purposes, but for significant projects we need better tools, and now for nearly all significant projects we have much better tools.Computing has moved on — not surprising since C was a big, cheap-o, step backwards from PL/I and Algol. Gee, PL/I compiled and ran fine on a 64 KB non-virtual memory small 360 computer — now squeezing down to a tool written for 8 KB is absurd.Okay, okay, okay, okay, can use C/C++ for writing code for routers for software defined networks and for some cases of embedded code more generally. But, heck, apparently that code will soon be in application-specific integrated circuits (ASICs) via field programmable gate arrays (FPGAs) or some such anyway.If NASA, DARPA, Department of Energy, Boeing, etc. want better languages for embedded code, let them continue their work, long in progress now.To heck with teaspoons as programming tools.For better or worse, a big part of computing is Windows, and on Windows a big part of software development stands on the .NET Framework and the common language runtime (CLR), and the main path to .NET and the CLR is Microsoft’s C#.Well, C# borrows some of the deliberately idiosyncratic syntax of C, syntax I find difficult to teach, learn, read, write, and document and, thus, clumsy and error prone. Right, e.g., I don’t like the comment syntax borrowed from the source code stream syntax (to get around the 80 character limit of old punched cards, literally) of PL/I. Yes, I have some nice editor macros that help a lot with that syntax, but I still don’t like it.But we don’t have to put up with the masochistic C syntax. Instead we can use the essentially equivalent but much sweeter and, really, classic flavor of syntactic sugar of the .NET version of Visual Basic (VB.NET). That’s what my project is using, now with 100,000 lines of typing of VB.NET.Equivalent? IIRC there’s a translator to and/or from C# and VB.NET.So, the difference is mostly just syntactic sugar. Sure, IIRC C# has lambda expressions — if I need those, I’ll write some C# code and call it from VB.NET — it’s all the same CLR guys! In particular, it was easy enough for Microsoft to document the .NET Framework all in the same Web pages for all the Microsoft CLR languages. Gee, they even have a compiled Python, Iron Python!The Microsoft, Windows CLR and .NET Framework are milestones in computing and, no doubt, were not easy to do. E.g., IBM struggled for years on just the same important, even crucial, broad goals and achieved at most only a tiny fraction of what Microsoft did with the CLR and .NET.Net, if want to introduce young people to programming, go really slow on C, much slower on C++, and get on with the .NET Framework, the CLR languages, and C# and/or VB.NET.And we don’t have to push C/C++ to children and, thus, be nasty. That’s not as nasty as pushing love of boarder-less globalism, rushing to flood countries built on the glories of Western Civilization with people from brutal, pre-Enlightenment, pre-Medieval brain-dead cultures, or fears of human caused climate change, but it’s still nasty.
I learned C/C++ as a child. Like skiing or riding a bike or any skill, that’s the best time to do it. Give em a little credit, they’re at least as smart as we were at that age.
Sure, they CAN learn it, but SHOULD they learn it? Let’s move on.Learning C/C++ is not the point. I was responding to your2. C++: The only real application language left.3. C: The original and still the best systems languageAs I argued, on your 2. C++ sucks. E.g., for decades we all suffered from the “memory leaks” from the bad memory management. On Windows, the CLR languages, e.g., C# and VB.NET, are decades ahead.On your 3., that was false already in Multics of 1969.My point is that C/C++ for significant software development are like digging with teaspoons — possible, yes, good tools, no.
C++ doesnt suck, you just don’t like it. 😉 It has two things that are absolutely necessary for developing any REAL application (i.e. one that actually requires performance). 1) The object model has every possible feature and 2) it compiles to a binary. These two things also make it hard, which is why you likely think it sucks 😉
I know probably two dozen computer languages well enough to be productive with them. This is due to the facts that I’ve been designing and building computer systems for all sorts of applications from bare metal to your IT level environment for nearly 40 years, not to mention my Ph.D. in Computer Science from a top 10 institution. I’ll keep my own counsel about languages and choose the best one for whatever job I’m doing. Since its obvious you’re just argumentative, conversation over.
I have a college intern working for me right now and she started with Python. She says she loves how you can code something that works so quickly — it gave her a lot of confidence. Now she is learning some other languages.
how you can code something that works so quicklyIs she going to be writing production code for you? Or are you just saying she is fooling around with it?The reason I ask (I have done this pro-sumerly since the mid 80’s) is that writing the code is only one small part of creating something usable. It’s like knowing how to fly a plane and/or give a needle injection and not being an actual pilot, physician or nurse.  Just writing code doesn’t mean the code will be secure and there aren’t big gaping errors that will cause some large future problem. And as you know this is something that happens to the biggest and the brightest as it is. There is a ton of things to know just security wise and it changes all the time. And is a full time job almost just to keep on top of that.Honestly I get a kick out of hearing that people are learning to code on such an expedited basis. This reminds me of martial arts studios giving out black belts in a year or two vs. how long it used to take to make that achievement. In other words how do you handle problem situations? What do you do in emergencies? Hard to replace actual experience over many years with a course no matter how bright someone is or who the teacher or teaching method is.
She is not writing code for me. I am training her and one other intern (who happens to be my son) the following:- Social media marketing management – Graphic design and Illustration (just my son)- Photoediting – Website planning and interface wireframing and design- Website content management on various CMSs (mainly Drupal and CMSMS but also SquareSpace and Duda since some of my smaller clients use these last two platforms).
My intern took Python in college.
In the seven pillars of wisdom in designing software, the first pillar is the KISS principle of engineering — keep it simple, stupid.The other six? I’m not sure!
See my list above…
With the disclaimer that I am a strategic advisor to them, I’d also recommend Codesters if Python is of interest. See http://www.codesters.com/hoc — Python based curriculum focused on being classroom ready for middle school.
I have no relationship with codesters but will chime in that I very much like what they’ve put together.
Fred — I watched Banking on Bitcoin last night. Great movie. I was surprised to see you in it. I was like “Is that Fred Wilson?”Can you talk about the implications of the Bitcoin Licensing Act and how it’s affected the ecosystem now? Thanks.
What “language” do they program in? I have followed these links and am still a bit lost (except for the “donate” part.) As a parent, you would be surprised how difficult it is to even determine what a modern school system is teaching – let alone identify gaps. I’ll keep plugging – but some examples would be really helpful.
There’s no single language, curriculum, process, toolset etc. either in NY or across the country. At the high school level, the gorilla in the room is the college board with APCS-A (Java) and APCS-Principals. APCS-P is something of a “course framework” where both the toolset and rigor vary tremendously.The tools range from drag and drop languages like Scratch to scripting type languages such as Python but you’ll also find Processing (Java), straight Java, Racket (Scheme) and more.
Exactly what I had feared. I had hoped that one of the first things a non-profit would do is create a standard for the “hello world” app.In my opinion, you cannot tell someone “go code for an hour” without some clear direction of exactly what to do. Especially in the school systems, which are woefully behind in technology adoption. If you make them figure out the language and the platform and the use cases – you will get very slow adoption.
A number of the “Hour of Code” lessons are very structured and directive, similar to what you describe. If anything they’ve been criticized for being too prescriptive.
That’s good to hear. In my opinion, the “movement” should standardize on a single on-boarding process. It sounds like you guys just developed your own. It doesn’t matter WHAT language is used, as long as it is widely accepted and teaches basic principles.
CONTRIBUTORS:Those over 50 remember the ms-dos commands. Looping, etc. We never ventured past ms-dos and entering the commands. Enjoyed reading Computer World. (1978)https://uploads.disquscdn.c…
Heck no: Just last week I had to write some code with redirection = ‘< N >> test11.out 2>&1’Right, straight out of old high end DOS stuff to automate running an old DOS command line program. Ran the program about 500,000 times so wanted it automated!Command lines remain great as an easy, general purpose API for automation via interpretive scripts, e.g., old Rexx. Actually, command lines remain, e.g., with the Windows icons and the command lines there. Mostly there remain just programs, EXE files, started with essentially just command lines.
I have always been bothered by use of ‘code’ vs. ‘programming’. ‘Programming’ is what I was raised with.  So I decided to look into this and found this article which is interesting on the topic and confirmed validated my feelings.https://www.huffingtonpost….For a long time, “programming” was the word most commonly associated with entering instructions into a machine to execute commands. There really was no debate. Programming was the formal act of writing code, and the term also encompassed the greater nuances of computer science.Believe what you read only if it confirms your point of view:In recent years (read: post-Hour of Code, circa 2013) the term “coding” has resurfaced as a much more playful and non-intimidating description of programming for beginners. I still refer to it as ‘programming’. And no coding was definitely in use prior to 2013 obviously. To me it’s like ‘snowboarding’ vs. ‘skiing’. Two different sports but the one that my brain thinks is more legit is the one that I was raised with…. https://uploads.disquscdn.c…
Imagine if this happened in every elementary, middle and high school once a week, all year round? THAT would be impressive. It should be required.
The obstacle is not the teachers, school, and parents — they are, by and large, on board. It’s rather 1) lack of adequate networks and equipment, 2) lack of PD for teachers (a problem our host is working on), and 3) finding time in the day given conflated problem of myriad state standards AND that in most states CS is still not a mandate itself. Ultimately it’s a choice of how we expend public resources and if this is a priority or not.There is also SO, SO, SO much more the technology community could be doing in terms of financial resources and volunteer time. Our host is the model — the tech community really could step up and do so much more.
It should be requiredI just don’t agree with that level of enthusiasm for the benefits of this ‘once a week, all year round.’What subjects would you remove and what would happen to the teachers for those subjects as a result of less demand?That ‘hour per week’ has to come from somewhere. Right?And I don’t think this is something that close to everyone should be doing either.As an example I think in teaching we have to get away from this idea that ‘everyone should learn a foreign language because a small percentage of those who learn will end up using it in some valuable way’. There are to many things that can be learned and not everyone is going to benefit from the same thing one size fits all. And kids tend to gravitate toward what they like if exposed to it they don’t need to be force fed for something at least like this.I do agree that we have to get back to teaching more of the trades and de-stigmatize that for those who would not benefit from college.
CodeBrooklyn, the campaign co-founded by Brooklyn Borough President Eric L. Adams and myself, whose mission is to “be an advocate for Brooklyn school communities – teachers, students, parents, administrators, and their surrounding neighborhoods – in their school specific roadmaps to introduce, expand, and fortify computer science education”, could use more volunteers to help with Hour of Code and similar CS Ed week activities in Brooklyn public schools. You can volunteer here: http://bit.ly/volunteer-cse… (or email me at rob-at-ttmadvisors-dot-com). Please also spread the word. Thanks
Some photos of one of the early events in Brooklyn this week: https://twitter.com/HG45k32…
For Hour of Code for K-12, okay, I’ll make a contribution here. If I were to give a lecture to the students, it would have at least this material. As a student, I always wanted the content in well done text, not just a lecture, certainly not just foils or audio. So, the text here is about the best I’d give the students, anyway.So, there are four steps.Step 1:Checkout from the libraryDonald E. Knuth, The Art of Computer Programming, Volume 3, Sorting and Searching.and read, if only quickly, about (1) the heap data structure (just a clever use of an array to make a cute binary tree), (2) heap sort, (3) the Gleason bound that shows that with meager assumptions heap sort is the fastest possible sort routine based on comparing pairs of keys.Step 2:See the code (in Microsoft’s Visual Basic .NET but with classic syntax easy enough to read) below for using the heap data structure for a priority queue. This code is real: The real problem is, look at 10 million or so numbers and end up with the 20 or so largest, and do that efficiently in both processor time and main memory space.’ Function obj_heap_insert” Object heap insert to use the heap’ algorithm to maintain a ‘priority’ queue’.” So, suppose for positive integer n we’ have x(i) for i = 1, 2, …, n and’ for positive integer m <= n we want’ the m largest of the x(i). So, we’ allocate an array y(j), j = 1, 2,’ …, m.” Suppose we regard y as an ascending’ heap, e.g., so that y(1) <= y(j), j =’ 2, …, k <= m where the number of’ locations of y in use so far is’ integer k where 0 <= k <= m.” Then for i = 1, 2, …, n, we’ consider inserting x(i) into the’ heap.” If k < m, then we set k = k + 1 and’ set y(k) = x(i) and ‘sift’ to’ ‘promote’ the value in y(k) until we’ have a heap again.” If k = m, then we compare x(i) and’ y(1). If x(i) <= y(1), then we are’ done with x(i). Else, x(i) > y(1)’ and y(1) is not among the m largest’ and, to ‘remove’ value y(1) we set’ y(1) = x(i) and ‘sift’ the value in’ y(1) to create a heap again.” After i = n, y contains the m largest’ of x(i), i = 1, 2, …, n.” The advantage of this routine is’ speed: When k = m, the effort to’ insert x(i) is proportional to’ log(m). When k < m, the effort to’ insert x(i) may be proportional just’ to k. The worst case would be when’ the array x was in ascending order’ since then x(i) would have to be’ inserted for each i. Similarly, in’ the case the array x is in descending’ order, after m inserts, no more’ inserts will be done. For the order’ of array x ‘random’, once a’ relatively large fraction of the m’ largest are in the heap, additional’ inserts become relatively rare.” This routine is to be ‘polymorphic’,’ that is, to work for essentially any’ user defined class my_class1 where” Dim m, n As Int32′ Dim x( n ) As my_class1′ Dim y( m ) As my_class1” This routine can build an ascending’ heap, as discussed above, or a’ descending heap. The only difference’ is how the comparisons are made in’ the class used for the interface.” The interface IComparer is used. If’ the usual class Comparer is used for’ this interface, then this function’ builds an ascending heap, that is,’ where upon return” y( 1 ) <= y( j ),” for j = 2, 3, …, m. Function obj_heap_insert( _ x As Object, _ y() As Object, _ ByRef k As Int32, _ m As Int32, _ compare As IComparer) As Int32 Dim routine_name As String = “obj_heap_insert” Dim error_code As Int32 Dim return_code As Int32 = 0′ Dim result_code As Int32 Dim message_tag As Int32 Dim i_father As Int32 Dim i_child0 As Int32 Dim i_child1 As Int32 Dim i_child2 As Int32 Dim i_2 As Int32 = 2 Try error_code = 1001 message_tag = 1001 If console_msg_level >= console_routine_messages2 Then _ Console.WriteLine( routine_name & ” ” & message_tag & _ “: Started …” ) error_code = 1002 If m <= 0 Then return_code = 1001 Goto out End If If k < 0 Then return_code = 1002 Goto out End If If k > m Then return_code = 1003 Goto out End If If y.GetUpperBound(0) < m Then return_code = 1004 Goto out End If error_code = 1003 If k < m Then i_child0 = k + 1 k = i_child0 error_code = 1004 Do ‘ Sift value of x into correct position. If i_child0 < 2 Then Exit Do i_father = i_child0 i_2 error_code = 1005 If compare.Compare( x, y( i_father ) ) >= 0 Then Exit Do y( i_child0 ) = y( i_father ) i_child0 = i_father Loop ‘ Sift value of x into correct position. error_code = 1006 y( i_child0 ) = x Goto out End If error_code = 1007 If compare.Compare( x, y( 1 ) ) <= 0 Then Goto out i_father = 1 error_code = 1008 Do ‘ Delete value of y( 1 ) and sift x to correct position ‘ starting at location y( 1 ) If i_father > m i_2 Then Exit Do i_child1 = i_father + i_father If i_child1 < m Then i_child2 = i_child1 + 1 error_code = 1009 If compare.Compare( y( i_child1 ), y( i_child2 ) ) < 0 Then i_child0 = i_child1 Else i_child0 = i_child2 End If Else i_child0 = i_child1 End If error_code = 1010 If compare.Compare( x, y( i_child0 ) ) <= 0 Then Exit Do y( i_father ) = y( i_child0 ) i_father = i_child0 Loop ‘ Delete value of y( 1 ) and sift x to correct position ‘ starting at location y( 1 ) y( i_father ) = x Catch Console.WriteLine( ” ” ) message_tag = 1002 Console.WriteLine( routine_name & ” ” & message_tag & _ “: Error condition raised. ” & vbCrLf & _ “: error_code = ” & error_code & vbCrLf & _ “: err.number = ” & err.number & vbCrLf & _ “: err.source = ” & err.source & vbCrLf & _ “: err.description = ” & err.description ) message_tag = 1003 Console.WriteLine( routine_name & ” ” & message_tag & _ “: Raising error condition with error_code = ” & error_code ) err.raise(error_code, routine_name) End Try out: message_tag = 1004 If console_msg_level >= console_routine_messages2 Then _ Console.WriteLine( routine_name & ” ” & message_tag & _ “: Returning.” ) Return return_code End Function ‘ Function obj_heap_insertRight, the error handling is crude. A fix, a system-wide log server, is on the way!Could slightly modify this code to find an approach good under some common circumstances to finding the unique lines in a large file, maybe a Web site log file. This was an interview question I once got from one of the more technical people at A16Z venture firm.Of course, could also useRonald Fagin, Jurg Nievergelt, Nicholas Pippenger, H. Raymond Strong, “Extendible hashing — a fast access method for dynamic files”, ACM Transactions on Database Systems, ISSN 0362-5915, Volume 4, Issue 3, September 1979, Pages: 315 – 344.That is useful material; once we used it in an artificial intelligence project at IBM’s Watson lab.Of course, the conservative approach is just to sort the file, which, for each duplicated line, puts all the duplicates next to each other, and then read the sorted file. Of course, for a file with n lines, this sorting approach will take time, best case, average case, and worst case, proportional to n log(n).Under some common cases, the priority queue or hashing could be much faster.Step 3:This problem is also real.Suppose we are given an array A with its components sorted into ascending order. In Knuth as above, read about the binary search algorithm for finding components in the array. Or, the algorithm is essentially the same as looking up a word in a dictionary by cutting the dictionary in half, seeing what half the word is in, cutting that in half, etc., until find the page that should have the word if it is in the dictionary. Now you just learned the binary search algorithm and, thus, don’t have to look up the binary search algorithm or even bother doing homework with it!But, suppose we are also given array B and for each component of B want to look it up in array A. One way, then, is just to use binary search on array A once for each component of array B. But can we think of a way that, in total for all the lookups of the components of B, might, in some cases, on the sizes and contents of the arrays A and B, be faster?Cook up such an algorithm.As an exercise program and test it.I did that, and my project is using the code! So, this exercise is real!Then, you will see some possible variations in the algorithm. Do some algorithm applied math computational time complexity analysis as is often done in Knuth above to find in some appropriate and realistic sense the fastest possible such algorithm, maybe depending on the sizes and contents of the arrays A and B.Done well, that work might be publishable in a good peer reviewed journal of original research. At some of the world’s best research universities, the main requirement for a Ph.D. is “an original contribution to knowledge worthy of publication”. Even if don’t get a Ph.D. out of it, might get some high end contacts in the research computer science community and a high end college admission and scholarship.Step 4:For artificial intelligence and machine learning in computer science now, a central issue is the polar decomposition in matrix theory.Below is a simple proof of the more general complex case but only for the non-singular case. An exercise is to invent and/or find a proof for the singular case. Note: Consider a perturbation argument.Then, sure, watch the MIT lecture on quantum mechanics athttps://www.youtube.com/wat…and try to generalize this result to Hermitian (self-adjoint) and unitary (distance preserving) operators on Hilbert space as in quantum mechanics.If you need help, e.g., some simple explanations of quantum mechanics and/or Hilbert space, then ask your teachers and/or ask them to ask some professors, etc. Or just call up some math or physics profs at, say, Courant or Columbia.Polar decomposition?Okay.Since apparently the standard HTML elements sup and sub don’t work at Disqus, here I borrow syntax from D. Knuth’s mathematical word processing software TeX.Theorem (polar decomposition): For positive integer n and n x n complex non-singular matrix A, there exist n x n Hermitian H and n x n unitary U so thatA = HUThat is, somewhat intuitively, for a positive integer n and the set of complex numbers C, function A: C^n –> C^n and, there, the work of A is just, first, to rotate and/or reflect keeping distances constant and, then, stretch or shrink along mutually perpendicular axes. So a circle gets rotated and/or reflected and then stretched or shrunk into an ellipse. So, that is all a square matrix, that is, a linear transformation on C^n, can do. Similarly for the real numbers R and R^n.Proof:Let A* denote the transpose of the complex conjugate of A.Let K = AA*. Then K is Hermitian, that is,K* = (AA*)* = (A*)*(A*)= AA* = K.Then there exist unitary Q and diagonal D = [d_ij] with d_ii > 0 such thatK = Q*DQLetH = Q*D^(1/2)QThen H is Hermitian, i.e.,H* = (Q*D^(1/2)Q)*= Q*D^(1/2)Q) = HThenH^(-1) = (Q*D^(1/2)Q)^(-1)= Q*D^(-1/2)Qand(HH)^(-1) = H^(-2) = Q*D^(-1)Q= K^(-1)LetU = H^(-1)AWe show that U is unitary, i.e., that U*U = I.U*U = (H^(-1)A)*(H^(-1)A)= A*(H^(-1))* (H^(-1)A)= A*(H^(-1))(H^(-1)A)= A*(H^(-2))A= A*(K^(-1))A= A*(AA*)^(-1)A= A*((A*)^(-1))(A^(-1))A= II= Iso that U is unitary.Then we have A = HU for Hermitian H and unitary U. Done.So, in this proof, all we did is do a little matrix algebra using the standard properties of Hermitian and unitary matrices. That is, we didn’t have to bring in anything but just routine matrix algebra. For such an important result, interesting.The result also holds when A is singular, and can construct a proof similar to the above but with some more details.
@twaintwain I saw this article and thought of you: “Why Women Should Lead our A.I. Future” https://medium.com/intuitio…
@mlbar:disqus — Thanks, Carlos Perez of Intuition Machine follows me on LinkedIn where I’m very vocal about how male-only frameworks and definitions have resulted in AI’s biased data, inability to understand Natural Language, our cultures, our values.I’m now part of the Consciousness Prior AI project led by one of the three “Godfathers of Deep Learning,” Yoshua Benugio of Montreal. One of its key objectives is to find another way of tooling the machines to do Natural Language Understanding.