Category: Education

  • What E-mail teaches us about Bitcoin and Block Chain

    E-mail is widely used, and everyone knows what it is. Bitcoin is a hot new techno-bauble, and Bitcoin technologies like block chain are getting lots of attention and money. It turns out that e-mail has a great deal to teach us about Bitcoin and its technologies. Here’s the punch line: in spite of its ubiquity, practically no one understands how e-mail works, and this causes huge errors with practical consequences! By comparison, Bitcoin and its spawn are incredibly complicated;  most of the people who do understand e-mail have little chance of understanding Bitcoin. Think about the consequences of this, please.

    Do You Know How E-mail works?

    E-mail is simple, right? You login to your e-mail account, fill out the To and Subject fields, maybe add a couple people in the CC field, write your e-mail, and press send. Then some magic happens, and the e-mail shows up in the in-boxes of the people to whom you sent it. You can read your own e-mail by looking at the items in your in-box, and even go to your sent-mail folder and look at what you sent. It’s simple, wonderful and true! For the vast majority of the time, it’s fine to leave “then some magic happens” alone.

    The trouble comes when trouble comes, i.e., when there’s some special circumstance that requires knowing something about how that “magic” in the middle works. That’s when it comes out that almost no one has a clue about what’s going on, even in something as simple and ubiquitous as e-mail.

    The IRS e-mail case

    There are lots of examples, but the issues involving e-mail at the IRS which have been in the news off and on for the last couple of years are a good case in point. Here’s the lead paragraph from Wikipedia on the subject:

    IRS targeting controversy - Wikipedia, the free encyclopedia 2015-09-30 15-24-02

    Now, remember – I’m not talking about the merits of the issue on one side or the other. I’m solely talking about the knowledge exhibited of how e-mail works, and the practical consequences of that knowledge. Read this juicy lead from an AP story on the subject:

    IRS Head Says No Laws Broken In Loss Of Emails 2015-09-29 18-25-43

    Here are the key points:

    • In June 2011, Lois Lerner’s computer crashed.
    • This resulted “in the loss of records”
    • It was determined that the records on the hard drive, i.e., Lois Lerner's emails, were gone forever

    I am aghast. Agog. At a loss for words. I’d like to be shocked at the “depth” of misunderstanding, but I think it’s more appropriate to be shocked at the “shallowness” of misunderstanding exhibited in this quote, and in the heads of all the IRS employees, FBI, Congressional staffers, the archivists, and all the journalists with their fancy degrees from fancy schools.

    Here is the core concept that everyone involved on every side seems to agree on:

    The e-mails Lois Lerner wrote are uniquely stored on the hard drive of her personal computer. If it is true that the hard drive is severely damaged, then the e-mails are “gone forever.”

    The simple thing

    Even from the simplistic view of how e-mail works, every e-mail is either a draft or is sent to someone. If it's been given an accurate address, it arrives. It's in the receiver's in-box, and perhaps eventually in their deleted mail folder. Since the issue involved e-mails not only received by Ms. Lerner, but ones sent by her, presumably to other IRS employees, there is an obvious strategy: do a search on the e-mail of every IRS employee to whom Ms. Lerner could have sent an e-mail, and see if she did send one. It's the magic of e-mail: the sender has a copy of what was sent, and the recipient has a copy of what was received. There are at least two copies: both sender and receiver have one!

    Have you ever read that simple thought anywhere else? Neither have I.

    The "deep" thing, requiring understanding of how it works

    Now we get to the real point. An e-mail address has two main parts: the name, and the domain. The name is the part before the @ and the domain is the part after the @, for example Lois@IRS.gov. Similarly, all e-mail systems have two main pieces of software involved: a client and a server. Software by Microsoft is widely used in governments and corporations. Outlook is the client software, which runs on the computer on which you read and write e-mails. Exchange is the server software, which runs in a data center somewhere. Exchange is a program with a database holding the e-mails, address books and calendars for a whole bunch of users. A domain like IRS.gov is implemented with many Exchange servers, each with the e-mails of a particular collection of IRS workers, typically a couple for each physical location.

    When Ms. Lerner wrote an e-mail, she used her computer running an e-mail client such an Outlook. When she hit the Send button, the e-mail immediately went to her Exchange server, which filed it away. It then found the Exchange server(s) of the recipient(s) and passed the e-mail to it (them), which it turn sent it to the user's Outlook clients. Shortly after Ms. Lerner sent an e-mail to her colleague Mr. Lowe, it was stored in no less than four places, including a couple servers. In addition, assuming the government had at least moderately responsible Exchange administration, the e-mails were further copied to replicas, on and off-site, and in addition periodically backed up to yet another medium and location.

    There are other e-mail clients and other e-mail servers. I have no information about what the IRS actually used. But this is how e-mail works! There are clients. There are servers, which serve a number of users/clients. When a human writes an e-mail, it goes from her client to her server to the recipient's server to the recipient's client. As as result, it should have made no difference whatsoever that Ms. Lerner's computer "crashed." It wouldn't matter if it suddenly grew wings and flew off to Tahiti to frolic in the waves. Any e-mails that Ms. Lerner wrote were securely stored on her e-mail server shared with other users and in a data center, and on multiple replicas, backups and disaster recovery sites.

    The fact that Ms. Lerner's computer crashed and people supposedly spent time attempting to recover e-mails from it, and when they failed, declared them "lost forever," and the fact that everyone else involved, including journalists and commentators and experts of all sorts, accepted that as the state of affairs ("well, if her hard disk crashed, what can you do, ya know?"), demonstrates that none of them has a clue about how e-mail works. It's like not knowing that cars have engines. It's that bad.

    What e-mails have to do with Bitcoin and Block Chain

    Compared to many other computer technologies, e-mail is simple. Compared to many other computer technologies, Bitcoin is complex. Even worse, what's interesting about Bitcoin isn't Bitcoin the crypto-currency — it's the block chain technology on which it's implemented. Block chain is getting all sorts of attention from financial technology people and investors. I won't review it here, but a brief look at the action will convince you it's frothy.

    What if investors, financial industry executives and Bitcoin technology company leaders are as informed about block chain as everyone involved was/is about e-mail? What if they're making important decisions based on critical observations as sound as "well, the hard drive is kaput, so the e-mail is gone, and that's that?" If the understanding of important actors in the e-mail drama exhibit paper-thin understanding and wrong-headed conclusions, are we to understand that all the folks involved in Bitcoin and block chain are geniuses by comparison?

    Place your bets, people. I know what I'm betting on.

  • How much is a computer science degree worth?

    The median annual wage of a college grad with a computer, math or statistics degree is over $70,000. This is better than the vast majority of college majors, and compares really well with the median annual wage of high school grads, which is under $40,000. The conclusions are clear:

    • Go to college
    • Major in computers, math, statistics, architecture or engineering
    • Otherwise, you’re screwed.
    • Well, all right, majoring in education or psychology leads to crappy salaries, but at least it’s better than being just a high school grad.

    Here is the data: Wages of college grads

    This is a test!

    Trigger Warning! From here to the end of this post could trigger feelings of inadequacy among certain people. Others could feel anger towards the author, causing potentially dangerous heightening of the pulse rate. Others could feel that the author is hopelessly arrogant or elitist, resulting in generally uncomfortable feelings. So read on at your own risk.

    This post is a test of whether you’re qualified to be a top computer programmer, or an outstanding achiever in any technical/quantitative field. The thoughts in this post up to this point summarize what the article accompanying the chart intends you to conclude, and what most people will think on looking at the chart.

    The author of the article clearly failed the test.

    Did you?

    Understanding the data

    If you haven’t already, look at the chart again. Note the big, fat explanation at the top. The endpoints of the lines represent 25th and 75th percentiles. The 75th percentile for high school grads is about $50,000. This means that a quarter of high school grads have salaries above that. The 25th percentile for computer etc. grads is roughly $50,000, perhaps a little more. Which means that a quarter of the computer etc. grads make less than $50,000. In summary: a quarter of high school grads have salaries that are greater than a quarter of college grads with degrees in computers, math or statistics. Read that sentence again. Get it? Did you figure it out before reading this?

    Implications for Hiring Computer Programmers

    I hope you’ve just seen why, when I’ve hired people, I really haven’t given a %^* about their education or their degree – in fact, the higher the education and the fancier the degree, the more concerned I am to weed out the folks with bad attitudes, the ones who have been granted the knowledge and the certification to prove it, and want to spend their lives resting on and/or milking their degrees. Some of the best programmers I’ve met in decades of programming did not have college degrees. Most of the ones who are less than excellent and/or have “risen” in management are experts at glancing at things and reaching the wrong conclusions. Like most people do when looking at the salary chart above. FWIW, here are some good examples of drop-outs who did pretty well. Including the Wright Brothers — after all, how hard can inventing the airplane be?

    The people who are best in computing combine big-picture, visual/conceptual abilities with an utterly uncompromising attention to detail. Computer programs shouldn’t have even a single byte wrong, and the bytes should be selected and arranged according to a deep conceptual understanding of the problem at hand. Amateurs and pretenders don’t do well at either of these jobs, much less in combination.

    Conclusion

    If you care about attracting, selecting and retaining the very best software people, you would be well advised to alter your hiring practices as required to select the people who … get ready for it … can actually do the work! Really well! Having degrees or whatever is not nearly as correlated to that outcome as you might think.

  • Math and Computer Science vs. Software Development

    In a prior post, I demonstrated the close relationship between math and computer science in academia. Many posts in this blog have delved into the pervasive problems of software development. I suggest that there is a fundamental conflict between the perspectives of math and computer science on the one hand, and the needs of effective, high quality software development on the other hand. The more you have computer science, the worse your software is; the more you concentrate on building great software, the more distant you grow from computer science.

    If this is true, it explains a great deal of what we observe in reality. And if true, it defines and/or confirms some clear paths of action in developing software.

    A Math book helped me understand this

    I've always loved math, though math (at least at the higher levels) hasn't always loved me. So I keep poking at it. Recently, I've been going through a truly enjoyable book on math by Alex Bellos.

    Bellos cover

    It's well worth reading for many reasons. But this is the passage that shed light on something I've been struggling with literally for decades.

    Bellos quote

    When we learn to count, we're learning math that's been around for thousands of years. It's the same stuff! Likewise when we learn to add and subtract. And multiply. When we get into geometry, which for most people is in high school, we're catching up to the Greeks of two thousand years ago.

    As Alex says, "Math is the history of math." As he says, kids who are still studying math by the age of 18 have gotten all the way to the 1700's!

    These are not new facts for me. But somehow when he put together the fact that "math does not age" with the observation that in applied science "theories are undergoing continual refinement," it finally clicked for me.

    Computers Evolve faster than anything has ever evolved

    Computers evolve at a rate unlike anything else in human experience, a fact that I've harped on. I keep going back to it because we keep applying methods developed for things that evolve at normal rates (i.e., practically everything else) to software, and are surprised when things don't turn out well. The software methods that highly skilled software engineers use are frequently shockingly out of date, and the methods used for management (like project management) are simply inapplicable. Given this, it's surprising, and a tribute to human persistance and hard work, that software ever works.

    This is what I knew. It's clear, and seems inarguable to me. Even though I'm fully aware that the vast majority of computer professionals simply ignore the observation, it's still inarguable. The old "how fast do you have to run to avoid being eaten by the lion" joke applies to the situation. In the case of software development, all the developers just stroll blithely along, knowing that the lions are going to to eat a fair number of them (i.e., their projects are going to fail), and so they concentrate on distracting management from reality, which usually isn't hard.

    What is now clear to me is the role played by math, computer science and the academic establishment in creating and sustaining this awful state of affairs, in which outright failure and crap software is accepted as the way things are. It's not a conspiracy — no one intends to bring about this result, so far as I know. It's just the inevitable consequence of having wrong concepts.

    Computer Science and Software Development

    There are some aspects of software development which are reasonably studied using methods that are math-like. The great Donald Knuth made a career out of this; it's valuable work, and I admire it. Not only do I support the approach when applicable, I take it myself in some cases, for example with Occamality.

    But in general, most of software development is NOT eternal. You do NOT spend your time learning things that were first developed in the 1950's, and then if you're good get all the way up the 1970's, leaving more advanced software development from the 1980's and on to the really smart people with advanced degrees. It's not like that!

    Yes, there are things that were done in the 1950's that are still done, in principle. We still mostly use "von Neumann architecture" machines. We write code in a language and the machine executes it. There is input and output. No question. It's the stuff "above" that that evolves in order to keep up with the opportunities afforded by Moore's Law, the incredible increase of speed and power.

    In math, the old stuff remains relevant and true. You march through history in your quest to get near the present in math, to work on the unsolved problems and explore unexplored worlds.

    In software development, you get trapped by paradigms and systems that were invented to solve a problem that long since ceased being a problem. You think in terms and with concepts that are obsolete. In order to bring order to the chaos, you import methods that are proven in a variety of other disciplines, but which wreck havoc in software development.

    People from a computer science background tend to have this disease even worse than the average software developer. Their math-computer-science background taught them the "eternal truth" way of thinking about computers, rather than the "forget the past, what is the best thing to do NOW" way of thinking about computers. Guess which group focusses most on getting results? Guess which group would rather do things the "right" way than deliver high quality software quickly, whatever it takes?

    Computer Science vs. Software Development

    The math view of history, which is completely valid and appropriate for math, is that you're always building on the past, standing on the shoulders of giants.

    The software development view of history is that while some general things don't change (pay attention to detail, write clean code, there is code and data, inputs and outputs), many important things do change, and the best results are obtained by figuring out optimal approaches (code, technique, methods) for the current situation.

    When math-CS people pay attention to software, they naturally tend to focus on things that are independent of the details of particular computers. The Turing machine is a great example. It's an abstraction that has helped us understand whether something is "computable." Computability is something that is independent (as it should be) of any one computer. It doesn't change as computers get faster and less expensive. Like the math people, the most prestigious CS people like to "prove" things. Again, Donald Knuth is the poster child. His multi-volume work solidly falls in this tradition, and exemplifies the best that CS brings to software development.

    The CS mind wants to prove stuff, wants to find things that are deeply and eternally true and teach others to apply them.

    The Software Development mind wants to leverage the CS stuff when it can help, but mostly concentrates on the techniques and methods that have been made possible by recent advances in computer capabilities. By concentrating on the newly-possible approaches, the leading-edge software person can beat everyone else using older tools and methods, delivering better software more quickly at lower cost.

    The CS mind tends to ignore ephemeral details like the cost of memory and how much is easily available, because things like that undergo constant change. If you do something that depends on rapidly shifting ground like that, it will soon be irrelevant. True!

    In contrast, the Software Development mind jumps on the new stuff, caring only that it is becoming widespread, and tries to be among the first to leverage the newly-available power.

    The CS mind sits in an ivory tower among like-minded people like math folks, sometimes reading reports from the frontiers, mostly discarding the information as not changing the fundamentals. The vast majority of Software Development people live in the comfortable cities surrounding the ivory towers doing things pretty much the way they always have ("proven techniques!"). Meanwhile, the advanced Software Development people are out there discovering new continents, gold and silver, and bringing back amazing things that are highly valued at home, though not always at first, and often at odds with establishment practices.

    Qualifications

    Yes, I'm exaggerating the contrast between CS and Software Development. Sometimes developers are crappy because they are clueless about simple concepts taught in CS intro classes. Sometimes great CS people are also great developers, and sometimes CS approaches are hugely helpful in understanding development. I'm guilty of this myself! For example, I think the fact that computers evolve with unprecedented speed is itself an "eternal" (at least for now) fact that needs to be understood and applied. I argue strongly that this fact, when applied, changes the way to optimally build software. In fact, that's the argument I'm making now!

    Nonetheless, the contrast between CS-mind and Development-mind exists. I see it in the tendency to stick to practices that are widely used, accepted practices, but are no longer optimal, given the advances in computers. I see it in the background of developers' preferences, attitudes and general approaches.

    Conclusion

    The problem in essence is simple:

    Math people learn the history of math, get to the present, and stand on the shoulders of giants to advance it.

    Good software developers master the tools they've been given, but ignore and discard the detritous of the past, and invent software that exploits today's computer capabilities to solve today's problems.

    Most software developers plod ahead, trying to apply their obsolete tools and methods to problems that are new to them, ignoring the new capabilities that are available to them, all the while convinced that they're being good computer science and math wonks, standing on the shoulders of giants like you're supposed to do.

    The truly outstanding people may take computer science and math courses, but when they get into software development, figure out that a whole new approach is needed. They come to the new approach, and find that it works, it's fun, and they can just blow past everyone else using it. Naturally, these folks don't join big software bureaucracies and do what everyone else does. They somehow find like-minded people and kick butt. They take from computer science in the narrow areas (typically algorithms) where it's useful, but then take an approach that is totally different for the majority of their work.

  • Math and Computer Science in Academia

    Math and music are incredibly inter-related, as has been understood at least since Pythagoras. But they are never studied in a single academic department. Math and music are arguably more intimately bound than math and computer science. But math and music are never in the same department, while math and computer science frequently are. Hmmm….

    Math and Computer Science are joined at the hip in Academia

    Math and Computer Science are so intimately related in academia that they are frequenty part of the same department. This is true at elite institutions like Cal Tech. Mathcs caltech

    Math and Computer Science are in the same department at private liberal arts schools, too, like Wesleyan. Mathcs wesleyan

    They're a single department at major state universities, like Rutgers. Mathcs rutgers

    Same thing as lesser state schools. Here's how it goes at Cal State East Bay. Mathcs CSEB

    I make no argument that this is universal. Don't need to. If you search like I did, you'll find that putting math and computer science in a single department is a common practice.

    Why are Math and Computer Science so Academically Intimate?

    Most people seem to think that math and computer science are pretty much the same thing. Consider this:

    • Most "normal" people who try either of them don't get very far.
    • The people who are way into either of them are really nerdy.
    • If you're good at one of them, there's a good chance you'll do well at the other.
    • They are incredibly detail-oriented. They're full of symbols and strange languages.
    • What you do doesn't seem to be physical at all. What are you doing while programming or doing math? Mostly staring into space or scribbling strange symbols, it seems.
    • You can write programs that do math, and math applies broadly to computing.

    Meanwhile, there are other remarkably similar things that don't end up in the same department. Consider the "life sciences." They all have loads of things in common. Everything they all study starts life, develops, lives for awhile, maybe has offspring, and dies. DNA is intimately involved. Oxygen and carbon dioxide play crucial roles. But since when have you ever seen a department of botany and zoology? Like never, right? In the humanities it's just as extreme. Ever hear of a department of French and German? Academics already fight enough among themselves without that…

    Academics clearly think that math and computer science aren't just similar or highly related. If so, they'd treat them the way they do languages or life sciences. A broad spectrum of academics think they're so interwoven that there are compelling reasons for studying them together. Thus a single department that has them both.

    Math and Computer Science, a Marriage made in ????

    It's a common practice for math and computer science to be studied together. Obviously, most people have no trouble with the concept. Of all the things to question or worry about in the world, this seems pretty low on the list.

    I would like to change this. I'd like to cause trouble where there is none today — or rather, I'd like to EXPOSE the deep-seated, far-reaching, trouble-causing consequences of the fact that everyone thinks it's quite alright that math and computer science are thought of as pretty much two halves of the same coin. In fact, I will argue that the math-computer-science-marriage is just fine for math — but the root cause of a remarkable variety of intractable problems that plague software development.

    Note that I did a quick shift there. I have no problem with math and computer science being together. They kinda belong together. My problem is that everyone thinks that you study computer science in school so that you're qualified to do software development after graduating. And that software development shops require CS degrees, and pay more for advanced degrees in CS, on the theory that if some is good, more must be better.

    I will flesh this out and explain why it's the case in future posts. But I thought throwing down the gauntlet was worth doing. Or at least fun!

  • Internet Driver’s Licenses Needed for Users

    We give kids sex education. We give them driver education, and require a driver test and license before driving. But we let any fool onto the internet to wreak whatever havoc they can on themselves and others without a second thought. It's time for a change!

    Education for Meaningful Use

    Education on the basics of how the internet and associated technologies work and how to control, respond to and interpret what you see is totally neglected. There are no significant efforts that I know of to make people educated consumers of this important, ubiquitous service that is so widely used. But there is a more important issue…

    Education for Safety

    By far the most important subject for internet education is safety. Maintaining internet safety has some similarities to general safety, but is different in important ways.

    Internet "driving" safety

    The most important aspect of safety while driving is avoiding driving while impaired in any way, and paying sharp attention to the road and other vehicles at all times. Driving while impaired by drugs or alcohol or while engaged in texting or talking Image-3-4
    are recognized factors.

    So imagine how hazardous internet driving must be when people don't even know how to read the road signs (the URL's) and can't tell that they've wandered onto a road constructed by criminals specifically for the purpose of enabling them to steal your car, drive it to your bank and take out a big withdrawal! But that's exactly what it is! Here's an example of a more brazen attack (image from a good guy, Yoo Security), demanding that you send the money yourself: ICE
    Unfortunately, there are criminals out there who have grown far beyond simple smash-and-grab operations. These sophisticated criminals with a long-term view trick you to "drive" onto their criminally-constructed "road" for the sole purpose of making your car an instrument for stealing from other people or organizations. They can make your computer into a zombie to participate in botnets. It can serve that purpose for minutes or years without your awareness. Is the problem big? You betcha. There are more computers that have been hi-jacked into botnets (maybe yours!) than most people are aware of:

    Botnets
    Sometimes, of course, the criminals are stupid, greedy or malicious — I guess those are the drop-outs from the "criminals should be good citizens" certification program. So your hi-jacked device could slow to a crawl, do weird things, look over your shoulder as you type until they get the information needed to drain your bank account or max out your credit card, or even (just because it's fun!) wipe out your machine while leaving some cute "it was me! Have a nice life!" Message on your screen.

    Internet E-mail fraud

    How often do you get a letter purporting to be from your bank asking you to send them a letter containing your account number just so they can verify that everything's OK? If you got one, do you think you'd respond as requested? Apparently you're not alone — criminals are the supreme capitalists, and abandon efforts that are unprofitable before long.

    But how about letters on the internet, i.e., e-mail? Along with everyone I know, I get an amazing number of criminal solicitations, ranging from the laughable (at least to me) to the amazingly credible every day. Data-driven capitalists that they are, the only explanation for the persistence of these efforts is that more than enough of them work to cover the costs and trouble of running the schemes, certainly more than getting a legal job. I've seen fewer solicitations from Nigeria lately, but the slack has been taken up by Libya.

    Here's one of the new breed from Libya:

    Libya

    Here is a somewhat more plausible one from a place that really could be your bank:

    Chase

    Conclusion

    Uneducated internet users cause billions of dollars of harm to themselves and others every year. You think this would result in outcry by those users and people who know them for education. You might think this might merit a bit of attention from the institutions who so assiduously and expensively educate, authorize, license and otherwise keep us on the straight and narrow. When I'm in Central Park in New York, there are rangers watching my every move; they set me straight when I ride my bike where I'm not supposed to, or walk in one of the ever-changing restricted areas. The conclusion is obvious: every move I make in the Park is more worthy of watchful restriction by people in uniforms than the millions of actions on the internet that seem, at least to me, far more destructive. I must be missing something.

  • Computer History

    In software, history is ignored and its lessons spurned. What little software history we are taught is often simply wrong. Everyone who writes or uses software pays for this, and pays big.

    But we know about history in software — there's Babbage, the ENIAC, etc.

    Yes, we've all heard about various people who are said to have invented modern computing. A  shocking amount of what we are taught is WRONG.

    Babbage is a case in point. People just love to go on and on about him. There are  problems, though. I'll just mention a couple.

    220px-Charles_Babbage_-_1860

    One problem is that his machines simply didn't work, even after decades of work, and huge amounts of skilled help and money. He must have known they wouldn't; although he was personally wealthy, it was other people's money he spent on his famous dalliance.

    Another problem is that his best idea wasn't his. The idea of using punched cards

    220px-Jacquard.loom.cards
    to contain the program was invented in France and was a key aspect of the Jacquard Loom — a machine that pre-dated all his work, and a machine that actually worked and was in widespread use.

    The ENIAC is another good example of what appears to be the typical pattern in computing, which is someone invents a good thing, makes it work, and then someone else steals it, takes credit for it and tries to cover up the theft, often without delivering results as good as the original.

    250px-Eniac

    If you only read the standard literature, you would still be convinced that the ENIAC and its inventors were giants of the field. Once you read everything, you discover that reality is more interesting. It turns out that the inventors of the ENIAC were "inspired" by prior inventions, much like Babbage and the Jacquard Loom. In this case, the inspiration was the Atanasoff-Berry Computer.

    ABCdrawing
    Here is an excerpt from the ruling in the patent dispute that settled the issue:

    Judge Larson had ruled that John Vincent Atanasoff and Clifford Berry had constructed the first electronic digital computer at Iowa State College in the 1939-1942 period. He had also ruled that John Mauchly and J. Presper Eckert, who had for more than twenty-five years been feted, trumpeted, and honored as the co-inventors of the first electronic digital computer, were not entitled to the patent upon which that honor was based. Furthermore, Judge Larson had ruled that Mauchly had pirated Atanasoff's ideas, and for more than thirty years had palmed those ideas off on the world as the product of his own genius.

    Other fields don't need history — why should software?

    Not true. Other fields are saturated with history.

    Politicians study history in general and the last election in particular. Fiction writers frequently read fiction, current and historic. Generals study old battles for their lessons; even today at West Point, they read about the Civil War. Learning physics is like going through the history of physics, from Galileo and Newton and through Planck and Einstein to the present. Even the terms used in physics remind you of its history: hertz, joules and Brownian motion.

    Software, by contrast, is almost completely a-historical. Not only are most people involved uninterested in what happened ten years ago, even the last project is unworthy of consideration – it’s “history.”

    Consequences of the lack of history

    War colleges study past wars for the highly pragmatic purpose of finding out how they were won or lost. What was it the winner did right? Was it better weapons? Better strategy? Better people? Some combination? And how exactly did the loser manage to lose? Was it a foregone conclusion, or was defeat snatched from the jaws of victory? People who conduct wars are serious about their history — they want to win!!

    In software, no one is interested in history. Everyone thinks they know the "right" way to build software, and thinks that the only possible source of loss is failing to do things the "right" way — the requirements weren't clear; the requirements were changed; I wasn't given enough time to do a proper design; there was no proper unit testing; the lab for testing was insufficently realistic. The list of complaints and excuses is endless, and their net effect is always the same: crappy software and whining: I need more people, more time and more money. Because studying history is so rare, few are exposed to the software "wars" that are fought and won by teams that didn't follow their rules.

    There is only one conclusion to be drawn: software people would rather lose with lots of excuses than win by doing things the "wrong" way. Ignoring history is a great way to stay in this comfortable cocoon.

    When software history becomes as important a part of computer science education as physics history is of physics, we'll know it's approaching credibility. Until then, everything about computer science, education and practice will continue to be a cruel joke.

Links

Recent Posts

Categories