Category: Computer history

  • Don’t Know Much about History

    Software people generally know very little about software history, and that's OK with them. It's too bad. There's a lot to learn from software history. It can help you now!

    Wonderful World

    In 1960, Sam Cooke released a single called "Wonderful World."

    220px-Cooke_WonderfulWorld

    Here are some of the lyrics:

    Capture

    I sure hope you can win that girl or boy you're after in spite of all that not-knowing!

    The Wonderful World of History

    Politicians study history in general and the last election in particular. Fiction writers frequently read fiction, current and historic. Generals study old battles for their lessons; even today at West Point, they read about the Civil War. Learning physics is like going through the history of physics, from Galileo and Newton and through Planck and Einstein to the present. Even the terms used in physics remind you of its history: hertz, joules and Brownian motion. Math is the same way. Whatever you're learning was first established at some point in history, and remains as valid and applicable to the present as when first discovered.

    Software, by contrast, is almost completely a-historical. Not only are most people involved uninterested in what happened ten years ago, even the last project is unworthy of consideration – it’s “history.”

    History isn't just for historians

    How did we learn about biological evolution? By observing species and trying to figure out their history. How did we learn about genes and DNA? By trying to figure out the mechanisms that make organisms work through time. Geology? Gee, I wonder how those mountains got there? And what happened so that I'm finding fossils of creatures that lived in the ocean up there?

    A good deal of science is historical in nature. We try to construct theories that explain how things got to be the way they are; and then we run tests or make lots of observations.

    Software History is for the Birds

    Or so it appears, from the way that the vast majority of software people act. We're about to embark on a new project. How did similar projects work out in the past? What are we doing differently? The uniform response to questions like these? Crickets.

    One thing I've realized is that our determined effort to ignore history in software is a completely understandable defense mechanism. Suppose you're starting an hours-long road trip. At the end is near-certain disaster. Would you like to know that at the beginning of the trip, so that every second is miserable, building to a crescendo of terror? Or would you rather blissfully cruise along, and then be blind-sided at the end, leading to a mercifully quick death? Apparently, pretty much everyone agrees that blissful ignorance is the way to go.

    A Wonderful World

    Here's what I think would be a wonderful world:

    1. They both love each other, AND
    2. They know lots of software history together, leading not only to A's in school, but great jobs and successful projects.
  • Math and Computer Science vs. Software Development

    In a prior post, I demonstrated the close relationship between math and computer science in academia. Many posts in this blog have delved into the pervasive problems of software development. I suggest that there is a fundamental conflict between the perspectives of math and computer science on the one hand, and the needs of effective, high quality software development on the other hand. The more you have computer science, the worse your software is; the more you concentrate on building great software, the more distant you grow from computer science.

    If this is true, it explains a great deal of what we observe in reality. And if true, it defines and/or confirms some clear paths of action in developing software.

    A Math book helped me understand this

    I've always loved math, though math (at least at the higher levels) hasn't always loved me. So I keep poking at it. Recently, I've been going through a truly enjoyable book on math by Alex Bellos.

    Bellos cover

    It's well worth reading for many reasons. But this is the passage that shed light on something I've been struggling with literally for decades.

    Bellos quote

    When we learn to count, we're learning math that's been around for thousands of years. It's the same stuff! Likewise when we learn to add and subtract. And multiply. When we get into geometry, which for most people is in high school, we're catching up to the Greeks of two thousand years ago.

    As Alex says, "Math is the history of math." As he says, kids who are still studying math by the age of 18 have gotten all the way to the 1700's!

    These are not new facts for me. But somehow when he put together the fact that "math does not age" with the observation that in applied science "theories are undergoing continual refinement," it finally clicked for me.

    Computers Evolve faster than anything has ever evolved

    Computers evolve at a rate unlike anything else in human experience, a fact that I've harped on. I keep going back to it because we keep applying methods developed for things that evolve at normal rates (i.e., practically everything else) to software, and are surprised when things don't turn out well. The software methods that highly skilled software engineers use are frequently shockingly out of date, and the methods used for management (like project management) are simply inapplicable. Given this, it's surprising, and a tribute to human persistance and hard work, that software ever works.

    This is what I knew. It's clear, and seems inarguable to me. Even though I'm fully aware that the vast majority of computer professionals simply ignore the observation, it's still inarguable. The old "how fast do you have to run to avoid being eaten by the lion" joke applies to the situation. In the case of software development, all the developers just stroll blithely along, knowing that the lions are going to to eat a fair number of them (i.e., their projects are going to fail), and so they concentrate on distracting management from reality, which usually isn't hard.

    What is now clear to me is the role played by math, computer science and the academic establishment in creating and sustaining this awful state of affairs, in which outright failure and crap software is accepted as the way things are. It's not a conspiracy — no one intends to bring about this result, so far as I know. It's just the inevitable consequence of having wrong concepts.

    Computer Science and Software Development

    There are some aspects of software development which are reasonably studied using methods that are math-like. The great Donald Knuth made a career out of this; it's valuable work, and I admire it. Not only do I support the approach when applicable, I take it myself in some cases, for example with Occamality.

    But in general, most of software development is NOT eternal. You do NOT spend your time learning things that were first developed in the 1950's, and then if you're good get all the way up the 1970's, leaving more advanced software development from the 1980's and on to the really smart people with advanced degrees. It's not like that!

    Yes, there are things that were done in the 1950's that are still done, in principle. We still mostly use "von Neumann architecture" machines. We write code in a language and the machine executes it. There is input and output. No question. It's the stuff "above" that that evolves in order to keep up with the opportunities afforded by Moore's Law, the incredible increase of speed and power.

    In math, the old stuff remains relevant and true. You march through history in your quest to get near the present in math, to work on the unsolved problems and explore unexplored worlds.

    In software development, you get trapped by paradigms and systems that were invented to solve a problem that long since ceased being a problem. You think in terms and with concepts that are obsolete. In order to bring order to the chaos, you import methods that are proven in a variety of other disciplines, but which wreck havoc in software development.

    People from a computer science background tend to have this disease even worse than the average software developer. Their math-computer-science background taught them the "eternal truth" way of thinking about computers, rather than the "forget the past, what is the best thing to do NOW" way of thinking about computers. Guess which group focusses most on getting results? Guess which group would rather do things the "right" way than deliver high quality software quickly, whatever it takes?

    Computer Science vs. Software Development

    The math view of history, which is completely valid and appropriate for math, is that you're always building on the past, standing on the shoulders of giants.

    The software development view of history is that while some general things don't change (pay attention to detail, write clean code, there is code and data, inputs and outputs), many important things do change, and the best results are obtained by figuring out optimal approaches (code, technique, methods) for the current situation.

    When math-CS people pay attention to software, they naturally tend to focus on things that are independent of the details of particular computers. The Turing machine is a great example. It's an abstraction that has helped us understand whether something is "computable." Computability is something that is independent (as it should be) of any one computer. It doesn't change as computers get faster and less expensive. Like the math people, the most prestigious CS people like to "prove" things. Again, Donald Knuth is the poster child. His multi-volume work solidly falls in this tradition, and exemplifies the best that CS brings to software development.

    The CS mind wants to prove stuff, wants to find things that are deeply and eternally true and teach others to apply them.

    The Software Development mind wants to leverage the CS stuff when it can help, but mostly concentrates on the techniques and methods that have been made possible by recent advances in computer capabilities. By concentrating on the newly-possible approaches, the leading-edge software person can beat everyone else using older tools and methods, delivering better software more quickly at lower cost.

    The CS mind tends to ignore ephemeral details like the cost of memory and how much is easily available, because things like that undergo constant change. If you do something that depends on rapidly shifting ground like that, it will soon be irrelevant. True!

    In contrast, the Software Development mind jumps on the new stuff, caring only that it is becoming widespread, and tries to be among the first to leverage the newly-available power.

    The CS mind sits in an ivory tower among like-minded people like math folks, sometimes reading reports from the frontiers, mostly discarding the information as not changing the fundamentals. The vast majority of Software Development people live in the comfortable cities surrounding the ivory towers doing things pretty much the way they always have ("proven techniques!"). Meanwhile, the advanced Software Development people are out there discovering new continents, gold and silver, and bringing back amazing things that are highly valued at home, though not always at first, and often at odds with establishment practices.

    Qualifications

    Yes, I'm exaggerating the contrast between CS and Software Development. Sometimes developers are crappy because they are clueless about simple concepts taught in CS intro classes. Sometimes great CS people are also great developers, and sometimes CS approaches are hugely helpful in understanding development. I'm guilty of this myself! For example, I think the fact that computers evolve with unprecedented speed is itself an "eternal" (at least for now) fact that needs to be understood and applied. I argue strongly that this fact, when applied, changes the way to optimally build software. In fact, that's the argument I'm making now!

    Nonetheless, the contrast between CS-mind and Development-mind exists. I see it in the tendency to stick to practices that are widely used, accepted practices, but are no longer optimal, given the advances in computers. I see it in the background of developers' preferences, attitudes and general approaches.

    Conclusion

    The problem in essence is simple:

    Math people learn the history of math, get to the present, and stand on the shoulders of giants to advance it.

    Good software developers master the tools they've been given, but ignore and discard the detritous of the past, and invent software that exploits today's computer capabilities to solve today's problems.

    Most software developers plod ahead, trying to apply their obsolete tools and methods to problems that are new to them, ignoring the new capabilities that are available to them, all the while convinced that they're being good computer science and math wonks, standing on the shoulders of giants like you're supposed to do.

    The truly outstanding people may take computer science and math courses, but when they get into software development, figure out that a whole new approach is needed. They come to the new approach, and find that it works, it's fun, and they can just blow past everyone else using it. Naturally, these folks don't join big software bureaucracies and do what everyone else does. They somehow find like-minded people and kick butt. They take from computer science in the narrow areas (typically algorithms) where it's useful, but then take an approach that is totally different for the majority of their work.

  • Continents and Islands in the World of Computers

    The vast majority of people appear to think that the world of computers and software is pretty uniform. While everyone recognizes that there are differences between the systems used by consumers and ones in business, it makes sense that pretty much the same thing is going on inside.

    The reality is that vast cultural and practical differences separate the various clusters of computer and software applications. I look forward to the first anthropological studies that are devoted to this subject, illustrating and spelling out the untravelled oceans that separate the diverse lands of computer and software practice. Meanwhile, there are both obstacles and opportunities that arise from these facts.

    A Diversity of Tongues

    The Bible gives a vivid explanation of how the various languages arose. In the post-flood world, there was said to be a single people with a single tongue. They built a city with a tower that reached to the sky, to make a name for themselves. Tour_de_babel.jpeg
    As a single people with a single tongue, "the sky was the limit" for how far they could go. God didn't like this. He reached down and confounded their speech, and scattered them over the face of the earth.

    Whether it's the fault of God or humans, it's well understood that groups of people develop their own languages, which then evolve and splinter. In fact, by studying the relationship of various languages, you gain insight into how humans migrated over the earth. The Indo-European family of languages is an excellent example of this.

    A Diversity of Software Languages

    Early computer programmers in the 1950's clearly saw the advantage of having a single language for software. They tried hard to create a universal software language, FORTRAN being the most well-known and successful such early language. While FORTRAN (short for "formula translator") was great for math people, people working with business records weren't impressed. Thus COBOL ("COmmon Business-Oriented Language") was invented. Things were still pretty simple in the mid-1950's. 1957

    But of course it didn't stop there. People migrated to different "lands," confronted new issues, and created new languages suitable for the new problems. Here's a snapshot of some of the major developments in the mid-1970's.

    1976

    By now, literally thousands of general-purpose languages have been created. Not including "esoteric" languages and thousands more languages for narrow problem domains.

    Idioms and Layers

    As anyone who has learned a new human language as an adult is well aware, learning a language is one thing — but learning all the incidental aspects of the language, particularly the idioms, is a task that never seems to end. This is because there are lots of them — an estimated 25,000 of them in English, for example.

    There are equivalents of idioms in software languages, in multiple categories. I'll just give a basic example: the run-time library, whose documentation frequently exceeds that of the base language, and without which you can't write practical programs. A more complex example: a "framework," for example the RAILS and Sinatra frameworks for the Ruby language.

    Let's say you know English pretty well. Are you qualified to write or even to read and understand a legal brief? Unless you're a lawyer, probably not. Yet there's no denying that the brief is written in English, with a whole pile of idioms and other special things that are not in common use.

    It's the same with computer languages. You may know the language C pretty well, but when you first look at the C code in a driver or an operating system kernel, it's got a strange idiom — "straight" C would stand out as obviously as BBC English would in Brooklyn. The C that's in a compiler is even more rife with idioms and unfamiliar constructs, so much so that a person who is otherwise fluent in C would have trouble figuring out what was going on, just as a normal fluent English-speaker would have trouble following a discussion by two doctors of a difficult medical case.

    Different Continents, Different Cultures

    Everyone knows that human languages are part of overall human culture. Cultures differ at least as much as languages. It's not widely appreciated that this is also true in the world of computers. While of course there are commonalities, you actually think differently in different languages, whether human or software. Lots of incidental things tend to get wrapped up in the cultures as well. To take a simple example: in the US, when you walk into a store, prices tend to be marked on the goods. If you like that thing at that price, you buy it; otherwise you don't. It doesn't work that way in much of South Korea. Prices may not be marked at all; and negotiation is assumed and expected. There are a whole set of cultural norms that have to be understood in order to thrive.

    Differences in software cultures are just as strong as differences in human cultures. For example, one of the major forces in the world of hospital automation is Epic. Epic is written in a language originally called MUMPS. While few write programs from scratch in MUMPS anymore, you have to use the language to customize the Epic system, just as you have to use ABAP to customize the SAP manufacturing system. In either case, learning the peculiar language is the tip of the iceberg — your programs "live in" the Epic system, and so knowing that system inside out is the key to success, far more important than utilization of the language itself. This is perhaps comparable to the importance of knowing all the relevant prior case law in writing a legal brief in support of your position, vs. simply knowing the vocabulary and syntax of English.

    Contact Between Distant Cultures

    We know that in human life, cultures developed in isolation from each other for many thousands of years. Migrating peoples would clash, and the cultures with superior instruments and warrior culture tended to decimate the others. A culture that develops superior methods of war, usually with distinct technology, tends to expand. This was true, for example of the Commanche in North America and the Mongols, each of which developed unique competence and technology with horses, and used that to rapidly expand their sphere of influence.

    This is an area of both similarity and difference with software cultures. Within a company, there are frequently members of different software "tribes," who usually neither understand nor like each other. They are constantly at war to establish primacy, the winning culture grudgingly conceding resource-poor reservations to which the losers are confined.

    It's different with software cultures that are separated by "oceans," usually different problem domains or industries. You might like to think that everyone who does software has all the information available, and therefore is at a similar cultural level, the equivalent of, say, the different countries in Europe. They may speak different languages and like different foods, but they all have cars, telephones, and electrical appliances. In software, this is not the case! In software, there are cultural differences that are the equivalent of cars being widely available in one place, while ox carts are the standard mode of transport in another. It's that extreme.

    What's even more shocking is that the denizens of these culturally isolated software continents are comfortable and secure in what outsiders see as their "backwardness" or "ignorance," and find ways to denigrate and disparage outsiders who dare to suggest there might be a better way of doing things.

    How big are these culturally retarded continents? Just a few distant places, the equivalent of Australia? If you have the opportunity to see a wide swath of software culture, what you find is that the vast majority of software groups have cultures that are dramatically inferior to the best places. Moreover, the variance in just how primitive things are is huge.  In any given place, it is likely that practices that are unknown but would dramatically enhance results are already standard practice in other places. Judging software isn't like auditing the financial books of a company, in which you either pass or fail an audit. It's more like figuring out which part of which software continent the place is, and how many years or decades behind the best known methods the place is, and to what extent their near neighbors are slightly ahead or behind.

    Conclusion

    It's useful to compare human language and culture to software language and culture. Just as with humans, language is an important part of culture, but thriving involves a whole lot more than just grasping the basics of a language. Just like with humans, there are varying levels and there are conflicts. But what's most interesting are the differences, which are the equivalent of humans living in environments that are physically next to each other, but using tools and methods that are hundreds of years different in terms of evolution. This fact has huge implications on many levels.

  • Lessons for Software from the History of Scurvy

    Software is infected by horrible diseases. These awful diseases cause painfully long gestation periods requiring armies of support people, after which deformed, barely-alive products struggle to be useful, live crippled existences, and are finally forgotten. Software that functions reasonably well is surprisingly rare, and even then typically requires extensive support staffs to remain functional.

    Similarly, sailors suffered from the dread disease of scurvy until quite recently in human history. The history of scurvy sheds surprising light on the diseases which plague software. I hope applying the lessons of scurvy will lead to a world of disease-free, healthy software sooner than would otherwise happen.

    Scurvy

    Scurvy is caused by a lack of vitamin C. It's a rotten disease. First you get depressed and weak. Then you pant while walking and your bones hurt. Next your skin goes bad,

    378px-A_case_of_Scurvy_journal_of_Henry_Walsh_Mahon
    your gums rot and your teeth fall out.

    Scorbutic_gums
    You get fevers and convulsions. And then you die. Yuck.

    The Impact of scurvy

    Scurvy has been known since the Egyptians and Greeks. Between 1500 and 1800, it's been estimated that it killed 2 million sailors. For example, in 1520, Magellan lost 208 out of a crew of 230, mainly to scurvy. During the Seven Years' War, the Royal Navy reported that it conscripted 184,899 sailors, of whom 133,708 died, mostly due to scurvy. Even though most British sailors were scurvy-free by then, expeditions to the Antarctic in the early 20th century were plagued by scurvy.

    The Long path to Scurvy prevention and cure

    The cure for scurvy was discovered repeatedly. In 1614 a book was published by the Surgeon General of the East India company with a cure. Another was published in 1734 with a cure. Some admirals kept their sailors healthy by providing them daily doses of fresh citrus. In 1747 the Scottish Naval Surgeon James Lind proved (in the first-ever clinical trial!) that scurvy could be prevented and cured by eating citrus fruit.

    JamesLind

    Finally, during the Napoleonic Wars, the British Navy implemented the use of fresh lemons and solved the problem. In 1867, the Scot Lachlan Rose invented a method to preserve lime juice without alcohol, and daily doses of the new product were soon standard for sailors, which is how "limey" became synonymous with "sailor."

    B_scurvy

    Competing Theories and Establishment Resistance

    The effective cures that had been known and used by some people for centuries were not in a vacuum. There were competing theories. Cures included urine mouthwashes, sulphuric acid and bloodletting. As recently as 100 years ago, the prevailing theory was that scurvy was caused by "tainted" meat. How could this be?

    We've seen this movie before. Over and over again. I told the story of Lister and the discovery of antiseptic surgery — and the massive resistance to the new method by the leading authorities at the time.

    Software Diseases

    This brings us back to software. However esoteric and difficult it may be, software is a human endeavor: people create, change and use software and the devices it powers. Like any human endeavor, some of what happens is because of the subject matter, but a great deal is due to human nature. People are, after all, people, regardless of what they do. Patients were killed for lack of antiseptic surgery — and the surgical establishment fought it tooth and nail. Millions of sailors were killed by scurvy, when a cure had been known, practiced and proved for centuries. Why would we expect any other reaction to cures for software diseases, when the "only" consequence of the diseases are explosive growth in the time, cost and risk to build and maintain software, which is nonetheless crappy and late?

    Is there a general outcry about this dismal software situation? No! Why would anyone expect there would be? Everyone thinks it's just the way software is, just like they thought scurvy in sailors and deaths after surgery were part of life. Government software screws up,

    Healthcare-gov-wait
    software from major corporations is awful,

    Hertz fail

    software from cool new social media companies is inexcusably bad. Examples of bad software can be listed for endless, boring, tedious, like forever lengths.

    Toward Healthy Software Development

    If I had spent my life in the normal way (for a software guy), I wouldn't be on this kick. But I didn't and I am on this most-software-sucks kick. Early on, I had enough exposure to large-group software practices to convince me that I wanted none of it. I'd rather actually get stuff done, thank you very much. Now, looking at many young software ventures over a period of a couple decades, the patterns have emerged clearly.

    I have described the main sources of the problems. I have described the key features of diseasefree software development. I have explained the main sources of the resistance to a cure, for example in this post. And I have no illusion that things will change any time soon.

    It will sure be nice when the pockets of healthy software excellence that I see proliferate more quickly than they are, and when an anti-establishment consensus consolidates and gains visibility more quickly than it is. In the meantime, there is good news: groups that use healthy, disease-free software methods will have a massive competitive advantage over the rest. It's like ninjas vs. a collection of retired security guards. It's just not fair!

  • Fundamental Concepts of Computing: Speed of Evolution

    Nothing we encounter in our daily lives changes or evolves as quickly as computing. All our habits of thinking are geared towards things that evolve slowly, compared to computing. This is a simple concept. It is disputed by no one. But it has implications that are vast and largely undiscussed and unexplored. It clearly deserves to be a fundamental concept of computing, along with a few correlaries.

    Normal evolution speeds

    Our planet evolves. Most of the changes are slow, with peak events mixed in. Life forms evolve slowly, over tens of thousands of years. Human culture evolves more quickly — too fast for some people, and not nearly fast enough for others. Human capabilities also evolve, but for something like speed to double over a period of decades is astounding.

    Take people running as an example. Here are the records for running a mile over the last 150 years or so. Mens record
    The time has been reduced by roughly 25% over all those years. Impressive for the people involved, and amazingly fast for human change.

    Once you shift to things made by humans, the rate increases, particularly as science and technology have kicked in. We've invented cars and they've gotten much faster since the early ones. Here is a lady on a race car in 1908. She was called the "fastest lady on earth" for driving at 97 mph in 1906.

    Miss_Dorothy_Levitt,_in_a_26hp_Napier,_Brooklands,_1908

    in 2013, Danica Patrick won pole position during qualifying rounds at the Daytona Speedway by averaging over 196 m.p.h.

    Danica

    In this case, speed roughly doubled over about a hundred years.

    Computer evolution speeds

    Computers are different. Moore's Law is widely known: power doubles roughly every 18 months. And then doubles again. And again. And again. Every time someone predicts an end to the doubling, someone else figures out a way to keep it going. This is a fact, and it's no secret. It's behind the fact that my cell phone has vastly more computational power and storage than the room-sized computer I learned on in 1966.

    This chart should blow anyone's mind, even if you've seen it before. It shows processor transistor count (roughly correlates with power) increasing by a factor of 10, then another, then another. In sum, it shows that power has increased about one million times over the last forty years.

    Processor speed

    It would be mind-blowing enough that we have something in our lives that increases in speed at such an incomprehensible rate. But that's not all! Everything about computers has also gotten less expensive! For example, the following chart shows how DRAM storage prices have gone down over the last twenty years, from over $50,000 per GB to around $10. In other words, to about 2% of 1% of the price twenty years ago.

    DRAM-GB-Price

    That's faster and cheaper in a nutshell.

    So what? Everyone knows that things change quickly with computers, what's the big deal? It's just the way things are!

    Here's what: this simple fact has profound implications.

    Everything about human beings is geared for people and things that evolve at "normal" speed. Our patterns of thought, the things we do, most of our behaviors were developed for "normal-speed" evolvers, i.e., everything but computers (EBC). A surprising number of them break, are wrong or yield crappy results when applied to computers. There is no reason at all to be surprised by this; in fact, anything else would be surprising. What is surprising and interesting is that the implications are rarely discussed.

    Computers evolve more quickly: it matters!

    When you apply patterns of thought and behavior that may be appropriate for EBC to fast-evolving computers, those thoughts and behaviors typically fail miserably. This "impedance mis-match" explains failure patterns that persist for decades. Of course everyone knows that computers are different from other things — just as they know that Indian food is different from Chinese food. What they tend not to know is that computers are different in a different way (because of the speed of evolution) from EBC.

    Here are a couple examples.

    The mainstream "wisdom" for software project management is essentially the same, with minor modifications, as managing anything else, from building a house to a new brand of toothpaste. It's not! That's one of the many reasons why the larger organizations that depend on those techniques fail to build software effectively.

    We treat software programming skills like any other kind of specialized knowledge, like labor law. They're not! Thinking that they are is one of the many reasons why great software people are 10 times better than really good ones, who are themselves 10 times better than average ones.

    The normal ways people go about hiring software engineers is crap. They think that hiring folks who deal with a subject matter that changes so dramatically is the same as hiring anyone else. It's not! They also think that extended, in-depth experience with a particular set of technologies is really valuable. It's not! It's actually a detriment!

    Software engineers tend to learn to program in a certain way, using a given set of tools, techniques and thought patterns. Those tools were designed to solve the set of problems that existed with computers at a certain point in their evolution. But computers evolved from that point! Quickly! The programmers are doing the equivalent of hunting for rabbits with weapons designed for hunting Mastodons, blissfully unaware of what's appropriate for computers the way they are today!

    Software is hard and isn't getting easier

    Software isn't a problem that gets solved — oh, now we know how to do it, finally. It doesn't get solved because the underlying reality (computers) evolves more quickly than anything else. The examples I mentioned are things that "everyone" is sure must apply to software. How could they not? The fact that they yield consistently horrible results seems not to break the widespread faith in the mainstream approach to software. 

    These and many other broadly accepted falsehoods explain why so many things about software are broken and (worse) don't seem to get fixed, decade after decade, when superior methods have been proven in practical application. Why is everyone so resistant to change? Is everyone stupid?

    Aw shucks, I admit it: "everyone is stupid" was my working hypothesis for explaining things like this. But I now have a more satisfying hypothesis: everyone is so used to dealing with "normal speed things," EBC's, that they just can't help themselves from applying to computers the methods and patterns of thought and behaviors that work reasonably well in most of their lives. Since nearly everyone gets the same lousy results, the conclusion everyone draws is that there's something about computers that's just miserable.

    Conclusion

    There is nothing comparable to computers in the rest of our human experience. Nothing evolves at anywhere close to the speed of computers, getting more powerful while getting cheaper at hard-to-comprehend rates. We apply methods and patterns of thought that work well with practically everything, and those methods fail when applied to computers. But they fail for everyone! The conclusion everyone draws from this is that computers are just nasty things, best to stay away from them and avoid blame. It's the wrong conclusion.

    Computers are understandable. The typical failures are completely explained by the mis-matched methods we bring to them, like trying to catch butterflies with a lasso. When people use methods that are adapted to the unprecedented evolutionary speed of computers, things go well.

  • The Big Data Technology Fashion

    Where there are people, there are fashions. Why should technology be immune? The current fashion of "big data" is a classic exemplar of the species.

    The Books

    Books are a good place to observe the common themes of technology fashions. You'll see patterns that resemble the ones I previously pointed out for project management.

    I think it's fair to say it's not a legitimate technology trend if it's not covered in an "X for Dummies" book.


    BD Dummies

    Similarly, it's got to be big. Be Revolutionary. Transform lots of stuff.


    BD revolution

    Its got to be a big, scary thing that needs taming.


    BD Taming

    For any fashion trend, its important to make sure that other things are hitched to its wagons.


    BD Analytics

    Let's not forget that, if it's worth paying attention to, there's got to be a way to make money from it.


    BD Money

    It's never too soon to start adding layers of process and paranoia to it, to assure that costs skyrocket and that hardly anything ever gets done; in other words, governance.


    BD Governance

    Finally, anything but anything has to have a human side.


    BD Human

    I swear, I sometimes think there's a central planning committee for technology fashions. They plan when the next new label on something old and not all that interesting is going to come out, grab their standard set of titles, and pass them out to people to write the books.

    But then, I guess it can't really be that organized, because there are usually so very many books, each of them covering the same small set of themes over and over and over, with slightly different language. The themes always seems to include:

    • X is revolutionary; it will change lots of important stuff.
    • X is big and scary, and you need help to tame it or bring it under control
    • There are lots of ways to screw up doing X, so you need to pay lots of money for Y to get it right
    • You're a Dummy, but I'll help you understand what you need to know about X anyway.
    • X has a human side

    The Conferences

    Things aren't that different with conferences. They take the themes established in the books and embellish them a bit.

    There are conferences for people who work in particular sectors.

    BD Public

     

    You can't pass up an opportunity to learn from the very best.

    BD learn from best

    Who can resist going to a conference which cuts through all the crap and helps you do stuff?

    BD how to

    Anyway, you get the idea — there are lots of conferences. The themes are predictable, even without the aid of big data or predictive analytics. Because they apply to any technology fashion trend.

    Conclusion

    Technology fashions — they are forever in fashion!

  • What Can Software Learn from Steamboats and Antiseptic Surgery?

    Software is among the most advanced, rapidly changing fields of technology. Only the "kids" who grew up with the latest techniques seem to be able to master them. At the same time, really bad ideas spread through software groups like the plague; they take hold and resist cure, in spite of producing terrible results. How can we make sense out of a field that advances rapidly and resists change at the same time?

    History

    As I've pointed out, software people are strongly averse to learning about computer history. In some fields (e.g. physics), the very terms used are named after historical figures; in others, history is treated with reverence (e.g., Santayana: "Those who cannot remember the past are condemned to repeat it."); in software, by contrast, we use the phrase "that's history" to dismiss anything that happened in the past as obviously irrelevant to the present.

    I think studying history is the only way to understand the present, software included. I think we can understand the strange software phenomenon of rapid change combined with resistance to change by taking two examples from history: one in which new methods in technology were rapidly accepted by all concerned parties, and the other in which clearly superior new methods were resisted for many long years by the leading people in the field.

    Steamboats

    It would be great if software advances were adopted quickly, like the way steam technology rapidly overcame wind as a method for moving boats.

    The displacement of wind by steam is clearly laid out in T.J. Stile's excellent biography of one of the major figures in the transformation, Cornelius Vanderbilt. 449px-Cornelius_Vanderbilt_Daguerrotype2

    Vanderbilt started in business by running a sailing-boat "taxi" service from Staten Island to Manhattan. He transitioned into the rapidly emerging steam boat transportion business, not only as a captain and owner, but (surprisingly to me) as an engineer.

    The public took to the new steamboats quickly. The reason is clear: speed. There was, at the time, no quicker way to get from point A to point B if there were a water route between them. The speed of the boat was immediately obvious to the simple observer, and easy to verify by noting departure and arrival times. To prove whose boat was the fastest, there were races. 800px-Cornelius_Vanderbilt_(steamboat)

    Vanderbilt's steamboats were judged by a clear standard: whose was the fastest? The criteria were easy to measure.

    Antiseptic Surgery

    The benefits of antiseptic surgery, as introduced by Joseph Lister, were clear: instead of a large number of patients dying of infection after surgery, they would live. Ego clearly played a role in resisting the adoption of the new method. But, to be fair, there is another important factor.

    What made surgery different from steamboats? They were both major technical advances. They both involved major changes in what you did and how you did it — more so with boats than with surgery! So why did steam catch on quickly, even though they required whole new boats of radically different design and operation, while the antiseptic method was resisted for decades, even though it was subsidiary to the surgery itself, which was left largely unchanged?

    Boats and Surgery

    The fact that steamboats were faster than sailboats was easy and unambiguous to measure, while the surgery outcomes were difficult and ambiguous to measure.

    The time of each boat trip is easy to measure. It's just a time duration. When you watch two boats, anyone can see which one moves more quickly. By contrast, every surgery is different. The patient is different, the trouble being fixed is different, and the ultimate outcome may not be determined for weeks. Many surgery patients continued to die with antiseptic methods because it wasn't the only factor influencing the outcome. Furthermore, excellent surgeons who were dirty could save patients that would have been killed by crappy surgeons who happened to use antiseptic methods, since after all not every patient got infected.

    In retrospect, it's completely maddening that surgeons failed to be swayed by the arguments and evidence in favor of Lister's carbolic acid methods, and ego certainly played a role. But the case of the rapid acceptance of the more radical change to steam in boats makes it clear that something more than ego is at work here. Simply put, it is how comparable and measurable are the outcomes of the new technology? With steamboats, you can tell the difference in seconds with the naked eye, and verify it with a stopwatch. No arguments. With surgery, the cases are not clearly and unambiguously comparable, statistics are needed, and there is major variability. There is room for arguments.

    Software, steamboats and antiseptic surgery

    Is any given advance in software like moving from sailboats to steamboats, or is it more like adding antiseptic methods to surgery?

    That's easy: unlike straightforward competitions like races, every software project is different. In a race, the competitors take off from a starting line at the same time, and whichever crosses the finish line first is the winner. Simple! But in the real world of software, every project is different; you can always point to differences in requirements, conditions, deployment, or other things to explain why this project took more time and resources than that project. It sounds like software is kind of like surgery!

    Conclusion

    It is my personal experience and judgment that ego can play a significant role in explaining why many software groups stay mired in the same old methods, getting the same lousy results, year after year. But I think that if software projects were as comparable as transportation schedules, the evidence would simply force more rapid change, like it or not, on intransigent software groups. But because of how genuinely challenging it is to compare software projects to each other, it is at least understandable how only the most enterprising and eager-to-be-the-best software groups seek out and adopt the very best methods. 

  • Computer History

    In software, history is ignored and its lessons spurned. What little software history we are taught is often simply wrong. Everyone who writes or uses software pays for this, and pays big.

    But we know about history in software — there's Babbage, the ENIAC, etc.

    Yes, we've all heard about various people who are said to have invented modern computing. A  shocking amount of what we are taught is WRONG.

    Babbage is a case in point. People just love to go on and on about him. There are  problems, though. I'll just mention a couple.

    220px-Charles_Babbage_-_1860

    One problem is that his machines simply didn't work, even after decades of work, and huge amounts of skilled help and money. He must have known they wouldn't; although he was personally wealthy, it was other people's money he spent on his famous dalliance.

    Another problem is that his best idea wasn't his. The idea of using punched cards

    220px-Jacquard.loom.cards
    to contain the program was invented in France and was a key aspect of the Jacquard Loom — a machine that pre-dated all his work, and a machine that actually worked and was in widespread use.

    The ENIAC is another good example of what appears to be the typical pattern in computing, which is someone invents a good thing, makes it work, and then someone else steals it, takes credit for it and tries to cover up the theft, often without delivering results as good as the original.

    250px-Eniac

    If you only read the standard literature, you would still be convinced that the ENIAC and its inventors were giants of the field. Once you read everything, you discover that reality is more interesting. It turns out that the inventors of the ENIAC were "inspired" by prior inventions, much like Babbage and the Jacquard Loom. In this case, the inspiration was the Atanasoff-Berry Computer.

    ABCdrawing
    Here is an excerpt from the ruling in the patent dispute that settled the issue:

    Judge Larson had ruled that John Vincent Atanasoff and Clifford Berry had constructed the first electronic digital computer at Iowa State College in the 1939-1942 period. He had also ruled that John Mauchly and J. Presper Eckert, who had for more than twenty-five years been feted, trumpeted, and honored as the co-inventors of the first electronic digital computer, were not entitled to the patent upon which that honor was based. Furthermore, Judge Larson had ruled that Mauchly had pirated Atanasoff's ideas, and for more than thirty years had palmed those ideas off on the world as the product of his own genius.

    Other fields don't need history — why should software?

    Not true. Other fields are saturated with history.

    Politicians study history in general and the last election in particular. Fiction writers frequently read fiction, current and historic. Generals study old battles for their lessons; even today at West Point, they read about the Civil War. Learning physics is like going through the history of physics, from Galileo and Newton and through Planck and Einstein to the present. Even the terms used in physics remind you of its history: hertz, joules and Brownian motion.

    Software, by contrast, is almost completely a-historical. Not only are most people involved uninterested in what happened ten years ago, even the last project is unworthy of consideration – it’s “history.”

    Consequences of the lack of history

    War colleges study past wars for the highly pragmatic purpose of finding out how they were won or lost. What was it the winner did right? Was it better weapons? Better strategy? Better people? Some combination? And how exactly did the loser manage to lose? Was it a foregone conclusion, or was defeat snatched from the jaws of victory? People who conduct wars are serious about their history — they want to win!!

    In software, no one is interested in history. Everyone thinks they know the "right" way to build software, and thinks that the only possible source of loss is failing to do things the "right" way — the requirements weren't clear; the requirements were changed; I wasn't given enough time to do a proper design; there was no proper unit testing; the lab for testing was insufficently realistic. The list of complaints and excuses is endless, and their net effect is always the same: crappy software and whining: I need more people, more time and more money. Because studying history is so rare, few are exposed to the software "wars" that are fought and won by teams that didn't follow their rules.

    There is only one conclusion to be drawn: software people would rather lose with lots of excuses than win by doing things the "wrong" way. Ignoring history is a great way to stay in this comfortable cocoon.

    When software history becomes as important a part of computer science education as physics history is of physics, we'll know it's approaching credibility. Until then, everything about computer science, education and practice will continue to be a cruel joke.

  • The Name Game of “Moving to the Cloud”

    The most recent in a long string of technology fashion trends, "the cloud" is hot. Like its hot technology fashion predecessors, it mostly consists of old ideas with a little spicy sauce on top and fresh packaging. If you mindlessly follow the fashion and just "go to the cloud," you are likely to end up in the same unhappy place where most mindless followers of fashion trends end up. What is "the cloud?" Simple. Clouds are belated enterprise IT implementations of the consumer internet.

    What is the Cloud?
    Is the "cloud" a new development? Well, it is a new name…

    Ancient Clouds
    I first encountered the cloud more than 40 years ago. Before that fateful meeting, my only experience with computers had been up close and personal. You had to get in the room, push buttons, flick switches, feed card decks or punched paper tape, listen to whirring sounds and watch blinking lights. Like with this computer:
    IBM 360 mod 50

    But The Cloud! Ahhh, the Cloud…I remember it vividly.

    Of course, things were a bit different then. We only had "fat clients," as in "takes two guys to lug it" fat. The principle was identical, however: I was in one place, with a "fat client" like this:
    450px-Teletype_with_papertape_punch_and_reader

    …and in another place was a computer like this:
    Dec-pdp-10.men_working_at_pdp10.102630583.lg

    …and everything worked.
    Remember the importance of labelling, however: what we did 40 years ago wasn't "cloud computing," which hadn't been "invented" yet — it was merely "telecommunications."

    Creating the Modern Cloud
    Between then and now, lots of iterations of Moore's Law have come and gone. All the hardware has gotten smaller, cheaper and faster, while the software has gotten larger, more expensive and slower — but, fortunately for all of us, the rate of hardware evolution is greater than the rate of software devolution, giving the impression of net progress.

    Where this leaves us, 40 years later, is faster remote computers talking with lighter remote clients over incredibly faster networks, all at lower cost. Sprinkle with a little extra software and drizzle with some marketing hoo-haw, and — kazaam! — you've got today's hottest technology fashion trend, Cloud Computing.

    Clouds aren't always friendly
    When we think of clouds, we're likely to think of this kind of cloud:
    Cumulus_clouds_in_fair_weather

    friendly, fluffy shapes floating in an otherwise sunny sky. When people think about cloud computing, this is the kind of cloud they seem to have in mind. But as we know, there are other kinds of clouds. There are dark, oppressive clouds that make everyone depressed. And there are really mean clouds, that wreck things horribly, creating the situations for which "disaster recovery plans" are made. Is this just a metaphor? Of course not. But just like financial fraud, the big, juicy examples are usually hushed up in order to protect the guilty.

    OK, so What is "the Cloud"

    There is a little-discussed trend that is deeply embarassing to IT professionals: there is a wide and growing gap between the use of computing technology in the consumer world and in the corporate, data center world. When the average user of corporate data systems is home, he works in a very advanced computing environment. His local machines and devices are amazingly capable and pretty easy to use. When connected to the internet, he can access a nearly limitless world of cloud computing resources — which are themselves largely run out of data centers that are remote from the people who set up and administer the software in them, and which contain an ever-evolving mix of dedicated and shared resources and services. The consumer internet has been based on a cloud computing model for a long time.

    The corporate world is a whole different thing. The corporate world has been consumed with consolidating their diverse data centers. They are finally beginning to confront the extreme flexibility and ease of use that consumers enjoy every day, and are finding it increasingly difficult to explain why the computing they run with such high capital and operating costs are so cumbersome, error-prone and inflexible.

    In this context, there is no way that anyone associated with corporate computing is ever going to plainly admit that what they are basically doing is trying to catch up with the consumer internet. So they must be doing something else. Oh, yeah — they're evolving to the latest, smartest trend in corporate computing, adopting the latest technologies and being really leading-edge: they're "moving to the cloud," but of course in a "smart" way, with large doses of "private cloud" technology along the way.

    Summary
    What's a "private cloud?" A corporate data center with a fancier name.

    What's a corporation "moving to the cloud?" A corporate IT group trying to play catch-up with the consumer internet, and desperately trying to make it look like something else.

    What's new about "cloud computing?" Very little; mostly naming and marketing fluff.

    Is anything real happening when a corporation "moves to the cloud?" Sometimes yes! Sometimes, they really are copying a couple proven techniques of the consumer internet, slowly and at great cost and trouble, but nonetheless creeping towards a 21st century computing model.

  • Why Computer Software is So Bad

    There are aspects of the theory and practice of computer software that drive me nuts.

    I look at a widely accepted theory or practice, and am aghast that the cancer spread and became part of the mainstream. I look at a genuinely good practice and am shocked at the way everyone lies and slithers to make it sound to the unsuspecting that whatever lousy thing they do is a shining example of a good thing. I look at a nearly-universal approach to problem solving that may have made sense many Moore's-Law generations ago, but has long since been rendered irrelevant by advances in hardware. And I look at proven, understandable techniques that improve productivity by whole-number factors that are spurned and/or ignored by most people in the field.

    How significant is this stuff I'm complaining about? A minimum of a factor of 10. Sometimes a great deal more. So it matters! This is theoretical, but it has real-world, practical implications. It's the only way, for example, that small companies can beat large, established ones that have software staffs 10 to 1,000 times larger.

    I think I understand some of the causes of this deplorable state of affairs. But that doesn't make me feel better. And it definitely doesn't cure the sick or empower the unproductive.

    A number of my private-distribution papers go into these subjects in considerable depth. But I thought it might be interesting to summarize a couple of the main observations here.

    Among the causes of bad software are:

    The blind and deaf leading the blind. In the majority of cases, the people in charge have little to no personal experience of creating software, and no interest in how it's done. This is about as sensible as having someone in charge of a baseball team who not only can't play baseball, but can't see onto the field where it's being played; all he can do is see the score at the end and hear reports from the players about how things are going. So everything turns into politics — convincing the blind, clueless boss that you're the great contributor, everyone else is a chump. The larger the organization, the more likely it is to be led by such a genius.

    Process over substance. The larger and older the organization, the more likely it is to elevate the importance of process, and the more elaborate and all-consuming that process is likely to be. The more process there is, the less code gets written, and the productivity of the innocent few who actually want to work gets ground down to the abysmal norm.

    Lawyers. Shoot the lawyers! Shoot them! They and their legal ways are a plague. When lawyers see a problem, they want to write a law or create a regulation that makes the problem go away. Except that instead of saying this in simple, results-oriented terms (e.g. "programmers should not be allowed to die unnecessarily while writing code"), they say it in terms of incredibly elaborate, micro-managing, this-is-how-thou-shalt-do-it regulations (e.g., "programmers should be offered a nutrional meal no later than 12 noon local time each day consisting of no less than…" and 538 similar instructions, constantly growing). If regulations of this kind actually led to good results it would be one thing. All I should have to say at this point is "credit card theft" and PCI.

    Fashion over function. Programmers are supposed to be nerds, aren't they? How can programmers let their decisions be made by "fashion," whatever that's supposed to be in the world of software? I refer you to the first point, about software teams being led by people who are clueless. These people want to seem smart in the eyes of their bosses and peers, who know even less than they do. So they make decisions that are guided by C-office fashion trends, which are usually laughably out of step with true optimal decisions.

    A preference for bad new ideas over good older ones. How do bad ideas catch on? Some company promotes them. Books are written. Appealing rhetoric is created and repeated. Suddenly a fashion trend is born. Someone can appear to be smart and modern just by advocating the cool new stuff and sounding smart, without actually knowing anything or doing anything. Everyone involved thinks they're helping advance the state of software, when in fact they're digging the hole of despair deeper. It's remarkably like when the dumbest guy in Scotland moves to England, thereby increasing the average intelligence of both places.

    A preference for bad old ideas over good newer ones. There aren't many good new ideas; mostly there are old good ideas that are still good. But because conditions change (like processors getting fast and memory getting cheaper), ideas emerge to respond intelligently to the new state of affairs (like this one), instead of blindly and stupidly continuing on as though nothing had changed. Like most people do.

    Wrong scope; usually too narrow. This is classic optimization over too narrow a range. Much bad software results from compartmentalism or simply failing to look at things through the eyes of the ultimate consumer. This is an incredibly common mistake. The trouble gets really bad when is the practices that result from the little "silos of excellence" get elevated into industry "best practices." (Whenever I hear the phrase "best practice," it's really hard for me to avoid getting a serious look and pronouncing in deep tones "you folks had best practice a bit longer before you inflict yourselves on the world of paid, professional programming.")

    I'm sure there are many more causes of bad software than the ones I've listed here — we haven't even gotten into bad programmers in their innumerable incarnations! But every item on the list above thoroughly deserves inclusion, even more because most of them are not listed by most people as a cause of software malaise.

  • Databases and Applications

    When databases were invented, they solved a huge problem that couldn't be solved any other way. Anyone who cares to look can see that the original problem that caused us computer programmers to invent databases has largely gone away. So why is it exactly that application programmers reflexively put their data in a database? In a surprisingly wide range of cases, it sure isn't because of necessity. Could it, perhaps, perhaps, be nothing but habit and the little-discussed fact that change happens in software at roughly the same rate that change happens in glaciers?

    From the beginning of (computer) time, instructions have needed to be in memory to be executed, and data has needed to be in memory to be operated on by instructions. The memory in which instructions execute and fiddle with data has always been way faster and way more expensive than the large, slow but cheap places they are put when they're not in memory (call it storage, whether the storage is punch cards or tape or disk). It was this way at the beginning of time and it's true now.

    Think of memory as your work table. Eons ago, your work table was really tiny, like this:

    Tiny table
    You can hardly fit anything at all on it! So you'd better have a really big storage place to keep all your stuff, like a pantry:

    Pantry

    OK, that's cool. You've got all your stuff in storage, but you can only work on when it's in memory (on the table). What do you need? You need to get the stuff you want to work on now from the pantry, and you need to put the stuff you're done with back in the pantry. In other words (if you're in the world of computers)…you need a database!

    The very most basic function of a database is pretty simple: its job is to shuffle your data between memory and disk. It's also nice if it keeps everything straight, avoids dropping bits on the floor, and cleans everything up when something goes wrong during the shuffling.

    That was then. But things have changed. Remember Moore's Law? The amount of memory available to us at surprisingly reasonable prices has grown hugely. Exponentially. Our work tables now look more like this:

    Giant table
    And our pantries? Well, they've grown a bit too:

    Giant warehouse
    So how much data do you have? Run the numbers. It goes without saying that it's going to fit in storage. But how about that work table? Here's the question you have to think about:

    Will all your data fit on the work table (i.e., in memory)?

    With memory available at reasonable prices for 64GB and even 256GB, the answer is often YES! It will!

    Hmmmm. What was it the database does? If all my data fits in memory, why was it I needed that database???

    I know, I know. Databases can be nice for reporting and data analysis and "persistence" and a few other things. I'm not saying you never use them. But for your real application, the one that takes user requests and responds to them, if you don't need to have the database shuttling stuff between the work table and the pantry… Hmmmm.

  • The Xiotech ISE and Technology Fashion

    We all know what fashion is. Think of Vogue Magazine, or impossibly tall, thin women walking in that special way down the elevated runway, wearing something no normal person would be able to wear, or would want to wear if they could.

    SAIC_Fashion_Show_2008 But fashion extends way beyond women's clothing. Let's start with men's clothing: how many guys wear suits and ties to the office today? Then cars — how many modern cars have those giant fins that were popular in the '60's?  The kind of popular music you like dates you at least as much as wrinkles on your skin. The more you think about it, the more you realize how pervasive fashion is.

    Fins_close_up

    "Technology is a counter-example," perhaps you say. "It's bits and bytes and silicon, no fashion there!" Well, that's true, except that it's people who buy the technology, and people are fashion-driven creatures. Let's face it: the cool kids who once drove sporty cars now pull out their iPhones at the slightest excuse. Waiting on line to see the Beatles; waiting on line to get the latest iPhone — what's the difference? They're both fashion-driven fads.

    Iphone3g_line_2

    "I concede that consumer technology is fashion-driven," perhaps you admit. "But hard-core computing technology, where nerds are building things for nerds; how can that possibly be driven by fashion?" I fully concur that no nerd techie would ever admit that his choices, selections and designs are driven by fashion, not even to himself. But all too often, that's exactly what's happening. The techie nerd who comes up with a design approach for solving a problem almost always prides himself on originality and foresight, without any awareness of how fashion-determined his most important decisions are. These decisions are often not made consciously; they are assumptions. "It's not worth discussing, of course we'll take approach X," the techie would respond in the unlikely event that the assumption is questioned — by some "ignorant" (which is tech-talk for "unfashionable") person. Just to be clear: we're not talking about how nerds dress; we're talking about how nerds think.

    Gisele_nerd

    There are examples in every field of computing technology. The Java/J2EE fad during and after the internet bubble is an obvious example, and before it client/server computing was a huge fad.

    There is a clear example in storage technology today. The fashion is as clear and obvious as short skirts, and moreover is explicitly stated by its adherents: the fashion is that storage functionality should be provided as a body of software, independent of any hardware embodiment, and without regard to any particular storage hardware. Companies that previously sold storage hardware no longer have real hardware design functions — all they do is bundle their software with hardware provided by others and sell the combination. The most popular form of this approach is to buy drive bays from an OEM and connect them to controllers consisting of off-the-shelf specially configured processors; 3Par and many others do this, for example. IBM's xiv implements a variation on this theme, using all IBM commodity server hardware. While there are still loads of dollars being spent on old-style, hardware-centric storage systems (think EMC), engineers building new storage systems are uniformly following the software-centric fashion.

    In this sense, the Xiotech ISE is decided unfashionable. The ISE was invented at Seagate, in response to the CEO, Steve Luczo, wanting to create a storage product that was higher value than spinning magnetic disk drives. The idea was simple: build a fixed-format super-disk, with many Seagate drives, intelligence, etc. It would be bigger than a disk, but smaller than a SAN. It would emphasize basic storage functions (write, protect, read) and leave the "high level" storage functions to the SAN vendors.

    What is interesting is that Steve Sicola and a group of other storage industry veterans ended up working side-by-side with Seagate engineers, something they never would have done at a SAN vendor. Sicola and his team knew the evolving fashions in storage quite well: ignore the details of the drives, that's "just storage." Build fancy high-level functions.

    But since they were stuck with the drive engineers, they did something unusual: they actually listened to them! They learned about the amazing functions the engineers embedded in the drives that all the SAN vendors ignore. They learned how annoyed the Seagate engineers were at all the drives marked "bad" by SAN's, the vast majority of which are actually good; they learned about error codes and performance details that all the other storage engineers in the industry were studiously ignoring.

    Before long, they got absorbed in what you could really do once you really knew the hardware. And, being good nerds, they invented a bunch of stuff, like how to virtualize over a fixed number of heads so that top performance was maintained even when the disks are filled up. They also invented a bunch of stuff that provides major, persisting advantages as new drives with higher capacities come out.

    Since I know a fair amount about Xiotech's ISE, I want to go on and on about it. But I won't, because the point of this post is technology fashion. The purpose of bringing up the ISE is that it's a great illustration of the power of fads and fashion in technology. Any normal group of self-respecting storage nerds would have built a completely hardware-independent storage system. As such, it may have had nice features, but it would be pretty much like all the others in terms of its basic functions of reading and writing disks. But because these storage engineers were sequestered with hardware types and had a unique mission imposed from above, they did something very rare: they built a leading-edge storage system that is decidedly unfashionable. Because the engineers actually paid attention to the hardware, the ISE does things (performance, reliability, density, scalability, energy use, etc.) that no other storage system on the market today does, even though it uses the same disks available to others.

    Fashion is, of course, a relative term. Fashion is one thing at diplomatic receptions, and quite another hiking in the wilderness or in a war zone. What is appropriate for one doesn't work for the other. Shoes that are appropriate for a salon can cripple you in the woods.

    Well, it turns out that the modern storage fashion of ignoring the storage hardware may be acceptable in salon-type environments (where appearance and style is important but there's no heavy lifting to be done), but is as crippling as high heels in the I/O-intensive environments that are increasingly found in virtualized, cloud data centers. The ISE is like storage fashion for war zones of data, for data-intensive applications like virtualized servers, where the applications are concentrated in a small number of servers, all fighting to get their data. Most storage systems know how to hold their tea cups and conduct refined discussions and other things that matter when getting your data sometime today would be nice, thanks.

    A-Tea-Party

    But when you've got a crowd of rowdy, tense applications all of whom are demanding their data NOW, perhaps more of a war-time style is appropriate; that's what the "unfashionable" nerds at Xiotech created in the ISE.

    An-Angry-Crowd-Giclee-Print-C12371290.jpeg

  • Paleolithic Mainframes Discovered Alive in Data Center!

    A recent article on forbes.com quotes me on what many people find to be the surprising longevity of mainframe computers.

    Don’t things in computing just get better and better – not to mention faster, smaller and less expensive? Which implies that after a few years of use, it’s just not worth keeping the old stuff around anymore? So we throw out (oops, please excuse me, we meticulously recycle…) the useless old stuff and bring in the cool, cost-effective new stuff, right?

    Like most common wisdom in modern computing, this contains elements of truth, but isn’t quite right.

    The element of truth behind this thought is the astounding continued progress of Moore’s Law, which posits that electronics gets smaller and faster at a rate that boggles the mind. This is what gives us iPhones and portable computers that have more speed and capacity than the room-sized mainframes of the past.

    But there is more to computing than electronics. First of all, there is this little matter of physical devices that have mass and inertia, that no amount of Moore’s-Law-driven advance will free from the tyranny of the laws of physics. This leads to growing storage problems that Moore’s Law actually makes worse. See here for a description of the issue.

    Second of all, there is this thing called software. Yes, software, the invisible-to-the-human-eye “stuff” that makes all that amazing electronics actually do something. Software is really hard, complicated stuff, like most things that are essentially mental, conceptual and invisible (think math). Once some software actually gets working well enough, sensible people are loath to change it. Even worse, the amazing increases in speed and capacity of electronics mask simply awful problems in software.

    Building most real, practical production software tends to be a nightmare that rarely ends. Re-building software that more-or-less works is a nightmare in hell that visits all the circles of hell in round-robin. So if the credit card companies can process their transactions, and the software that gets the job done happens to be written in totally-out-of-fashion-squared COBOL that runs best on a mainframe – that’s a great reason for IBM to build a new implementation of the mainframe instruction set out of modern electronics (thus getting most of the benefits of all the advances), just so it can run the code. It’s kind of like a horse and buggy built out of modern materials and powered by a fuel cell – it looks funny, but it’s modern and efficient and gets the job done.

    So, yes, the electronic part of computers get faster, better and cheaper. And the software seems to get better because it’s along for the ride, but it actually tends to get worse, which is why Paleolithic mainframes have been discovered, alive and working, in otherwise modern data centers.

Links

Recent Posts

Categories