Category: Dictionary

  • Devil’s Dictionary for 21st Century Finance

    Another in the series introduced here.

    Ambrose Bierce

    A couple definitions from his book:

    Finance

    Wall street

     

    Definitions for 21st Century Finance

    A couple from the student:

    Alternative Data

     

    The shocking practice among certain rogue financial investors of evaluating investments using non-financial data, from sources such as social media. Leading progressive investors are talking with the appropriate authorities in hopes that strong regulations will be issued to put a halt to this kind of prejudicial and unfair practice.

    Quantitative Fund

    A fund that uses methods that involve lots of numbers, in sharp contrast to earlier methods that emphasized more qualitative measures such as “lots and lots,” and “not too much.”

    Quantamental

    An approach to investing that purports to combine two incompatible investment methods, quantitative and fundamental, to their mutual advantage. It is based on the success of other combinations of opposites, like combining overpaying and underpaying in order to pay just the right amount for something.

    Conclusion

    Some bad things never seem to end…

  • Devil’s Dictionary for 21st Century Computing 3

    More cynical definitions in the series introduced here, for Deep Learning and Blockchain.

    Ambrose Bierce

    A couple definitions from his book:

    Cynic

    Conversation

    Consult

    Stan Kelly-Bootle

    Mr. Kelly-Bootle sometimes provided extended explanations of the words he defined:

    Alpha

    Sometimes he even needed illustrations. See the two definitions below, followed by illustrations:

    ASCII

    ASL

    Definitions for 21st Century Computing

    A couple more from the student:

    Deep Learning

    Deep learning is an evolution of shallow neural networks, in which the neural networks are stacked in many layers, making them “deep.”

    Decades after the 1959 biological model introduced by Nobel Prize-winning scientists Hubel and Wiesel inspired artificial intelligence pioneers at MIT and elsewhere to invent neural network technology, someone noticed that biological neurons are connected in many layers, unlike the single-layer neural networks that AI researchers had been touting for years as the basis for recreating human intelligence inside a machine. Since everyone knows that prestigious artificial intelligence researchers don’t commit errors, or at least simple ones, “deep learning” was introduced as a brand-new idea that would finally crack the code of making machines as smart as the average fifth grader. Someday. Maybe.

    Blockchain

    A hot new technology that is sweeping through the world of finance,  healthcare and elsewhere, whose greatest practical success to date has been the secret transfer of funds between cooperating parties in a criminal enterprise.

    A newly discovered database that has recently been freed from the nearly unbreakable bonds of its cryptocurrency prison; however, as a new kind of database, it stubbornly refuses to be classified as a “database,” preferring to be known as a “distributed ledger,” of which it is apparently the only known exemplar. A cynic might point out that that the stubborn refusal to agree to be part of genus database-imus may be due to the wholly inadequate functionality and performance of blockchain on generally accepted measures of database value, but this is almost certainly unfair to such a widely hailed future solution to problems that undoubtedly are pressing, and have resisted solution for many years.

    Conclusion

    I apologize in advance: there could be more to come.

  • Devil’s Dictionary for 21st Century Computing 2

    Another in the series introduced here.

    Ambrose Bierce

    A couple definitions from his book:

    Telephone

    Telescope

    Stan Kelly-Bootle

    Mr. Kelly-Bootle sometimes provided extended explanations of the words he defined:

      Algo

    Algo 2

    Definitions for 21st Century Computing

    A couple more from the student:

    Cognitive Computing

    A totally, absolutely brand-new approach to making computers that are really smart. Cognitive computing is already a success primarily because it has NOTHING whatsoever to do with certain lame technologies that have a decades-long, proven track record of achieving perpetually imminent success. Cognitive computing is primarily backed by a giant company whose roots go back to the technology that popularized the term “hanging chads,” whose TLA name is alphabetically adjacent to HAL, the star of movie set in 2001.

    Machine Learning

    The term for a growing collection of dozens of techniques that have been developed in the continuing quest to teach machines enough so that they can score better than they do on the college entrance exams. Until the quest for effective machine learning yields better results, machines will continue to be relegated to second-class status among the company of educated things.

    The advocates of machine learning are known to be a fiercely contentious lot, each asserting that its own approach is superior to all others, and that any evidence adduced to the contrary is propaganda, fake news of the worst sort, stemming from jealous advocates of inferior approaches. The closest approximation to the internecine warfare of the machine learning field is the human learning field, in which advocates of public, government-run and union-staffed schools exchange harsh words with advocates of charter schools, with a level of invective and passion that indicates that someone is strongly in favor of hopelessly uneducated machines and/or humans.

    Conclusion

    I apologize in advance: there could be more to come.

  • Devil’s Dictionary for 21st Century Computing

    Ambrose Bierce wrote the Devil’s Dictionary in 1910, delighting and edifying cynics everywhere. Stan Kelly-Bootle wrote a new version for the world of computing called the Devil’s DP Dictionary in 1981, and a later edition in 1995 called the Computer ContraDictionary. These are timeless works, providing valuable insight and inspiration for cynics to this day. But there are modern computing terms that came into use after these geniuses had passed onto their reward. It’s time for at least a first draft of a Computer Cynic’s Dictionary for the 21st Century.

    Ambrose Bierce

    Mr. Bierce started publishing definitions many years before the first book appeared. Here is the start of a column from 1881:

    Devil

    You can see that from the very start, Mr. Bierce had the ability to get at the heart of things using few words.

    Stan Kelly-Bootle

    Ambrose Bierce was clearly a tough act to follow, but the new computer technology was such rich soil that Mr. Kelly-Bootle felt that an attempt had to be made. And a heroic attempt it was, providing insight and edification all these years later. The following couple of simple definitions get right to the point:

    Stan

    In other definitions, he gets a bit more cutting:

    CS

    Cynicism in the 21st Century

    Many new terms have entered the world of computing since Mr. Kelly-Bootle last graced us with his wisdom. Reasonable people may ask, "is cynicism dead?" "Will such juicy targets remain unskewered?"

    I have searched high and (especially) low, and found nothing but piles of dry computer-babble, peppered with ignorance and misinformation. I have yet to find a good source of penetrating definitions for any the terms being thrown wildly about in today's discourse. I feel I have no choice but to offer some of my own definitions, sad exemplars of the type though they be, in hope of challenging those with the true, deep knowledge of a Bierce or Bootle to counter with their own superior definitions.

    Here is the first installment. Should I somehow avoid assassination, more will follow in future posts.

    Big Data

    A subject of which no self-respecting executive may claim ignorance; an expensive, ever-growing collection of hardware and software managed by people who spout a dizzying array of acronyms with confidence and certainty, with mounting expenses and benefits that are just about to be realized.

    A collection of data, presumed to be large but normally fitting in a backpack with room to spare, which is said to contain untold riches if only they can be found and unlocked with mysterious keys like Hadoop.

    An approach to analyzing incredibly huuuuge collections of data that has been recently invented, bearing no resemblance whatsoever to outdated technologies such as data warehousing and business intelligence, and sharing none of their drawbacks.

    Artificial Intelligence

    A kind of intelligence, sometimes implemented by computers, which would be decisively rejected by all right-thinking people if it were food. It is the opposite of organic, free-range, unprocessed intelligence – it is chock-full of GMO’s, fructose and artificial ingredients of many kinds.

    The growing crisis of insufficient intelligence is being addressed by some leading scientists, who are leading the way in the creation of artificial intelligence to fill in the gaps left by inadequate supplies of naturally-occurring intelligence. Like the green revolution in agriculture, many hope that this emerging “grey revolution” will put a stop to the persistent intelligence shortages that make so many miserable. While some elites sneer that artificial, non-organic intelligence is deeply harmful, most of the deprived are glad to be served intelligence of any kind, however artificial it may be, rather than their current meager diets containing precious little intelligence of any kind.

    A purposely vague term, referring to an ever-growing set of tools and techniques, that are said to do stuff that people usually do, only better. AI programs have advanced from early victories in playing checkers to wins against chess masters. They have finally achieved the pinnacle of human intelligence, winning the game show Jeopardy. After decades of marching from success to success, today's leaders of Artificial Intelligence anticipate that practical applications of the technology are certain to emerge. If not, they threaten to further inflate the definition of Artificial Intelligence to encompass normal computer programs written by ordinary human beings, at which point success will be theirs — since a computer program is, without doubt, artificial.

    Conclusion

    I expect to release more definitions in the course of this year.

Links

Recent Posts

Categories