We all know that information about our adversaries and partner countries is and always has been critical to National Security. Before our interconnected world of the internet, information operations were more about human spying, one person to another. As technology has progressed this information criticality has only become more intense, and today has become the most critical component of modern warfare.
The root of today’s information technology can be traced back to the humble use of the binary numbering system of ones and zeros. Before that, information was locked into the analogue world of sine wave radio frequency communications, decimal numbers, and language specific alphabets.
Using the binary numbering system of 1’s and 0’s, called bits, changed all of that because bits have the revolutionary ability to be processed through electronic circuits that can easily translate an “on” transistor as a 1 bit and an “off” transistor as a 0 bit. It is this simplicity that enables noisy electronic signals to be correctly interpreted as 1’s or 0’s. Combining this electronic simplicity with the invention of integrated circuits now containing billions of transistors in a single chip, has given rise to our modern digital world.
Likewise, in wired and wireless communication systems, detection of 1’s and 0’s in transmitted information is much easier than accurately reproducing an analogue sine wave. So the simple idea of binary bits have enabled orders of magnitude greater information to be transmitted within a given frequency band, wire, or glass fiber strand, and stored within a magnetic or electronic medium. As an example undersea fiber optic cables transmit 95% of today’s global information, and a single fiber can now transmit as much as one terabit (10 to the 12 bits per second).
The net result is our current world, where everything has become represented, stored, moved, processed, and controlled by bits and the machinery that translates bits into our human analogue brains. I used to have a picture of planet earth on my college dorm wall with a caption that said, “The world is analog, isn’t it?”
Humans are the dominant species on our planet because we are superior information sharers. And if we haven’t shared information for some reason, we steal it. As such, warfare as a component of tribal, or national security, has always been a cat and mouse game of getting ahead or behind of one’s adversaries. In the past, tribal or national security has been about obtaining and controlling physical space, by the force of military and/or police control, to enable rule-of-law.
In today’s world, physical space is still critical, but bits now control all of the primary elements of our human lives. By that I mean that bits literally control all of the primary flows on our planet: food, water, money, commodities, manufactured goods, real estate, banking, stocks & bonds, and energy in every form. Therefore, if the right bits don’t get to the right place, at the right time, our planet’s flows are disrupted!
On our multi-thousand-year human timeline, cybersecurity has only been a real thing in the last microsecond. The term cybersecurity was first invented in 1989 when the word cyber began to be added to some words to make them more futuristic and interesting. Cyber came from the 1948 invention of the term cybernetics most famously defined by Norbert Wiener who characterized cybernetics as “the scientific study of control and communication in the animal and the machine”.
As we often hear, the foundation of our modern internet, and all of the compute machinery that makes it possible, was not invented nor designed with a need for cybersecurity. It wasn’t considered important at the time, and the inventors had no idea that DoD’s ARPANET would become the dominant network technology of our planet. But, it was less than twenty years after the beginnings of ARPANET before the first foreign cyber hacking instances against our military began in the early 90s. Not until the late 90’s did the DoD begin to take stronger actions to prevent cyber intrusions. And, the very first DARPA cybersecurity research project was initiated in the late 90s.
In 1999, the futurist Ray Kurzweil, in his book, “The Age of Spiritual Machines,” predicted that by 2009, “The security of computation and communications is the primary focus of the Department of Defense. There is general recognition that the side that can maintain the integrity of computational resources will dominate the battlefield.” Interestingly, the U.S. Cyber Command stood up in mid-2009, and likewise, each of the Military Services stood up operational Cyber Commands in 2009. The Navy rebirthed the name 10th Fleet as it’s Cyber Command (it was originally a special anti-submarine command during WWII).
Kurzweil’s predictions are mostly evolved from Moore’s Law that first predicted in 1975 that the number of transistors in an integrated chip would double every 18-24 months… Following Moore’s Law, it has been predicted that by 2024 a single $1000 chip will have more computing power than the human brain, and by 2045 more power than the human race. That Moore’s Law has held up since 1975, led Kurzweil to embrace the idea in his 2005 book “The Singularity is Near,” that “By the 2040’s, non-biological intelligence will be a billion times more capable than biological intelligence (a.k.a. us).” [Recall from your math classes that a mathematical function singularity is when the function goes to infinity or isn’t well behaved.]
So in our world now dominated by bits and preyed upon by criminal and nation state hackers, where does it all leave our human race? It leaves us in a mess where we can only hope that new technologies and the right human choices will carry us safely forward!
In future posts I will discuss our changing world of cryptocurrency, blockchain contracts, quantum computing, and software 2.0. Could these new technologies outpace the cybersecurity threats that plague our current world?