MORE OR TITLES
Paperback: $16 | |
E-book: $10 | |
Print + E-book: $20 |
From its beginning the World Wide Web seemed both inherently singular and global. Today, Scott Malcomson contends in Splinternet, the Internet is cracking apart into discrete groups no longer willing, or able, to connect. More
An interview with Scott L. Malcomson, author of Splinternet: How Geopolitics and Commerce Are Fragmenting the World Wide Web (publication March 17, 2016)
“Virtual reality,” in the sense of creating a virtual space you can interact with technologically, is at the heart of the relationship between war and computing.
That relationship goes back to World War I, when rival navies tried to find ways of hitting each other’s boats when they couldn’t see them. Navies had to create systems that could take in a number of information inputs—such as speed of ship, direction, size of gun, angle of gun—and crunch the numbers in order to arrive at the best way to fire their guns in order to hit the enemy ship even if they weren’t certain where it was. These were called “fire control systems.” The calculating machinery—pre-digital—was called a “computer.” Modern gunnery required the invention of computers because no group of humans could calculate and recalculate the variables quickly enough on their own. With a human to direct the system, it became a computer-based command-and-control system.
The use of computers to refine the command and control of guns in war was the basis for developing computing in the 20th century.
There were other very important strands—cryptography, telegraphy, telephony—but command and control of weaponry was absolutely the crux. In terms of computing, imagining warfare—creating a virtual reality from actual conflict variables—was where it all began.
In important ways, that hasn’t changed. Drone warfare is command-and-control computing.
One is how it demonstrates, from 1916 through to the present, the continuous centrality of warfare to the development of computing. People have sometimes thought that computing developed as an accidental byproduct of war or military spending—like nylon or Tang. But it was really fundamental to computing and shaped it profoundly as a command-and-control system that could take in many different inputs (or “intelligence” as it was sometimes called), compute their relationships, and deliver a result that would serve the purposes of whoever was in command. War is in the DNA of computing.
A second “first” is the deep link between computing and intelligence. Computers in command-and-control systems took in intelligence—first about ships and weapons, then about planes and submarines, then about missiles, and eventually about cellphones or other geolocated devices. Advances like radar and decryption and GPS improved the quality of intelligence inputs. But the essence was that computers could analyze intelligence, and calculate the best options to respond to that intelligence, at a speed unavailable to humans. Many of the pioneers of computing, from 1916 to the 1970s, and indeed later, were part of the intelligence world. I demonstrate that in this book, and I don’t think many people were really aware of it, even some within the intelligence community. But it is a century-long and continuous, as well as ongoing, relationship.
A third “first” is the tie I show between military spending and venture capital. We think of these now as being very distinct, but the idea of spending a lot of money on a range of potential technologies, accepting that some would be losers but that the breakthroughs would carry the rest, was born in the context of modern, technologically driven military spending and migrated from there directly into the private sector after World War II, indeed with many of the same personnel. The U.S. government effectively invented venture capital.
A fourth “first” is the demonstration that the U.S. military-industrial-academic complex really dates to World War I, when it was partly a reaction against the increased specialization of scientific disciplines. This three-part complex immediately proved to have a tremendous power to innovate, not least because it was cross-disciplinary.
Finally, this book is probably unique in how it places computing today, especially the World Wide Web, in a century of state-led investment and innovation directed above all at advancing foreign policy—overwhelmingly, of course, U.S. foreign policy, but increasingly the foreign policies of all states.
Oh, no, not at all! The book has a wonderful cast of characters and is told, as much as I could do it, as a story. The first generation of computer pioneers—Vannevar Bush, Norbert Wiener, and Warren Weaver are major characters in my book—all served in World War I as very young men and shaped computing through the second World War and beyond. A younger group—I focus on JCR Licklider, Frederick Terman, Georges Doriot and Claude Shannon, as well as Leonard Kleinrock, Tommy Flowers, George Stibitz, Kenneth Olsen and some others—take the story through World War II and into the Cold War. The ‘60s and ‘70s crew includes Steve Jobs, John McCarthy, Stewart Brand, Bill Draper; in the ‘80s we get Al Gore and then Tim Berners-Lee, Jon Postel, and J. Robert Beyster. From there on the actors become a bit more institutional: NSA and CIA, Network Solutions, ICANN, Google, Amazon, the Cult of the Dead Cow, though also Edward Snowden, Tim Cook and James Comey.
Overall it is a compellingly quirky cast of characters. All white men, you’ll note, and most (but not all) of a rather megalomaniacal bent.
They are mostly there too, or the institutions they built, but it is a short book! I wanted to make this a brief, accessible guide to the past, present and future of computing and, especially, the Internet, something specialists could learn from but which the general reader could also enjoy and get a lot out of. The issues it treats affect all of our lives quite profoundly.
That is the Internet I grew up with and that’s where my personal sympathies lie. I was born in 1961 and so am a pseudo-baby-boomer—“pseudo” in that Jim Morrison was dead well before my first date. I also write as a native Californian. The combination carries with it a bedrock optimism and a natural taste for liberation and novelty. It is at this point a rueful, tempered optimism. Yet I do believe that the emancipatory side of technology is not just a dream and that to look at Web reality squarely is a service to optimism. It toughens it up.
True. But it is what we are getting. States are slowly carving up the Internet. The universal Internet, in my view, was ultimately dependent on a high level of American geopolitical, as well as technological, dominance, and I see no reason to think either that that dominance will recur or that a universal Internet can be achieved by some other means. What we need to do now is figure out how to preserve those universal aspects of the Internet that must and can be preserved, and how to deal at the same time with the growing assertiveness of states. My own recommendations are that the global engineering subculture—one hero of the book—continue to dominate Internet governance; that we accept that a modest sanitizing of the public Internet is the price to pay for having one at all; that the dark Web is the necessary place for criminals and law enforcers, spies, terrorists and certain kinds of hackers; that the private sector’s passion for scalability is an indispensable counterforce to state control; and that the nation-state’s passion for security must never be allowed to drag cyberspace into cyberwar.