From the Editor: When Technologies Collide
I've been reading a short history of computing, aptly titled "Computing: A Concise History," by Paul E. Ceruzzi. The book covers a lot of familiar ground (transistors, IBM, Tim Berners-Lee) peppered with many surprises—I never knew the term "digital" was coined during World War II to describe a type of anti-aircraft gun. Or that the circuit board was invented for the "proximity fuse," which used radio waves to remotely detonate shells (the electronics had to be shock-resistant during launch).
One of Ceruzzi's most interesting insights concerns the importance of convergence to technological development. Computing as we now know it, he writes, "represents a convergence of many different streams of techniques, devices and machines, each coming from its own separate historical avenue of development." He cites the smartphone, in which a variety of once-separate technologies (telephone, radio, television, computer, phonograph) came together, unexpectedly, to become more than the sum of their parts.
Because convergence can be hard to predict, we are often surprised by the impact of emerging technologies. Smartphones and tablets were not on the radar six years ago; now, they threaten to upend print as a primary book and magazine reading format. Those who see technologies as static or siloed (print here, digital there) risk overlooking critical trends. Publishers must be flexible enough to adjust to rapid development and unexpected turns—what Lynda Hammes, publisher of Foreign Affairs, calls "agility," "energy" and "a kind of start-up attitude." Foreign Affairs, profiled in this issue, designed its mobile products to be iterative, ready to respond to evolving technologies and trends.
"Today, you cannot go behind the curtain and [unveil] something—'Ta Da!'—many months later," Hammes tells Publishing Executive. "You have to be in a constant mode of improvement and responding to new technology."
What does convergence mean for the future? It means, look out! What seems odd today might be the norm tomorrow. In the 1960s, IBM never thought its mainframe computer business (first enabled by the vacuum tube, which was developed for radios and telephones) could be threatened by a bunch of hippy hobbyists working with computer kits. The personal computer revolution was made possible by transistors, microprocessors, memory chips built on the circuit board concept, business machines like the Teletype, and communications technology with roots in the telegraph. Each of these components was developed with a different purpose in mind.
In this issue, Samir "Mr. Magazine" Husni makes a bold statement bordering on the downright eccentric: "If we can imagine a day when print may no longer exist, why do we not imagine the reverse; a day when digital may disappear?" It seems a very odd prediction, but if history teaches us anything, it is that we should rule nothing out. The other day I was reading an article in Foreign Affairs (more convergence!) about the coming revolution in digital fabrication, which will allow machines similar to 3-D printers to manufacture anything, on demand. As with the history of computing, the author sees this technology as eventually trickling down to the individual level—"personal fabrication," Star Trek replicator style. Imagine being able to ask a machine to create the latest issue of People, and it appears, right in front of you. Could this unexpected use of a new technology— a new way to manufacture and deliver print—supersede digital reading devices?
The possibilities are intriguing.