Guest Column: Eat Your Own Lunch
I graduated from the Scripps School of Journalism at Ohio University in 1982 having never used a computer during my undergraduate education. When I began selling for Popular Science and Times Mirror magazines in 1987, our offices at 380 Madison Ave. had rotary dial phones (clearly without voice mail). We had no fax machines or Federal Express; insertion orders came in via the U.S. Postal Service. No computers, no database-management systems of any kind, no Internet—and obviously no iPods, HD flat-screen TVs or smartphones. My secretary actually took dictation.
Boy, have things changed!
Now my entire life is on my BlackBerry; my iPod is a trusted friend; and my Garmin has gotten me out of many a jam. I can barely remember what the world was like before these devices—and I can't wait for the next technological advancement that will compel me to buy my next gadget. Most of the nation, and the world, is now hooked on technology.
In the mid-'90s, as we began to see the what the Internet was and dream about what it could be, many of us recommended that management invest in developing robust 1.0 Web sites to bring our trusted content into the new, digital world.
However, without having a real business model established for monetizing (and thus paying for) this development, prudent bottom-line-oriented managers chose not to fund the research and development (R&D) and instead recommended we refocus our attention on our core businesses—making great content, selling subscriptions and selling advertising. This was short-term thinking, driven by the fact that margins were shrinking due to reductions in our dual revenue streams—the advent of negotiating off of open page rates by agencies and clients, and the acceleration of the use of inexpensive subscription generation from the likes of American Family Publishers and Publishers Clearing House.