Things don’t replace things; they just splinter. I can’t tell you how exhausting it is to keep hearing pundits say that some product is the “iPhone killer” or the “Kindle killer.” Listen, dudes: the history of consumer tech is branching, not replacing.
TV was supposed to kill radio. The DVD was supposed to kill the Cineplex. Instant coffee was supposed to replace fresh-brewed.
But here’s the thing: it never happens. You want to know what the future holds? O.K., here you go: there will be both iPhones and Android phones. There will be both satellite radio and AM/FM. There will be both printed books and e-books. Things don’t replace things; they just add on.This has obvious implications for the relationship among the arts, sciences, humanities, and social sciences: fund them as though they were related branches, not as though some were replacing others.
Second, there's this:
Forget about forever – nothing lasts a year. Of the thousands of products I’ve reviewed in 10 years, only a handful are still on the market.
Oh, you can find some gadgets whose descendants are still around: iPod, BlackBerry, Internet Explorer and so on. But it’s mind-frying to contemplate the milions of dollars and person-years that were spend on products and services that now fill the Great Tech Graveyard: Olympus M-Robe. PocketPC. Smart Display. MicroMV. MSN Explorer. .
Much technology comes from university research -- though not as much as we often assume, as three-quarters of R&D by dollars spent takes place in for-profit companies. But technology's commercial cycles are the opposite of what we’re trying to do around here, in a university. We’re creating permanent knowledge and skills. Knowledge and skills continuously evolve, but from activities focused on things that last.
Everybody knows that’s the way tech goes. The trick is to accept your gadget’s obsolescence at the time you buy it, so you feel no sense of loss when it’s discontinued next fall.
Belated happy thanksgiving wishes to all.
2 comments:
i would be really interested to hear your thoughts on the failure of academe to catch on (early) that enterprise systems (the financials, HR etc)for universities specifically should have been developed through some sort of collaborative open source arrangements among universities. Instead, we gave and continue to give hundreds of millions of dollars to Larry Ellison and his ilk-- and also made our staff and faculty provide feedback to make THEIR products better so that we can pay for another new upgrade to fix the bugs in the previous versions every couple of years...
(and then we have Bill Gates and his thoughts on how unnecessary a physical campus is etc.)
why didn't the faculty -at various institutions- see this coming? especially if they want to claim patriarchal pride for having fostered it all?
Cloudminder - sorry to have missed this. I think it's two things: first, faculty have very little contact with administrative systems development, especially at sytemwide. Second, UC officials are impressed and even intimidated by the private sector and especially by its leading firms. I spent a lot of time early in the 2000s on such things as the Technology Transfer Advisory Committee wondering how UC had learned to act like the junior partner in these deals -- not just like the less powerful but like the less intelligent partner. It reflected a cultural "buy-in" to tech transfer, which in simpler terms in my presence at least expressed itself as a lack of confidence. In general, UC isn't internally integrated and doesn't use its own distributed expertise very well
Join the Conversation
Note: Firefox is occasionally incompatible with our comments section. We apologize for the inconvenience.