My first column for Tech Decisions was in the summer of 1999. It wasn't called Trends & Technology back then but the gist was the same: What is happening in technology and how those changes affect our lives and the way we do business.
In 1999 Y2K issues were occupying a great deal of our time. It was the last good time to be a COBOL programmer, although most of us were trying to hide that skill and leave all that legacy stuff behind. The Internet had arrived—Google was founded in 1998. Silicon Valley was booming and the dot-com bust was a year away. The harsh reality though was that the dot-com boom occurred a few years before business and technology were fully ready to embrace the Internet and e-Commerce. The bust was inevitable but the Internet promised years of interesting work for developers.
So where are we today vis-à-vis trends and technology? The emergence of the personal, general purpose device is the defining factor in the tech world today. Smart phones and tablets have transformed the way we interact with each other and the way we do business. Teenagers have never known a world without the Internet. They have always had access to a personal computer. For them a computer is not an enabling technology—it is a part of their life experience.
Three Big Steps
We can identify three game changing paradigms in computing: the development of the mainframe; the commercialization of the personal computer (1981); and ubiquitous connectivity. Everything else builds on those paradigms and continual improvements in engineering.
I recently saw a sketch of a personal device created by Nathan Myhrvold (former Microsoft CTO) in 1991. For all practical purposes it was a design for a smart phone—including, email, messaging, GPS technology, a notepad, and wireless network connectivity. It may have taken Steve Jobs to complete that vision when the necessary pieces were in place (very small efficient processors, touch screen technology, ubiquitous cellular communications, etc.) but the idea was there 16 years before the execution.
Big Iron
Mainframe technology evolved throughout the 1940's and reached its pinnacle in the mid '60's with the release of the IBM System/360. When I was in college I was fortunate enough to get weekend time on the university's 360. I don't even remember a keypad on the door to the data center. I suspect the only reason we were permitted as much computer time as we were was because no one really knew what to do with it so it was better to let a bunch of undergraduate geeks burn up expensive computer time than let it sit idle. Early mainframes were characterized by large size, large cost, small volatile memory, and total lack of interactivity.
They did do a couple of thing very well. They were able to perform complex calculations and analysis on large amounts of data and they could store large amounts of data. Mainframes were not particularly efficient at these two tasks but that wasn't the point. They may have been slow and inefficient but they were far more accurate than armies of green-shaded clerks cranking mechanical calculators and making entries in ledgers and journals.
Mainframes are still with us and while they have embraced new technologies and paradigms (like virtualization) their core system architecture is still based on 1960's thinking. Like COBOL they will linger around for another decade or so and fade into the past like piston-driven commercial airliners.
Personal Computers
The release of the Altair 8800 kit in 1975 marked the birth of the personal computer. Bill Gates and Paul Allen got one and developed what they called Altair basic. You probably know the rest of the story. The Altair reshaped the way we all think about computers. For the first time individuals could completely control a computing device limited only by their skill and imagination.
The first commercially successful PCs were the Apple II (1977) and the IBM PC (1981) and those machines are credited with popularizing PC. The bottom line, though, was they were too expensive for most want-to-be programmers. Machines like the TRS-80 and the Commodore PET and 64 were relatively affordable and spawned an entire generation of innovative, youthful programmers who taught themselves BASIC and machine instructions and made those machines sing and dance.
That generation of self-taught developers created the computing paradigms that defined the phenomenal growth of computing—object oriented programming, multi-tier technology, distributed computing, and enterprise databases that run on PC-like devices. When everyone was writing object oriented code like C++ universities were still teaching FORTRAN or silly things like RPG. No small wonder that the best and brightest developers in the 1980's and 90's were, by and large, anti-establishment types.
Intelligent Workers?
The personal computer revolution not only put machines in the hands of developers it also put them in front of business users and individuals. The so-called information worker was born from the need to justify thousands of managers and workers sitting in front of these machines day in and day out. I think the jury is still out on the actual productivity gains we have experienced by transforming office workers into information workers. Most of the business-related tasks the average employee does on their computer are purely clerical.
I suspect a substantial part of an employee's workload is now consumed by tasks that never existed before we created the information worker and the wealth of self-perpetuating tasks and reports that are fostered by that model. Most business is about selling goods and/or services at a price that exceeds our cost. Computers allow us to improve our ability to do that—making it possible to react immediately to market trends, an obvious business value.
What is not so obviously valuable is all the busy little worker bees churning out endless email, documents, spreadsheets charts, and all the other artifacts that we have come to associate with the business world. Information workers justify their existence through defining and fulfilling tasks when the business might be better served by creative thinking. A white board is a more compelling media for the creative process than a keyboard.
Online all the Time
John Gage, former chief researcher and vice president at Sun Microsystems, is credited as the originator of the phrase "the network is the computer" although others have occasionally taken credit. The realization of that statement—continuously connected computing characterizes the technology trend we are currently experiencing. Apple may be riding the crest of the wave right now but the rest of the field is not far behind. The defining characteristics of the personal device are few. It must support common standards. It must have the ability to easily connect to the Internet and local networks. It must be intuitively easy to use.
A Computing Swiss Army Knife
The most interesting characteristic of the current best of breed devices like the iPhone and iPad is that they are used as replacements for a multitude of other devices. I think I once paid $400 or so for a GPS for my car. That device had two memorable traits—it kept falling off my windshield and flying across the dashboard during hard cornering—and I would receive a yearly CD with updated maps from the manufacturer that I could unlock for another $200. That never happened.
I haven't purchased a music CD in about four years. My component stereo system is gone. I haven't purchased a paper book in about two years. I no longer take any daily newspapers that are made from paper. Using Skype or FaceTime is as easy as making a phone call. I no longer need cable TV because I can stream video to my sets. During a recent tornado warning I was able to stay comfortably in my family room when the sirens sounded because I was able to track the progress of the storm in real time.
Personal devices are truly universal multimedia devices. They don't have near the computing capacity of a laptop but most users for most use cases don't require all that power. I am typing this on a very expensive machine that is probably wishing I would get back to work so it could do what it was designed to do.
A Re-emerging Player?
Windows 8 could be the tipping point that puts Microsoft back in the catbird seat. I am very impressed with the UI. They have taken a lesson from the Apple and Android devices and incorporated what they learned into a touchscreen ready OS for the x64 platform. All that's needed is some killer hardware to support the software.
One interesting thing about personal devices is the way in which generations and social groups define how the device is used. Teenagers spend endless hours texting while users in my generation are more likely to email or use instant messaging.
Does that mean that Millenials or Gen-Y-ers are onto something that Boomers can't grasp? No. The pundits always want to say that one generation or another doesn't get it, but it is just socio-economic reality. Texting is easier and quicker than using voice—and it is silent. Kids don't want anyone—particularly parents or teachers—listening in on their conversations.
Texting is fairly surreptitious. And it is cheaper. Kids don't always have expensive data plans that make email possible. It has the additional benefit of being cool and kind of secret. Your mom may know what GR8 means but probably doesn't get AYSOS. The reality is that these devices are so unspecialized that everyone finds a use that suits their needs. If you are unable to do so you just may be from generation Luddite.
Perfect Storm
This convergence of technology, engineering, and connectivity leads us into a really interesting area. Employees now want to bring their personal devices into the workplace and connect to corporate networks data and email. And because the desire to do so has been driven from above (senior managers and executives are generally the first to bring these devices inside the firewall) it has generally been accepted.
If the VP of Sales wants to check out the daily sales data on her tablet the security team has little recourse but to make it happen. That leads to a trickle-down effect and we soon have a multitude of devices in the workplace.
Security aside, there are other issues with the "bring your own device" movement. I can control what is installed on a corporate computer. I have little control over personal devices.
I was recently in a conference room where a gentleman was tapping away furiously on his Amazon Fire. He could have been taking very good notes…or he could have been responding to personal email…or he could have been playing "Where's My Water." Judging by his lack of response when asked a question I suspect it was the latter.
Personal devices in the corporate environment are here and can't be ignored. We better find a way to embrace them and still maintain productivity without alienating a generation or two who have never known a world with them. TD
Please address comments, complaints, and suggestions to the author at [email protected].
Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader
Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
- Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.