The insurance industry has welcomed the advancements of technology and has continually been among the leading buyers of technology.

From the advent of the earliest insurance systems to modern state-of-the-art application suites, the one constant is that the insurance business continues to move through financial cycles, business cycles, changes in our culture, demographics, regulations, work processes, and products. Many of these changes require the acquisition of new technology or systems. In the area of insurance technology, one area that stands out is the overall need for technology to be flexible, adaptable, and designed for an unknown future.

The industry's focus on technology is necessary because of the rapidly changing business and consumer insurance marketplace — but it does cause concern in the executive ranks. Insurance CEOs recognize the importance of technology in their business, but they don't want to be known as a technology company that happens to sell insurance. They would much rather be insurance companies that effectively use technology to help the company grow and meet strategic objectives.

Overcoming history

Insurance companies have been major users of computing since the early data processing machines were the norm. Big mainframe computers took up huge amounts of space, required heavy duty air-conditioning, needed raised floors to hide all the cables and wires and consumed whole forests of paper. Companies found that computers were very good at managing accounting and financial data, information on customers, agents, coverage, claims and other administrative tasks. And, as companies grew, so did their data processing requirements, causing companies to buy more computers, add technology staff, and begin a cycle of ever-increasing technology and administrative costs.

With no real standardization in computing, computer vendors created their own programming languages that were unique to their systems. Univac, IBM, Hewlett-Packard and other companies all had programs that could, for example, manage accounting, but each vendor had its own computing language — complicating how different systems could send information. The development of COBOL as a programming language helped reduce the complexity and redundancy of data processing, making it easier to manage programmers and other technology staff.

To make technology more useful and viable, data-processing companies had to develop programs that did more than simply add up numbers and print reports. In other words, these companies had to create software that solved business problems and either reduced cost or improved processes, rather than just generating printouts. The true beginning of modern information technology began with the development of databases that could manage more information than 80 columns on a punch card. Once that occurred, computing costs, especially in financial services, started to decline as innovation sparked by advances in technology began to enable true information management.

Apple Macintosh

Drexel University President William Hagerty, left, talks with chemistry Professor Allan Smith after the Institution helped unveil Apple Computer's new Macintosh on Jan.25, 1984, in Philadelphia. The Macintosh became the mandatory electronic workhorse for the school's 1,870- student freshman class. (Photo: George Widman/AP Photo).

Usefulness comes in through the mini-computer

Although the main use of technology centered on financial and operational reporting, the real change came with the advent of the mini-computer and distributed data processing. The mini-computer, not the personal computer, revolutionized how information was gathered, collected, managed, used, and stored. With the mini computer, software developers began to create applications directed at reducing the footprint of "big iron" and get computing power and costs out of the home office and into branch locations and departments such as underwriting and claims.

As the mainframe market shrank, the market for smaller computing systems grew. Most people are aware of the emergence two operating systems: Apple and Microsoft. What Apple showed was out-of-the-box thinking with the personal computer. Between the Macintosh and Apple LS2, business began to see the potential for downsized computing needs and upsized capabilities. What Microsoft created was the concept of a standardized operating system that enabled software applications to be developed that would work on different computers. In short, Apple proved the concept, Microsoft proved the way. And the way was for software developers to invent and create tools and applications that made business easier to manage, on machines that took up a fraction of the space and were friendly to users.

Applications:  The true beginning

The road to "modern" technology systems begins with the era of Microsoft, Apple and Intel in the mid-1980s. The Apple Macintosh, introduced in 1984, was the first mass personal computer. With several prepackaged applications for word processing and spreadsheets, it became a favorite in the world of accounting — proving that the Mac wasn't a toy but a real business tool. Microsoft, at around the same time, developed MS-DOS, which enabled other companies to develop software applications that would run on different manufacturers' computers. As a software-development company, Microsoft soon realize that usefulness and functionality of the personal computer required applications that could do word processing, develop spreadsheets, handle business communications mail/messages, and other common functions that could now be automated.

If one stands back 15 years or so, the depth and breadth of growth in the technology has been breathtaking — but so too are the myriad business and consumer tasks and processes that are now highly automated, functional, and taken for granted as technology continues to advance. A major force supporting this Moore's Law-speed innovation has been the insurance industry, which has always been first in line to try — and in some cases, create or innovate — these new systems. As mentioned at the outset, the insurance business is in a constant state of change as our culture changes. To stay on the cutting edge — and even to remain relevant — insurance technology needs to take a note from its history and keep adapting, stretching and reaching for the emerging known.

John Sarich is vice president of strategy at VUE Software. He as more than 25 years of insurance industry experience, and has closely tracked the history of technology's role in insurance. Sarich can be reached at [email protected]

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

Your access to unlimited PropertyCasualty360 content isn’t changing.
Once you are an ALM digital member, you’ll receive:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.