What is ‘legacy IT’ and how scared should we be of it?

(Getty Images)

Share

Written by

Everyone agrees that modernization of federal IT infrastructure and systems is necessary, and the swift passage of the Modernizing Government Technology Act will help. But one area of disagreement is the viability of “legacy” technology and what exactly should be done about it.

The government has been reporting that 75 to 80 percent of the federal IT budget is spent on running legacy (or existing) systems. That may sound alarming to those who aren’t familiar with the inner-workings of a large IT organization. However, the percentage is in-line with the industry average. Gartner says the average distribution of IT spending between run, grow and transform activities — across all industries — is 70 percent, 19 percent and 11 percent respectively. Those numbers have been consistent over the past decade.

In addition, the mere fact that a technology system is legacy, or old, does not mean that it isn’t effective. When it comes to working code, the longer it’s been running, the better it becomes as bugs and inefficiencies are eliminated over time. CIOs from the Department of Health and Human Services, the IRS and the Defense Department are just a few that have said that their legacy systems, written in Assembler and COBOL, are well written and can be kept current through ongoing stewardship.

Unfortunately, some technology leaders look past these facts and advocate an overall “rip, rewrite and replace” strategy. In my nearly 30 years with major IT companies, I’ve seen how blanket replacement strategies and new, unproven technologies have fared. More often than you might expect, the result is a waste of precious time, money and energy on an outcome that doesn’t achieve the stated goals. Consider Pennsylvania’s lawsuit against IBM, where a major modernization project to deal with its legacy technology resulted in complete failure.

So, with $500 million in taxpayer dollars ready to be spent, it’s imperative that we take a look at legacy modernization approaches that have worked and those that have failed.

Cloud-first mentality

The cloud is not the answer to every technological need.

A blanket re-platforming of core legacy applications is an enormously expensive and risky undertaking. The key is understanding when the cloud is appropriate and when other platforms offer the optimal approach. Large private organizations with the same “can’t fail” reliability needs as the government have long depended on the mainframe for their systems of record. Is their mainframe code “legacy?” Well, it may have started decades ago, but the hardware and software have been continually updated. Ninety-two of the world’s top 100 banks still use mainframes and wouldn’t think of migrating off the platform.

Cloud-first has become a misguided proxy for what really matters, which is continuously improving the citizen experience in engaging federal services. This is the perfect time for federal CIOs to think deeply and critically about a cloud-first strategy, and ask themselves if they are really doing what’s in the best interest of U.S. citizens or wasting time and taxpayer dollars chasing the “next big thing.” This fundamental question should drive every federal CIO’s vision and mission.

The two platform approach

What is working well is a hybrid approach we call two platform IT, which smartly uses the best of both worlds in appropriate situations. Part one takes advantage of the mainframe’s unmatched security, reliability and performance-at-scale for transaction processing and other systems of record — for example, the U.S. federal tax system. Part two uses the cloud for other services or applications that, while essential, don’t require the near-perfect reliability the mainframe offers – for example, HR applications.

The mainframe modernization issue

It is true that the experienced mainframe workforce is primarily made up of Baby Boomers who are retiring. What’s not true is that this workforce needs to be replaced with employees of equal experience. A new generation of post-modern mainframe solutions bring agility and ease of use to the mainframe. This includes everything from updating the mainframe interface, thereby making it familiar to new IT workers, to speeding up the development processes supporting the mainframe, which has not traditionally been known for rapid development cycles. The result is a multi-platform system that can keep pace with the demands of our mobile, digital economy.

It’s also important to note that a well-managed and maintained mainframe is also the most secure computing infrastructure available, requiring much less protection from outside attack. If you’re thinking “What about the massive OPM breach? Wasn’t that a mainframe problem?” Actually, no. The distributed systems surrounding the mainframe allowed the breach, and if simple, available security measures had been taken, it could have been avoided. The post-modern mainframe requires 69 percent less effort to secure than other systems, according to IBM research.

So as the MGT’s new funding and flexible budgeting opportunities are clarified in the coming months, federal CIOs will have to interpret the law to best leverage their legacy systems and determine what changes will truly serve the citizens paying for them.

Is it old and in need of replacing? Or is it a proven technology that simply requires modernization? MGT dollars will be best spent with careful consideration to these questions.

Chris O’Malley is CEO of Compuware, where he is responsible for setting the company’s vision, mission and strategy. With 30 plus years of IT experience, Chris has led the company’s transformation into becoming the “mainframe software partner for the next 50 years.” 

-In this Story-

Compuware, IT Modernization, legacy IT, Mainframe tech, Modernizing Government Technology Act