Tax Notes logo

IRS IT Investments Face Serious Risks, GAO Says

JUN. 1, 2018

GAO-18-298

DATED JUN. 1, 2018
DOCUMENT ATTRIBUTES
Citations: GAO-18-298

INFORMATION TECHNOLOGY

IRS Needs to Take Additional Actions to Address Significant Risks to Tax Processing

June 2018

Highlights of GAO-18-298, a report to congressional committees

Why GAO Did This Study

IRS relies extensively on IT investments to annually collect more than $3 trillion in taxes, distribute more than $400 billion in refunds, and carry out its mission of providing service to America's taxpayers in meeting their tax obligations. For fiscal years 2016 and 2017, the agency reported spending approximately $2.7 billion and $2.6 billion, respectively, for IT investments.

GAO was asked to review IRS's IT operations. GAO's specific objectives were to (1) evaluate the performance of selected IRS IT investments, (2) summarize any risks associated with selected legacy systems and evaluate the steps the agency has taken to manage such risks, and (3) determine the extent to which IRS has implemented key IT workforce planning practices.

GAO analyzed planned versus actual performance information for nine selected investments for fiscal year 2016 and the first 2 quarters of fiscal year 2017 — four in development and five in the operations and maintenance phase; identified risks facing three legacy investments and analyzed IRS's efforts to manage these risks against key practices; and analyzed IRS's IT workforce planning efforts against best practices.

What GAO Recommends

GAO recommends that IRS perform operational analyses consistent with guidance, implement key risk management practices, and fully implement key IT workforce planning practices. IRS did not agree or disagree with the recommendations, but said it would provide a plan for addressing each recommendation.

View GAO-18-298. For more information, contact David A. Powner at (202) 512-9286 or pownerd@gao.gov.

What GAO Found

The performance of the Internal Revenue Service's (IRS) selected information technology (IT) investments that GAO reviewed varied. Specifically, the four selected investments in the development phase that GAO reviewed spent less than planned, but most were behind schedule and delivered less scope than planned (see table below). In addition, the five selected investments in the operations and maintenance phase that GAO reviewed had performed internal qualitative assessments of performance as required by the Office of Management and Budget (OMB); however, none of the analyses addressed all key factors specified in OMB guidance.

Reported Performance of Selected Internal Revenue Service (IRS) Investments in Development during Fiscal Year 2016 and the First Two Quarters of Fiscal Year 2017

Three investments GAO reviewed in the operations and maintenance phase that are legacy investments — Individual Master File (IMF), Integrated Data Retrieval System (IDRS), and Mainframes and Servers Services and Support (MSSS) — are facing significant risks due to their reliance on legacy programming languages, outdated hardware, and a shortage of human resources with critical skills. For example, IRS reported that it used assembly language code and Common Business Oriented Language (both developed in the 1950s) for IMF and IDRS, which exposes these investments to a rise in procurement and operating costs, and a decrease in staff available with the proper skill sets. Further, MSSS relies on a significant amount of outdated hardware exposing the investment to rising warranty and maintenance fees, as well as equipment failures. Despite these risks, the agency has not fully implemented key risk management practices and may be challenged in mitigating risks effectively so that they do not impact the agency's ability to carry out its mission.

IRS has not yet fully implemented any of the key IT workforce planning practices GAO has previously identified. Specifically, the agency has developed a tool to automate the IT workforce planning process, but the tool is in the initial stages of implementation. IRS officials attributed the limited progress in implementing IT workforce planning practices to resource constraints and competing priorities. Nevertheless, until the agency fully implements these practices, it will continue to face challenges in assessing and addressing the gaps in knowledge and skills that are critical to the success of its key IT investments.


Contents

Letter

Background

Performance of Selected IRS IT Investments Varied

Selected Legacy Investments Face Significant Risks and IRS Has Not Implemented Steps Needed to Effectively Manage These Risks

IRS Has Not Implemented Key IT Workforce Planning Practices Conclusions

Recommendations for Executive Action

Agency Comments and Our Evaluation

Appendix I Objectives, Scope, and Methodology

Appendix II Comments from the Internal Revenue Service

Appendix III GAO Contact and Staff Acknowledgments

Tables

Table 1: Reported Performance of Selected Internal Revenue Service (IRS) Investments in Development during Fiscal Year 2016 and the First 2 Quarters of Fiscal Year 2017

Table 2: Operational Performance for Five Internal Revenue Service (IRS) Investments in the Operations and Maintenance Phase during Fiscal Year 2016 and the First 2 Quarters of Fiscal Year 2017

Table 3: Assessment of Selected Internal Revenue Service (IRS) Investments' Operational Analyses

Table 4: Key Practices for Risk Management, Based on Leading Practices from the Software Engineering Institute and Office of Management and Budget Guidance

Table 5: Extent to Which the Internal Revenue Service (IRS) Implemented Key Risk Management Practices for Selected Legacy Investments

Table 6: Key Practices for Information Technology (IT) Workforce Planning

Figures

Figure 1: Internal Revenue Service's Information Technology Organization

Figure 2: Hardware Inventory Associated with the Internal Revenue Service's (IRS) Mainframes and Servers Services and Support, as of Fiscal Year 2017

Abbreviations

ACA

Affordable Care Act Administration

CADE 2

Customer Account Data Engine 2

CIO

Chief Information Officer

COBOL

Common Business Oriented Language

ECM

Enterprise Case Management

EUSS

End User Systems and Services

IDRS

Integrated Data Retrieval System

IMF

Individual Master File

IRS

Internal Revenue Service

IT

information technology

MSSS

Mainframes and Servers Services and Support

OMB

Office of Management and Budget

RRP

Return Review Program

Treasury

Department of the Treasury

TSS

Telecommunications Systems and Support


June 28, 2018

The Honorable James Lankford
Chairman
The Honorable Christopher Coons
Ranking Member
Subcommittee on Financial Services and General Government
Committee on Appropriations
United States Senate

The Honorable John Thomas Graves, Jr.
Chairman
The Honorable Michael B. Quigley
Ranking Member
Subcommittee on Financial Services and General Government
Committee on Appropriations
House of Representatives

The Internal Revenue Service (IRS) relies extensively on information technology (IT) investments to annually collect more than $3 trillion in taxes, distribute more than $400 billion in refunds, and carry out its mission of providing service to America's taxpayers in meeting their tax obligations. For fiscal years 2016 and 2017, the agency reported spending approximately $2.7 billion and $2.6 billion, respectively, for these investments.

Given the size and significance of IRS's investments and the challenges inherent in successfully delivering them, you asked us to review the performance of key IT investments and the agency's associated risk management efforts.1 Specifically, our objectives were to (1) evaluate the performance of selected IRS IT investments, (2) summarize any risks associated with selected IRS legacy systems and evaluate the steps the agency has taken to manage them, and (3) determine the extent to which IRS has implemented key IT workforce planning practices.

To address the first objective, we initially identified a non-generalizable sample of the agency's investments based on several factors, including (1) mission criticality; (2) funding for fiscal year 2016, as reported on the Federal IT Dashboard; and (3) investment risk, as determined by IRS. This resulted in the selection of nine investments for our review.

Four of the selected investments were primarily in development during fiscal year 2016 and five investments were primarily in the operations and maintenance phase during the same time frame.2 For the four investments in development, we reviewed documentation, including quarterly reports showing planned versus actual cost, schedule, and scope of work delivered for fiscal year 2016 and the first 2 quarters of fiscal year 2017. We also interviewed relevant officials.

For the five investments in the operations and maintenance phase, we obtained operational performance metrics for fiscal year 2016 and the first 2 quarters of fiscal year 2017, as well as the planned versus actual results against these metrics. We also compared IRS's efforts to assess the operational performance of these investments with key factors identified in the Office of Management and Budget's (OMB) capital programming guidance. These factors included, for example, (1) analyzing how well the investment contributes to achieving the organizations strategic goals; and (2) determining the extent to which the investment supports customer processes as designed, and how well the investment is delivering the goods or services it was designed to deliver.

For the second objective, we chose three investments from our initial selection of nine — Individual Master File (IMF), Integrated Data Retrieval System (IDRS), and Mainframes and Servers Services and Support (MSSS) — because they were originally placed into operation in the late 1960s and early 1970s and, thus, are considered legacy systems. To summarize the risks associated with these systems, we reviewed risk logs, risk detail reports, and reports identifying the number of staff supporting the systems. In addition, we reviewed agency documentation identifying the legacy programming languages used and the age of the supporting hardware.

To evaluate the steps IRS has taken to manage the risks, we analyzed documentation, including risk management plans; risk logs; risk detail reports; and meeting minutes from IRS's Applications Development Risk Review Board. In addition, we evaluated the agency's risk management efforts by comparing the aforementioned documentation to key practices identified in the Software Engineering Institute's Capability Maturity Model® Integration for Development and OMB guidance.3 We also interviewed IRS officials involved in the risk management process.

To address the third objective, we obtained documentation describing a tool that IRS is planning to implement agency-wide to address IT workforce planning. Further, we viewed a demonstration of the functionality provided by this tool, and interviewed officials in IRS's human capital office, as well as investment staff, to determine the extent to which the tool has been implemented across the agency.

We also obtained and reviewed information relative to the agency's cross-functional acquisition training, efforts intended to strengthen IT program management, and results of IT skills assessments. We compared IRS's current and planned IT workforce planning efforts to key practices for IT workforce planning derived from sources such as the Clinger-Cohen Act of 1996;4 Office of Personnel Management workforce planning guidance; and OMB Circular A-130;5 and as identified in our report on IT workforce planning efforts.6

In addition, we interviewed staff responsible for managing four investments to identify the extent to which IT workforce planning best practices were being implemented. We selected the four investments from our initial selection of nine investments according to one or more of the following factors: (1) mission critical designation by IRS, (2) exposure to human capital risk, or (3) status as a key development effort at IRS. Additional details regarding our objectives, scope, and methodology can be found in appendix I.

We conducted this performance audit from November 2016 to June 2018 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Background

The mission of the IRS, an agency within the Department of the Treasury (Treasury), is to provide America's taxpayers with top quality service by helping them understand and meet their tax responsibilities; and to enforce the tax law with integrity and fairness to all. In carrying out its mission, IRS annually collects over $3 trillion in taxes from millions of individual taxpayers and numerous other types of taxpayers. It also manages the distribution of over $400 billion in refunds. To guide its future direction, the agency has six strategic goals: (1) empower and enable all taxpayers to meet their tax obligations; (2) protect the integrity of the tax system by encouraging compliance through administering and enforcing the tax code; (3) collaborate with external partners proactively to improve tax administration; (4) cultivate a well-equipped, diverse, flexible and engaged workforce; (5) advance data access, usability and analytics to inform decision making and improve operational outcomes; and (6) drive increased agility, efficiency, effectiveness and security in IRS operations.

The mission of IRS's Information Technology organization is to deliver IT services and solutions that drive effective tax administration to ensure public confidence. It is led by the Chief Information Officer (CIO), who oversees several subordinate offices. Figure 1 shows the structure of IRS's Information Technology organization.

Figure 1: Internal Revenue Service’s Information Technology Organization

IRS Relies on Major IT Investments for Tax Processing

For fiscal year 2016, IRS's IT portfolio contained 137 investments, of which 23 were classified as major.7 According to the agency, it spent approximately $2.7 billion on its IT investments during fiscal year 2016. Of the $2.7 billion, approximately $1.9 billion (70 percent) was spent for operations and maintenance activities, and approximately $800 million (30 percent) was spent for development, modernization, and enhancement.

Among the agency's investments that we selected for our review, the following four were primarily in development during fiscal year 2016:8

  • The Affordable Care Act (ACA) Administration investment encompasses the planning, development, and implementation of IT systems needed to support IRS's tax administration responsibilities associated with the Patient Protection and Affordable Care Act. The agency reported spending $253 million on this investment in fiscal year 2016.9

  • Customer Account Data Engine 2 (CADE 2) is to, among other things, provide daily processing of taxpayer accounts, address a financial material weakness, and maintain a clean audit opinion. It is expected to replace the nearly 50 year old IMF system that IRS is using to process individual taxpayer accounts. A key project supporting CADE 2 is the Individual Tax Processing Engine project, which, according to the agency, is a complex effort to, among other things, convert approximately 200,000 lines of IMF's legacy assembly language code to Java.10

    According to IRS, the agency has completed an initial phase of converting the assembly language code for core IMF components to Java; however, significant work remains to complete the conversion. Specifically, in October 2017, IRS's CIO stated that the agency could deliver a system to replace the core IMF components in 5 years if the agency was provided with 50 to 60 employees and the associated funding, direct hire authority to hire employees with the right skills,11 and approximately $85 million each year. The agency reported spending $182.6 million on CADE 2 in fiscal year 2016.

  • The Return Review Program (RRP) is IRS's primary system for fraud detection. As such, it supports the agency's capabilities to detect, resolve, and prevent criminal and civil tax noncompliance. According to IRS, as of May 2017, the system had helped protect over $4.5 billion in revenue. The agency reported spending $100.2 million on this investment in fiscal year 2016.

  • Enterprise Case Management (ECM) is to provide an enterprise solution for performing case management across IRS's business units. According to the agency, its current systems provide limited visibility into case management practices between programs, process redundancies, and multiple handoffs that can lead to, among other things, increased risks; and ECM is expected to address these limitations. The agency reported spending $38.1 million on the investment in fiscal year 2016.

Five other investments that we selected for our review were in the operations and maintenance phase during fiscal year 2016:12

  • MSSS represents approximately 73 percent of IRS's IT infrastructure. Specifically, this investment encompasses the design, development, and deployment of servers, middleware and large systems, and enterprise storage infrastructures, including systems software products, databases, and operating systems. The MSSS investment began in 1970. The agency reported spending $499.4 million on this investment in fiscal year 2016.

  • Telecommunications Systems and Support (TSS) provides the voice and data network infrastructure services, video services, and engineering throughout IRS. The TSS investment began in 2001. The agency reported spending $336.4 million on this investment in fiscal year 2016.

  • End User Systems and Services (EUSS) provides desktops, laptops, mobile devices, software, incident management services, and asset management services to end users in IRS. The EUSS investment began in 2002. The agency reported spending $238.0 million on this investment in fiscal year 2016.

  • IDRS is used by IRS employees to review tax information, issue notices to taxpayers, and update taxpayer records. The IDRS investment began in 1973. The agency reported spending $15.8 million on this investment in fiscal year 2016.

  • IMF is IRS's system for processing individual taxpayer account data. Using this system, accounts are updated, taxes are assessed, and refunds are generated as required during each tax filing period. Virtually all IRS information system applications and processes depend on output, directly or indirectly, from this data source. As previously noted, the agency uses assembly language code to program this system, which began in the late 1960s. The agency intends to decommission IMF once CADE 2 is fully implemented; however, as we recently reported, the agency has not provided a target date for decommissioning IMF.13 The agency reported spending $14.3 million on this investment in fiscal year 2016.

GAO Has Made Recommendations to Improve IRS's Major IT Investments, Government-Wide Legacy Systems, and IT Workforce Planning

For several years, we have reported on the performance of IRS's IT investments and identified opportunities for improving the management of these investments.

  • In February 2015, we reported that the agency had provided summary-level Chief Technology Officer risk assessment ratings for the majority of IT investments in quarterly reporting to Congress. However, the agency did not provide such ratings for selected investments for which Congress required detailed reporting, including CADE 2 and RRP, which are the subject of this review.14 We noted that summary-level risk assessment ratings would improve the visibility into risks faced by the investments and provide Congress with the information to more easily determine the investments requiring greater attention. Consequently, we recommended that IRS provide summary-level risk assessment ratings for all major investments in its quarterly reporting to Congress.

    In response to our recommendation, IRS began providing summary-level risk information for all major investments in its fiscal year 2015 second quarter report to Congress. In its report for the fourth quarter of fiscal year 2016, the agency reported that selected aging systems were facing increased risks. In this regard, the agency CIO provided a risk assessment rating for major IT investments, which incorporated risks associated with people, infrastructure, deferred scope, and delivery of agreed-upon scope.

IRS reported that two investments had risk ratings that went from yellow to red.15 Specifically, IMF received a red risk rating for the people factor and CADE 2 received a red risk rating for the delivery of agreed upon scope factor.

  • In a report in June 2016, we noted, among other things, that CADE 2 and ACA did not report information on planned versus actual delivery of functionality in accordance with best practices.16 In addition, ACA did not report timely information on planned versus actual costs. Accordingly, we recommended that IRS report, at least quarterly, scope and cost performance for CADE 2 and ACA, consistent with best practices.

    In response to our recommendation, the agency began reporting on planned versus actual delivery of functionality for CADE 2 starting in fiscal year 2016. However, the agency has not reported on planned versus actual functionality for ACA. In March 2017, officials responsible for managing the investment told us that the agency had not implemented the recommendation because it did not see the benefit in doing so given that the remaining development work was minimal. IRS subsequently completed the development work for ACA in September 2017, at which point the investment transitioned to operations and maintenance. Given the status of the investment, we agree that the recommendation is no longer applicable.

In addition, our prior work has emphasized the importance of IRS more effectively managing its legacy systems.

  • As part of a government-wide review in November 2013, we reported on the extent to which 10 agencies' large investments had undergone operational analyses — a key performance evaluation and oversight mechanism required by OMB to ensure investments in the operations and maintenance phase continue to meet agency needs.17 We noted that MSSS had not had an operational analysis for fiscal year 2012. As a result, we recommended that Treasury perform an operational analysis for the investment. The department did not comment on our recommendation but subsequently performed an operational analysis for the MSSS investment.

  • In May 2016, we reported on legacy IT systems across the federal government, noting that these systems were becoming increasingly obsolete and that many of them used outdated software languages and hardware parts that were unsupported by the vendor.18 As part of that work, we highlighted Treasury's use of assembly language code and Common Business Oriented Language (COBOL) — a programming language developed in the late 1950s and early 1960s — to program its legacy systems.

    We noted the need for agencies to move to more modern, maintainable languages, as appropriate and feasible. Further, we noted that a leading IT research and advisory company had reported that organizations using COBOL should consider replacing the language, and that there should be a shift in focus to using more modern languages for new products. We also pointed out that the use of COBOL presents challenges for agencies given that procurement and operating costs associated with this language will steadily rise, and because fewer people with the proper skill sets are available to support the language.

    Further, we reported that IMF was over 50 years old and, although IRS was working to modernize it, the agency did not have a time frame for completing the modernization or replacement. In addition, we noted that IRS did have not have specific activities or timelines for updating MSSS and EUSS. Thus, we recommended that Treasury direct the CIO to identify and plan to modernize or replace IRS's legacy systems. The department had no comments on our recommendation.

We have also previously reported on agencies' IT workforce planning efforts.19 Specifically, in November 2016 we identified eight key IT workforce planning activities based on relevant laws and guidance and noted that the five federal departments in our review, including Treasury, had mixed progress in addressing the activities. We made one recommendation to Treasury and the department agreed with our recommendation.

Performance of Selected IRS IT Investments Varied

The performance of selected IRS IT investments has varied. In this regard, we found that the four selected investments in development had spent less than planned, and that most were behind schedule and had delivered less scope than planned. In addition most of these investments had significant variances, meaning that actual cost, schedule, or scope varied from plans by more than 10 percent. For the five selected investments in the operations and maintenance phase, we found that most had met all of their operational performance targets and all performed operational analyses required by OMB. However, none of the analyses addressed all key factors specified in OMB's guidance.

The Selected Investments in Development Spent Less than Planned, but Most Were Behind Schedule and Had Delivered Less Scope than Planned

Best practices highlight the importance of monitoring the performance of projects in development by comparing actual cost, schedule, and scope to plans in order to allow appropriate corrective actions if actual performance deviates significantly from planned performance.20

With regard to the four selected investments in development that we reviewed, IRS reported cost, schedule, and scope performance information for ECM, CADE 2, and RRP, but the agency reported only cost and schedule information for ACA. Table 1 provides details reported by the agency on the performance of these IT investments.

Table 1: Reported Performance of Selected Internal Revenue Service (IRS) Investments in Development during Fiscal Year 2016 and the First 2 Quarters of Fiscal Year 2017

Regarding ECM, the agency reported that it spent $1.5 million less than budgeted, had an approximately 9 percent schedule overrun for the three projects it worked on during the time frame of our review, and delivered about 90 percent of planned scope. However, after 18 months of working with a contractor, the agency paused all development activities for the investment because the product that was being delivered did not meet the agency's needs.

Specifically, according to agency officials, including the CIO, the contractor's solution was not sufficiently automated to be scalable across the agency. Thus, IRS subsequently established a new effort to acquire a product that would be aligned with its business needs. The officials stated that the strategy for acquiring the new product includes collaboration with other agencies on experiences in implementing enterprise case management systems and requesting information on potential solutions from commercial vendors.

Regarding CADE 2, IRS reported that it spent $4 million less than it budgeted, had a 54 percent schedule overrun for the 15 projects it worked on during the time frame of our review, and delivered 46 percent of the planned scope. Officials responsible for managing CADE 2 stated that the cost, schedule, and scope variances from planned performance were due to human resource and funding shortages.

Specifically, the agency reported that it does not have an adequate number of staff with expertise in assembly language code and tax processing to perform development work on both its core tax processing system (IMF) and its tax processing system modernization effort (CADE 2), or enough Java programmers to develop and maintain new code. As a result, the agency paused 7 of the 15 projects. (IRS's efforts to address its human resources constraints are discussed later in this report.)

For RRP, the agency reported that it spent $29.5 million less than budgeted, had a 19 percent schedule overrun for the 4 projects it worked on during the time frame of our review, and delivered about 80 percent of planned scope. According to the agency, these variances were due to, among other things, overestimation of planned costs, deferral of planned scope to a future release, and additional time needed to address a development defect.

For ACA, the agency reported that it spent $41.6 million less than planned and was on time for the four projects it worked on. According to the agency, the cost variance was due to, among other things, an initial overestimation of the costs to complete planned work. IRS did not track scope delivery for ACA and, as a result, we could not determine the scope performance for the investment.

As previously mentioned, in June 2016, we recommended that IRS report on actual scope information for ACA at least quarterly.21 In its March 2017 response to the recommendation, the agency stated that it did not see the benefits of implementing the recommendation for the investment given the minimal development work remaining. IRS has since completed this development work and transitioned ACA to the operations and maintenance phase. We agree that the recommendation is no longer applicable given the status of the investment.

The Majority of Selected Investments in the Operations and Maintenance Phase Met Performance Targets

According to OMB's fiscal year 2016 capital planning guidance, ongoing performance of operational investments should be monitored to ensure the investments are meeting the needs of the agency, are delivering expected value, and/or are consistent with the agency's enterprise architecture. To achieve these goals, agencies are required to establish and publically report on five operational metrics for major IT investments, as well as planned and actual performance against these metrics.22 According to OMB, these metrics seek to answer more subjective questions about areas such as customer satisfaction and financial performance.

IRS reported operational performance metrics, as required, for the five selected investments we reviewed that were in the operations and maintenance phase. Further, three of these investments — IMF, MSSS, and TSS — met all of their operational performance targets, and the remaining two investments — IDRS and EUSS — met four of five operational performance targets during the time frame that we reviewed. Table 2 lists the operational performance metrics for each of the five investments, the metrics' areas of focus, as well as the extent to which IRS met planned performance targets.

Table 2: Operational Performance for Five Internal Revenue Service (IRS) Investments in the Operations and Maintenance Phase during Fiscal Year 2016 and the First 2 Quarters of Fiscal Year 2017Table 2 continued

With regard to the investments that did not meet their performance targets:

  • IDRS did not meet its target for IRS employees' usage of the investment for 9 out of the 18 months we reviewed. Officials responsible for managing IDRS stated that this was likely due to a reduction in the number of staff at the agency who access taxpayer accounts and to a lag experienced early in the months before notices are sent out for the filing year.

  • EUSS did not meet its target for the average amount of time IRS employees wait to receive telephone support for 6 out of the 18 months we reviewed. According to officials responsible for managing EUSS, this target was missed due to the attrition of telephone support staff and the agency's inability to hire additional support staff.

Operational Analyses for Selected Investments in the Operations and Maintenance Phase Addressed Most, but Not All Key Factors

OMB's fiscal year 2016 capital programming guidance highlights the importance of operational analyses in examining the ongoing performance of operational investments. The guidance further notes that such analyses should be conducted at least annually and should address, among other things, the following:23

  • the extent to which the investment supports customer processes as designed, and how well the investment is delivering the goods or services it was designed to deliver;

  • how well the investment contributes to achieving the organizations strategic goals;

  • a comparison of current performance with a pre-established cost baseline;

  • alternative methods of achieving the same mission needs and strategic goals; and

  • greater utilization of technology or consolidation of investments to better meet organizational goals.

The five selected investments that we reviewed in the operations and maintenance phase performed operational analyses that addressed most, but not all of these key factors identified in OMB guidance. Specifically, four of the investments addressed five of the six factors, and one investment addressed four of the factors. Table 3 provides our assessment of the investments' operational analyses.

Table 3: Assessment of Selected Internal Revenue Service (IRS) Investments’ Operational Analyses

With regard to the investments that did not address all key factors identified in OMB's guidance:

  • The IMF operational analysis did not address the factor associated with greater utilization of technology or consolidation of investments to better meet organizational goals. Specifically, the analysis stated that the agency is researching the validity of converting legacy assembly language code to a modern programming language. However, the analysis did not more broadly address greater utilization of technology or consolidation to better meet organizational goals, consistent with the key factor in OMB's guidance. In addition, the analysis did not reflect IRS's progress to date in modernizing IMF and the associated challenges. This omission is concerning given the risk exposure from the agency's continued use of the legacy assembly language code. (Such risk is further discussed later in this report.)

  • With respect to IDRS, while the investment is intended to, among other things, provide for systemic review of tax information, issue notices to taxpayers, and update taxpayer records, IRS's performance metrics generally focused on system availability and usage and did not address the extent to which the intended functionality was being provided. Agency officials agreed that IDRS's metrics could be improved to address the extent to which intended functionality is being provided.

  • For TSS, while the investment provides, among other things, video conferencing and enterprise voice and fax services, the operational analysis did not address how well these service offerings were being delivered. IRS officials stated that they had instead evaluated these service offerings in a post-implementation review, which is a one-time effort conducted after an investment has completed development. However, by not addressing the factor in the operational analysis, which is an annual exercise, IRS risks not being continually informed of the extent to which the investment is meeting the needs of the agency. In addition, the operational analysis for the TSS investment did not appropriately include a comparison of current performance with a pre-established cost baseline. Specifically, while the analysis included planned and actual cost figures for fiscal year 2016, the planned cost figure was not complete as it did not account for reimbursable costs and user fees.

  • Regarding MSSS, the operational analysis did not address alternative methods of achieving the same mission needs and strategic goals. Senior officials in IRS's Information Technology organization stated that the agency had performed analyses of alternative methods for achieving mission needs and strategic goals, but these analyses were not included in the operational analysis for the investment.

  • The operational analysis for EUSS did not appropriately include a comparison of current performance with a pre-established cost baseline. Specifically, while the analysis included planned and actual cost figures for fiscal year 2016, the planned cost figure was not complete as it did not include multi-year funding and user fees.

A Branch Chief for the IDRS investment stated that IRS has used the same operational performance metrics for the investment for 10 to 15 years, and the agency has not revisited them to justify their validity over time or to modify them. The Branch Chief further noted that the operational performance metrics are usage-based and do not provide a qualitative measure of how well the investment is delivering intended services. IRS officials did not identify the causes for the deficiencies we noted with the other selected investments' operational analyses.

Until IRS addresses the shortcomings noted for the selected operational investments, the agency risks not having critical information needed to determine whether the investments fully meet intended objectives and whether there are alternative ways to efficiently meet the agency's mission.

Selected Legacy Investments Face Significant Risks and IRS Has Not Implemented Steps Needed to Effectively Manage These Risks

Three selected investments we reviewed — IMF, IDRS, and MSSS — are facing significant risks due to their reliance on legacy programming languages, outdated hardware, and a shortage of human resources with critical skills24 However, IRS has not implemented steps needed to effectively manage these risks — and thus, the agency's ability to carry out its tax processing and modernization efforts may be impacted.

Two of the three selected investments — IMF and IDRS — rely on legacy programming languages, resulting in increased risk to continuing operation of these investments. Specifically, IRS reported that IMF is written in assembly language code and COBOL, and IDRS is written in COBOL. As we previously reported, reliance on assembly language code and COBOL has risks, such as a rise in procurement and operating costs, and a decrease in the availability of individuals with the proper skill sets.25

In addition, one investment in our review — MSSS — relies on a significant amount of outdated hardware. Specifically, at the start of fiscal year 2017, the agency reported an inventory of approximately $684.2 million in hardware associated with this investment. Of this amount, approximately $430.3 million, or 63 percent, was for outdated hardware, with about 21 percent of that amount directly supporting tax processing. The $430.3 million is broken down as follows:

  • $112.6 million in communications equipment, which includes devices such as network switches26 and telephone systems;

  • $171.5 million in systems supporting IRS employees, which includes desktop and laptop computers, scanners, and printers;

  • $88.9 million in equipment directly supporting tax processing, which includes servers and UNISYS mainframes;27 and

  • $57.3 million in storage equipment, which includes automated tape libraries28 and disk arrays.29

 

Figure 2 illustrates the categories of hardware associated with MSSS, including outdated hardware.

Figure 2: Hardware Inventory Associated with the Internal Revenue Service’s (IRS) Mainframes and Servers Services and Support, as of Fiscal Year 2017

IRS officials stated that the outdated hardware associated with MSSS is expensive to maintain because it is often past the warranty. Specifically, after a warranty for hardware ends, the maintenance fees for this hardware commonly increase by approximately 25 percent per year. In addition, the officials stated that, relying on this hardware has the potential to expose IRS to equipment failures that could preclude its systems from supporting the annual tax filing season and expanding the systems and tools for enforcement approaches, among other things.

The three selected investments — IMF, MSSS, and IDRS — are also facing risks due to the attrition of key personnel. For example, IMF program officials noted that developers are responsible for maintaining taxpayer accounts and applying business rules associated with the tax process for a given situation or tax year, and thus require skills beyond creating or updating lines of code. However, according to an internal staffing report for IMF for 2011 to 2017, the agency experienced attrition of developers skilled in legacy programming languages and tax processing, exposing the investment to increased risks of not being able to successfully process tax information. For example:

  • According to the report, from 2011 to 2017, 24 developers responsible for performing work on the IMF investment retired or were transferred to other positions. In addition, as a result of this attrition, 32 developers were available to perform IMF system updates for the 2017 tax filing season, which was about 4 developers (5,840 work hours) less than needed to perform the work. Further, as of July 2017, IMF projected a shortage of 3 developers (4,042 work hours) needed for the 2018 tax filing season.

In an internal document identifying options to address the loss of knowledge caused by the attrition of staff for IMF, IRS reported that it has taken various actions as a result of the ongoing attrition of developers. Among others, these actions include: (1) cancelation of planned system enhancements; (2) training and transfer of developers from other projects to perform work on IMF; and (3) reduction in the amount of development work being completed for CADE 2 to address a financial material weakness.30

According to IMF risk logs, the investment also reported potential impacts on tax processing as a result of the attrition. These impacts include (1) the agency's delay in implementing modifications to IMF for the filing season to reflect changes in the tax law, (2) tax processing delays due to the lack of adequate institutional knowledge to resolve complex issues, and (3) a lack of necessary data from IMF, which the agency uses as input for other tax processing systems.

Further, according to the agency's CIO, it takes 4 to 5 years to train developers performing work on the IMF investment. The agency, however, is facing challenges with such training and development. For example, IMF program staff stated that the agency has historically recruited and trained future developers from within the agency, where staff had an understanding of IRS business processes and concepts. However, according to the program staff, budgetary reductions limiting travel, moving costs, or stipends, have prevented the agency from continuing such efforts.

  • According to our analysis of an IRS report showing staffing allocations for MSSS, as of April 2017, IRS reported that there were 12 COBOL developers supporting the MSSS investment. Agency officials noted impacts as a result of attrition among its developers, such as the loss of historical knowledge and expertise required to ensure proper maintenance of systems and prevent disruptions during the tax filing season.

  • With regard to IDRS, IRS officials reported the attrition of 30 developers from January 2012 through January 2017. In addition, these officials noted that, as of March 2017, the attrition had resulted in a shortage of 20 developers required to complete work on the investment. In addition, the agency identified 10 “single points of failure” for this investment, meaning that only one staff is available to support a function. Further, the officials noted that attrition of staff may result in (1) a delay in updating systems to reflect tax law changes and (2) IRS's inability to complete critical IDRS project activities on time.

The Three Selected Legacy Investments Did Not Fully Implement Key Practices for Managing Risks

We established an evaluation framework based on leading practices from The Software Engineering Institute and OMB guidance.31 The framework consists of 6 practices and 22 associated activities for managing IT investment risks. Table 4 identifies these practices and activities.

Table 4: Key Practices for Risk Management, Based on Leading Practices from the Software Engineering Institute and Office of Management and Budget Guidance

IRS has not fully implemented all of the key practices for managing risks for any of the three selected legacy investments that we reviewed. Specifically, based on our analysis, the agency fully implemented one key practice and partially implemented the remaining five for IMF; fully implemented two key practices, partially implemented two, and did not implement the remaining two for IDRS; fully implemented one key practice, partially implemented three, and did not implement the remaining two for MSSS. Table 5 provides our assessment of the extent to which IRS implemented key risk management practices for the selected three legacy investments.

Table 5: Extent to Which the Internal Revenue Service (IRS) Implemented Key Risk Management Practices for Selected Legacy Investments

IMF

IRS fully implemented one key risk management practice for IMF. Specifically, IRS officials responsible for managing IMF risks stated that the agency continuously identified risks through, among other things, monthly meetings. Further, these risks were documented using the Item Tracking Reporting and Control tool, IRS's risk and issue repository.

In addition, we determined that IRS partially implemented the remaining five key risk management practices for IMF. Specifically, the agency prepared for risk management by using IRS's Application Development organization risk management strategy along with the Item Tracking Reporting and Control tool, which describe how projects are to identify, analyze, prioritize, mitigate, and monitor risks and issues. However, the risk management strategy did not address risk constraints or risk assumptions. In addition, IRS's risk analysis for IMF included criteria for evaluating and quantifying risk likelihood and severity, but it did not address residual risk, which is the exposure remaining after action has been taken to manage a risk. Further, the agency's prioritization of risks for IMF included consideration of risk criticality, but did not include the creation of a risk profile, which documents the highest priority risks.

With respect to risk mitigation, IRS developed a Stabilization Plan in December 2016 for IMF, and used the Item Tracking Reporting and Control tool. The Stabilization Plan and tool addressed risk mitigation plans, which included specific actions to be taken, as well as an assignment of responsibility and commitment of resources. Further, the agency documented the rationale for accepted IMF risks and established a schedule or period of performance for risk handling activities. However, IRS did not meet the activities for developing alternative courses of action for all critical risks, and establishing threshold values for acceptability of risks, or threshold values for each risk category.

Finally, regarding monitoring, IRS reviewed risks at least annually, but did not implement a strategy to escalate and monitor unresolved risks, even though the Application Development risk management strategy outlined a process for doing so. Further, the agency did not compare risks status to acceptability thresholds to determine the need for implementing a risk mitigation plan. Officials responsible for overseeing risk management activities for the investment also did not review its risk management process at least annually to ensure that the process remains appropriate and effective.

IDRS

IRS fully implemented two of the key risk management practices for IDRS. Specifically, the agency continuously identified IDRS risks via bi-weekly meetings and documented risks using the Item Tracking Reporting and Control tool. In addition, the agency prioritized risks based on a documented risk inventory and risk profile.

In addition, IRS partially implemented two key practices. Specifically, similar to IMF, the agency prepared the IDRS investment for risk management by using IRS's Application Development organization risk management strategy; however, it did not address risk constraints or risk assumptions. Further, the agency partially analyzed IDRS risks by including criteria for evaluating and quantifying risk likelihood and severity; however, it did not include residual risks and, thus, may not be aware of additional risk mitigation actions that are needed.

Further, IRS did not implement the remaining two practices. With respect to risk mitigation, as of March 2017, the agency did not include risk mitigation plans, as required, for 15 of the 20 risks identified for IDRS. Further, the agency did not maintain dates for risk handling activities for the investment, as the majority of completion dates and projected impact dates for identified risks had passed and these dates were not updated. The agency also did not meet any of the key activities for monitoring identified risks for the investment. For example, it did not compare risk status to acceptability thresholds to determine the need for implementing a risk mitigation plan. Further, the agency did not provide evidence that executive leadership is monitoring all top risks for IDRS. For example, while IRS officials closed 19 of 20 identified risks noting that these risks were tracked by the Applications Development Risk Review Board, the meeting minutes that we received from this board did not show that the risks were being monitored. Lastly, IRS did not review the IDRS risk management process at least annually.

MSSS

IRS fully addressed one key risk management practice for MSSS. Specifically, the agency prioritized risks in the MSSS risk log by assigning “red,” “yellow,” and “green” indicators to identified risks. Further, IRS identified the most significant risks for the MSSS investment in a weekly status report which is intended to address the agency's readiness to support the tax filing season.

IRS partially implemented three practices for MSSS. The agency continuously identified the investment's risks via several processes. For example, the agency uses its Sustaining Infrastructure Program to address infrastructure components in need of replacement.32 According to IRS officials and documentation on the Sustaining Infrastructure Program, the program includes (1) an identification of aging infrastructure components; and (2) risk scoring and ranking of the components based on age, the extent to which the asset is associated with critical IRS processes, and the asset's impact on operations. IRS officials stated that this process results in a prioritized list of assets that are candidates for replacement. In addition, the agency documents MSSS risks via a risk log; however, the risk log does not include all risks for the investment. For example, while officials responsible for managing the MSSS investment told us about human resource risks, these risks were not included in the risk log.

The agency also partially mitigated the investment's risks by developing risk mitigation plans and specific actions for the items identified in its risk log; however, the actions did not include a schedule or period of performance. IRS also did not establish threshold values for MSSS risk categories, or alternative courses of action for critical risks.

The agency partially monitored MSSS risks by reviewing risks regularly, providing executive monitoring of top risks, and implementing a strategy to escalate unresolved risks. However, the agency did not compare risk status to acceptability thresholds to determine the need for implementing a risk mitigation plan. In addition, an IRS official responsible for managing risk and issue data for the MSSS investment stated that IRS did not review the risk management process for MSSS annually. Instead, the officials only updated the date and version number on the document that captures this process.

In addition, IRS did not implement two key practices. First, the agency did not prepare the MSSS investment for risk management. Specifically, while the agency provided the risk management plan intended to document how it prepares the investment for risk management, we found — and IRS's Director for Demand Management and Project Governance, and approver of this plan confirmed — that the plan did not describe the risk management activities that the agency was carrying out for the investment. In addition, the agency did not meet key activities for analyzing MSSS risks. For example, the agency used criteria for evaluating and quantifying risk likelihood but it did not document criteria to analyze the severity of all of its risks or consider inherent and residual risks.

IRS cited various reasons for the inconsistent implementation of risk management key practices for the three selected investments. For example, IDRS officials responsible for risk management stated that their current guidance did not clearly address some of the key practices. Further, MSSS officials responsible for risk management stated that the majority of the risk management activities for the investment are not documented. In addition, MSSS officials stated that selected risks were not documented due to their perception that a reduced budget and hiring freeze would not allow the agency to mitigate the risks. However, documenting these risks would ensure that IRS appropriately forms a baseline for initiating risk management activities.

Until IRS fully implements all of the key practices for managing risks, it will be challenged to successfully identify and mitigate risks before they adversely impact the agency's ability to carry out its mission.

IRS Has Not Implemented Key IT Workforce Planning Practices

As we have previously reported, implementing effective IT workforce planning practices can better position agencies to address human capital risks. Accordingly, our prior work has highlighted four key IT workforce planning practices and supporting activities identified in various laws enacted and guidance issued over the past 20 years that call for agencies to perform workforce planning activities.33 These key practices are (1) setting the strategic direction for workforce planning, (2) analyzing the workforce to identify skill gaps, (3) developing strategies to address skill gaps, and (4) monitoring and reporting on progress in addressing skill gaps. The key IT workforce planning practices and supporting activities are identified below.

Table 6: Key Practices for Information Technology (IT) Workforce Planning

While IRS has initiated IT workforce planning efforts, the agency has not yet implemented any of the four IT workforce planning practices. Specifically, we found that the Human Capital Office and IT organization have collaboratively developed a tool to automate the IT workforce planning process but the tool is in the initial stages of implementation and IRS has not yet performed any of the activities associated with setting the strategic direction for workforce planning. In addition, the agency has developed an inventory of its current IT workforce but it has not yet developed the competency and staffing requirements nor conducted any of the activities associated with analyzing the workforce to identify skill gaps, developing strategies to address skill gaps, or monitoring and reporting on progress in addressing skill gaps for the agency.

While IRS has not implemented key practices for IT workforce planning at the agency level, staff for IMF and CADE 2 — two of the four investments selected for our review — provided evidence of efforts they had taken to address their workforce needs. For example,

  • For IMF, in 2016, IRS established a process for continuously matching the current workforce capacity, in terms of skills and staffing, with a projected level of work. In addition, IMF staff identified competencies and staffing requirements for the investment, and assessed the gaps in competencies and staffing by assessing net available staff hours with needed staff hours for particular skill types. Lastly, IMF staff developed strategies and implemented activities in an effort to address IT skill gaps by creating a Stabilization Plan, which includes short and long-term activities for training and realignment of resources.

  • For CADE 2, IRS conducted an assessment in 2015 to identify government and contractor resource needs and utilization. IRS also identified skill gaps and developed strategies and implemented activities such as knowledge transfer sessions to begin addressing these skill gaps. The CADE 2 program manager stated that the program is waiting for additional guidance and direction from the human capital office, as the work in this area was a rudimentary one-time effort.

Staff for the remaining two investments we reviewed — IDRS and RRP — stated that they were awaiting further implementation of the agency-wide workforce planning tool to address their IT workforce planning needs.

IRS officials attributed the limited progress in implementing IT workforce planning practices to resource constraints and competing priorities.

Nevertheless, until the agency implements these practices, it will continue to face challenges in assessing and addressing the gaps in knowledge and skills that are critical to the success of its key investments, some of which we identified earlier in the report.

Conclusions

IRS has performed operational analyses to examine the ongoing performance of some IT investments, but it has not fully addressed key factors specified in OMB guidance. Until IRS fully addresses all key factors for performing operational analyses, the agency risks not having the information it needs to determine whether the investments fully meet intended objectives, or if there are alternative or more efficient ways to do so.

In addition, IRS faces significant risks that could impact key tax processing investments. Specifically, IMF, IDRS, and MSSS are reliant on legacy programming languages and outdated hardware, and the agency is experiencing shortages of staff with the skills to support these investments. However, the agency has not implemented key risk management practices, placing the tax processing and modernization efforts at risk. By fully implementing key risk management practices, IRS will have better assurance that it is proactively addressing risks before they can impact the agency's ability to carry out its mission. Further, although human capital risks have in part led to significant cost, schedule, and scope variances for CADE 2, a key modernization system, IRS has not implemented key IT workforce planning practices. Specifically, while the agency has initiated efforts to address workforce planning agency-wide, which it plans to continue, the efforts have not yet been implemented for all of the agency's IT investments. Until IRS implements effective key workforce planning practices, it will not be best positioned to address the human capital risks it faces and ensure the timely and effective delivery of its investments.

Recommendations for Executive Action

We are making the following 21 recommendations to IRS:

The Commissioner of the IRS should ensure the operational analysis for IMF fully addresses greater utilization of technology or consolidation of investments to better meet organizational goals. (Recommendation 1)

The Commissioner of the IRS should ensure the operational analysis for IDRS addresses the extent to which the investments support customer processes as designed, and how well the investments are delivering the goods or services they were designed to deliver. (Recommendation 2)

The Commissioner of the IRS should ensure the operational analysis for TSS addresses the extent to which the investments support customer processes as designed, and how well the investments are delivering the goods or services they were designed to deliver. (Recommendation 3)

The Commissioner of the IRS should ensure the operational analysis for TSS includes a comparison of current performance with a pre-established cost baseline. (Recommendation 4)

The Commissioner of the IRS should ensure the operational analysis for EUSS includes a comparison of current performance with a pre-established cost baseline. (Recommendation 5)

The Commissioner of the IRS should ensure the operational analysis for MSSS addresses alternative methods of achieving the same mission needs and strategic goals. (Recommendation 6)

The Commissioner of the IRS should fully implement the risk management key practice associated with preparing for risk management for the IMF investment. (Recommendation 7)

The Commissioner of the IRS should fully implement the risk management key practice associated with analyzing risk for the IMF investment. (Recommendation 8)

The Commissioner of the IRS should fully implement the risk management key practice for prioritizing risk for the IMF investment. (Recommendation 9)

The Commissioner of the IRS should fully implement the risk management key practice associated with mitigating risk for the IMF investment. (Recommendation 10)

The Commissioner of the IRS should fully implement the risk management key practice associated with monitoring, reporting, and controlling risk for the IMF investment. (Recommendation 11)

The Commissioner of the IRS should fully implement the risk management key practice associated with preparing for risk management for the IDRS investment. (Recommendation 12)

The Commissioner of the IRS should fully implement the risk management key practice associated with analyzing risk for the IDRS investment. (Recommendation 13)

The Commissioner of the IRS should fully implement the risk management key practice associated with mitigating risk for the IDRS investment. (Recommendation 14)

The Commissioner of the IRS should fully implement the risk management key practice associated with monitoring, reporting, and controlling risk for the IDRS investment. (Recommendation 15)

The Commissioner of the IRS should fully implement the risk management key practice associated with preparing for risk management for the MSSS investment. (Recommendation 16)

The Commissioner of the IRS should fully implement the risk management key practice associated with identifying risk for the MSSS investment. (Recommendation 17)

The Commissioner of the IRS should fully implement the risk management key practice associated with analyzing risk for the MSSS investment. (Recommendation 18)

The Commissioner of the IRS should fully implement the risk management key practice associated with mitigating risk for the MSSS investment. (Recommendation 19)

The Commissioner of the IRS should fully implement the risk management key practice associated with monitoring, reporting, and controlling risk for the MSSS investment. (Recommendation 20)

The Commissioner of the IRS should fully implement IT workforce planning practices, including the following actions (1) setting the strategic direction for workforce planning; (2) analyzing the workforce to identify skill gaps; (3) developing strategies and implementing activities to address skill gaps; and (4) monitoring and reporting on progress in addressing skill gaps. (Recommendation 21)

Agency Comments and Our Evaluation

We received written comments on a draft of this report from IRS. In its comments, which are reproduced in appendix II, IRS did not state whether it agreed or disagreed with our recommendations. However, the agency acknowledged the importance of strengthening its risk management process by implementing the key leading practices we identified, and described actions underway which confirm the significance of the risks described in our report. The agency also reported actions it has taken since the end of our review to address the IT workforce planning recommendation and stated that it would provide a detailed corrective action plan addressing each of our recommendations. Further, IRS provided technical comments, which we incorporated, as appropriate.

We are sending copies of this report to interested congressional committees, the Commissioner of IRS, and other interested parties. This report will also be available at no charge on our website at http://www.gao.gov.

If you or your staffs have any questions on matters discussed in this report, please contact me at (202) 512-9286 or pownerd@gao.gov. Contact points for our Offices of Congressional Relations and Public Affairs may be found on the last page of this report. GAO staff who made major contributions to this report are listed in appendix III.

David A. Powner
Director
Information Technology Management Issues


Appendix I: Objectives, Scope, and Methodology

Our objectives were to (1) evaluate the performance of selected Internal Revenue Service (IRS) information technology (IT) investments, (2) summarize any risks associated with selected legacy systems and evaluate the steps IRS has taken to manage them, and (3) determine the extent to which IRS has implemented key IT workforce planning practices.1

To select investments for our review, we first considered investments identified by IRS as essential to tax processing. We then considered the following factors: (1) investments impacting the greatest number of IRS business areas and associated services based on our review of IRS's 2016 Technology Roadmap; (2) investments with the highest levels of funding for fiscal year 2016, as reported on the Federal IT Dashboard; (3) investments that IRS's Chief Information Officer rated as having significant risk with respect to human capital or infrastructure; and (4) investments with planned future system migration efforts as outlined in IRS's Enterprise Transition Plan.

In addition, we considered a mix of investments in development and operations and maintenance. We selected the following nine investments (presented in order of those considered mission critical, followed by those most prominently meeting the above selection factors): (1) Individual Master File (IMF), (2) Integrated Data Retrieval System (IDRS), (3) Telecommunications Systems and Support (TSS), (4) Mainframes and Servers Services and Support (MSSS), (5) End User Systems and Services (EUSS), (6) Enterprise Case Management (ECM), (7) Customer Account Data Engine 2 (CADE 2), (8) Return Review Program (RRP), and (9) Affordable Care Act Administration (ACA).

To address our first objective, we analyzed IRS's reporting on the performance of the nine investments in our selection. The investments included four that were primarily in development — CADE 2, RRP, ECM, and ACA — and five that were primarily in the operations and maintenance phase — IMF, IDRS, TSS, MSSS, and EUSS. For the three investments which were using IRS's Investment Performance Tool — CADE 2, RRP, and ECM — we compiled and analyzed quarterly reports showing planned versus actual cost, schedule, and scope for work IRS was performing on these investments during fiscal year 2016 through the first 2 quarters of 2017.2 For the fourth investment — ACA — we compared reported planned and actual costs, as well as planned and actual completion dates for development activities for fiscal year 2016 and the first 2 quarters of 2017. IRS did not report information on ACA's performance in meeting scope goals.

In addition, for the five operational investments, we compiled and analyzed operational performance information reported for the selected investments for fiscal year 2016 and the first 2 quarters of 2017; this information included, where reported, the performance target and actual results for each metric. Further, we determined the extent to which an operational analysis was performed for each of the investments in accordance with best practices. To do so, we obtained operational analyses for fiscal year 2016 and analyzed the analyses against relevant practices outlined in the Office of Management and Budget's (OMB) fiscal year 2016 capital programming guidance.

To assess the reliability of the data for the investments in development used for this objective, we interviewed officials responsible for overseeing the use of the Investment Performance Tool to confirm the completeness of the data generated from the tool, as well as our understanding of what these data represent. We also followed up with these officials to discuss detected anomalies we found in the performance data. In addition, we relied on data reliability assessments we previously completed on IRS's financial management system because it is a source of the actual cost data found in the Investment Performance Tool.

Finally, we followed up on IRS's actions to address recommendations we previously made to improve the reliability of the cost, schedule, and scope performance data.3 While we found additional actions are needed to address our recommendations, we determined the investments' data are sufficiently reliable for our purposes. With respect to the reported operational performance data, we reviewed documentation describing the performance metrics and interviewed IRS officials regarding the process for reporting such metrics. We determined these data were sufficiently reliable for purposes of reporting on operational performance.

For our second objective, we selected three investments from our initial selection of nine — IMF, IDRS, and MSSS — because they were placed into operation in the late 1960s and early 1970s, and are therefore considered legacy investments. To summarize the risks associated with these investments, we reviewed, among other things, risk logs captured in IRS's Item Tracking Reporting and Control tool and risk detail reports. In addition, we obtained resource assessment documentation, where available, and documentation from IRS identifying staff availability and the legacy programming languages supporting these investments. We also identified aged hardware components supporting the selected investments by obtaining reports from the Knowledge, Incident/Problem, Service Asset Management system. This system is used for tracking and managing IRS assets, to include recording and tracking asset acquisitions, transfers, and disposals.4 We supplemented our review of documentation with interviews of IRS officials responsible for software and infrastructure maintenance.

To evaluate the steps IRS has taken to mitigate risks, we analyzed documentation such as risk management plans; risk logs captured in IRS's Item Tracking Reporting and Control tool and through other means; risk detail reports that included the probability, impact, and overall status for identified risks; risk mitigation plans; and meeting minutes from IRS's Applications Development Risk Review Board. We also interviewed IRS officials involved in the risk management process, including software developers responsible for maintaining aging programming languages, system administrators, and risk coordinators. We selected officials based on (1) the median number of years the officials had worked to support the investment; (2) their position as an investment Branch Chief or Section Chief in order to fairly represent the management's perspective on matters discussed; (3) consideration of employees who serve in more than one role in the risk management process in order to obtain diverse perspectives on the process; and (4) their fair representation of all programming language types and infrastructure supporting the investments. We evaluated IRS's risk management efforts by comparing information obtained to key practices from the Software Engineering Institute's Capability Maturity Model® Integration for Development, as well as OMB guidance.

To assess the reliability of the data used for our second objective, we interviewed officials responsible for overseeing the use of IRS's Item Tracking Reporting and Control tool and users of this tool to determine if the tool was being consistently implemented. In addition, as part of our interviews with software developers responsible for maintaining aging programming languages, we determined the extent to which risks shared by these officials were consistent with formally documented risks. Lastly, we corroborated IRS's identification of legacy programming languages, and infrastructure components with our staff possessing expert knowledge of IRS's IT environment. We determined these data were sufficiently reliable for purposes of describing risks faced by selected investments, as well as for evaluating IRS's risk management efforts.

For our third objective, we obtained documentation describing a tool that IRS is planning to implement agency-wide to address IT workforce planning. Further, we obtained a demonstration of the functionality provided by this tool, and interviewed officials in IRS's human capital office, as well as investment staff, to determine the extent to which this tool has been implemented across the agency. We also obtained and reviewed information relative to cross-functional acquisition training, efforts intended to strengthen IT program management, and results of IT skills assessments. We compared IRS's efforts to key practices for IT workforce planning derived from sources, including the Clinger-Cohen Act of 1996,5 Office of Personnel Management workforce planning guidance, and OMB Circular A-130, and identified in our report on IT workforce planning efforts.6

In addition, we selected four investments from our initial selection of nine — IMF, IDRS, CADE 2, and RRP — to identify efforts they had taken to address their workforce needs. We selected the four investments based on one or more of the following factors: (1) mission critical designation by IRS, (2) exposure to human capital risk, or (3) status as a key development effort at IRS.

Officials responsible for managing two of the investments — IMF and CADE 2 — provided information on their efforts. For IMF, we reviewed workforce capacity planning documentation as well as short and long-term workforce plans to assess IMF implementation of workforce planning efforts such as skills gap analysis, development of strategies and implementation of activities to address IT skills gaps, and monitoring and reporting progress in addressing IT skills gaps. For CADE 2, we reviewed documentation from its resource assessment conducted in 2015, which included information relative to resource needs and skills gaps. In addition, we reviewed documentation of efforts to address skills gaps, including training and knowledge transfer programs.

We conducted this performance audit from November 2016 to June 2018 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.


Appendix II: Comments from the Internal Revenue Service

June 14, 2018

Mr. David A. Powner
Director, Information Technology Management Issues
U.S. Government Accountability Office
441 G Street, N.W.
Washington, D.C. 20548

Dear Mr. Powner:

Thank you for the opportunity to comment on the draft report titled Information Technology: IRS Needs to Address Significant Risks to Tax Processing, (GAO-18-298). We recognize the importance of continuously improving the performance of the Internal Revenue Service's (IRS's) major Information Technology (IT) investments.

The IRS has a comprehensive enterprise risk management process that is actively followed by the IT organization. IRS IT also employs its own robust risk management process, with extra scrutiny on Filing Season and Security, that is applied across its investments. The IRS acknowledges the importance of strengthening this process by implementing the key leading practices identified by the Government Accountability Office (GAO) throughout its investments. Additionally, the IRS is specifically addressing the significant risks noted by GAO (outdated hardware, reliance on legacy programming language, shortage of human resources with critical skills) as indicated below.

The IRS identified aging technology infrastructure as the top IRS enterprise risk during Fiscal Year (FY) 2017. Since then, IRS has greatly increased focus on the issue and made progress stabilizing the aging infrastructure. Since this audit was initiated, IRS IT has redefined how aged hardware is measured to improve tracking precision, accuracy and consistency. Further, in order to protect our core processing hardware, IT uses risk-based decisions in determining which systems to upgrade using our available resources. As a result, most of the IRS aged hardware quantity is in end user computing (workstations, printers, scanners, etc.).

The IRS established the Customer Account Data Engine 2 (CADE 2) program to modernize and retire the Individual Master File (IMF), and CADE 2 has continuously delivered capabilities towards this end. Key progress to-date includes accelerating IMF processing cycles from weekly to daily, resulting in faster refunds and notices; modernizing one of the IMF's most complex set of reports, Financial Recap Reports; and launching the CADE 2 Database, successfully migrating all individual account information to a modern relational database which now provides data to 16 IRS systems. The program is currently focused on modernizing the IMF's core components where the majority of tax law is embedded from legacy Assembler Language Code (ALC) to Java. While the program continues to make progress, there are still significant challenges such as the rapidly declining number of critical IMF resources.

IRS IT continues to review, monitor, rationalize and identify the efforts associated to Common Business Oriented Language (COBOL) applications that are ongoing, and integrate and drive awareness and correlations to build an actionable assessment and get-well plan. IRS IT has established a COBOL get-well plan outlining key activities and initiatives to sustain and modernize COBOL programs including the Integrated Data Retrieval System (IDRS).

The IRS faces significant workforce challenges as the number of developers who know and understand the technology and tax law business rules embedded in the ALC code is decreasing at an alarming rate. These limited resources are working at capacity in supporting annual tax code updates and have very limited time to help with modernization efforts. Additionally, past hiring freezes have impacted the IRS' ability to bring in government leadership and oversight to drive the program forward. This has resulted in heavy contractor use, which increases program costs and places the government in a vulnerable position in terms of vendor dependency.

The IRS appreciates GAO acknowledging the shortage of human resources with critical skills. IRS IT is currently working to hire approximately 1,400 staff to address tax reform implementation, aging infrastructure, single points of dependencies and other needs to help mitigate this risk.

With regards to fully implementing IT workforce planning practices to address skill gaps, IRS completed prioritizing their critical skills gaps and developed gap mitigation strategies that are being implemented through FY19/20 Training Plans. The skill gap mitigation plans are monitored in the Project and Portfolio Management (PPM) system to track skill gap closures through the Training Needs Assessment Process. These initiatives were accomplished since the audit closing.

The IRS appreciates GAO recognizing that the Operational Analysis (OA) provides an effective process to examine the ongoing performance of our operational investments. IRS IT looks at opportunities to improve the investment performance and OA process for each annual cycle, and will consider the audit findings in subsequent OAs. Additionally, it was noted that the Mainframe and Servers Services and Support (MSSS) investment had not had an OA for FY 2012 as part of a government-wide review GAO conducted. While the OA was not identified to GAO through this review, IRS did complete the analysis for the MSSS investment in FY2012. The OA document was provided to GAO during the review of the Statement of Facts.

We will provide a detailed corrective action plan addressing each of your recommendations with our response to the final report. If you have any questions, please contact me or Gina Garza, Chief Information Officer, at (202) 317-5000.

Sincerely,

David J. Kautter
Acting Commissioner


Appendix III: GAO Contact and Staff Acknowledgments

GAO Contact

David A. Powner, (202) 512-9286, or pownerd@gao.gov

Staff Acknowledgments

In addition to the individual named above, the following staff made key contributions to this report: Sabine Paul (Assistant Director), Bradley Roach (Analyst in Charge), Andrew Banister, Mark Canter, Vern Cumarasegaran, Rebecca Eyler, Paul Middleton, and Martin Skorczynski.

GAO's Mission

The Government Accountability Office, the audit, evaluation, and investigative arm of Congress, exists to support Congress in meeting its constitutional responsibilities and to help improve the performance and accountability of the federal government for the American people. GAO examines the use of public funds; evaluates federal programs and policies; and provides analyses, recommendations, and other assistance to help Congress make informed oversight, policy, and funding decisions. GAO's commitment to good government is reflected in its core values of accountability, integrity, and reliability.

Obtaining Copies of GAO Reports and Testimony

The fastest and easiest way to obtain copies of GAO documents at no cost is through GAO's website (https://www.gao.gov). Each weekday afternoon, GAO posts on its website newly released reports, testimony, and correspondence. To have GAO e-mail you a list of newly posted products, go to https://www.gao.gov and select “E-mail Updates.”

Order by Phone

The price of each GAO publication reflects GAO's actual cost of production and distribution and depends on the number of pages in the publication and whether the publication is printed in color or black and white. Pricing and ordering information is posted on GAO's website, https://www.gao.gov/ordering.htm.

Place orders by calling (202) 512-6000, toll free (866) 801-7077, or TDD (202) 512-2537.

Orders may be paid for using American Express, Discover Card, MasterCard, Visa, check, or money order. Call for additional information.

Connect with GAO

Connect with GAO on Facebook, Flickr, Twitter, and YouTube.

Subscribe to our RSS Feeds or E-mail Updates. Listen to our Podcasts.

Visit GAO on the web at https://www.gao.gov.

To Report Fraud, Waste, and Abuse in Federal Programs

Contact:

Website: https://www.gao.gov/fraudnet/fraudnet.htm

Automated answering system: (800) 424-5454 or (202) 512-7470

Congressional Relations

Orice Williams Brown, Managing Director, WilliamsO@gao.gov, (202) 512-4400, U.S. Government Accountability Office, 441 G Street NW, Room 7125, Washington, DC 20548

Public Affairs

Chuck Young, Managing Director, youngc1@gao.gov, (202) 512-4800 U.S. Government Accountability Office, 441 G Street NW, Room 7149 Washington, DC 20548

Strategic Planning and External Liaison

James-Christian Blockwood, Managing Director, spel@gao.gov, (202) 512-4707 U.S. Government Accountability Office, 441 G Street NW, Room 7814, Washington, DC 20548

FOOTNOTES

1Senate Appropriations Committee report, S. Rep. No. 114-280, at 40-41 (2016) and House of Representatives Appropriations Committee report, H.R. Rep. No. 114-624, at 24-25 (2016), as approved by the joint explanatory statement of the conference, 163 Cong. Rec. H3327, H3782 (daily ed. May 3, 2017) (statement of Chairman Frelinghuysen), specifically referenced in section 4 of the Consolidated Appropriations Act, 2017, Pub. L. No. 115-31, § 4, 131 Stat. 135, 137 (May 5, 2017). In communication with committee staff, we agreed to focus on the three objectives detailed in this report.

2The investments primarily in development were Enterprise Case Management, Customer Account Data Engine 2, Return Review Program, and Affordable Care Act Administration. The investments primarily in the operations and maintenance phase are Individual Master File, Integrated Data Retrieval System, Telecommunications Systems and Support, Mainframes and Servers Services and Support, and End User Systems and Services.

3Software Engineering Institute, Capability Maturity Model® Integration for Development (CMMI-DEV), Version 1.3 (Pittsburgh, Pa.: November 2010); Office of Management and Budget, Managing Information as a Strategic Resource, OMB Circular No. A-130 (Washington, D.C.: revised 2016); Office of Management and Budget, Management's Responsibility for Enterprise Risk Management and Internal Control, OMB Circular No. A-123 (Washington, D.C.: 2016)

440 U.S.C. § 11315(c)(3).

5Office of Management and Budget, Managing Information as a Strategic Resource, OMB Circular No. A-130 (Washington, D.C.: revised 2016)

6GAO, IT Workforce: Key Practices Help Ensure Strong Integrated Program Teams; Selected Departments Need to Assess Skill Gaps, GAO-17-8 (Washington, D.C.: Nov. 30, 2016).

7IRS defines a major investment as one that costs $10 million or more in either current year or budget year, or $50 million or more over the 5-year period extending from the prior year through the budget year +2.

8IRS allocated funding to both development and operations activities for the selected investments for fiscal year 2016, but over 50 percent of the funding was allocated to development activities.

9IRS completed its last developmental release for ACA in September 2017, at which point the investment transitioned to the operations and maintenance phase.

10Assembly language code is a low-level computer language initially used in the1950s and Java is a programming language first released in 1995.

11Congress created direct hire authority to help agencies fill vacancies in the competitive service under certain circumstances. This direct hire authority expedites the typical hiring process.

12IRS allocated funding to both development and operations activities for the selected investments for fiscal year 2016, but over 50 percent of the funding was allocated to operations activities.

13GAO, Information Technology: Federal Agencies Need to Address Aging Legacy Systems, GAO-16-468 (Washington, D.C.: May 25, 2016).

14GAO, Information Technology: Management Needs to Address Reporting of IRS Investments' Cost, Schedule, and Scope Information, GAO-15-297 (Washington, D.C.: Feb. 25, 2015).

15A yellow rating means there are risks/issues for which mitigation is in progress and, for those which if they occur, will have an impact but will not impede the core mission of the program. A red rating means there is an imminent threat that puts the program deliverables in jeopardy; and for when the program risks/issues will have a major impact as the mitigation activities are likely to fail; or if the time for mitigation is inadequate.

16GAO, Information Technology: IRS Needs to Improve Its Processes for Prioritizing and Reporting Performance of Investments, GAO-16-545 (Washington, D.C.: June 29, 2016).

17GAO, Information Technology: Agencies Need to Strengthen Oversight of Multibillion Dollar Investments in Operations and Maintenance, GAO-14-66 (Washington, D.C.: Nov. 6, 2013).

20GAO, GAO Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital Program Costs, GAO-09-3SP (Washington, D.C.: Mar. 2, 2009).

22Agencies publically report operational metrics information on the Federal IT Dashboard, a public website with information on the performance of federal agencies' investments maintained by OMB (https://itdashboard.gov).

23Office of Management and Budget, Capital Programming Guide V 3.0: Supplement to Circular A-11, Planning, Budgeting, and Acquisition of Capital Assets (Washington, D.C.: Executive Office of the President, 2016).

24IRS refers to outdated hardware as hardware beyond its useful life.

26A switch serves as a controller, enabling networked devices to communicate with each other.

27A mainframe is a central data repository in an entity's data processing center, linked to users through less powerful devices such as workstations or terminals.

28An automated tape library automates the use of tape cartridges by using a robotic handler.

29A disk array is a data storage system that contains multiple disk drives and distributes data across these drives.

30We have previously reported on IRS's material weakness in internal control over unpaid assessments. An unpaid assessment is a legally enforceable claim against a taxpayer and consists of taxes, penalties, and interest that have not been collected or abated (i.e., the assessment reduced by IRS). Internal Revenue Manual § 1.34.4, Unpaid Assessments (Aug. 25, 2015). See, for example, GAO-17-140.

31Key practices for managing risks associated with IT investments are specifically found in the Software Engineering Institute's Capability Maturity Model® Integration for Development and OMB circulars A-130 and A-123.

32We did not evaluate IRS's Sustaining Infrastructure Program.

1Senate Appropriations Committee report, S. Rep. No. 114-280, at 40-41 (2016) and House of Representatives Appropriations Committee report, H.R. Rep. No. 114-624, at 24-25 (2016), as approved by the joint explanatory statement of the conference, 163 Cong. Rec. H3327, H3782 (daily ed. May 3, 2017) (statement of Chairman Frelinghuysen), specifically referenced in section 4 of the Consolidated Appropriations Act, 2017, Pub. L. No. 115-31, § 4, 131 Stat. 135, 137 (May 5, 2017). In communication with committee staff, we agreed to focus on the three objectives detailed in this report.

2IRS does not consider this tool to be a formal Earned Value Management System. As a result, we did not evaluate the extent to which the tool was compliant with the American National Standards Institute's guidelines for an Earned Value Management System.

3GAO, Information Technology: IRS Needs to Improve Its Processes for Prioritizing and Reporting Performance of Investments, GAO-16-545 (Washington, D.C.: June 29, 2016).

4We did not evaluate IRS's process for documenting its infrastructure inventory.

540 U.S.C. § 11315(c)(3).

6GAO, IT Workforce: Key Practices Help Ensure Strong Integrated Program Teams; Selected Departments Need to Assess Skill Gaps, GAO-17-8 (Washington, D.C.: Nov. 30, 2016).

END FOOTNOTES

DOCUMENT ATTRIBUTES
Copy RID