Tax Notes logo
Part 4. Examining ProcessChapter 2. General Examining Procedures

4.2.8. Guidelines for SB/SE National Quality Review


4.2.8. Guidelines for SB/SE National Quality Review

4.2.8 Guidelines for SB/SE National Quality Review

Manual Transmittal

October 06, 2020

Purpose

(1) This transmits revised IRM 4.2.8, Examining Process, General Examining Procedures, Guidelines for SB/SE National Quality Review.

Material Changes

(1) The internal controls section of this IRM was updated to comply with IRM 1.11.2.2.5, Address Management and Internal Controls.

(2) Significant changes to this IRM are reflected in the table below:

Former IRM Section Number

Description of Change

4.2.8.1 Program Scope and Objectives

Updated and clarified content, added cite for General Overview and Contact Information

4.2.8.1.1 Quality Review Program Authority

Cite renamed to Background, new content added and original content moved to 4.2.8.1.2 and updated

4.2.8.1.1.1 Taxpayer Bill of Rights (TBOR)

Content updated and moved to 4.2.8.1.2(3) Authority

4.2.8.1.2 Field and Specialty Exam Quality - Roles and Responsibilities

Content moved to 4.2.8.1(7) and 4.8.2.1.5 and updated

4.2.8.3 National Quality Review Program

Content moved to 4.2.8.1.1(2) and 4.2.8.1.4(2)

4.2.8.1.4 Terms and Acronyms

Moved to 4.2.8.1.6 and updated

4.2.8.1.5 Field and Specialty Exam Job Aids

Content moved to 4.2.8.1.7 and updated hyperlinks

4.2.8.2 Field and Specialty Exam Quality - Program Manager Responsibilities

Content moved to 4.2.8.1.3.1 and updated

4.2.8.2.1 Field and Specialty Exam Quality - Analyst Responsibilities

Content moved to 4.2.8.1.3.2 and updated

4.2.8.3 National Quality Review Manager Responsibilities

Content moved to 4.2.8.1.3.3 and updated

4.2.8.4 National Reviewer Responsibilities

Content moved to 4.2.8.1.3.4 and updated

4.2.8.5 Overview of The National Quality Review System (NQRS)

Content moved to 4.2.8.2 and updated

4.2.8.6 Quality Attributes

Content moved to 4.2.8.2.1 and updated

4.2.8.7 National Quality Review Process and Completion of Data Collection Instrument (DCI)

Content moved to 4.2.8.3 and updated

4.2.8.7.1 Review of Specialty Exam Electronic Case File

Content moved to 4.2.8.3.1 and updated

4.2.8.7.2 Review of the Correspondence Examination Automation Support (CEAS) Case File

Content moved to 4.2.8.3.1 and updated

4.2.8.7.3 National Quality Review Scoring System

Content moved to 4.2.8.2.3

4.2.8.7.4 DCI Header Input Procedures

Content moved to 4.2.8.3.2

4.2.8.7.4.1 DCI Process Measures Fields

Content moved to 4.2.8.2.3 and updated

4.2.8.7.5 Evaluating and Coding the Attributes

Content moved to 4.2.8.2.2 and updated

4.2.8.7.6 Reason Codes and Attribute Narratives

Content moved to 4.2.8.3.3 and updated

4.2.8.7.6.1 Guidelines for Writing Attribute Narratives

Content moved to 4.2.8.3.3 and updated

4.2.8.8 National Quality Review Case Selection Procedures

Content moved to 4.2.8.6 and updated

4.2.8.8.1 Unagreed Appeals Case Selection Procedures

Content moved to 4.2.8.6.1 and updated

4.2.8.8.2 Defaulted Case Selection Procedures

Content moved to 4.2.8.6.2 and updated

4.2.8.8.3 Shipping Sample Select Cases

Content moved to 4.2.8.6.3 and updated

4.2.8.8.4 Sample Select Case Control Procedures

Content moved to 4.2.8.6.4 and updated

4.2.8.8.5 BSA Case Selection Procedures

Content moved to 4.2.8.6 and updated

4.2.8.9 Field Exam Case Sampling Criteria

Content moved to 4.2.8.4

4.2.8.9.1 Specialty Exam Case Sampling Criteria

Content moved to 4.2.8.5

4.2.8.10 Guidelines for Consistent Case Reviews

Content moved to 4.2.8.7 and updated

4.2.8.10.1 Conducting Consistency Case Reviews

Content moved to 4.2.8.7 and updated

4.2.8.11 Use and Limitations of National Quality Review Data

Content moved to 4.2.8.8 and updated

4.2.8.12 Case Return Criteria

Content moved to 4.2.8.9

Exhibit 4.2.8-1 Quality Attributes Rated by Field and Office Exam National Quality Reviewers

Deleted. Information located in EQRS via help menu hyperlink

Exhibit 4.2.8-2 Quality Attributes Rated by Excise Tax National Quality Reviewers

Deleted. Information located in EQRS via help menu hyperlink

Exhibit 4.2.8-3 Quality Attributes Rated by Employment Tax National Quality Reviewers

Deleted. Information located in EQRS via help menu hyperlink

Exhibit 4.2.8-4 Quality Attributes Rated by Estate and Gift Tax National Quality Reviewers

Deleted. Information located in EQRS via help menu hyperlink

Exhibit 4.2.8-5 Quality Attributes Rated by Bank Secrecy Act National Quality Reviewers

Deleted. Information located in EQRS via help menu hyperlink

Exhibit 4.2.8-6 NQRS Time Frames for Case Action

Content moved to Exhibit 4.2.8-1 and updated

(3) Editorial changes were made throughout this IRM to add clarity, readability, and to eliminate redundancies. Website addresses, legal references, and IRM references were reviewed and updated as necessary.

Effect on Other Documents

IRM 4.2.8 dated March 5, 2018 is superseded

Audience

Small Business/Self-Employed (SB/SE) Field and Specialty Exam Employees.

Effective Date

(10-06-2020)

Pamela Drenthe
Director, Exam Quality and Technical Support
Small Business/Self-Employed

Program Scope and Objectives

(1) General Overview. Field and Specialty Exam Quality (FSEQ) support the Small Business/Self Employed (SB/SE) quality improvement program, providing an assessment of the quality of Field and Specialty Examination case work.

(2) Purpose. This IRM section contains general information and procedural guidance relating to the SB/SE Field and Specialty Exam National Quality Review program.

(3) Audience. The audience is employees and management officials in FSEQ as well as SB/SE stakeholders.

(4) Policy Owner. The Director, Exam Quality and Technical Support (EQ&TS), is responsible for the policies related to the National Quality Review Program. Refer to IRM 1.1.16.3.5.5 , Exam Quality and Technical Support, for more information.

(5) Program Owner. The Program Manager, FSEQ is responsible for overseeing the National Quality Review program. Refer to IRM 1.1.16.3.5.5.5, Field and Specialty Exam Quality , for more information.

(6) Program Goals. The goal of the National Quality Review Program is to provide a practical and accurate method of assessing organizational performance in support of the Balanced Measures.

(7) Primary Stakeholder. The Director Examination SB/SE. Additional stakeholders are Directors located in:

  • Headquarters Exam

  • Field Exam

  • Field and Campus Policy

  • Specialty Policy

  • Specialty Tax

.

(8) Contact Information. To recommend changes or make any other suggestions related to this IRM section see , Providing Feedback About an IRM Section - Outside of Clearance,

Background

(1) Embedded Quality (EQ) creates a link between individual performance and organizational goals. This linkage is achieved through a common set of attributes that both National Quality Reviewers in FSEQ and front-line managers use to evaluate the quality of case work.

(2) EQ reviews focus on whether the examiner took the right actions at the right time while protecting taxpayer rights.

(3) National Quality Reviewers in FSEQ use the National Quality Review System (NQRS), an automated web based system used to record results from case reviews for:

  • Field and Office Exam Program

  • Bank Secrecy Act (BSA) Program

  • Employment Tax Program

  • Estate and Gift Tax Program

  • Excise Tax Program

(4) Reports generated from NQRS provide data which may be used to evaluate organizational processes, procedures and successes, and identify areas in need of improvement.

(5) The NQRS database is accessed through the Embedded Quality home page at http://mysbse.web.irs.gov/examination/examinationeq/default.aspx.

(6) Managers use the Embedded Quality Review System (EQRS) database to evaluate employee performance. For more information regarding front line manager use of EQRS see IRM 1.4.40.3.7, Performance Feedback..

Note: NQRS data is never used to evaluate employee performance.

Authority

(1) The requirement for an organizational measure of quality for the IRS was established by CFR 801.6(b)Quality measures as part of the Restructuring and Reform Act of 1998 (RRA 98).

(2) CFR 801.6(b) states that quality measures focus on whether IRS personnel:

  • Devoted an appropriate amount of time to a matter

  • Properly analyzed the facts of the situation

  • Complied with statutory, regulatory and IRS procedures

  • Took timely actions

  • Provided adequate notification and made required contacts with taxpayers

(3) The Taxpayer Bill of Rights (TBOR) lists the fundamental rights taxpayers have when working with the IRS, including a right to quality service. Consideration of these rights in every interaction with taxpayers helps to reinforce the fairness of the tax system. All IRS employees must be:

  • Informed about taxpayer rights

  • Conscientious in the performance of their duties to honor and respect those rights

  • Communicate effectively, those rights that aid in reducing taxpayer burdens

  • Administer the law with integrity and fairness

  • Exercise professional judgment in conducting enforcement activities

(4) See IRM 4.10.1.2, Taxpayer Rights, Taxpayer Rights, for more information about TBOR.

Roles and Responsibilities

(1) Listed below are the primary roles and responsibilities of the Program Manager, Quality Analysts, management, and quality reviewers involved in the quality review process.

Program Manager Responsibilities

(1) FSEQ Program Manager primary responsibilities include:

  • Overseeing and allocating resources for FSEQ

  • Coordinating the development of the annual case review sample plan for FSEQ

  • Ensuring that case review inventory is sufficient for each Field and Specialty Exam Area or program based on the sample plan

  • Monitoring the delivery of the Field and Specialty Exam national sampling plan

  • Coordinating issues relating to interpreting and rating the quality attributes

  • Establishing protocol to measure, monitor, and improve reviewer accuracy and consistency

  • Sharing analysis of NQRS data to aid in organizational improvement and influence quality performance

  • Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis

  • Coordinating with stakeholders in the development of attributes and requirements for quality reviews

  • Providing recommendations to enhance NQRS

Quality Analyst Responsibilities

(1) FSEQ Analyst is responsible for:

  • Developing and distributing quality performance reports

  • Developing the annual case review sample plan

  • Reviewing attribute narratives on a regular basis to ensure guidelines are followed

  • Participating in group meetings to promote consistency, including the discussion of specific attributes and case scenarios

  • Developing and clarifying review criteria and procedures to promote consistency

  • Providing quality review data and/or analysis to internal/external stakeholders on an ad hoc or recurring basis

  • Coordinating with stakeholders in their quality improvement initiatives

  • Collaborating with stakeholders in the development of attributes and requirements for quality reviews

  • Working with stakeholders in monitoring and updating Job Aids, instructional guides and quality review procedures in accordance with IRM and program guidelines

(2) In addition, the Quality Analyst assigned to Specialty Exam is also responsible for:

  • Maintaining Specialty Exam program EQRS and NQRS database reference tables via On Line Matrix web application

  • Coordinating Specialty Exam EQRS/NQRS end user and system support

Front Line Manager Responsibilities

(1) FSEQ Manager responsibilities include:

  • Providing guidance for program objectives

  • Ensuring that reviewers understand and adhere to program guidelines

  • Ensuring accurate and consistent application of the quality attributes

  • Reviewing attribute narratives on a regular basis to ensure guidelines are followed

  • Critiquing completed reviews on a regular basis and providing meaningful feedback to reinforce expectations for quality case reviews

  • Conducting group meetings to promote consistency, including the discussion of specific attributes and case scenarios

  • Ensuring accuracy of data input

  • Ensuring that the sample plan is followed

  • Monitoring sample plan and recommending actions to address imbalances

  • Maintaining instructional guides for national quality reviewers

  • Reviewing and approving case returns that meet criteria found in IRM 4.2.8.9, Returning Cases to the Field

  • Reviewing and approving rejection of cases that do not meet case sampling criteria found in IRM 4.2.8.4 and 4.2.8.5

  • Sharing trends and issues that may have nationwide impact

  • Providing input during the attribute development or update process

Reviewer Responsibilities

(1) FSEQ Reviewers responsibilities include:

  • Evaluating examination case quality by conducting reviews of completed SB/SE Field and Specialty Exam cases

  • Accurately and consistently applying the attributes utilizing the appropriate Job Aid and tools such as the IRM and Internal Revenue Code

  • Completing timely case reviews using the Data Collection Instrument (DCI)

  • Completing timely and accurate input of review data into the NQRS database

  • Identifying the appropriate reason code(s) for each not met attribute rating

  • Writing clear and meaningful attribute narrative comments for each not met attribute rating

  • Elevating potential conflicts in the IRM and the Job Aid for resolution

  • Assisting in data analysis as warranted

Program Reports and Effectiveness

(1) Program reports are available on NQRS by selecting Reports from the Main Menu screen.

(2) FSEQ also generates quarterly performance reports for stakeholders. These reports provide data to aid in:

  • Establishing baselines to assess program performance

  • Identifying quality strengths and weaknesses

  • Determining specific training/educational needs

  • Identifying opportunities to improve work processes

  • Measuring the success of quality improvement efforts

(3) An overall quality score serves as the Balanced Measure for Business Results – Quality. This measure is reported to various levels of the organization and to external stakeholders such as Congress.

Program Controls

(1) Access to EQRS and NQRS data and reports is controlled based on the user’s assigned permission level, assigned function and assigned organization. System coordinators are responsible for assigning users to the appropriate permission level based on the user’s role in the organization. Users are only given privileges that are required for the user to perform their job. Users do not have access to security and other functions/features that require elevated privileges.

(2) Operations Support in collaboration with EQRS and NQRS Site System Coordinators (SSC) has oversight of the Online Form 5081, Information User Registration/Change Request, which is used to request access to the EQRS and NQRS applications. See http://mysbse.web.irs.gov/sbseorg/eq/syscoordguidance/default.aspx for more information.

(3) EQRS/NQRS checks information inputs for accuracy and completeness by:

  • Restricting data input to established system parameters to ensure data accuracy

  • Using drop downs lists as much as possible to restrict users from typing invalid information

  • Displaying an error message if invalid data is input into the system

  • Requiring data field input before proceeding

(4) Operations Support, Technology Solutions, Collection Systems provide core information technology management and support services for both EQRS and NQRS. They are responsible for:

  • Ensuring compliance with the Federal Information Security Management Act (FISMA)

  • Managing Unified Work Requests (UWR) for system updates and changes

  • Leading the development of enhanced data and computer security process and controls

Terms and Acronyms

(1) The following table contains commonly used terms and acronyms found in this IRM:

Terms and Acronyms

Definition

BSA

Bank Secrecy Act

CCP

Centralized Case Processing

CJE

Critical Job Element

CEAS

Correspondence Examination Automation Support

DCI

Data Collection Instrument

EQ

Embedded Quality

EQ&TS

Exam Quality & Technical Support

FSEQ

Field and Specialty Exam Quality

SB/SE Field Exam

Cases selected for quality review from Revenue Agents, Tax Compliance Officers and Tax Auditors located in Field Examination

EQRS

Embedded Quality Review System

ERCS

Examination Returns Control System

IMS

Issue Management System

IRM

Internal Revenue Manual

ITAMS

Information Technology Asset Management System

NQRS

National Quality Review System

SB/SE

Small Business/Self Employed business unit

SPRG

Specialized Product Review Group

SB/SE Specialty Exam

Cases selected for quality review from Revenue Agents, Tax Compliance Officers, Attorneys, Revenue Officer Examiner and Fuel Compliance Agents located in Specialty Examination

UWR

Unified Work Request

Related Resources

(1) Field and Specialty Exam Job Aids are reference tools for Field and Specialty Exam management and FSEQ review staff to aid in rating the quality attributes in a uniform and consistent manner. Guidelines in the Job Aids align the EQ concepts to current Field and Specialty Exam procedures. IRM references support each quality attribute.

(2) Headquarters Examination, Examination Field and Campus Policy is responsible for ensuring the consistency of the Field and Office Exam Job Aids and training materials, along with the IRM and other guidelines.

(3) Headquarters Examination, Specialty Policy is responsible for ensuring the consistency of the Specialty Exam Job Aids and training materials, along with the IRM and other guidelines.

(4) The help menu, located at the top of the EQRS/NQRS main menu screen, contains links to the Program Job Aids.

Overview of National Quality Review Process

(1) The Quality Review process provides data to measure, monitor and improve the quality of work.

(2) Organizational performance is measured by conducting independent case reviews from a statistically valid sample of examination case work.

(3) Specific measurement criteria, referred to as quality attributes, are used to evaluate the quality of case work.

Quality Attributes

(1) Quality attributes address whether:

  • Timely service was provided to the taxpayer

  • Facts of the case were properly analyzed

  • Law was correctly applied

  • Taxpayer rights were protected by following applicable IRS policies and procedures including timeliness, adequacy of notifications, and required contacts with taxpayers

  • Appropriate determination was reached regarding liability for tax and ability to pay

(2) Quality attributes are organized into measurement categories which allow quality data to be generated based on the following criteria:

  • Timeliness - resolving issues in the most efficient manner through proper time utilization and workload management techniques

  • Professionalism - promoting a positive image of the Service by using effective communication techniques

  • Regulatory Accuracy - adhering to statutory/regulatory process requirements

  • Procedural Accuracy - adhering to internal process requirements

(3) Quality attributes can also be organized by DCI attribute group:

  • Planning

  • Income Determination (Field Exam)

  • Investigative/Audit Techniques

  • Timeliness

  • Customer Relations/Professionalism

  • Documentation/Reports

Evaluating and Coding the Attributes

(1) Reviewers evaluate case work utilizing attributes specific to their Specialized Product Review Group (SPRG).

(2) Reviewers rate all attributes that apply to the case being reviewed.

(3) Attribute ratings must be accurate and consistent. Reviewers must strive for consistency in rating similar case actions.

Attribute Scoring System

(1) The scoring system provides for the equal weighting of each attribute. Each attribute is rated as Yes, No, or in some instances Not Applicable.

(2) The quality score is computed as a percentage. The percentage is calculated as total Yes ratings divided by total Yes and No ratings. A total score of 100 is possible for each case.

Case Review Procedures

(1) The DCI is the principal documentation for the reviewer’s case evaluation and conclusions. A DCI is completed for each case review in NQRS. Reviewers must ensure that all entries on the DCI are accurate and records are not duplicated.

(2) Reviewers will review one case at a time to completion before starting another case review.

(3) Steps in the review process include:

  • Review of the paper case file and the electronic file, where applicable

  • Input case review data on the DCI and prepare narratives to explain each not met attribute rating

  • Review the DCI for accuracy and narrative quality

  • Edit the DCI as necessary

  • Complete case review

Review of Electronic Case File

(1) The physical case file is the primary source for case reviews for Field and Specialty exam reviewers. Reviewers may need to view case information stored electronically.

(2) Documents found in the electronic case file might not be in the physical case file because they were not printed or were inadvertently removed. If there are indications in the physical case file that electronic documents exist, reviewers should access the electronic file to determine if additional information is available.

(3) Electronic case files for the Excise, Employment and Estate and Gift Programs are located on the Issue Management System (IMS) Team Website.

Note: An OL5081 is required for access to the IMS Team Site.

(4) Electronic case files for Field Exam are located in Correspondence Examination Automation Support (CEAS).

DCI Header Input Procedures

(1) The first input section of the DCI is the header fields that capture basic case information. The bold header fields are mandatory and must be entered to complete the DCI.

(2) Header information is categorized into four groupings:

  • Review Information - specific information about the review itself

  • Case Information - specific information about the case

  • Process Measures - case actions taken by the examiner that are used to measure the efficiency of the examination process

  • Special Use - special tracking for local or national purposes

(3) Process Measures data may be analyzed in conjunction with the quality attributes. Process Measures data fields capture:

  • Specific tasks performed during the examination

  • How these tasks were completed

  • Key dates

  • Delays in activities

  • Hours associated with the case

Reason Code Selection and Writing Guidelines for Attribute Narratives

(1) When a quality attribute is rated not met, at least one reason code, if available, must be selected that supports the not met rating.

(2) The most appropriate reason code should be selected for the error. Multiple reason codes may be selected for multiple errors, if warranted.

(3) A narrative is required, describing the facts, for each not met attribute rating.

(4) Reviewers should contact their manager when other is used regularly as a reason code to determine if additional reason codes should be added to NQRS.

(5) Reviewer narratives must be thorough, providing clear, concise, and specific descriptions of any errors, offering sufficient detail to allow for specific recommendations for improvement.

(6) Reviewers must avoid using canned statements in their narratives.

(7) Attribute narratives should:

  • Clearly state the facts that resulted in the attribute rating

  • Identify the nature of the error in the first sentence of the narrative

  • Indicate what was not done, not what should have been done

  • Evaluate the case, not the examiner

(8) Reviewer should not include taxpayer specific or PII data in the narrative comments.

Field Exam Case Sampling Criteria

(1) The following Field Exam cases are included in the review sample:

  • SB/SE revenue agent and tax compliance officer income tax cases (corporations, partnerships, and individual returns)

  • Agreed, partially agreed, unagreed, no-change, and cases protested to Appeals

  • Secured delinquent returns not accepted as filed

  • Training cases

  • Form 1041, U.S. Income Tax Return for Estates & Trusts,Form 1042, Annual Withholding Tax Return for U.S. Source Income of Foreign Persons, and Form 1120-F, U.S. Income Tax Return of a Foreign Corporation, tax returns examined by revenue agents

  • Correspondence cases examined by revenue agents, tax auditors, and tax compliance officers

  • Pre-assessment innocent spouse cases

  • Claims

  • Audit reconsideration cases

  • Employment tax cases are included if they are closed as related cases to an income tax case (the entire related case package is included)

(2) The following Field Exam cases are excluded from the national quality review sample:

  • Secured delinquent returns accepted as filed

  • Penalty cases not included as part of an examination case

  • Surveyed returns

  • Offers in Compromise cases

  • Post-assessment innocent spouse cases

  • Surveyed claim cases (Disposal Code 34)

  • No show/no response cases

  • Protested cases with 395 days or less remaining on the statute

    Note: If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

  • Petitioned cases

  • Cases updated to suspense status

  • Cases updated to group status after 90 Day Letter issued

  • Cases closed via Form 906, Closing Agreement

  • Specific project codes as determined by Headquarters

Specialty Exam Case Sampling Criteria

(1) The following Specialty Exam cases are included in the review sample:

  • Excise tax cases

  • Estate and Gift tax cases

  • Employment tax cases where there is no related income tax case

  • BSA Title 31 and Form 8300 cases

(2) The following Specialty Exam cases are excluded from the national quality review sample:

  • Secured delinquent returns accepted as filed

  • Penalty cases not included as part of an examination case

  • Surveyed returns

  • Offers in Compromise cases

  • Post-assessment innocent spouse cases

  • Surveyed claim cases (Disposal Code 34)

  • No show/no response cases

  • Protested cases with 395 days or less remaining on the statute

    Note: If the case selected for review is a protested case to Appeals, there must be at least 395 days remaining on the statute. Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute as of the date the case is received in Appeals. The additional 30 days is required to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

  • Petitioned cases

  • Cases updated to suspense status

  • Cases updated to group status after 90 Day Letter issued

  • Cases closed via Form 906,Closing Agreement

  • Specific project codes as determined by Headquarters

  • Activity Code 421 returns -Gift Form 706GS(D),Generation-Skipping Transfer Tax Return for Distributions and Form 706GS(T),Generation-Skipping Transfer Tax Return For Terminations

  • Cases worked by Estate and Gift support staff, Paraprofessional -position code 316 and Audit Accounting Aide -position code 301

  • Estate and Gift returns assigned outside of the Estate and Gift area

  • Excise Form 2290, Heavy Highway Vehicle Use Tax Return cases

  • Excise returns assigned outside of the Excise area

  • Employment Tax Form 1040 Tip cases

National Quality Review Case Selection Procedures

(1) The Examination Record Control System (ERCS) Sample Review program automates the process of randomly selecting a valid sample of cases meeting the sampling criteria for review.

(2) The sample size is statistically valid at the Field Exam Area level and the Specialty Exam Program level. The annual sample plan is based on projected fiscal year closures for each SB/SE program.

(3) Cases meeting the sample criteria are selected by the ERCS Sample Review program at the designated sample rate for the Field Exam Area and for three of the Specialty Exam Programs (Excise, Employment, Estate and Gift). Cases are subject to the sample at the point they move to Status Code 51 or 21. When a case is selected for sample review, the ERCS user is notified to print the Sample Selection Sheet to place on the file. CCP is responsible for updating sample selected cases to Status Code 90 and sending them to the appropriate review site.

(4) BSA cases are not controlled on ERCS but are selected using a pull rate selection process. Form 8300 cases are pulled from the weekly extract of closed cases maintained by Enterprise Computing Center – Detroit (ECC-DET). Title 31 cases are pulled from the closed case Title 31 database using the NQ interface.

Unagreed Appeals Case Selection Procedures

(1) The ERCS Sample Review program may select unagreed cases as part of the random sample of cases for review.

(2) Technical Services is responsible for sending unagreed Appeals cases and unagreed Appeals cases with at least one agreed/no-change year that are selected for sample review to the appropriate review site. These cases are high priority and procedures are established to ensure their timely review. Refer to IRM 4.8.2.3.4, Technical Services, Case Processing, for more information.

(3) When “open” cases are transmitted to the review site by Technical Services, they should be updated to Status Code 23, Sample ReviewSample Review and Review Type 33 on ERCS.

(4) Field and Specialty Exam reviewers will complete their review of the open unagreed case within 10 business days and return the case to Technical Services via ground mail service.

(5) Appeals policy requires a non-docketed case to have at least 365 days remaining on the statute as of the date the case is received in Appeals. If a non-docketed case is selected for sample review there needs to be an additional 30 days on the statute to allow for the completion of the review and for Technical Services to receive and prepare the case for closing to Appeals.

(6) Cases that do not meet this criteria will be deselected and returned to Technical Services.

(7) Reviewers prepare an Appeals Advisory Memo to Technical Services when tax application/computation errors are found or if any taxpayer confidentiality issues are discovered. Technical Services decides whether to forward the case to Appeals or return it to the group.

Defaulted Case Selection Procedures

(1) The ERCS Sample Review program may also select unagreed cases closing for issuance of statutory notice of deficiency as part of the random sample of cases for review.

(2) Technical Services will affix the sample selection sheet to these case files and update all returns with an account transfer out freeze code ("M" freeze code).

(3) If the case defaults, Technical Services will send the case to CCP. The "M" freeze code along with the Sample Selection Sheet will alert CCP that the case must be sent to the appropriate review site.

(4) CCP will update the case to Status Code 90, remove the "M" freeze code and forward the case to the appropriate review site.

Shipping Sample Select Cases

(1) Cases selected for review should be transmitted to their respective review site.

(2) Field Exam cases are shipped to NQRS-Oakland.

(3) Specialty Exam cases are held in the Covington campus for screening. After screening, cases are shipped directly to reviewers.

(4) Closed case files should remain intact after they leave CCP and Technical Services. Dismantling, purging, or discarding documents from a case file could negatively affect the case if legal actions are pursued.

(5) A separate Form 3210, Document Transmittal, shall be attached to the closed case files. Each selected case shall include the full physical case file.

Sample Select Case Control Procedures

(1) Each review site will maintain an inventory control system. This will facilitate an orderly flow of case files and supporting documents between closing units, the review site, and the reviewer.

(2) All closed (status 90) physical case files along with Form 3210 are transported via ground shipment for final disposition.

Case Review Consistency

(1) The reviewer’s case evaluation must be accurate and consistent to provide reliable and meaningful results.

(2) The FSEQ Manager should periodically perform consistency checks to ensure consistent and accurate application of the quality attributes and accurate data input.

(3) The FSEQ Manager may conduct consistency reviews in several ways, including:

  • Have each reviewer independently review the same case and discuss any inconsistencies in attribute rating

  • Critique completed case reviews and provide feedback to reinforce expectations of the review outcomes

  • Utilize NQRS reports including reviewer narratives, to evaluate consistency, ensure guidelines are followed and to ensure the narratives are clearly and professionally written

  • Hold group meetings to discuss specific attributes and case scenarios

(4) Results of the consistency reviews are maintained and updated as warranted.

Use and Limitations of National Quality Review Data

(1) The fundamental purpose of the National Quality Review Program is to provide an overall organizational assessment of case quality.

(2) Quality review results are statistically valid and reliable measurements of the overall quality of casework completed by SB/SE Field Exam only at the Area level. Specialty Exam national quality results are statistically valid at the Program level. Results stratified to any lower organizational segment are not statistically reliable measurements of the quality of casework at those levels.

(3) Lower organizational segment stratifications are indicators. They should be relied upon only to the extent that they are confirmed by other reliable management measures of quality.

(4) The design and format of quality review reports within NQRS as well as access to the reports and data will be determined by the Field and Specialty Exam Program Manager.

(5) No attempt should be made to identify an examiner or otherwise associate specific review results to a particular case.

(6) Review data is used to assess program performance and will not be used to evaluate individual employee performance. Any feedback or other data generated from NQRS will not be used as a substitute for EQRS case reviews, on the job visits, workload or any other reviews.

Case Return Criteria

(1) FSEQ reviewers will follow guidance found in the Technical Services IRM 4.8.2.9, Returning Cases to the Field, which outlines return criteria for cases with potential for significant impact to taxpayer compliance or to tax revenues.

National Standard Time Frames for Case Action

(1) Activity - Type of exam action or activity measured

(2) Days - Maximum number of calendar days permitted for the exam action or activity

(3) Measured From - Start of the exam action or activity

(4) Measured To - End of the exam action or activity

(5) The national recommended standard timeframes (unless noted, measured in calendar days) are shown in the table below:

Activity

Program

Days

Measured From

Measured To

Start Examination

Field Exam, Excise, Employment, BSA

45

First action

First Appointment

Contact

Estate and Gift

45

Examiner’s receipt of case

Date examiner sends an initial contact letter to the taxpayer with a copy to the representative, or surveys the assigned case

Significant Activity

Field and Specialty Exam

45

Last significant action

Next significant activity

Response to call

Field and Specialty Exam

1 business day

Taxpayer or representative telephone call

Return telephone call to the taxpayer or representative

Response to correspondence

Field and Specialty Exam

14

Receipt of correspondence or documentation from taxpayer or representative

Provide follow up response to taxpayer or representative

POA Processing

Specialty Exam

as soon as possible or within 24 hours of the receipt date

Receipt of Form 2848

Submission to CAF Unit for processing

Agreed/No Change Case Closing

Field Exam,

10

Date the report is received or the date the no-change status is communicated to the taxpayer

Date the case is closed from the group

Agreed/No Change Case Closing

Estate and Gift, Excise

30

Date the report is received or the date the no-change status is communicated to the taxpayer/financial institution

Date the case is closed from the group

Agreed/No Change Case Closing

Employment

20

Date the report is received or the date the no-change status is communicated to the taxpayer

Date case is updated to status 51 and closed from the group

Agreed/No Change Case Closing

BSA

20

Date closing letter finalized

Date case closed from the group

Agreed cases with unpaid proposed assessments of $100,000 and greater

Field and Specialty Exam

4

Date the report is received

Date case is closed from the group

Unagreed Case Closing

Field and Specialty Exam

20

Date the 30–Day Letter defaults or the date the request for appeals conference is received

Date the case is closed from the group

This data was captured by Tax Analysts from the IRS website on December 03, 2023.
Copy RID