OFFICE
OF
THE INSPECTOR GENERAL
SOCIAL SECURITY ADMINISTRATION
PERFORMANCE
INDICATOR AUDIT:
CUSTOMER SATISFACTION
September
2007
A-15-07-17129
AUDIT REPORT
Mission
By conducting independent and objective audits, evaluations and investigations, we inspire public confidence in the integrity and security of SSA's programs and operations and protect them against fraud, waste and abuse. We provide timely, useful and reliable information and advice to Administration officials, Congress and the public.
Authority
The Inspector General Act created independent audit and investigative units, called the Office of Inspector General (OIG). The mission of the OIG, as spelled out in the Act, is to:
Conduct and supervise independent and objective audits and investigations
relating to agency programs and operations.
Promote economy, effectiveness, and efficiency within the agency.
Prevent and detect fraud, waste, and abuse in agency programs and operations.
Review and make recommendations regarding existing and proposed legislation
and regulations relating to agency programs and operations.
Keep the agency head and the Congress fully and currently informed of problems
in agency programs and operations.
To ensure objectivity, the IG Act empowers the IG with:
Independence to determine what reviews to perform.
Access to all information necessary for the reviews.
Authority to publish findings and recommendations based on the reviews.
Vision
We strive for continual improvement in SSA's programs, operations and management by proactively seeking new ways to prevent and deter fraud, waste and abuse. We commit to integrity and excellence by supporting an environment that provides a valuable public service while encouraging employee development and retention and fostering diversity and innovation.
MEMORANDUM
Date: September 24, 2007 Refer To:
To: The Commissioner
From: Inspector General
Subject: Performance Indicator Audit: Customer Satisfaction (A-15-07-17129)
We contracted with PricewaterhouseCoopers, LLP (PwC) to evaluate 13 of the
Social Security Administration's (SSA) performance indicators established to
comply with the Government Performance and Results Act. Attached is the final
report presenting the results of one of the performance indicators PwC reviewed.
For the performance indicator included in this audit, PwC's objectives were
to:
Assess the effectiveness of internal controls and test critical controls over
data generation, calculation, and reporting processes for the specific performance
indicator.
Assess the overall reliability of the performance indicator's computer processed
data. Data are reliable when they are complete, accurate, consistent and not
subject to inappropriate alteration.
Test the accuracy of results presented and disclosed in SSA's Fiscal Year 2006
Performance and Accountability Report.
Assess if the performance indicator provides a meaningful measurement of the
program it measures and the achievement of its stated objective.
This report contains the results of the audit for the following indicator:
Percent of individuals who do business with SSA rating the overall service as "excellent", "very good", or "good".
Please provide within 60 days a corrective action plan that addresses each recommendation. If you wish to discuss the final report, please call me or have your staff contact Steven L. Schaeffer, Assistant Inspector General for Audit, at (410) 965-9700.
Patrick P. O'Carroll, Jr.
MEMORANDUM
Date: September 12, 2007
To: Inspector General
From: PricewaterhouseCoopers, LLP
Subject: Performance Indicator Audit: Customer Satisfaction (A-15-07-17129)
OBJECTIVE
The Government Performance and Results Act (GPRA) of 1993 requires the Social Security Administration (SSA) to develop performance indicators that assess the relevant service levels and outcomes of each program activity. GPRA also calls for a description of the means employed to verify and validate the measured values used to report on program performance.
Our audit was conducted in accordance with generally accepted government auditing standards for performance audits. For the performance indicator included in this audit, our objectives were to:
1. Assess the effectiveness of internal controls and test critical controls over the data generation, calculation, and reporting processes for the specific performance indicator.
2. Assess the overall reliability of the performance indicator's computer processed data. Data are reliable when they are complete, accurate, consistent and not subject to inappropriate alteration.
3. Test the accuracy of results presented and disclosed in SSA's Fiscal Year (FY) 2006 Performance and Accountability Report (PAR).
4. Assess if the performance indicator provides a meaningful measurement of the program it measures and the achievement of its stated objective.
BACKGROUND
We audited the following performance indicator as stated in the SSA FY 2006 PAR:
Performance Indicator
FY 2006 Goal
FY 2006 Actual Results
Percent of individuals who do business with SSA rating the overall service as
"excellent," "very good," or "good"
83%
82%
SSA provides a range of services to the general public including the issuance
of social security cards, and payment of retirement and long-term disability
benefits. SSA provides the general public a variety of service options for conducting
business and obtaining information. These options consist of customers calling
SSA's national toll-free 800 number, calling and/or visiting local field and
hearing offices, and utilizing SSA's website. The majority of SSA's customers
prefer to conduct business by telephone, and many choose to deal with 1 of the
1,336 local Field Offices (FO) or
143 Hearing Offices (HO).
This performance indicator is linked to SSA's strategic objective to "Improve service through technology, focusing on accuracy, security and efficiency," which is linked to SSA's strategic goal "To deliver high quality, citizen-centered Service." This strategic goal is linked to one of the five government-wide goals on the President's Management Agenda, "Expanded Electronic Government" which addresses all Government agencies' ability to simplify the delivery of high quality service while reducing the cost of delivering those services.
To assess the indicator's progress in meeting this objective and goal, SSA's Office of Quality Performance (OQP) annually conducts a series of tracking surveys to measure a customer's satisfaction with his or her last contact with SSA. SSA conducts three surveys: the 800-Number Caller Survey, the FO Caller Survey, and the Office Visitor (OV) Survey. OQP uses a 6-point rating scale ranging from "excellent" to "very poor." To report the final overall service satisfaction, OQP combines the three customer satisfaction surveys, weighting each survey by the customer universe it represents.
For additional details on the surveys and reporting process, refer to the flowcharts
in Appendix C.
Indicator Background
800-Number Caller Survey
The 800-Number Caller Survey is conducted with a sample of individuals who received customer service using the 800-number during the month of March. When a customer calls the toll free number, the Automatic Number Identifier (ANI) system collects data about the call, i.e., phone number, date, time, and duration. The sample is randomly drawn on a biweekly basis from data supplied by the telephone company over the course of the sample month. OQP excludes calls from blocked numbers, businesses, pay phones, or other locations where the customer is not identifiable. OQP contracts with a firm to call the customer, conduct the survey, and compile the responses. OQP then analyzes and reports the results.
Field Office Caller Survey
The FO Caller Survey is conducted with a sample of individuals who called 1 of approximately 52 randomly selected FOs during the month of April. During the survey period, an OQP contractor installs caller-ID equipment in sampled FOs on its main incoming phone lines. Each FO's caller-ID system records the date and time of contact, length of call, and phone number of the caller. From the caller population accumulated, OQP makes a biweekly, random selection of callers over the course of the sample month. OQP excludes calls from blocked numbers, businesses, pay phones, or other locations where the customer is not identifiable. OQP contracts with a firm to call the customer, conduct the survey, and compile the responses. OQP then analyzes and reports the results.
Office Visitor Survey
The OV Survey is conducted with a sample of individuals who visit either 1 of approximately 52 FOs or 13 HOs randomly selected over the course of an 8-week period from July to September. FOs are selected by region, with all 10 SSA regions represented. Each sampled office participates for 1 week of the sample period. The FO or HO records each customer's name, address, telephone number, and reason for the visit and forwards this information electronically to OQP daily. Every 2 to 3 days, OQP selects a random sample of customers to participate in the survey. A contractor mails the survey to the selected customers. Customers are asked to return the survey directly to OQP, for analysis and reporting of results.
Combination of the Three Survey Results
Once all three of the OQP annual surveys have been completed, OQP combines the three individual survey results (each weighed by the customer universe) and reports this information to the Office of Strategic Management (OSM). OSM in turn reports and publishes the survey results in SSA's PAR.
For additional detail on the surveys and reporting process, refer to the flowcharts in Appendix C.
RESULTS OF REVIEW
Our assessment of the indicator included in this report did not identify any exceptions related to the accuracy or presentation of information included in the PAR. We were able to recalculate and verify SSA's overall customer satisfaction percentage presented in the FY 2006 PAR. We also found that SSA management improved the transparency of the FY 2006 performance results by disclosing its participation in a Government-wide survey on customer satisfaction results across multiple Federal Government agencies.
However, SSA management could improve the performance indicator's linkage to
the strategic objective. In addition, we did identify issues with internal controls
related to documentation of the survey processes. We also concluded that a potential
bias exists in the customer satisfaction survey sample selection process and
response rate. It should be noted that these issues were initially identified
in a previous audit report. We did not identify any new issues during this year's
audit.
Findings
Internal Controls
Although we were able to recalculate and verify the result reported in the PAR, we found that SSA did not have sufficient and current documentation of the performance indicator calculation process. During the prior audit of this indicator, Performance Indicator Audit: Overall Service Rating (A-15-05-15118), we noted that several calculations were not completely and accurately documented. During the current audit, we found that this issue had not been addressed. The absence of " well-defined documentation processes that contain an audit trail, verifiable results, and specify document retention periods so that someone not connected with the procedures can understand the assessment process " did not comply with standards defined by the Office of Management and Budget (OMB) Circular A-123, Management's Responsibility for Internal Control.
A potential bias exists within SSA's survey results. This potential bias exists
because SSA conducts the surveys from a limited time period (e.g., 1 or 2 months
during the FY) and then projects the results from these surveys to the entire
FY. The results from the time period not sampled may have been substantially
different from the results of the time period sampled. SSA did not provide analysis
to substantiate the assertion that the results from the sample period were representative
of the months not included within the survey.
Furthermore, an additional potential bias exists as the survey does not seek
to compensate or adjust for non-responsive surveys. Non-response in surveys
can often lead to biased results if respondents and non-respondents have different
views on customer satisfaction. Typically, organizations will periodically evaluate
the effect of non-response, such as statistical tests between respondents and
non-respondents for characteristics believed to be correlated with customer
satisfaction.
Performance Indicator Meaningfulness
This indicator measures the service satisfaction of SSA customers who called the 800 number, called a FO, or visited a FO or HO. During the prior audit of this indicator, Performance Indicator Audit: Overall Service Rating (A-15-05-15118), we noted that the performance indicator did not clearly support SSA's strategic objective to "Improve service through technology, focusing on accuracy, security and efficiency" since there was no linkage defined between the performance indicator and the improvement of service through technology. During the current audit, we found that this issue had not been addressed. We do however believe that this performance indicator supports the strategic goal "to deliver high quality, citizen-centered service."
CONCLUSION AND RECOMMENDATIONS
We reaffirm our previous recommendations noted in the prior audit of this indicator. We continue to recommend that SSA take action to address these recommendations. (Refer to Appendix E for the prior audit recommendations.)
In addition, we recommend SSA:
1. Increase the period of time covered through these surveys or gather and maintain evidence which supports the assertion that the results of these surveys would not vary if additional survey periods were used. In addition, if SSA continues to use the current sampling parameters, the Agency should ensure that the limited sampling periods are clearly articulated within the description of this performance measure in the PAR.
2. Periodically analyze respondent and non-respondent characteristics and their respective representation of the population. Segments of the population lacking coverage may be an indication of non-response bias. Model-based techniques using information gathered on respondents may be helpful to assess the impact of non-response.
AGENCY COMMENTS
SSA partially agreed with recommendation 1. SSA agreed to ensure the limited sampling period is articulated in the FY 2007 PAR description. However, SSA did not agree with increasing the period of time covered by the surveys. SSA stated that the benefit of conducting surveys more frequently is not sufficient to justify the time and expense to administer the surveys.
SSA agreed in theory with recommendation 2. SSA agreed to conduct analysis on the characteristics of the Office Visitor Survey including the reason for the visit. However, SSA stated there is little meaningful data available regarding characteristics of the sample population that could be correlated to customer satisfaction.
The full text of SSA's comments can be found in Appendix F.
PWC RESPONSE
In response to SSA's comments on recommendation 1, we believe SSA should continue to review and analyze the survey results for trends. If significant trends are noted, SSA should conduct surveys more frequently, and consider performing abbreviated surveys on smaller samples.
For recommendation 2, we believe that SSA should use the results of the Office Visitor Survey analysis to determine other characteristics that may be related to customer satisfaction. If limited sample data hinders this analysis, SSA may want to consider trying to capture additional characteristics for its survey data. In addition, SSA should consider the effect of nonresponse for characteristics believed to be correlated with customer satisfaction.
Appendices
APPENDIX A - Acronyms
APPENDIX B - Scope and Methodology
APPENDIX C - Process Flowcharts
APPENDIX D - Statistical Methodology
APPENDIX E - Prior Audit Recommendations
APPENDIX F - Agency Comments
Appendix A
Acronyms
ANI Automatic Number Identifier
CATI Computer Assisted Telephone Interviewing
FO Field Office
FY Fiscal Year
GAO Government Accountability Office
GPRA Government Performance and Results Act
HO Hearing Office
OMB Office of Management and Budget
OQP Office of Quality Performance
OSM Office of Strategic Management
OTSO Office of Telecommunications and Systems Operations
OV Office Visitor
PAR Performance and Accountability Report
SSA Social Security Administration
U.S.C. United States Code
Appendix B
Scope and Methodology
We updated our understanding of the Social Security Administration's (SSA) Government
Performance and Results Act (GPRA) processes. This was completed through research
and questions of SSA management. We also requested SSA to provide various documents
regarding the specific programs being measured as well as the specific measurement
used to assess the effectiveness and efficiency of the related program.
Through inquiry, observation, and other substantive testing, including testing of source documentation, we performed the following:
Reviewed prior SSA, Office of the Inspector General and other reports related
to SSA's GPRA performance and related information systems.
Reviewed applicable laws, regulations and SSA policy.
Met with the appropriate SSA personnel to confirm our understanding of the performance
indicator.
Flowcharted the process. (See Appendix C).
Tested key controls related to manual or basic computerized processes (e.g.,
spreadsheets, databases, etc.).
Conducted and evaluated tests of the automated and manual controls within and
surrounding each of the critical applications to determine whether the tested
controls were adequate to provide and maintain reliable data to be used when
measuring the specific indicator.
Identified attributes, rules, and assumptions for each defined data element
or source document.
Recalculated the metric or algorithm of the performance indicator to ensure
mathematical accuracy.
We assessed the completeness and accuracy of the data to determine the data's
reliability as it pertains to the objectives of the audit.
As part of this audit, we documented our understanding, as conveyed to us by Agency personnel, of the alignment of the Agency's mission, goals, objectives, processes, and related performance indicators. We analyzed how these processes interacted with related processes within SSA and the existing measurement systems. Our understanding of the Agency's mission, goals, objectives, and processes were used to determine if the performance indicator appeared to be valid and appropriate given our understanding of SSA's mission, goals, objectives and processes.
We followed all performance audit standards in accordance with generally accepted government auditing standards.
In addition to these steps, we specifically performed the following to test the indicator included in this report:
Assessed the sample selection methodology, including the creation of the sample
frame.
Recalculated the survey results, including the survey weights, for each of the
three types of performance surveys.
Recalculated the combined survey results for the SSA customer satisfaction performance
indicator.
Tested key controls over the Blaise software used to calculate the Office Visitor
Survey results.
Appendix C
Flowcharts
800-Number Caller Survey - Process Flowchart C-1
Field Office Caller Survey - Process Flowchart C-3
Office Visitor Survey - Process Flowchart C-5
Survey Calculation - Process Flowchart C-7
Customer Service Survey - 800-Number Caller:
The customer calls the Social Security Administration (SSA) 800-Number.
The Automatic Number Identifier (ANI) system records the customer data.
Contractor furnishes the Office of Quality Performance (OQP) with the ANI data.
OQP selects the completed calls within the sampling period from the ANI data.
A completed call is a call where the customer has selected to speak with a SSA
representative or selected an option from the automated menu.
OQP selects eligible calls from the population of completed calls. An eligible
call is a call made between 7 a.m. and 7 p.m. during the caller's local time,
originating from a phone number that made fewer than 100 calls to SSA that day,
and was made during the survey sample period. During Fiscal Year 2006 the hours
for an eligible call were expanded to the hours between 7 a.m. and Midnight
EST.
OQP selects a random sample of callers to participate in the survey from the
list of eligible calls.
OQP provides the contractor with the survey questions.
The contractor converts the questions into Computer Assisted Telephone Interviewing
(CATI) software.
OQP tests and validates the converted questionnaire.
OQP sends an electronic file with the selected customers' information to the
contractor.
The contractor administers the survey.
The contractor compiles survey responses and sends them electronically to OQP.
OQP analyzes the final results.
OQP applies the individual survey weight to the sample data and calculates the
individual survey results.
Customer Service Survey - FO Caller:
OQP selects the FOs to participate in the survey.
The contractor installs caller-id equipment in each of the sampled FOs to capture
call data.
Customers call the FOs.
The telephone contractor downloads and extracts customer information from the
caller-id equipment.
The contractor provides an electronic file of the caller information to OQP.
OQP extracts the data from the electronic file.
OQP selects a sample of eligible FO callers to participate in the survey.
OQP provides the contractor with the survey questions.
The contractor converts the questions into CATI software format.
OQP remotely tests and validates the converted questionnaire.
OQP sends an electronic file with the selected customers' information to the
contractor.
The contractor administers the survey.
The contractor compiles survey responses and sends them electronically to OQP.
OQP analyzes the final results.
OQP applies the individual survey weight to the sample data and calculates the
individual survey results.
Customer Service Survey - Office Visitor:
OQP selects a random sample of FOs and HOs to participate in the survey.
OQP notifies the FO and HO of their selection to participate in the survey.
The customer visits the FO or HO.
The FO or HO enters the customer's information into a tracking system when the
customer checks in at the reception desk.
The FO or HO sends the electronic list of customers and their information to
OQP.
OQP selects a random sample of customers to participate in the mailed survey.
OQP electronically sends the names and addresses of selected customers to the
contractor.
The contractor administers the survey via mail.
The customer returns the survey to OQP after completion.
OQP date stamps and enters the survey responses into Blaise.
OQP reviews the information entered into Blaise for completion.
OQP analyzes the final results.
OQP then extracts the results to a spreadsheet for tabulation and applies the
individual survey weight to the sample data and calculates the individual survey
results.
Customer Service Survey - Calculation of Overall Survey results:
OQP analyzes the individual survey results.
OQP applies individual survey weights to the sample data for each individual
survey and calculates the final individual survey results.
OQP analyzes these final results.
OQP combines the three individual survey results weighted by customer universe
to calculate the overall customer satisfaction survey results.
OQP writes and publishes a report on customer satisfaction and distributes the
report throughout SSA.
OQP reports customer satisfaction for the performance indicator to OSM.
Appendix D
Statistical Appendix
PricewaterhouseCoopers reviewed this performance indicator during the Fiscal
Year (FY) 2004 Government Performance and Results Act (GPRA) audit. During the
FY 2006 GPRA audit, we found no significant changes to the sampling and estimation
models used to calculate this performance indicator. As such, our previous assessment
of the sampling and estimation models used by Social Security Administration
(SSA) management in FY 2004 is consistent with the sampling and estimation models
used in FY 2006. Our assessment of the sampling and estimation models is included
below.
Methodology for the Sample Selection of Survey Participants
800 Number Caller Survey
In FY 2006, the Office of Quality Assurance and Performance Assessment (OQP)
randomly selected a sample of 2,600 phone numbers of customers who called SSA's
800 number to participate in the 800-Number Caller Survey. The sample was selected
from a population of all phone numbers of callers that received customer service
through the 800 number during the month of March. As the samples were drawn,
OQP excluded phone numbers sampled previously during the month, phone numbers
that reached the 800 number 100 times or more within a single day and calls
made between 7pm and 7am local time. Calls from blocked numbers, businesses,
pay phones, or other locations where the customer is not identifiable were excluded
during the interview process. Service to SSA's 800 Number callers is provided
by agents in teleservice centers and units in SSA's program service centers.
The Field Offices (FO) and Hearing Offices (HO) do not provide customer service
via the 800-number, therefore OQP does not select FOs or HOs to participate
in this survey.
Field Office Caller Survey
Each year, OQP selects a sample of 110 offices to participate in its FO Telephone
Service Evaluation, a review in which calls are monitored from remote locations
to ensure accuracy. OQP selects the sample without replacement from the current
population of eligible FOs. Eligible FOs are offices that have not been selected
in previous years.
From this initial sample of 110 FOs, OQP selects a sub-sample of offices to participate in the FO Caller Survey. The sub-sample is a systematic sample from the parent sample, after sorting the parent sample by telephone system, region, and area. Thus, the sample of offices included for the FO Caller Survey has a distribution of telephone system type, region, and area similar to the parent FO Telephone Service Evaluation sample. In FY 2006, the sub-sample of offices selected for the FO Caller Survey consisted of 75 of the 110 offices from the FO Telephone Service Evaluation. Because of phone system limitations, not all sub-sampled offices can be equipped to enable identification of phone numbers for survey sample selection purposes; in FY 2006, 52 of the 75 offices initially sub-sampled could be equipped as necessary. From the population of phone numbers identified in the sub-sampled offices, OQP selected a sample on a biweekly basis during the month of April of 2,600 phone numbers of field office callers to participate in the survey. OQP excluded from the sample all phone numbers that appeared to be invalid and phone numbers sampled in the previous biweekly selections for the FY 2006 survey. In addition, during the interview process, calls from blocked numbers, businesses, pay phones, or other locations where the customer was not identifiable were excluded, as were personal calls to field office employees.
The FO Caller Survey sample is drawn from all incoming lines an office has for public use. Although all FOs handle the same types of business over the phone, depending on the size of the office, some have one type of line and some have two types available to receive incoming calls. The sample selection process takes into account whether a particular office has one or two types of public lines available to ensure that all incoming phone numbers have an opportunity for selection. An approximately equal number of calls is sampled from each FO.
Office Visit Survey
OQP performs the Office Visitor Survey on an annual basis. The FY 2006 Office
Visitor Survey was conducted at 52 FOs and 13 HOs for the year. For each survey
execution, the offices are selected without replacement from the current list
of eligible FOs or HOs. Eligible offices are those that have not been selected
in previous years.
While HOs are selected as a simple random sample, OQP selects FOs in a stratified random sample by region. The number of FOs from each region is proportional to the number of FOs in that region.
Each sampled office participates for 1 week of the 8-week survey period, which extends from July to September. During their designated week, sampled offices submit identifying information to OQP every day for each individual who visited. OQP selects a random sample of 5,000 customers at the rate of 130 per day from the reported population to participate in the Office Visitor Survey. Customer records without a valid address and visitors already sampled for the current survey in a previous daily selection were excluded from the sample selection. Office visitors of offices that were not part of the initial field office and hearing office sample did not have an opportunity for selection.
Calculation of the Survey Results
When determining the estimated percentage of Excellent (6) responses for overall
satisfaction, SSA summed the weights for the Excellent responses and divided
by the total weight of all valid responses.
where i is the value of the response (ranging from 1-6)
j is the number of the respondents and
Wij is the weight of the jth person whose response was i
When determining the estimated percentage of Excellent/Very Good/Good (6/5/4)
responses for overall satisfaction, SSA summed the weights for the Excellent/Very
Good/Good responses and divided by the total weight of all valid responses.
where i is the value of the response (ranging from 1-6)
j is the number of the respondents and
Wij is the weight of the jth person whose response was i
Calculation of the Combined Overall Survey Result
SSA combined the results from the three surveys to derive one overall measure
of customer satisfaction. The percentage of Excellent responses for a survey
(or Excellent/Very Good/Good) was multiplied by its projected universe for each
of the universes: 800 Number Caller, FO Caller, and Office Visitors (FO Visitor
and HO Visitor). These numbers were then added together and divided by the total
universe.
where k represents either 800 Number Caller, FO Caller, or Office Visitors (FO
Visitor and HO Visitor).
Rk is the percentage of Excellent (or E/VG/G) responses in the kth universe
Uk is the population total for the kth universe
Appendix E
Prior Audit Recommendations
During a prior audit, "Performance Indicator Audit: Overall Service Rating
(A-15-05-15118)," we made seven recommendations to the Social Security
Administration (SSA) for the performance indicator "Percent of People who
do business with SSA rating the overall services as "'excellent,' 'very
good,' and 'good.'" During the current audit, we found that the following
two recommendations had not been addressed:
1. Improve documentation of policies and procedures. The documentation should be complete and accurate to define all of the steps ultimately used to report performance results in the Performance Accountability Report. Estimates and calculations used to generate the survey results should be clearly documented and supported to readily enable an independent assessment. SSA management should annually update and take ownership of the documentation in compliance with Office of Management and Budget (OMB) A-123 guidance.
2. Provide a direct linkage of the performance indicator to the Agency's strategic goals and objectives. SSA should expressly comply with criteria established by Government Performance and Results Act and OMB PART guidance, which requires performance indicators be linked to the Agency's relevant strategic goal and objective. If a direct linkage can not be supported, the Agency should disclose the basis for selecting the indicator and why it diverts from established criteria. SSA should further consider indicators which include satisfaction from internet users and other technology investments that support the strategic objective.
During the fiscal year (FY) 2006 audit, SSA management stated that it will provide a direct linkage to this indicator's strategic goals and objectives. Per SSA, this linkage will be available in the FY 2007 Performance and Accountability Report that is scheduled to be issued in November 2007.
Appendix F
Agency Comments
MEMORANDUM
Date: September 10, 2007
To: Patrick P. O'Carroll, Jr.
Inspector General
From: Larry W. Dye
Subject: Office of the Inspector General (OIG) Draft Report, "Performance
Indicator Audit: Customer Satisfaction" (A-15-07-17129)--INFORMATION
We appreciate OIG's efforts in conducting this review. Our comments on the recommendations are attached.
Please let me know if we can be of further assistance. Staff inquiries may be directed to Ms. Candace Skurnik, Director, Audit Management and Liaison Staff, on (410) 965-4636.
COMMENTS ON THE OFFICE OF THE INSPECTOR GENERAL'S (OIG) DRAFT REPORT, "PERFORMANCE INDICATOR AUDIT: CUSTOMER SATISFACTION" (A-15-07-17129)
Thank you for the opportunity to review and provide comments on this draft report. Our comments on the draft recommendations are as follows.
Recommendation 1
Increase the period of time covered through these surveys or gather and maintain evidence which supports the assertion that the results of these surveys would not vary if additional survey periods were used. In addition, if SSA continues to use the current sampling parameters, the Agency should ensure that the limited sampling periods are clearly articulated within the description of this performance measure in the Performance and Accountability Report (PAR).
Comment
We partially agree. We have taken the necessary action to ensure that the limited sampling period is clearly articulated within the description of this performance measure in the fiscal year (FY) 2007 PAR. We do not believe the benefits would be sufficient to justify the additional expenditure of resources, either in staff time or contractor funding, that would be necessary to support an increase in the period of time covered through these surveys. As we explained in our response to the FY 2005 audit recommendation, the 800 Number Caller Survey began in the late 1980's on a quarterly basis and was later reduced to a semi-annual and subsequently annual survey because the fluctuations in results were minimal and resources to conduct the surveys were constrained. Similarly, the Field Office Caller and Office Visitor Surveys were originally implemented on a semi-annual basis and later became annual surveys because, given the consistency of results, the resource expenditure could not be justified.
Recommendation 2
Periodically analyze respondent and non-respondent characteristics and their respective representation of the population. Segments of the population lacking coverage may be an indication of non-response bias. Model-based techniques using information gathered on respondents may be helpful to assess the impact of non-response.
Comment
We agree in theory. However, as we explained in our response to the FY 2005
audit recommendation, for the performance measure surveys, there is little meaningful
data available regarding characteristics of the sample population that could
be correlated to customer satisfaction. The sample data available for the 800
Number and Field Office Caller Surveys consists essentially of the incoming
telephone number and the date and time of the call and does not include any
demographic or programmatic information about the caller. Sample data for the
Office Visitor Survey does include the broadly categorized reason for the visit,
which we agree
could be a characteristic to analyze for responders and non-responders in relation
to satisfaction. We will undertake this analysis for the Office Visitor Survey
that will be used to calculate the FY 2007 performance indicator. It should
also be noted that for each of the surveys, we compare the proportions of the
various categories of nonresponse (refusal, unable to reach the caller despite
repeated attempts, etc.) to previous years to determine whether there have been
any major changes in the nature of nonresponse.
Overview of the Office of the Inspector General
The Office of the Inspector General (OIG) is comprised of our Office of Investigations
(OI), Office of Audit (OA), Office of the Chief Counsel to the Inspector General
(OCCIG), and Office of Resource Management (ORM). To ensure compliance with
policies and procedures, internal controls, and professional standards, we also
have a comprehensive Professional Responsibility and Quality Assurance program.
Office of Audit
OA conducts and/or supervises financial and performance audits of the Social
Security Administration's (SSA) programs and operations and makes recommendations
to ensure program objectives are achieved effectively and efficiently. Financial
audits assess whether SSA's financial statements fairly present SSA's financial
position, results of operations, and cash flow. Performance audits review the
economy, efficiency, and effectiveness of SSA's programs and operations. OA
also conducts short-term management and program evaluations and projects on
issues of concern to SSA, Congress, and the general public.
Office of Investigations
OI conducts and coordinates investigative activity related to fraud, waste,
abuse, and mismanagement in SSA programs and operations. This includes wrongdoing
by applicants, beneficiaries, contractors, third parties, or SSA employees performing
their official duties. This office serves as OIG liaison to the Department of
Justice on all matters relating to the investigations of SSA programs and personnel.
OI also conducts joint investigations with other Federal, State, and local law
enforcement agencies.
Office of the Chief Counsel to the Inspector General
OCCIG provides independent legal advice and counsel to the IG on various matters,
including statutes, regulations, legislation, and policy directives. OCCIG also
advises the IG on investigative procedures and techniques, as well as on legal
implications and conclusions to be drawn from audit and investigative material.
Finally, OCCIG administers the Civil Monetary Penalty program.
Office of Resource Management
ORM supports OIG by providing information resource management and systems security.
ORM also coordinates OIG's budget, procurement, telecommunications, facilities,
and human resources. In addition, ORM is the focal point for OIG's strategic
planning function and the development and implementation of performance measures
required by the Government Performance and Results Act of 1993.