Return to the USDOJ/OIG Home Page
Return to the Table of Contents

The Federal Bureau of Prisons Inmate Release Preparation and Transitional Reentry Programs

Report No. 04-16
March 2004
Office of the Inspector General


Findings and Recommendations

I. REENTRY PROGRAM COMPLETIONS

The BOP does not demonstrate that its institutions maximized the number of inmates that complete programs designed to prepare inmates for successful reentry into society.  We found that the BOP does not ensure that:  (1) institutions set realistic occupational and educational completion goals, (2) institutions are held accountable for meeting goals, (3) data for occupational and educational programs is reviewed to identify low performance, and (4) statistical data related to psychological programs and RPP performance is maintained and utilized.

As stated previously in this report, the research we reviewed related to inmate recidivism concludes that the completion of occupational, educational, psychological, and other programs during incarceration leads to a reduction in recidivism and an increase in post-release employment opportunities. We reviewed the types of programs offered by the BOP and found that the 82 institutions included in our audit offer a full range of occupational, educational, psychological, and other programs.  We compared the programs offered by the BOP to the research and concluded that the BOP offers the types of programs that have been shown in these studies to better prepare inmates for successful reentry into society.  Therefore, our audit focused on whether the BOP ensures that each of its institutions maximize the number inmates that successfully complete its reentry programs.

To determine the process by which the BOP monitors its reentry programs, we conducted site visits at institutions of each security level, three regional offices, and the BOP Central Office.  At the institutions, we reviewed inmate files to determine whether the unit teams assessed inmate reentry program needs and monitored inmate participation in reentry programs.  We found that at the institutions we visited, release planning was continuous from initial classification through final release as documented in inmate files.  Additionally, our review of the inmate files revealed that the BOP staff strongly encouraged participation in its reentry programs. At the BOP Central Office and regional offices, we identified the process by which the BOP monitors reentry program performance at its institutions.  We found that the BOP relies on its program review process and staff assistance visits conducted by regional office officials to monitor reentry program performance.  We reviewed program review reports and staff assistance reports prepared by the BOP and found that the reports generally focused on compliance with BOP policies rather than actual program performance.

The BOP has a process that requires each institution to establish annual program completion goals.  If this process is used effectively, it could ensure that each of its institutions maximize the number of inmates that participate in and complete occupational and educational programs.  Each fiscal year the institution’s Supervisor of Education is required to report on achievements towards the occupational and educational program goals in an Annual Program Report for Education and Recreation Services.  The BOP does not require its institutions to establish goals and outcomes for psychological programs or the RPP.

According to BOP officials, since 1998 the BOP’s Education Branch has systematically been working on establishing an effective strategic management process for monitoring and evaluating education program outcomes.  This included the development of draft outcome based program review guidelines, that were recently issued in December 2003; directing regional staff to negotiate education completion goals for FY 2003 with the institutions; and the development of a Quarterly Performance Indicator Report to provide detailed educational program data to the institutions for verification which became operational in February 2002.

For each of the 82 institutions included in our audit, we reviewed the institution’s Annual Program Report for Education and Recreation Services for FY 1999 through FY 2002 to determine whether the institutions met their occupational and educational program completion goals.  We analyzed the annual reported completion goals and the institution’s outcomes for occupational, GED, ESL, ACE, and parenting programs to determine whether stated goals were achieved and, as a result, whether the BOP as a whole was able to maximize the number of inmates who completed these reentry programs.  For FY 2002, we also reviewed the BOP’s goals and outcomes for the overall percentage of inmates enrolled in one or more educational programs during the year.

We also compared the number of inmates who completed occupational and educational programs to the number of inmates who eventually withdrew from the programs.  We calculated a program performance factor, based on the number of completions divided by the number of completions plus total withdrawals for each fiscal year.  (We used completion and withdrawal data that was reported in the BOP’s Key Indicators system for educational and occupational programs to calculate the performance factor.)  In our judgment, this comparison is an important indicator of an institution’s success and can be used to compare program performance among institutions.  We were unable to analyze the percentage of inmates who completed the BOP’s psychological programs and the RPP because the BOP does not maintain completion and withdrawal statistics for these programs in its Key Indicators system.  The results of our performance factor calculations and comparison are described later in this report.

Institution’s Annual Goals and Outcomes

As stated previously, the BOP stated that it has been working to establish an effective strategic management process for monitoring and evaluating occupational and educational goals and outcomes since 1998.  However, we found that the BOP has not implemented a standardized process followed by all institutions to establish occupational and educational completion goals.  Our review of the goals and outcomes reported in each institution’s Annual Program Report for Education and Recreation Services for FY 1999 through FY 2002 revealed that the institutions did not always set realistic occupational, GED, ACE, and parenting goals.  The institutions, in conjunction with the BOP regional offices, establish their own completion goals.  Our review revealed that the goal setting process is inadequate and inconsistent, resulting in institutions setting their goals too high or too low when compared to the prior year’s performance.  We found instances where institutions consistently exceeded their goals for each fiscal year by a significant margin, yet failed to establish goals for the following fiscal that adequately reflected prior years outcomes.

  • One institution with an occupational completion goal of 35 inmates and an actual outcome of 103 inmates completing the program in FY 2001, decreased its occupational completion goal to 20 inmates in FY 2002, but had an actual outcome of 111 inmates completing the program.
  • One institution with an ESL completion goal of 60 inmates and an actual outcome of 74 inmates completing the program in FY 2001, kept the same ESL completion goal of 60 inmates in FY 2002, and had an actual outcome of 72 inmates completing the program.  The same ESL completion goal of 60 inmates was also established for FY 2003.
  • One institution with an ACE completion goal of 65 inmates and an actual outcome of 192 inmates completing the program in FY 2001, only increased its ACE completion goal to 120 inmates in FY 2002, and had an actual outcome of 293 inmates completing the program.

Conversely, we found institutions consistently did not meet their goals by a significant margin, yet failed to establish goals for the following fiscal that reflected prior years outcomes.  For example, we found the following instances where the established goals appear inconsistent with the prior year’s performance and the annual report did not include an adequate explanation for the increase or decrease in the goals from the prior year.

  • One institution with a GED completion goal of 150 inmates had an actual outcome of 98 inmates completing the program in FY 1999, but increased its GED completion goal to 240 inmates in FY 2000, and had an actual outcome of 161 inmates completing the program.
  • One institution with an ACE completion goal of 88 inmates had an actual outcome of 49 inmates completing the program in FY 2000, but increased its FY 2001 ACE goal to 132 completions.

Additionally, we found a lack of consistency in setting goals between institutions with similar security levels and populations.  These institutions had set very different goals for the fiscal year.  Further, we found that the program completion goals are stated as the number of completions rather than a percentage of completions, which does not take into account the number of enrollments or the effect the inmate population could have when comparing among institutions.

During our audit, BOP officials we interviewed agreed that the BOP should standardize the goal setting process among institutions to enhance consistency based on security level and population.  In our judgment, the factors considered in setting program goals should include not only the security level of the institution and the inmate population, but also other factors such as classroom size, number of classes, number of instructors, whether the institution has a wait list for its program, and historical program completion data.

Although our audit concluded that the BOP’s goal setting process is inadequate and inconsistent, we found that the completion goals were the only available source of data within the BOP we could use to determine whether the BOP institutions maximize the number of inmates that complete occupational and educational reentry programs.  Therefore, we analyzed the goals and outcomes reported for each institutions’ occupational, GED, ESL, ACE, and parenting programs using this information.  For FY 2002, we were able to also use the National Strategic Plan goal and outcome for the percentage of inmates enrolled in one or more educational programs during the fiscal year at each institution, which was only included in the annual report for FY 2002.  The details of our analysis of each institution’s completion goals and outcomes for FY 2001 and FY 2002 are included in Appendix IV for occupational programs, Appendix V for GED programs (FY 2001 only), Appendix VI for ESL programs, Appendix VII for ACE programs, and Appendix VIII for parenting programs.

Overall, based on the BOP’s reported information, we found that during FY 1999 through FY 2002, a large percentage of the 82 institutions included in our audit failed to meet their annual completion goals established in the Annual Program Report for Education and Recreation Services for occupational, GED, ESL, ACE, and parenting programs, as shown in the following chart.

Percentage of Institutions Failing to Meet
Completions Goals by Program
FY 1999 through FY 2002
  Occupational GED ESL ACE Parenting
1999 38% 49% 55% 38% 34%
2000 42% 51% 58% 31% 53%
2001 51% 39% 66% 47% 49%
2002 64% N/A40 69% 45% 47%

We also found that 46 percent of the institutions we looked at failed to meet their stated National Strategic Plan goal for the percentage of inmates enrolled in one or more education programs in FY 2002 (Appendix IX).

To determine which factors may have contributed to the large percentages of institutions not meeting their occupational and educational goals, we sent questionnaires to 24 of the 82 institutions included in our audit.  We selected our sample based on institution security level and inmate population as of the end of FY 2002.  For each of the four security levels (minimum, low, medium, and high), we selected six institutions consisting of the two institutions with the largest inmate populations, the two institutions with the lowest inmate populations, and the two institutions with inmate populations in the middle range.  We received a response to our questionnaire from all 24 institutions included in our sample.  Based on the responses to our questions, we found that institutions that met their completion goals cited the following factors:

  • The education department at the institution was fully staffed or received increased staffing during the years that the goals were met or exceeded.
  • The institution added additional classes to meet its goals.
  • The inmate population at the institution increased resulting in increased enrollments.
  • The institution increased enrollments by shortening weekly class time allowing more inmates to be enrolled or increased class size.
  • Institution staff encouraged inmates not to withdraw from the reentry programs.
  • Institution staff screened inmates prior to enrollment in voluntary programs to ensure that they have the ability and are willing to commit to completing the courses.
  • The unit team assisted in providing inmates with information about the program benefits and encouraged participation.

Based on the responses to our questions, we found that institutions that did not meet their completion goals cited the following factors:

  • The education department was not fully staffed.
  • The Supervisor of Education did not receive GED testing authorization for a significant period of time; as a result, no testing was performed during the period.
  • The institution experienced a decrease in the inmate population or the number of inmates with GED needs.
  • The institution’s GED testing license was suspended because of a GED testing security breach.
  • The institution experienced prolonged periods of lock-down.
  • The institution reduced in the number of classes offered.
  • Inmates did not complete the course because the course was too long.
  • Inmates were transferred prior to completing the program.

Overall, we noted from the responses that institutions that met their goals appear to promote a proactive management approach (i.e., strongly encouraging inmate participation, screening applicants, and unit team involvement) and effectively used their available resources (i.e., shortening class time and adding more classes).  Those institutions that did not meet their goals attributed their failure to inadequate staffing, an insufficient number of classes offered, and factors outside the control of staff such as inmate population, inmate transfers, and prolonged periods of lock-down.  However, in our judgment some of these adverse factors could have been recognized and mitigated if, as noted below, the BOP had a process in place to determine why goals were not met and timely action taken to remedy poor performance.

We conducted interviews with BOP officials related to the large percentage of institutions that failed to meet their annual occupational and educational goals during FY 1999 through FY 2002.  BOP officials stated that currently institutions are not held accountable for failing to meet their goals.  BOP officials also stated the Central office has been moving towards performance monitoring since 1998.  Currently the institutions are only held accountable for the program review guidelines, which focus on compliance with BOP policy rather than performance. 

Despite the fact that the BOP’s annual report process was in place and a large number of institutions failed to meet one or more of their annual occupational or educational goals during FY 1999 through FY 2002, the BOP did not have a mechanism to assess the information included in the required reports, hold institutions accountable, or redirect resources to meet emerging deficiencies.  Institutions were not required to develop or implement corrective action plans to remedy performance and ensure that their goals are met in the future.  In our judgment, the BOP’s failure to hold institutions accountable for low performance contributed to its institutions not meeting their completion goals. 

During our audit, the BOP issued a draft program review guideline that includes a review of the institution’s performance towards meeting its occupational, GED, and percent participation program goals.  We noted that the BOP’s program reviews generally only occur every 3 years for institutions that receive a superior or good program review rating, every 2 years for institutions that receive an acceptable rating, and every 18 months for institutions that receive a deficient rating.  As a result, it could be up to 3 years before corrective actions are taken for institutions that failed to meet their annual occupational and educational program goals.  Further, the draft program review guidelines do not include a review of the ESL, ACE, or parenting program goals.  In our judgment, the draft program review guidelines are not sufficient to ensure that corrective actions are implemented timely.  We recommend that the BOP establish and implement an annual process to ensure that institutions are held accountable for meeting their occupational and educational goals and that corrective action plans are developed to remedy program performance and ensure that goals are met in future fiscal years.

We also noted during our audit that the FY 2001 Annual Program Report for Education and Recreation Services only includes the occupational and educational outcomes reported for FY 2000 and FY 2001, and the projected outcomes for FY 2002.  The FY 2001 report does not include the FY 2001 occupational and educational goals for comparison with the outcomes.  The FY 2001 goals can only be found in the FY 2000 Annual Program Report for Education and Recreation Services.  Similarly, the FY 2002 Annual Program Report for Education and Recreation Services does not include the FY 2002 occupational and educational goals for comparison with the outcomes.  We recommend that the Annual Program Report for Education and Recreation Services be revised to include both the goals and outcomes for the reported fiscal year, so that the BOP can readily determine whether its institutions meet their completion goals.

Percentage of Reentry Program Completions

In addition to reviewing each institution’s occupational and educational goals and outcomes, we also compared the number of inmates who completed occupational and educational programs to the number of inmates who eventually withdrew from the programs.  We calculated a program performance factor based on the number of completions divided by the number of completions plus total withdrawals for each fiscal year.41  In our judgment, this comparison is an important indicator of an institution’s success and can be used to compare program performance among institutions.

According to BOP officials, the security level of an institution is one of the factors that can have an impact on occupational and educational program performance.  For example, BOP officials stated that inmates at high security institutions are more likely to refuse programs because they are generally serving longer sentences than inmates at minimum, low, and medium security institutions.  As a result, we also analyzed the performance factor by security level and determined the range in performance among the institutions within the same security level.

The BOP offers two types of occupational programs – occupational technical programs and occupational vocational programs (which includes apprenticeship programs).  We calculated the performance factor for both the occupational technical and vocational programs during FY 1999 through FY 2002, based on completion and withdrawal data reported in the BOP’s Key Indicators system for each of the institutions included in our audit.  The details of our calculations and analysis of each institution’s performance factors for FY 2001 and FY 2002 are included in Appendix IX for occupational technical programs and Appendix X for occupational vocational programs.

Based on the occupational technical and vocational performance factors, we found that during FY 1999 through FY 2002 there was a significant range in the percentage of inmates that completed occupational technical and vocational programs for each security level, as shown below.

Range in Performance Factors
Among Minimum Security Institutions
FY 1999 through FY 2002
  Occupational Technical Occupational Vocational
FY Institutions
Reporting
Performance
Factor Range
Institutions
Reporting
Performance
Factor Range
1999 2 100% 10 0% - 100%
2000 1 100% 10 25% - 100%
2001 1 100% 11 50% - 100%
2002 6 75% - 100% 11 0% - 98%
Source:  The OIG analysis of the completions and withdrawals for occupational technical and occupational vocational programs reported for each minimum security institution in the BOP’s Key Indicators system.
Range in Performance Factors
Among Low Security Institutions
FY 1999 through FY 2002
  Occupational Technical Occupational Vocational
FY Institutions
Reporting
Performance
Factor Range
Institutions
Reporting
Performance
Factor Range
1999 13 0% - 100% 20 8% - 98%
2000 15 0% - 98% 22 0% - 100%
2001 15 0% - 100% 24 0% - 100%
2002 18 6% - 100% 24 0% - 100%
Source:  The OIG analysis of the completions and withdrawals for occupational technical and occupational vocational programs reported for each low security institution in the BOP’s Key Indicators system.
Range in Performance Factors
Among Medium Security Institutions
FY 1999 through FY 2002
  Occupational Technical Occupational Vocational
FY Institutions
Reporting
Performance
Factor Range
Institutions
Reporting
Performance
Factor Range
1999 13 0% - 100% 30 0% - 100%
2000 13 0% - 99% 30 17% - 100%
2001 12 55% - 100% 32 0% - 100%
2002 23 0% - 100% 31 25% - 100%
Source:  The OIG analysis of the completions and withdrawals for occupational technical and occupational vocational programs reported for each medium security institution in the BOP’s Key Indicators system.
Range in Performance Factors
Among High Security Institutions
FY 1999 through FY 2002
  Occupational Technical Occupational Vocational
FY Institutions
Reporting
Performance
Factor Range
Institutions
Reporting
Performance
Factor Range
1999 2 79% - 97% 9 0% - 100%
2000 3 75% - 99% 8 0% - 100%
2001 5 0% - 100% 9 0% - 97%
2002 9 0% - 100% 11 0% - 98%
Source:  The OIG analysis of the completions and withdrawals for occupational technical and occupational vocational programs reported for each high security institution in the BOP’s Key Indicators system.

As shown in the previous tables, there are significant ranges in the performance factors among institutions of the same security level.  The following charts further demonstrate the wide range in performance factors among institutions of the same security level for FY 2002.

Occupational Technical FY 2002 Performance Factor - Minimum Security.  A text version of this data is in Appendix 10.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational technical programs reported for each minimum security institution in the BOP’s Key Indicators system.


Occupational Vocational FY 2002 Performance Factor - Minimum Security.  A text version of this data is in Appendix 11.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational vocational programs reported for each minimum security institution in the BOP’s Key Indicators system.


Occupational Technical FY 2002 Performance Factor - Low Security.  A text version of this data is in Appendix 10.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational technical programs reported for each low security institution in the BOP’s Key Indicators system.


Occupational Vocational FY 2002 Performance Factor - Low Security.  A text version of this data is in Appendix 11.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational vocational programs reported for each low security institution in the BOP’s Key Indicators system.


Occupational Technical FY 2002 Performance Factor - Medium Security.  A text version of this data is in Appendix 11.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational technical programs reported for each medium security institution in the BOP’s Key Indicators system.


Occupational Vocational FY 2002 Performance Factor - Medium Security.  A text version of this data is in Appendix 11.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational vocational programs reported for each medium security institution in the BOP’s Key Indicators system.


Occupational Technical FY 2002 Performance Factor - High Security.  A text version of this data is in Appendix 11.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational technical programs reported for each high security institution in the BOP’s Key Indicators system.


Occupational Vocational FY 2002 Performance Factor - High Security.  A text version of this data is in Appendix 11.  Click the chart for direct access to the appendix. Data is listed in the last column, performance factor, under FY 2002.
Source:  The OIG analysis of the FY 2002 performance factor for occupational vocational programs reported for each high security institution in the BOP’s Key Indicators system.

Despite the wide range in the percentage of inmates that completed occupational technical and vocational programs at institutions of the same security level, we found that similar to the occupational and educational completion goals and outcomes the BOP did not have a formal process for reviewing performance data at each institution to identify low performance.

To determine the factors that may have contributed to institutions with a low, average, or high performance factor for its occupational technical and vocational programs for FY 1999 through FY 2001, we included in our questionnaires to 24 of the 82 institutions questions pertaining to the performance factors.  Based on the questionnaires, we found that institutions that had a high performance factor for occupational technical and vocational programs most commonly cited the following factors:

  • Screening of inmates prior to enrollment to ensure that they have the ability and are willing to commit to completing the course.
  • Support from the unit team to ensure that inmates are not transferred prior to completing the course.
  • Shortening weekly class time to allow more inmates to complete the program.
  • Expanding the occupational program.
  • Offering programs based on inmate interests.

Based on the responses to our questions, we found that institutions that had a low performance factor for occupational technical and occupational vocational programs most commonly cited the following factors:

  • Occupational programs were not fully staffed.
  • Programs were eliminated because of contract or security reasons.
  • Inmates were released or transferred prior to completing the program.
  • Inmates withdrew from the program in order to maintain work assignments.
  • Curriculum was too difficult or too long for inmates to complete.

Similar to the responses we received related to goals and outcomes, institutions with high performance rates appear to be proactive in the management and use of available resources (i.e., inmate screening, sufficient number of classes offered to meet inmate needs, and strongly encouraging inmate participation).  Those institutions that did not meet their goals attribute their failure to inadequate staffing, difficult classes, and factors outside the control of staff such as inmate transfers and security issues.

We also reviewed the GED performance factor as reported in the BOP’s Key Indicators system.  Similar to the other areas we reviewed, we found a wide range in the percentage of inmates that completed the GED program among the BOP institutions.  For example, during FY 2002

  • the GED performance factor for minimum security institutions ranged from 22 to 57 percent,
  • the GED performance factor for low security institutions ranged from 8 to 45 percent,
  • the GED performance factor for medium security institutions ranged from 0 to 45 percent, and
  • the GED performance factor for high security institutions ranged from 7 to 53 percent.

Our discussions with BOP officials during the audit revealed that they did not believe that the GED performance factor included in the BOP’s Key Indicators system was an accurate assessment of its institutions’ literacy programs.  BOP officials felt that since the GED literacy program was a mandatory program, looking at a GED performance factor based on voluntary withdrawals would be a better measure of performance since involuntary withdrawals are often outside the control of the education department, i.e. the inmate might have been released or transferred prior to completing the program.

To account for the BOP’s concerns, we recalculated the GED performance factor based only on voluntary withdrawals for each of the institutions included in our audit.42  The results of our calculation revealed higher completion rates for each institution; however, we continued to find that wide ranges among institutions in the percentage of inmates that completed the GED program.  For example, during FY 2002

  • the GED performance factor based on voluntary withdrawals for minimum security institutions now ranged from 63 to 99 percent (compared to the 22 to 57 percent reported),
  • the GED performance factor based on voluntary withdrawals for low security institutions now ranged from 49 to 97 percent (compared to the 8 to 45 percent reported),
  • the GED performance factor based on voluntary withdrawals for medium security institutions now ranged from 0 to 79 percent (compared to the 0 to 45 percent reported), and
  • the GED performance factor based on voluntary withdrawals for high security institutions ranged from 17 to 82 percent (compared to the 7 to 53 percent reported).

When we presented this analysis to the BOP education officials, they stated that they also did not believe that a GED performance factor based on voluntary withdrawals provided an accurate assessment of the literacy program performance at its institutions.  They felt that factors such as inmates dropping one literacy class and subsequently enrolling in a different class could increase the number of voluntary withdrawals and negatively impact the GED performance factor based on voluntary withdrawals.  However, we were unable to obtain any data from the BOP that supports that the wide range in performance is solely a result of factors such as inmates dropping one literacy class and subsequently enrolling in a different class.

In our judgment, the BOP needs to develop a suitable measure of literacy program performance at its institutions.  If BOP officials believe that the GED performance factor included in its Key Indicators system does not accurately measure literary program performance then it should be changed.  The new performance measure should provide an accurate picture of the percentage of all inmates that arrive at the BOP institutions without a GED credential or high school diploma that complete the literacy program during incarceration.

In the absence of reliable information to measure GED program performance, we asked BOP officials how they monitor the GED program performance at the institutions.  They stated that closely track the percentage of citizen inmates required to participate in the literacy program that have dropped out and are therefore not promotable above the maintenance pay grade for BOP work programs.  These inmates are designated as GED Dropped Non-promotable (GED DN) in the BOP’s SENTRY system.43  The BOP’s Key Indicators system also includes data on the number and percentage of GED Dropped Non-promotable inmates at each institution as of the end of the fiscal year.444

In our judgment, the percentage of GED Dropped Non-promotable inmates based on the BOP’s SENTRY system also is not an accurate measure of literacy program performance.  The percentage of GED Dropped Non-promotable inmates does not provide an accurate picture of the total number of inmates that arrive at the institution without a GED credential or high school diploma who subsequently complete the literacy program during incarceration.  Further, the percentage is based on the total institution population that includes inmates who arrive at the institution with a GED credential or high school diploma and inmates whose GED status is unknown.

Nonetheless, since BOP officials indicated that they closely track the percentage of GED Dropped Non-promotable inmates, and this information is included in the BOP’s Key Indicators system, we also reviewed these percentages for each security level and each institution included in our audit for FY 1999 through FY 2002.  The details of our review of the percentage of GED Dropped Non-promotable inmates at each institution for FY 2001 and FY 2002 are included in Appendix XI.  We found that for each security level the percentage of GED Dropped Non-promotable inmates generally decreased from FY 1999 through FY 2002.  However, as with the other areas we looked at, we found a significant range among institutions at the same security level.  Specifically,

  • For FY 1999, there was an 88 percent difference in the percentage of GED Dropped Non-promotable inmates between the lowest and highest minimum security institutions, a 90 percent difference between the low security institutions, a 96 percent difference between the medium security institutions, and a 60 percent difference between the high security institutions.
  • For FY 2000, there was a 90 percent difference in the percentage of GED Dropped Non-promotable inmates between the lowest and highest minimum security institutions, an 85 percent difference between the low security institutions, a 94 percent difference between the medium security institutions, and a 72 percent difference between the high security institutions.
  • For FY 2001, there was a 93 percent difference in the percentage of GED Dropped Non-promotable inmates between the lowest and highest minimum security institutions, a 93 percent difference between the low security institutions, a 91 percent difference between the medium security institutions, and a 60 percent difference between the high security institutions.
  • For FY 2002, there was a 96 percent difference in the percentage of GED Dropped Non-promotable inmates between the lowest and highest minimum security institutions, a 93 percent difference between the low security institutions, a 93 percent difference between the medium security institutions, and a 59 percent difference between the high security institutions.

We also noted that the BOP maintains data on the percentage of noncitizen inmates required to participate in the literacy program that have dropped out and are therefore subject to a loss of good conduct time.  These inmates are designated as Exempt GED Non-promotable (GED XN) in the BOP’s SENTRY system.  The BOP’s Key Indicators system also includes data on the number and percentage of Exempt GED Non-promotable inmates at each institution as of the fiscal year-end.

We reviewed the percentage of Exempt GED Non-promotable inmates for each security level during FY 1999 through FY 2002 based on the data contained in the BOP’s Key Indicators system.  We found that, with the exception of minimum security institutions, for each security level as a whole the percentage of Exempt GED Non-promotable inmates increased from FY 1999 through FY 2002, as shown in the following charts.

Exempt from GED / Non-promotable (XN) - Minimum Security, FY 1999 - FY 2002.  Data. 1999 - 0.20%; 2000 - 0.10%; 2001 - 0.20%; 2002 - 0.20%.
Source:  The OIG analysis of the percentage Exempt GED Non-promotable inmates at minimum security institutions reported in the BOP’s Key Indicators system for FY 1999 through FY 2002.


Exempt from GED / Non-promotable (XN) - Low Security, FY 1999 - FY 2002.  Data. 1999 - 17.50%; 2000 - 19.30%; 2001 - 21.00%; 2002 - 21.00%.
Source:  The OIG analysis of the percentage Exempt GED Non-promotable inmates at low security institutions reported in the BOP’s Key Indicators system for FY 1999 through FY 2002.


Exempt from GED / Non-promotable (XN) - Medium Security, FY 1999 - FY 2002.  Data. 1999 - 6.50%; 2000 - 9.60%; 2001 - 11.50%; 2002 - 11.30%.
Source:  The OIG analysis of the percentage Exempt GED Non-promotable inmates at medium security institutions reported in the BOP’s Key Indicators system for FY 1999 through FY 2002.


Exempt from GED / Non-promotable (XN) - High Security, FY 1999 - FY 2002.  Data. 1999 - 3.00%; 2000 - 4.50%; 2001 - 5.10%; 2002 - 5.50%.
Source:  The OIG analysis of the percentage Exempt GED Non-promotable inmates at high security institutions reported in the BOP’s Key Indicators system for FY 1999 through FY 2002.

As shown in the previous charts, the percentage of Exempt GED Non-promotable inmates increased for each security level from FY 1999 through FY 2002.  Therefore, in our judgment, the BOP has not adequately monitored the percentage of noncitizen inmates that have dropped out of the GED program.

Psychological Programs and RPP Participation

As stated previously, we were unable to analyze the percentage of inmates who complete the BOP’s psychological programs because the BOP does not maintain completion and withdrawal statistics for these programs in its Key Indicators system.

In its budget, the BOP tracks the following performance indicators related to its psychological programs:

  • percentage of inmates in residential drug treatment,
  • number of inmates in non-residential drug treatment,
  • percentage of intake assessments,
  • number of individual therapy/crisis counseling sessions provided, and
  • number of suicide risk assessments.

However, these performance indicators only cover a small portion of the psychological programs offered by the BOP.

Prior to January 2003, the BOP did not report on psychological program performance data for most of its programs.  As a result, we were unable to use this data to analyze trends related to performance, such as completion rates, failure rates, and withdrawal rates.

Since January 2003, the BOP has reported on monthly participation data for the majority of its psychological programs; however, this data is still not included in its Key Indicators system.  The data provided in the monthly reports prepared by the BOP’s Office of Research and Evaluation includes: (1) admissions, (2) completions, (3) expulsions, (4) failures, (5) withdrawals, (6) incompletes, and (7) waiting lists for the following programs.

  • Residential Drug Abuse Treatment Program,
  • Non-Residential Drug Abuse Treatment Program,
  • Transitional Drug Abuse Treatment Program,
  • BRAVE Program,
  • CODE Program,
  • New Pathways Program, and
  • Sex Offender Treatment Program.

BOP officials stated that the participation data is a tool but not necessarily a measure of performance.  For example, BOP officials stated that a large failure or expulsion rate in a psychological program is not necessarily an indication of low performance because sometimes expulsions are necessary to hold inmates accountable for their actions.

BOP officials also stated that it is difficult to evaluate performance of psychological programs since they are dealing with human behavior that is not easily determined based on completion rates or other data and that statistical data for these types of programs is best used as a tool to evaluate trends over a period of time.  In our judgment, participation data of this nature is also relevant and should be used by management as an indicator of potential immediate concerns.  For example, a large number of withdrawals could indicate inmates are being transferred to another institution prior to completing a program.  In addition, a large number of expulsions, failures, and withdrawals also may indicate a problem related to a specific psychological program at an institution.

At each of the three regional offices included in our audit, we identified the process for evaluating the monthly psychological program participation data for the institutions in their respective areas.  We found that the BOP regional offices did not have a standardized process for evaluating the participation data or holding its institutions accountable for low participation.  Generally, if the regional officials stated that if they noted trends in the participation data, such as a high failure rate in a particular program on the monthly participation report, they would follow-up by telephone.  However, we found no formal review process in place at the regional level.

As with its psychological programs, we found that he BOP does not maintain completion and withdrawal statistics for the RPP in its Key Indicators system.  Additionally, one of the expected outcomes of the BOP’s RPP is that inmate recidivism will be reduced.  Yet, the BOP has not conducted any studies demonstrating that successful participation in its RPP leads to a reduction in recidivism.

All eligible inmates committed to BOP custody are required to participate in the RPP and must enroll in the program no later than 30 months prior to release to the community or a CCC.  Although the BOP can determine the RPP status of each inmate at any given point in time, no statistical data related to RPP performance is tracked.

At each of the three regional offices included in our audit, we identified the process for evaluating RPP participation for its institutions.  We found that two of the three regional offices did not review RPP participation.  The third, the Northeast Regional Office, sends a monthly roster to each of its institutions listing those inmates within 30 months of release that have not enrolled in the RPP.  However, no formal response is required from the institutions.  The BOP should, at a minimum, track participation data for its institutions to determine the percentage of eligible inmates that have completed the RPP prior to release into the community.  Based on our discussions with BOP officials, we determined that the percentage of eligible inmates that have completed the RPP prior to release into the community could be included in its Key Indicators system.

In addition to tracking the percentage of eligible inmates that have completed the RPP prior to release into the community, the BOP needs to establish a mechanism to hold institutions accountable for RPP performance and implement corrective actions plans to remedy low performance. 

Conclusion

In summary, we found that the BOP does not ensure that:  (1) institutions set realistic occupational and educational completion goals, (2) institutions are held accountable for meeting goals, (3) data for occupational, educational, and psychological programs is reviewed to identify low performance, and (4) statistical data related to RPP performance is not maintained.  As a result, we concluded that the BOP does not provide assurance that each of its institutions maximized the number of inmates that complete programs designed to prepare inmates for successful reentry into society.

Recommendations

We recommend that the BOP:

  1. Ensure that a formalized process is established to set realistic occupational and educational completion goals stated as a percentage of completions to account for total enrollments and inmate population.  The factors considered in setting educational goals should include the security level of the institution, inmate population, classroom size, number of classes, number of instructors, whether the institution has a wait list for its programs, and historical educational program completion data.
  1. Establish and implement a formal process to ensure that institutions are held accountable for meeting their occupational and educational goals and that corrective action plans are developed to remedy performance so that goals are met in future years.
  1. Revise the Annual Program Report for Education and Recreation Services to include both the occupational and educational goals and outcomes for the reported fiscal year so that the BOP can readily determine whether the institution met its goals.
  1. Establish and implement a formal standardized process for evaluating the performance factor for occupational technical and vocational programs on an annual basis to ensure that the BOP institutions are held accountable for low performance and that corrective action plans are developed to remedy occupational program performance.
  1. Ensure that a formal standardized process is developed and implemented to screen all inmates prior to enrollment in all occupational programs to ensure that they have the ability and are willing to commit to completing the course.
  1. Ensure that a suitable measure of literacy program performance is developed to evaluate its institutions.  The new performance measure should provide an accurate picture of the percentage of all inmates that arrive at the BOP institutions without a GED credential or high school diploma who complete the literacy program during incarceration.
  1. Ensure that the percentage of citizen inmates required to participate in the literacy program that have dropped out at each institution is more closely evaluated.
  1. Ensure that the percentage of noncitizen inmates that have dropped out of the literacy program at each institution is monitored.
  1. Establish and implement a mechanism to hold institutions accountable for the monthly psychological program participation data that includes corrective action plans for institutions with low participation.
  1. Ensure that participation data is tracked for all of the BOP institutions to determine the percentage of eligible inmates that have completed the RPP prior to release into the community.
  1. Establish and implement a mechanism to hold institutions accountable for RPP performance that includes corrective action plans for institutions with low performance.

II. Community Corrections Centers (CCC)

The BOP offers transitional services to inmates through CCC placement, which has been found to increase the chances of successful reentry into society.  The BOP establishes CCC utilization targets for its minimum, low, and medium security institutions.  However, our audit revealed that a large number of institutions failed to meet their CCC utilization targets during FY 2000 through FY 2002.  Also, the BOP has not developed a CCC utilization target for its high security institutions, and does not adequately ensure that all eligible inmates are provided the opportunity to transition through a CCC in preparation for reentry into society.

In addition to reentry programs offered to inmates while serving their sentences at BOP institutions, the BOP provides services that assist inmates when they transition from incarceration into the community.  The primary transitional service provided by the BOP is the placement of inmates in CCCs, also known as halfway houses.  Pursuant to 18 U.S.C. § 3624(c), the BOP is required, to the extent possible, to assure that inmates spend a reasonable part of their term of incarceration under conditions that will afford the prisoner a reasonable opportunity to adjust to and prepare for reentry into the community.  The BOP believes the transitional services provided through a CCC meet the requirements of 18 U.S.C. § 3624(c).  Pursuant to this federal statute, the BOP can place inmates in a CCC for a period not to exceed the last 6 months of confinement or a period equal to 10 percent of the inmate’s sentence, whichever is less.

At the institutions we visited, we reviewed inmate files to determine whether eligible inmates were placed in CCCs prior to release.  We found that the unit team generally referred eligible inmates for CCC placement; however, not all inmates referred for CCC placement were transitioned through a CCC.  The reasons that those inmates that were not transitioned through a CCC included that the inmates (1) were not eligible (e.g., were deportable aliens), (2) were considered a flight risk, (3) were considered a high risk, or (4) refused placement.

BOP policy requires that eligible inmates be released to the community through a CCC.  However, the policy also states that the BOP does not ordinarily consider the following inmates for CCC programs.45

  • Inmates who are assigned a “Sex Offender” Public Safety Factor.
  • Inmates who are assigned a “Deportable Alien” Public Safety Factor.
  • Inmates who require inpatient medical, psychological, or psychiatric treatment.
  • Inmates who refuse to participate in the Inmate Financial Responsibility Program.
  • Inmates who refuse to participate, withdraw, are expelled, or otherwise fail to meet attendance and examination requirements in a required Drug Abuse Education program.
  • Inmates serving sentences of 6 months or less.
  • Inmates who refuse to participate in the RPP.
  • Inmates who pose a significant threat to the community.  These are inmates whose current offense or behavioral history suggests a substantial or continuing threat to the community.

BOP officials we interviewed believe that CCCs provide an excellent transitional environment for inmates nearing the end of their sentences.  According to the BOP, during the transitional period at a CCC, inmate activities are closely monitored, and inmates are provided with a suitable residence, structured programs, job placement and counseling.  CCCs also offer drug testing and counseling for alcohol and drug-related problems.  Further, during their stay inmates are required to pay a subsistence charge to defer the cost of their confinement in a CCC (25 percent of their gross income, not to exceed the average daily cost of their CCC placement).

A strategic objective of the BOP sets target CCC utilization rates for minimum, low, and medium security institutions.46  (The CCC utilization rate is the percentage of inmates transitioned into the community through a CCC, as compared to the percentage of inmates released directly into the community.)  The target CCC utilization rates are shown below.

Target CCC Utilization Rates
Security Level CCC Utilization
Target
Minimum Security 80 percent
Low Security 70 percent
Medium Security 65 percent

As stated previously, BOP policy requires that eligible inmates be released to the community through a CCC, regardless of security level.47   Nonetheless, we noted that the BOP has not established a CCC utilization target for its high security institutions.  We found that the average CCC utilization rate for the BOP high security institutions was 23 percent in FY 2000, 39 percent in FY 2001, and 45 percent in FY 2002.  In its policy, the BOP also states that one reason for referring an inmate to a CCC prior to release directly into the community is to increase public safety by aiding in the transition of an inmate into the community.  In our judgment, inmates in high security institutions have the greatest need for transitioning through the controlled CCC environment prior to being released directly into the community, especially since the average sentence of inmates placed in high security institutions was 12 years as of the end of FY 2002.

Historically, BOP officials at high security institutions have been reluctant to place their inmates in CCCs prior to release because they were considered a public safety risk.  Nonetheless, in our judgment the BOP should also establish a CCC utilization target for its high security institutions to ensure that eligible inmates released from these institutions are provided with the same opportunity to transition through a CCC prior to release into the community.  During the course of our audit, several BOP officials at the regional offices concurred that a CCC utilization target should be set for the high security institutions.  In establishing a CCC utilization target for its high security institutions, the BOP should consider the average CCC utilization rates noted in the preceding paragraph.

The 82 institutions we reviewed included 13 high security institutions and 1 maximum security institution.  Therefore, we were only able to review the CCC utilization targets and outcomes during FY 2000 through FY 2002 for the remaining 68 minimum, low, and medium security institutions.  We used the total number of inmates transferred to a CCC and the total number of inmates released directly to the community as reported in the BOP’s Key Indicators system to calculate the CCC utilization rate for each institution and compared this calculation to the BOP’s CCC utilization targets.  The details of our calculations and analysis of each institution’s CCC utilization rate for FY 2001 and FY 2002 are included in Appendix XII

Overall, the results of our review revealed that a large number of institutions failed to meet the BOP’s stated CCC utilization targets for FY 2000 through FY 2002.  Specifically, we found the following for each fiscal year.

  • For the 67 institutions reporting in FY 2000, 36 (54 percent) failed to meet their CCC utilization target for that fiscal year.
  • For the 67 institutions reporting in FY 2001, 19 (28 percent) failed to meet their CCC utilization target for that fiscal year.
  • For the 68 institutions reporting in FY 2002, 27 (40 percent) failed to meet their CCC utilization target for that fiscal year.

Since the BOP established its CCC utilization targets by security level, we analyzed the CCC utilization targets and outcomes by security level and determined the range in performance among the institutions within the same security level.

Our analysis of the CCC utilization targets and outcomes by security level revealed that for FY 2000, none of the 11 minimum security institutions reporting failed to meet their CCC utilization target, 14 (58 percent) of the 24 low security institutions reporting failed to meet their CCC utilization target, and 22 (69 percent) of the 32 medium security institutions reporting failed to meet their CCC utilization target.

Additionally, for each security level there was generally a significant range in the CCC utilization rates achieved by each institution (Appendix XIII), as shown in the following charts.

CCC Utilization Rates Achieved
Among Minimum Security Institutions
FY 2000 through FY 2002
FY Institutions
Reporting
CCC Utilization
Rate Range
2000 11 81% - 97%
2001 11 81% - 94%
2002 11 80% - 96%


CCC Utilization Rates Achieved
Among Low Security Institutions
FY 2000 through FY 2002
FY Institutions
Reporting
CCC Utilization
Rate Range
2000 24 47% - 86%
2001 24 56% - 88%
2002 24 52% - 83%


CCC Utilization Rates Achieved
Among Medium Security Institutions
FY 2000 through FY 2002
FY Institutions
Reporting
CCC Utilization
Rate Range
2000 32 42% - 100%
2001 32 54% - 89%
2002 33 35% - 88%


CCC Utilization Rates Achieved
Among High Security Institutions
FY 2000 through FY 2002
FY Institutions
Reporting
CCC Utilization
Rate Range
2000 9 0% - 36%
2001 11 0% - 56%
2002 13 0% - 75%

To determine the factors that may have contributed to institutions not meeting their CCC utilization targets during FY 2000 through FY 2002, we included questions regarding the CCC utilization rates in the questionnaires we sent to 24 institutions, as discussed in Finding I of this report.  Based on the responses to our questions, we found that institutions that met their CCC utilization targets most commonly cited the following factors:

  • The prior BOP Director and executive staff strongly encouraged institutions to refer inmates for CCC placement.
  • Institution staff stressed the use and referral for CCC placement at unit team meetings and staff strongly encouraged inmates to participate.
  • The institution started the CCC referral process early, especially in cases of inmates with short sentences.
  • Eligible inmates who have completed the Residential Drug Abuse Program and qualified for the one-year sentence reduction received mandatory CCC placement.
  • Institution staff counseled inmates who initially refuse CCC placement about the benefits of the program.

Based on the responses to our questionnaires, we found that institutions that did not meet their CCC utilization targets most commonly cited the following factors:

  • The institution applied a conservative interpretation of the BOP’s policy regarding the eligibility of inmates for CCC placement.
  • The institution had a large number of inmates who declined CCC placement.
  • The institution had a large number of inmates with pending charges in other districts and inmates with short sentences.

Overall, institutions that met their goals attributed their success to support from executive staff and unit teams that strongly encourage inmate participation.  Those institutions that did not meet their goals attributed their failure to conservative interpretation of policy and factors outside the control of staff, such as a large percentage of inmates that were ineligible for CCC placement.

According to BOP officials, at each quarterly executive staff meeting CCC utilization rates are reviewed and the regional directors may be asked to comment on any utilization rate outliers (institutions with CCC utilization rates that are significantly lower than the target utilization rate).  BOP officials also stated that the regional directors are ultimately held responsible for monitoring the CCC utilization rates within their region.  Although quarterly meetings are held and regional directors monitor their respective regional progress, only one specific security level (minimum, low, medium or high) is addressed at each quarterly meeting and each regional director may have a different process for monitoring CCC utilization rates.  However, as shown previously, we found that during FY 2000 through FY 2002, between 28 and 54 percent of institutions we looked at failed to meet their CCC utilization targets.

As with the other areas we reviewed, this may be attributed BOP regional offices did not follow a formal standardized process to ensure that institutions are held accountable for meeting their targets and that corrective action plans are developed to remedy low CCC utilization.  At each of the three regional offices we identified the process for reviewing CCC utilization rates and found that each of the three regional offices had different processes.  For example, one regional office did not have a process in place for reviewing CCC utilization rates, while another reviewed the rates but did not necessarily follow-up consistently with institutions that did not meet their targets. 

Conversely, the Northeast Regional Office has established a formal process to ensure that all eligible inmates at each of its institutions are provided the opportunity to transition into the community through a CCC.  The Northeast Regional Office officials review the CCC utilization data contained in Key Indicators system and the quarterly CCC utilization report prepared by the BOP’s Central Office in order to determine if its institutions are “on track” to meet established CCC utilization targets.  Further, the Northeast Regional Office requires each of its institutions to submit a monthly CCC utilization report identifying both the number of inmates referred for and those denied CCC placement.  For all inmates denied CCC placement, regional officials ask the institution to provide a detailed explanation regarding the basis for the inmate’s CCC denial.  Regional officials then review each denial in order to determine whether the denial is in compliance with the CCC utilization policy. 

We also found that the CCC utilization rates and targets cannot be used to determine whether all eligible inmates at each institution were released to the community through a CCC, as required by BOP policy.  Currently, the CCC utilization targets range from 65 percent for medium security level institutions to 80 percent for minimum security level institutions (a CCC utilization target has not been established for high security level institutions).  Therefore, even if an institution achieves or exceeds the CCC utilization target for its security level, the BOP does not assure that all eligible inmates were transitioned through a CCC.  In our judgment implementing a formal process for reviewing eligible inmates that are denied CCC placement, similar to the Northeast Regional Office, would also ensure that all eligible inmates are placed in a CCC prior to release.

It should be noted that subsequent to our audit, the BOP proposed a revision to the CCC utilization targets including the establishment of a CCC utilization target for high security level institutions and increasing the targets for minimum, low and medium security level institutions.  However, to date the BOP has not approved or implemented the proposed revisions to the CCC utilization targets.

Recommendations

We recommend that the BOP:

  1. Establish a CCC utilization target for its high security institutions.
  1. Establish and implement a formal process to ensure that all eligible inmates are placed in a CCC prior to release.


Footnotes
  1. The BOP did not require its institutions to establish goals for its GED programs for FY 2002 because of a change in the GED testing format that was implemented at the beginning of calendar year 2002.
  2. We used completion and withdrawal data that was reported in the BOP's Key Indicators system for educational and occupational programs to calculate the performance factor.
  3. The GED performance factor based on voluntary withdrawals was calculated based on completions divided by completions plus voluntary withdrawals for the fiscal year.
  4. SENTRY is the BOP's national on-line automated information system used to provide operational and management information requirements.
  5. The BOP's Key Indicators, Current Educational Needs Fact Sheet.
  6. BOP Program Statement No. 7310.04, Community Corrections Center (CCC) Utilization and Transfer Procedure, dated December 16, 1998.
  7. The BOP, State of the Bureau 2002, Accomplishments and Goals.
  8. BOP Program Statement No. 7310.04, Community Corrections Center (CCC) Utilization and Transfer Procedure, dated December 16, 1998.