Skip to main content
Speech

Assistant Attorney General
for the Office of Justice Programs Karol V. Mason Delivers Remarks at the American Society of Criminology

Location

United States

Thank you, Charles.  Good afternoon.  It’s a pleasure to be here today with all of you.

Before I go any further, I’d like to mention that I’m very pleased to have with me today several leaders and staff from the Office of Justice Programs.  I won’t identify them all now because I’ll be asking them to chime in as I talk, but I did want you to see, by their presence, just how important science and research are to the leadership of OJP.

I would like to point out one member of my staff.  This summer, I created a Science Policy Advisor position to help me identify and carry forward OJP-wide policy objectives related to science and technology.  I appointed the estimable Dr. Phelan Wyrick to that position.  Many of you know Phelan and the contributions he’s made leading the Evidence Integration Initiative and establishing CrimeSolutions.gov.  He was the clear choice for the position, and I know our scientific work will profit immeasurably by his guidance.

I have to start by telling you the same thing that I tell many others about my job – I love being the Assistant Attorney General for the Office of Justice Programs!  I’m so excited to lead an agency charged with helping states, communities, and tribes improve their justice systems, because those systems play such an important role in people’s lives.

Some of what I’m most proud of is the work we’re doing through our two primary science agencies – the National Institute of Justice and the Bureau of Justice Statistics – as well as the work we’re doing to integrate evidence into grant making in our programmatic offices, the Bureau of Justice Assistance, the Office for Victims of Crime, and the Office of Juvenile Justice and Delinquency Prevention.

Support for the sciences and emphasis on evidence comes from the very top of the Obama Administration.  The President has set a management agenda that calls for a “smarter, more innovative, and more accountable government.”  The Office of Management and Budget has emphasized the use of evidence to support budget requests by all federal agencies.

And Attorney General Eric Holder has emphasized on many occasions that we must not stop searching for new ways to be “smart on crime.”  His support for OJP’s work in this area has been unwavering.

Today I want to talk with you about what that work looks like.  And let me begin with a caveat:  I can’t possibly convey to you all of the things we do to advance science and integrate evidence into our programs and activities.  There is no single answer, no one-size-fits-all solution to how we enhance the quality of evidence available to the field.  There’s no one way to improve our ability to use that evidence in practice.

We know we have to take a careful approach that looks at each challenge individually, in context.  Then we have to develop short- and long-term strategies to move forward and help advance the field.

I want to touch on four critical areas that OJP is working to address:

• Improving the data infrastructure;
• Improving the quality of program evaluations;
• Integrating evidence into programmatic grants;
• And integrating evidence into technical assistance that’s available to states, tribes, and localities.

First, to be “smart on crime,” I believe we have to be able to accurately represent crime, as well as delinquency, how justice agencies are performing, and so on.  You can’t fix a problem you can’t see.  Without good data, whether it’s crime data or administrative data, we’re limited in what we can do to make improvements.

Think about how much things have changed in just the last 10 to 15 years.  The Internet, social media, smart phones, and other technologies have changed the way we live.  We’ve also seen dramatic changes in criminal activity involving these technologies.  Identity theft, sexual exploitation, stalking, bullying, and other crimes have taken new forms and presented new challenges.

Now think about how much has changed in the past 80 years!  That’s how long it’s been since the criminal justice field developed the basic definitions that underlie our most widely used indicators of crime in America – the Uniform Crime Reports (known as UCR) and the National Crime Victimization Survey.

We’re not getting enough information from these systems to take us through the 21st century.  We need to get more out of our common systems of crime measurement to support a more evidence-based approach – a “smart on crime” approach.

This year, I’m pleased that BJS has begun to work with the National Academy of Sciences to come up with ideas for developing a modern system of crime measurement.  Among other things, they’re carefully looking at how a modern system can be created to minimize the data collection burden on criminal justice agencies that are already stretched for resources.

BJS is also continuing an effort, started under the leadership of its former Director, Jim Lynch, known as the National Crime Statistics Exchange (or NCS-X).  They’re trying to achieve a goal that has eluded this nation for more than 30 years – a goal of producing national, incident-based crime statistics.  The value of this effort should be clear to any of you who have considered the analytical difference between incident-based data and the current UCR data. 

UCR data gives you counts and rates per 100,000 for an offense.  But incident-level data provides the opportunity to analyze multiple characteristics of the victim, the offender, and the offense itself.  When these data become available, researchers and policy makers will be able to answer far more sophisticated questions about the nature of crime.  These data will also help improve the quality and reduce the costs of program evaluations. 

Which brings me to the second critical area of work – program evaluations.  In recent years, OJP has made changes to the way we support program evaluations.  Of course, program evaluations are at the heart of answering the “does it work” questions.

We’ve emphasized high quality, randomized evaluation designs to help produce the most valid and reliable results possible.  Our National Institute of Justice has worked closely with our Bureau of Justice Assistance to build randomized field experiments that coordinate program funding with research funding.

Some of you may have already heard of the work that NIJ and BJA are doing to replicate the Hawaii Opportunity Probation with Enforcement program, known as HOPE.  The idea behind HOPE is to use swift and certain, but not severe, sanctions for probation violators.  The NIJ-funded random assignment evaluation of the Hawaii program found remarkable results in decreasing drug use and rearrests.  Now we’re replicating this model in multiple locations outside of Hawaii. 

This year, BJA and NIJ have partnered with the National Institute of Corrections to launch another demonstration field experiment on Fostering Desistance through Effective Supervision.  This project is designed to improve offender outcomes and reduce recidivism.  The evaluation will test a new approach to training parole officers and a promising cognitive behavioral curriculum to parolees.    

I think these strong partnerships between the agencies that support programs and those that support research are essential to advancing a “smart on crime” approach.

I’m also going to ask Denise O’Donnell to speak in just a few minutes.  As the Director of the Bureau of Justice Assistance, Denise has led the way in integrating evidence into grant making at BJA.

For example, the Smart Policing Initiative and the Smart Probation Initiative are just two programs that emphasize the use of evidence-based approaches and researcher-practitioner partnerships to measure and enhance the performance of criminal justice agencies.

As we work to integrate evidence into our grant making, we recognize that there is no need to have a conflict between focusing on evidence and encouraging innovation.  In fact, these two areas of emphasis are very compatible.

A few years ago, BJA opened a field-initiated solicitation called Encouraging Innovation.  They went on to fund a program called the Bronx Defenders that advances a “holistic” model of criminal defense.  Clients work with an interdisciplinary group of experts who work as a team on the criminal defense, but also to improve clients’ well-being and help them avoid further contact with the criminal justice system.

The original grant was based on promising, but preliminary, internal evaluation findings.  Further evaluation and case studies have provided more encouraging findings, and we consider this model promising.  Now, NIJ has committed resources to support a more rigorous evaluation as BJA continues to support technical assistance to help the field replicate this model with fidelity.

What this example shows to me is how we can make strategic investments to build our confidence over time by incrementally improving the evidence base in an area.  As our confidence in the evidence grows, our confidence in the quality of the investment also grows.

I should take a moment to say just how happy I am that Denise is here today.  It’s not typical for the Director of the Bureau of Justice Assistance to attend ASC.  Like me, Denise is here because she’s excited to learn about the latest research and meet with the leading researchers in this field.  Thank you, Denise. 

As I mentioned earlier, so much depends on context.  I want to emphasize just how different it can be to integrate evidence into different topic areas.  Let’s take two examples: mentoring and crime victim services.

A 2011 meta-analysis by David DuBois found that mentoring programs generally produce modest positive effects.  The study included an examination of factors that appear to be associated with stronger positive effects – factors known as “moderators” to you researchers.

The Office of Juvenile Justice and Delinquency Prevention integrated these findings into programmatic solicitations for mentoring.  OJJDP asked applicants to develop mentoring programs that emphasized these moderators – specifically the use of an “advocacy” role for mentors and putting the mentors into teaching roles.

What OJJDP found was that there were knowledge gaps in how to best operationalize these moderators at the programmatic level.   So they launched a randomized field experiment in a total of 32 sites to better assess the outcomes associated with these moderators.

So in the case of mentoring, we have a body of evidence that is mature enough to support a meta-analysis requiring numerous program evaluations.  We have a general awareness of moderators that contribute to a higher likelihood of success, and we’re explicitly testing those moderators in a rigorous way.

Let’s compare that with our work in crime victim services.  The Office for Victims of Crime released a report this year on transforming victim services, titled Vision 21.  Vision 21 represents a major, multi-year effort by OVC to work with experts and stakeholders in the field to assess the current state of affairs in victim services, and chart a course for the future.  “The stakeholders most singular finding was the dearth of data and research in the field.”  Those are not my words!  That is a direct quote from the report.

The report goes on to state that “that gap is reflected in every other finding of the Vision 21 initiative,” and that’s why they emphasized the importance of research at the front of the report.  They did not originally set out to write a report about research, but the report makes a strong statement about the need for basic research, program evaluation, and evidence-based practices.

My point in making this comparison between mentoring and crime victim services is only to underscore the fact that evidence-integration looks different based on the state of practice and the state of evidence in the area you’re addressing.  You should know that the Administration is working to support the recommendations of the Vision 21 report.  And I encourage all of you, as researchers, to look up the report and consider what work you may be able to do in this area. 

Beyond our grants for programs, services, or research, OJP makes a large impact on the field through a wide range of technical assistance providers.  BJA alone supports over 100 awards for technical assistance.  Much of our technical assistance is very specific and targeted.  But from time to time, we’re approached by localities, states, or tribes who are facing more complex and sometimes long-standing challenges. 

Last year, OJP launched the Diagnostic Center to provide technical assistance to build local capacity to use data to analyze crime problems and implement evidence-based solutions.  Diagnostic Center engagements begin with an assessment of local data to help define the problem to be addressed.  Then they move to a set of evidence-based recommendations, and end with an assessment of the effectiveness of the interventions that are implemented. 

In summary, much of what OJP is doing to advance evidence in the field of justice is included in our work to improve the data infrastructure, improve program evaluations, and integrate evidence into grants and technical assistance.

I know that many of you have heard of our CrimeSolutions.gov website that provides information about evidence-based programs and practices.  We’ve presented on that at previous conferences.  In fact, there’s another panel on that topic being held at this very moment.

Finally, I would be remiss if I didn’t acknowledge some very important people who have done a great deal to make this work possible for OJP.  First, I’d like to thank my predecessors.  Most of you know Laurie Robinson.  In her service as Assistant Attorney General earlier in the Administration, she did more than anyone to set the course for a more evidence-based OJP.  Thank you, Laurie.

Also, Mary Lou Leary, who is currently my Deputy, carried the flame forward and never lost sight of the importance of this issue during her time as Acting AAG.

We know we can’t do this work alone.  And one of the mechanisms that we use to learn from the field is the OJP Science Advisory Board that was launched under Laurie’s leadership in 2010.  This federal advisory board has 18 members and roughly 20 additional subcommittee members that provide input and advice to all of OJP.  We were very disappointed that our most recently scheduled meeting was canceled during the government shutdown in October.  But I’m excited that the Board is scheduled to meet again in January

I won’t mention all of the names, but I do want to thank all of the people who contribute to the Board.  And I will mention a handful of people who have given a great deal of their time.  I know that some of them are in the room.  Our Board Chair is Al Blumstein.  Thank you, Al.  And our subcommittee chairs are: Rick Rosenfeld, David Weisburd, Mark Lipsey, Ed Mulvey, Tony Fabelo, and Alan Leshner.  Thank you all so much for the work that you do.

I’m really excited about the work we’re doing at OJP to integrate evidence into our work.  We’ve got a great team of leaders and professionals committed to making science central to the way we do business.  I’m looking forward to moving the ball forward and to working with all of you to advance the role of research in criminal justice.

Thank you to all of you for your attention.  Now I’d like to open the floor for questions or comments.  


Updated September 17, 2014