Download StudyGeneral

Study Overview

Title:
Judicial Performance Improvement Impact Evaluation
Study is 3ie funded:
No
Study ID:
RIDIE-STUDY-ID-582c76c5d65ab
Initial Registration Date:
11/16/2016
Last Update Date:
11/03/2016
Study Status:
Ongoing
Abstract:

The impact evaluation will test alternative implementation strategies for performance feedback in the Kenyan Judiciary on court efficiency and services. This impact evaluation will test three variations of feedback: (i) simple feedback that illustrates how a court is performing against performance targets and compared to peer courts; and (ii) complementing the simple information with calls from senior judicial managers for mentoring support and some supervisory accountability, and (iii) sharing the simple feedback with Court User Committees as a means of local accountability.

Registration Citation:

Sandefur, J. and Menzies, N., 2016. Judicial Performance Improvement Impact Evaluation. Registry for International Development for Impact Evaluations (RIDIE). Available at: 10.23846/ridie102

Categories:
Public Sector Management
Other
Additional Keywords:
Governance & Accountability
Secondary ID Number(s):

Principal Investigator(s)

Name of First PI:
Justin Sandefur
Affiliation:
Center for Global Development
Name of Second PI:
Nick Menzies
Affiliation:
The World Bank Group

Study Sponsor

Name:
i2i Fund
Study Sponsor Location:
Kenya

Research Partner

Type of Organization:
Government agency (eg., statistics office, Ministry of Health)
Location:
Kenya
Intervention

Intervention Overview

Intervention:

The Judiciary IE will test measures that could enhance the impact of performance management and measurement. In particular it will test three mechanisms: (i) simple information that illustrates to court stations how they are performing against their performance targets and compared to their peers; (ii) complementing the information with calls or visits from senior judicial management for peer-to-peer support and supervisory accountability; and (iii) sharing this information with Court User Committees as a means of local accountability. A randomized control trial methodology will be used to implement these identified interventions. The interventions will be implemented at the court station level as the presiding judicial officer has managerial control over the other judicial officers and staff within a station.

Theory of Change:
Multiple Treatment Arms Evaluated?
Yes

Implementing Agency

Type of Organization:
Public Sector, e.g. Government Agency or Ministry

Program Funder

Name of Organization:
The Kenyan Judiciary
Type of Organization:
Public Sector, e.g. Government Agency or Ministry

Intervention Timing

Intervention or Program Started at time of Registration?
No
Start Date:
06/01/2016
End Date:
06/01/2017
Evaluation Method

Evaluation Method Overview

Primary (or First) Evaluation Method:
Randomized control trial
Other (not Listed) Method:
Additional Evaluation Method (If Any):
Other (not Listed) Method:

Method Details

Details of Evaluation Approach:

A randomized control trial methodology will be used to implement the interventions. Baseline and endline data will be collected to test for balance and impact respectively. There will be three treatment arms and one control group.

Outcomes (Endpoints):

Primary outcomes include: expeditious determination of cases (e.g. percentage of civil and criminal matters concluded within 360 days of filing); time to disposition (i.e. the average and median time between filing of a case, or entry of plea, to conclusion); case clearance rate (e.g. initiated cases over resolved cases); courts productivity (e.g. change in the judges’ productivity index); case backlog (e.g. percentage reduction in of backlog cases); and submission of court returns (as a proxy of administration functioning of the court). To ensure that the interventions don’t lead to judicial officers trading off access and quality in the name of timeliness and efficiency, the impact evaluation will also evaluate quality of court services through a court user and employee engagement survey.The survey will ask about perceptions on access to justice; fairness and integrity; and timeliness.

Unit of Analysis:
The court-station
Hypotheses:

Does the simplified feedback form to courts improve performance? Does simplified feedback plus top-down accountability in the form of phone calls from judicial headquarters improve performance? Does simplified feedback shared with locally based stakeholders organized in court user committees improve performance?

Unit of Intervention or Assignment:
Heads of court stations
Number of Clusters in Sample:
4 clusters with 32 court-stations per cluster
Number of Individuals in Sample:
126 heads of court stations
Size of Treatment, Control, or Comparison Subsamples:
32 court-stations per cluster, including the control cluster

Supplementary Files

Analysis Plan:
Other Documents:
Data

Outcomes Data

Description:
Court administrate data collected through a template called the Daily Court Return Template to primarily measure timeliness and efficiency of cases. Court user satisfaction and employee engagement surveys will provide a means to decipher whether efforts to improve timeliness and efficiency has negative side-effects on quality of court services.
Data Already Collected?
No
Data Previously Used?
Data Access:
Data Obtained by the Study Researchers?
Data Approval Process:
Approval Status:

Treatment Assignment Data

Participation or Assignment Information:
Yes
Description:
Data Obtained by the Study Researchers?
Data Previously Used?
Data Access:
Data Obtained by the Study Researchers?
Data Approval Process:
Approval Status:

Data Analysis

Data Analysis Status:

Study Materials

Upload Study Materials:

Registration Category

Registration Category:
Prospective, Category 1: Data for measuring impacts have not been collected
Completion

Completion Overview

Intervention Completion Date:
Data Collection Completion Date:
Unit of Analysis:
Clusters in Final Sample:
Total Observations in Final Sample:
Size of Treatment, Control, or Comparison Subsamples:

Findings

Preliminary Report:
Preliminary Report URL:
Summary of Findings:
Paper:
Paper Summary:
Paper Citation:

Data Availability

Data Availability (Primary Data):
Date of Data Availability:
Data URL or Contact:
Access procedure:

Other Materials

Survey:
Survey Instrument Links or Contact:
Program Files:
Program Files Links or Contact:
External Link:
External Link Description:
Description of Changes:

Study Stopped

Date:
Reason: