One of the problems faced when developing a SIB is how people end up as part of the intervention cohort. If the process involves government or some other body referring participants to the service delivery organisation, how does this service delivery organisation manage the risk that not enough people will be referred, or that the wrong kind of people (i.e. those with little potential for change) will be referred?
Referral mechanisms that are used can be split into the following three categories:
- Eligible: eligibility criteria are defined and everyone who meets it is considered part of the SIB
- Self-selected: either the delivery organisation chooses participants or people choose to join
- Referred: government refers individuals into the program
All of these mechanisms involve some eligibility criteria that participants must meet to be included.
Examples of each mechanism are listed below.
Eligibility – everybody eligible is considered part of the intervention cohort and measured
Peterborough: All male prisoners exiting HMP Peterborough after serving less than 12 months sentence.
Other examples: New York City, New York State
One of the key benefits to government of this approach, is that the responsibility and incentive for convincing people to participate in the program lies with the service provider. It is also more suitable for a rigorous measurement approach where the intervention cohort is compared to another cohort, as the potential for bias in the selection process is reduced.
Self-selection – either the service delivery organisation chooses people to be part of their program or participants choose to join, usually with some eligibility criteria that must be met
DWP Innovation Fund: Service delivery organisations were asked to propose which young people they would work with and how they would attract them. “Your proposal must clearly demonstrate how you are identifying and working with the most disadvantaged and socially excluded young people, the vast majority of whom would otherwise not achieve educational and employment outcomes” (DWP, Round Two Specifications). Some programs asked schools or other organisations to refer students to them.
One of the key benefits of this approach is that the service delivery organisation is in control of how many participants are included in the program, so if they need more people they can do something about it themselves. They can also make sure that the participants have the type of needs that their program was designed for.
Referred – Government refers people it consider suitable for the program, usually using some eligibility or referral criteria
Australia The Benevolent Society: Referrals are made by the Department of Family and Community Services according to the processes and criteria set out in an operations manual. The minimum number of referrals is 400 over the referral period (Deed of Implementation Agreement for TBS Social Benefit Bond Pilot).
Other examples: Essex County Council, Australia Newpin
This referral system is sometimes preferred by government, as it gives them control over who participates in the program, but it means that providers are exposed to the risk that not enough people are referred.
But how do we make sure enough participants are referred to a SIB?
Let’s look at two SIBs where this issue has been responded to.
Educating/marketing to government referrers: Essex County Council
The Essex County Council SIB relied on referrals from the council, however they weren’t getting enough of the right type. In order to fix this, the delivery organisation went in to educate and encourage council staff about the program and who they should refer to it.
“We discovered that Essex weren’t referring enough children who would most benefit from the intervention, due to a combination of things, including competing priorities of senior staff and referral staff not knowing the program existed, or who and how to refer. Solving this problem would not usually fall in the remit of a service provider, but our performance managers went into the council to do a marketing push and went right up to a senior level to change the way they were referring. The board is also considering whether to add an additional ‘marketing’ function to the service, to ensure that the barriers to referral are continually being addressed proactively” (Andrew Levitt, of Bridges Ventures, deliveringthepromise.org).
Incentives for government to refer (or penalties if they don’t): The Benevolent Society
The Australian Social Benefit Bond delivered by the Benevolent Society has a second part to its payment formula that corrects for a lack of referrals. The first part is related to the improvement that the program makes against its three outcome metrics: Out-of-home care entries; Safety and Risk Assessments; Helpline Reports. The second part is where the improvement is adjusted according to whether adequate referrals have been made and whether the children in the program can be compared against children not in the program. It’s called the ‘Performance percentage’.
While payments will not be made until 2018, we can use the 2014 preliminary results to understand how the metric works.
In 2014, the program made a minus one percent improvement against its outcomes.
The second part of the payment metric combines the improvement percentage with two metrics that address the risk related to (1) the measurement and (2) the referral methods.
- The counterfactual for this program is a propensity-score-matched control group, so to manage the risk of children not being able to be matched, a fixed improvement of 15% is assigned to each unmatched child.
- The SIB has set a number of children the Department of Family and Children’s Services guarantees to refer. For each child under the guaranteed number, an improvement of 40% is assigned. You can see below that in the first year of the program, referrals from government were 21% below where they should be, which combined with the score for unmatched children, lifted the performance percentage from -1% to 8%. This is a huge incentive for the Department to make sure they are meeting their referral guarantee.
So the 2014 results carried a huge penalty for government falling short in referrals.
Interestingly enough, the 2015 results show the Guaranteed Referrals still falling short by 13%. This could be due to over-calculation of eligible families during the contract development, or it could indicate a continuing lack of referrals by government staff to the program. It will be interesting to watch how this mechanism works as we head towards payment in 2018.