Nurturing the null: Leveraging unexpected null results to improve case management for housing services

Posted on:
Authors:
Kimberly Dodds
A child and parent sitting opposite a man talking.
Credit: Shutterstock

Null results—when a study does not find significant impacts on chosen outcomes—can provide valuable insights for research and policies alike. However, it can be difficult for stakeholders to identify and leverage these insights. In J-PAL's null results blog series, we highlight randomized evaluations that yielded null results in order to elevate their lessons learned and inform future research. In this piece, Kimberly Dodds, homeless prevention program manager in King County, Washington, discusses her experience discovering that their program had null results and how they were used to improve the Youth Family Homelessness Prevention Initiative.

The Youth Family Homelessness Prevention Initiative (YFHPI)—a program run by King County, Washington, in partnership with two dozen private social service agencies—provides youth and families with intensive case management and financial assistance with the aim of ensuring that they can stay housed when facing the imminent threat of housing loss or eviction. 

Youth and families on the verge of homelessness face many obstacles during this unstable time: negotiating with landlords, keeping themselves or their children in school, continuing to pay for food, childcare, utilities, and transportation, and keeping track of their belongings, among other challenges. The YFHPI program is powered by the belief that having a case manager to help a family work through these barriers together may provide stronger support than cash alone. A professional case manager is trained to navigate the complexities of finding new housing, school requirements and resources, eviction proceedings, and other logistical barriers that may prevent youth and families from finding permanent, stable housing. 

As the program manager for YFHPI, I (Kim) have myriad responsibilities, including ensuring that our program works. To me, rigorously evaluating our program was essential for understanding the impact of our work and, hopefully, being able to prove that it is possible to impact homelessness if we use the right interventions. 

To understand the YFHPI’s impact, particularly the effect of case managers, we partnered with J-PAL affiliated professor David Phillips and other researchers at the Lab for Economic Opportunities at the University of Notre Dame. We tested the impact of our services (case management plus financial assistance) compared to financial assistance alone on eviction and take up of housing services. In 2020, the evaluation concluded, having served 631 families receiving a combination of our services for at least twelve months.

We found that people assigned to case management plus financial assistance did not experience a lower eviction rate than people assigned to financial assistance alone. In other words, we found null results.

This news sparked a lot of fear among my team: fear that we had been utilizing important and limited dollars inefficiently, fear that these results would catch on in the news and programs across the country would be defunded, and fear that we weren’t serving our families as effectively as possible. However, our research partners at LEO and program administrators at King County helped us realize that we could still learn a lot from these results, even if they were disappointing.

In close partnership with the researchers, other King County staff, and service providers, we then began our journey of understanding why we found no measurable impact of case management on evictions and what we could do about it. We spoke with our program implementers, who had been on the ground throughout the course of this study, on what aspects of our theory of change may not have held true throughout the study period. 

We found that there was a lot of variability in the training and preparation case managers received and in the quality and depth of services they provided. Relatedly, we noted that many case managers worked in stressful and unstable conditions. Over the study period, case managers faced an 80 percent turnover rate, with many of them citing low wages as a primary reason for leaving their position. Additionally, we took notice of the high caseloads that our case managers were juggling—some of them assigned to as many as forty families and youth at a time.

In response to these findings, we piloted different programmatic changes to identify levers that we could adjust, and we implemented crucial changes to improve how we serve families on the verge of homelessness in King County. For example, we

  • established a minimum wage standard for case managers in our program, 
  • created an upper limit of fifteen families per case manager in an effort to improve retention and working conditions,
  • increased financial support to supervisors and contractual requirements for supervisor support of case managers, and
  • added weekly group training for all case managers, increasing opportunities for shared learning, and weekly check-ins between program managers, case managers, and their supervisors.  

We believe that with these changes in place, families receiving intensive case management will now be able to move through our theory of change as we initially intended, ultimately securing permanent, stable housing, employment, and other positive downstream outcomes.  

While we were initially disheartened by our null result, we treated it as an opportunity to carefully re-evaluate our service delivery model. The unexpected results are often the most important ones you can get.

Reflecting on this experience, I believe two things were central in transforming this disappointing result into a learning opportunity. First, our partnerships with the researchers and program implementers were key parts of understanding these results and responding to them with sustainable change. Specifically, the researchers incorporated our reflections into the final paper, validating the programmatic changes we were hoping to pilot and implement. Second, once we acknowledged the fear of receiving these results, we were able to embrace our curiosity about them. Asking questions throughout the process, finding opportunities to learn and improve, and considering factors that could have led to these results were important parts in landing us where we are with YFHPI today. 

I deeply understand the fear that many program managers face when deciding to evaluate or when discovering null results about a program that you pour so many resources and so much time and effort into. However, even knowing what I know now—that this study was going to produce null results—I still would go through with this evaluation again because this process improved our program, made us create better conditions for our case managers, and strengthened our belief that this model can work.
 

Authored By