Through a public records request and subsequent lawsuit, Justice in Aging obtained California’s “performance measure” data from the state’s Medicaid assisted living program. The data demonstrate deep problems in both federal and state Medicaid policies, as documented in Justice in Aging’s comprehensive report, An Illusion of Protection: Meaningless Federal “Quality Measures” Endanger Assisted Living Residents (Feb. 2024; with March 2024 update.) This paper summarizes the findings of that comprehensive report, including federal and state policy recommendations.

Under federal law and policy, assisted living quality measures don’t measure assisted living or quality.

Although the Medicaid program can pay for assisted living care, federal Medicaid law sets few standards for assisted living. Instead, state Medicaid programs issue formal “assurances” that facility residents’ health and welfare is adequately protected. Also, states must develop “performance measures” to monitor program performance and identify potential problems. If a performance measure results in a score of 85% or lower, the state must conduct remediation activities to address the identified problem.

You might expect these performance measures to be available on the Internet. In fact, no. States generally share performance measures with the federal government and no one else.

You also might expect performance measures to relate to assisted living quality, and to distinguish between strong and weak facilities. Again the answer is no. Most performance measures pertain to the state’s operation of its Medicaid program, rather than the Medicaid-funded assisted living care. The measures provide little information about the overall quality of assisted living care, and none whatsoever about the quality of any particular facility or facilities.

Federal and state governments keep performance measure information from the public.

Performance measure information is collected by the states and shared with the federal government through mandatory reporting. The public, however, is kept in the dark, as demonstrated by the obstacles encountered in a recent attempt to obtain performance measure information from California.

In December 2021, Justice in Aging submitted a public records request to the California Department of Health Care Services (DHCS) for the state’s assisted living performance measure data. DHCS never claimed that the requested information was exempt from public records requirements. Instead, DHCS requested delays over a period of several months, and then began ignoring Justice in Aging’s continued inquiries. Finally, given no other realistic option, Justice in Aging filed a public records lawsuit in December 2022, one year after the original request. Soon after the lawsuit was served on the State, DHCS finally began providing Justice in Aging with the requested data, covering the three-year period from March 2019 through February 2022.

California administers 18 Assisted Living Performance Measures, but none provide useful information about the care received by assisted living residents.

The information eventually turned over by DHCS indicates that DHCS administers 18 performance measures within California’s assisted living program.

These 18 measures are comprised of:

  • Seven easy-to-satisfy measures with purported 100% compliance;
  • Six case-file-review measures with suspiciously identical results;
  • Two measures based on facilities meeting basic licensure and in-service training requirements;
  • One faulty measure; and
  • Two measures relating to payment processes, with no relevance to assisted living quality.

The seven measures with purported 100% compliance serve no useful purpose.

As across-the-board 100% results might suggest, complying with these seven measures was virtually automatic. Thus, none of these seven measures is useful in monitoring or improving California’s assisted living Medicaid program.

Most of these measures with “perfect” scores are trivial on their face – for example, the enrollment of applying care coordination agencies during the same year in which they apply, or the fact that assisted living facilities and the one home health agency were qualified to provide the services for which they were reimbursed. In a similar vein, of course all level-of-care determinations were performed by a registered nurse, since program standards require exactly that.

The negligible utility of two other measures is demonstrated by the large numbers involved, combined with the “perfect” results. Careful evaluation of level-of-care determinations and service plans would require time and professional judgment. But clearly, as a practical matter, no such judgment was applied to 20,236 level-of-care determinations and 1,020 waiver service plans. Instead, all evidence suggests that the relevant computer programs generated “in compliance” findings as a matter of course.

Finally, little needs to be said about the performance measure based on resolving cases of abuse, neglect or exploitation. Zero divided by zero simply does not equal 100%.

Seven Performance Measures with 100% ScoresPerformance Measure Data (over three years, unless otherwise indicated)
Care coordination agencies enrolled in year of application15 of 15 100%
Facilities qualified to provide assisted living services360 of 360 100%
Home health agencies maintaining provider qualifications1 of 1 each year 100%
Level of care determinations performed by registered nurse, as required by program standards7,541 of 7,541 100%
Level-of-care determinations performed with appropriate processes and instruments20,236 of 20,236 100%
Service plans revised to address changed needs identified in reassessment1,020 of 1,020 100%
Resolved cases from the cases of abuse, neglect or exploitation identified in Serious Incident Reports submitted by facilities0 of 0 100%

The six case-file-review measures all have identical results, indicating that the data are not meaningful.

Six performance measures based on case review produced exactly the same results in each of the three years. The lockstep nature of these results indicates that neither the case review process nor the data are meaningful. As a practical matter, the data suggest that reviewers found compliance whenever a case file was available and intact. Then, in the rare instances in which a file was unavailable or compromised for some reason, the reviewer found noncompliance for each of the six relevant performance measures, leading to identical percentages for each of the measures.

Six Performance Measures with Identical ResultsYear 1Year 2Year 3
Service plans reflecting
assisted living resident’s needs
26 of 27 96%75 of 75 100%478 of 483 99%
Service plans submitted
within ten days of completing assessment
26 of 27 96%75 of 75 100%478 of 483 99%
Residents receiving services
delivered in accordance with service plan
26 of 27 96%75 of 75 100%478 of 483 99%
Residents offered choice between
assisted living and nursing facility care
26 of 27 96%75 of 75 100%478 of 483 99%
Service plans not calling for
restraints or seclusion
26 of 27 96%75 of 75 100%478 of 483 99%
Cases indicating that State monitored
overall health care standards
26 of 27 96%75 of 75 100%478 of 483 99%

The two measures based on licensure and in-service training requirements are trivial.

Two performance measures—one based on licensure standards, the other on in-service training—have
almost-identical, almost-perfect results and are largely inconsequential. Licensure and certification are the bare minimum for participating in the Medicaid program, and in-service training is mandatory. Facility compliance here indicates nothing more than facilities clearing the lowest of bars. Also, the lockstep data over the three years again suggest weakness in data collection, with a distinct possibility that compliance was registered virtually automatically, unless information for a particular facility was inaccessible for some reason.

Two Trivial Performance MeasuresYear 1Year 2Year 3
Facilities maintained licensure and certification through year189 of 191 99%29 of 30 97%190 of 193 98%
Facilities conducted mandatory in-service staff training190 of 191 99%29 of 30 97%190 of 193 98%

The faulty measure and the two payment-related measures do not provide any information about care received by assisted living residents.

The measure deemed faulty by DHCS is based on facility-submitted Serious Incident Reports, and consists of the number of reported incidents of abuse, neglect or exploitation, compared to the total number of reported incidents of any type. In rejecting the measure as faulty, DHCS explains that it had intended that the measure evaluate how well its system resolves problems related to abuse, neglect and exploitation—but the measure obviously fails to do that.

The final two measures each relate to payment: the percent of participants who were enrolled prior to submission of claims, and the percent of claims coded and paid for in accordance with reimbursement methodology.

It may be that these two measures offer some value to DHCS in evaluating current payment procedures. Notably, however, these two measures have almost nothing to say about the quality of assisted living care.

Five of the measures relate in some way to assisted living quality, but none are meaningful.

Another way to look at the 18 measures is to focus on the five measures that arguably could relate to assisted living quality. Three of these measures consider only qualifications: the one home health agency that maintained its licensure over the three-year period, facilities’ “perfect” record in maintaining qualifications to provide assisted living services during the relevant year, and facilities’ almost-perfect record in maintaining licensing and certification. A fourth quality-related measure merely records whether the provider offered in-service training.

The fifth measure purportedly documents whether services were delivered in accordance with service plans, but the results from that measure are drawn from the dubious case review process discussed above.

As explained previously, none of these five measures are helpful in evaluating assisted living quality. The measures are often trivial, generating perfect or almost-perfect scores in a way that suggests an effort to produce those high scores rather than to conscientiously examine assisted living quality.

Recommendations: the Federal and California governments should revise their policies to improve assisted living quality, including changes to make performance measures useful to government officials, providers, and consumers.

Specifically, the Federal and California governments should:

  • Monitor assisted living performance in a way that allows for evaluation of individual facilities.
  • Improve quality monitoring of assisted living facilities, includingdeveloping and enforcing standards.
  • Use performance measures to address health care disparities.
  • Make performance measure information available to the public.
  • Reorient performance measures towards actual policy improvements.

March 2024 Update

In 2023, DHCS asked for public comment on a draft renewal application for the Assisted Living Waiver. In response, Justice in Aging and several dozen partners submitted a comment letter that pointed out that the ALW’s performance measures had “little to say about the quality of care received by program participants,” and recommended that the measures be “substantially rewritten … to better measure access, health and welfare.”

Subsequently — and soon after this report’s initial publication — CMS approved California’s application for a five-year renewal, effective March 1, 2024. As one of the “significant changes” to the renewed waiver, the approved waiver application listed “Update Performance Measures.”

The updated waiver measures, however, do not address the problems identified in this report: the renewed waiver calls for the same 18 performance measures. Eight of the performance measures are word-for-word identical; the rest are revised slightly, generally to provide greater specificity in calculating measures. For example, several of the performance measures now specify that data analysis is based on the “waiver year” or the “current waiver year.”