Skip to main content



Child-Maltreatment-Research-L (CMRL) List Serve

Database of Past CMRL Messages

Welcome to the database of past Child-Maltreatment-Research-L (CMRL) list serve messages. The table below contains all past CMRL messages (text only, no attachments) from Nov. 20, 1996 - December 22, 2017 and is updated quarterly.

Instructions: Postings are listed for browsing with the newest messages first. Click on the linked ID number to see a message. You can search the author, subject, message ID, and message content fields by entering your criteria into this search box:

Message ID: 7980
Date: 2009-01-05

Author:Chaffin, Mark J. (HSC)

Subject:RE: Evidence-based and "Mix and Match" Programs

Tom,



Yaiiii....where to start. There is considerable interest in the idea of extracting common elements from across evidence based models, then applying these depending on assessed case characteristics and some systematic algorithm. Probably the most detailed system of how this process might be undertaken has been described by Chorpita and colleagues. Note, however, that the processes for identifying both the elements themselves and the matching algorithm as described by Chorpita are NOT just a matter of logic-model 'mix and match' eyeballing based on the clinician's gut or personal preferences. It is a quite structured and quantitative process. It is still a protocol (and a complicated one at that), and not at all the same thing as free-styling, fly-by-the-seat-of-the-pants services often advocated by the anti-EBT crowd. If anything, it is a system probably requiring more expertise and training than most simple EBT's. How well these algorithm-driven, elements-based systems work in practice is a subject of ongoing study. The jury is not in. Some EBT's such as MST, have done essentially this same thing for years and its worked out well. MST, for example, is not a single protocol, but is assessment driven and tailored, yet focuses the intervention on what we know matters, using basic elements that are known to work In this sense, there are fairly complex EBT's, such as MST, which target broad populations, and more specific EBT's, such as TF-CBT for example, which are specific for particular types of well-defined problems (i.e. PTSD).



The elements-based idea does have a clear appeal--EBT's are probably not monolithic entities incapable of being subdivided. Most share common elements within a given domain (e.g. most evidence-based parenting programs share many common elements such as use of labeled praise, application of structured time-out protocols, etc.). However, I would be very skeptical of efforts to use this rationale to neuter EBT's, or to simply say "oh, we're already doing all that" or "this is the same thing" when it really isn't the case. One need look no further than the Blueprints implementations to find evidence of this, where well-intended shooting from the hip led to blending in ad hoc crap with EBT's and spoiled the results. The point is that the elements-based approach is not an excuse to allow anything and everything to come into the intervention or to mix-and-match without some fairly tight limits, such as those described by Chorpita and colleagues.



The area of most concern in your question lies in the idea of whether more is better. There is excellent evidence at this point that this is not only false but that more can become harmful. For example, where parenting interventions are concerned, it appears that adding additional services to a parenting program actually poisons the benefits (Kaminsky, et al. meta-analysis). CPS and courts are notorious for this misconception--often prescribing so many services that whatever benefits any of the services might have offered may be quickly lost in the confusion. The analogy to "polypharmacy" in psychiatry is not a bad one. Focus (i.e. directly behaviorally targeting the top priority that needs to change) rather than comprehensiveness (trying to fix everything), is the new watchword in many service systems. Exactly how much is too much, and what demands the highest priority is an unanswered empirical question. But the emerging science in this area does, IMO, suggest a couple of general principles. First, tailoring might work...IF the elements are selected carefully based on scientific evidence and are clear essential common components across EBT's. Second, there are clearly both practical and therapeutic limits to how many things can be done well at once, and this point is reached rapidly and past that point quickly begins to ruin the overall service benefit. So, "focus" not "comprehensive" should be the watchword. We need to emphasize this watchword because practitioners have been so imbued with the idea that "comprehensive" is necessary that it takes considerable effort to disabuse them of this unfortunate misconception. This is another reason why any novel algorithm-driven, elements-based protocols need to be structured. And why it needs to be rigorously evaluated. Keep in mind that implementations of novel blended or elements-based programs cannot properly be called evidence-based just because the sources for the elements were evidence-based. For that matter, it cannot even be presumed to be effective, altough we might predict that it would be.



Mark





________________________________________

From: Tom Hanna [tph3@cornell.edu]

Sent: Thursday, December 18, 2008 8:25 AM

Subject: Evidence-based and "Mix and Match" Programs



On another list, there is active discussion underway on starting an

"ancillary" parenting education program to an existing "core" home

visitation program. The conversation quickly turned to the topic of

"evidence-based", and then to funders and their requirements.



The picture quickly got cloudy for me:



1. Some folks who already have an ancillary parenting education

program reported that "blending" aspects of two evidence-based

programs allowed them to tailor the trainings to the specific needs

of their "home visited" parents. Others quickly pointed out that

this is "wrong" and should not be done -- neither evidence based

program is being followed precisely, and therefore both are

"contaminated." Funders frown.



2. No one has said what additive effect, if any, is expected from

providing a parenting program on top of a home visitation program.

The underlying assumption is that families will be better off with

two distinct interventions instead of one. (In fact, many centers in

this home visiting network have many ancillary programs that serve

some if not many of their home visited families.)



3. I know that lots of funders are demanding that agencies use

"evidence-based" programs. But I now learn that funders are pushing

implementation of a "matrix" of "evidence-based" programs. The

underlying assumption is that "if one evidence-based program is good

for families, then many are better."



My question: Is there any research that helps multi-service agencies

make their way through this minefield when working with a cohort of

families?

-- Any study of the "deterioration of effects" of the blending of two

evidence based models for the same intervention?

-- Any classical studies of "additive effects" of multiple targeted

interventions?

-- Any evidence that a "matrix" of evidence based programs has a

stronger effect than a "pure" one-program approach?

-- Any analysis that shows that evidence based programs in different

interventions (home visitation vs parenting ed vs therapy groups) are

(or are not) internally consistent? (ie, my doctor gave me one

instruction about diet, my nutritionist gave me a contradictory

instruction, and my home visitor's instruction differed from the

other two.)



TIA

Tom









--

--

Tom Hanna, Director

Child Abuse Prevention Network

www.child-abuse.com

tom@child-abuse.com

tph3@cornell.edu

off 607.275.9360

cel 607.227.4524

fax: 415.962.0510

--



Tom,



Yaiiii....where to start. There is considerable interest in the idea of extracting common elements from across evidence based models, then applying these depending on assessed case characteristics and some systematic algorithm. Probably the most detailed system of how this process might be undertaken has been described by Chorpita and colleagues. Note, however, that the processes for identifying both the elements themselves and the matching algorithm as described by Chorpita are NOT just a matter of logic-model 'mix and match' eyeballing based on the clinician's gut or personal preferences. It is a quite structured and quantitative process. It is still a protocol (and a complicated one at that), and not at all the same thing as free-styling, fly-by-the-seat-of-the-pants services often advocated by the anti-EBT crowd. If anything, it is a system probably requiring more expertise and training than most simple EBT's. How well these algorithm-driven, elements-based systems work in practice is a subject of ongoing study. The jury is not in. Some EBT's such as MST, have done essentially this same thing for years and its worked out well. MST, for example, is not a single protocol, but is assessment driven and tailored, yet focuses the intervention on what we know matters, using basic elements that are known to work In this sense, there are fairly complex EBT's, such as MST, which target broad populations, and more specific EBT's, such as TF-CBT for example, which are specific for particular types of well-defined problems (i.e. PTSD).



The elements-based idea does have a clear appeal--EBT's are probably not monolithic entities incapable of being subdivided. Most share common elements within a given domain (e.g. most evidence-based parenting programs share many common elements such as use of labeled praise, application of structured time-out protocols, etc.). However, I would be very skeptical of efforts to use this rationale to neuter EBT's, or to simply say "oh, we're already doing all that" or "this is the same thing" when it really isn't the case. One need look no further than the Blueprints implementations to find evidence of this, where well-intended shooting from the hip led to blending in ad hoc crap with EBT's and spoiled the results. The point is that the elements-based approach is not an excuse to allow anything and everything to come into the intervention or to mix-and-match without some fairly tight limits, such as those described by Chorpita and colleagues.



The area of most concern in your question lies in the idea of whether more is better. There is excellent evidence at this point that this is not only false but that more can become harmful. For example, where parenting interventions are concerned, it appears that adding additional services to a parenting program actually poisons the benefits (Kaminsky, et al. meta-analysis). CPS and courts are notorious for this misconception--often prescribing so many services that whatever benefits any of the services might have offered may be quickly lost in the confusion. The analogy to "polypharmacy" in psychiatry is not a bad one. Focus (i.e. directly behaviorally targeting the top priority that needs to change) rather than comprehensiveness (trying to fix everything), is the new watchword in many service systems. Exactly how much is too much, and what demands the highest priority is an unanswered empirical question. But the emerging science in this area does, IMO, suggest a couple of general principles. First, tailoring might work...IF the elements are selected carefully based on scientific evidence and are clear essential common components across EBT's. Second, there are clearly both practical and therapeutic limits to how many things can be done well at once, and this point is reached rapidly and past that point quickly begins to ruin the overall service benefit. So, "focus" not "comprehensive" should be the watchword. We need to emphasize this watchword because practitioners have been so imbued with the idea that "comprehensive" is necessary that it takes considerable effort to disabuse them of this unfortunate misconception. This is another reason why any novel algorithm-driven, elements-based protocols need to be structured. And why it needs to be rigorously evaluated. Keep in mind that implementations of novel blended or elements-based programs cannot properly be called evidence-based just because the sources for the elements were evidence-based. For that matter, it cannot even be presumed to be effective, altough we might predict that it would be.



Mark





________________________________________

From: Tom Hanna [tph3cornell.edu]

Sent: Thursday, December 18, 2008 8:25 AM

Subject: Evidence-based and "Mix and Match" Programs



On another list, there is active discussion underway on starting an

"ancillary" parenting education program to an existing "core" home

visitation program. The conversation quickly turned to the topic of

"evidence-based", and then to funders and their requirements.



The picture quickly got cloudy for me:



1. Some folks who already have an ancillary parenting education

program reported that "blending" aspects of two evidence-based

programs allowed them to tailor the trainings to the specific needs

of their "home visited" parents. Others quickly pointed out that

this is "wrong" and should not be done -- neither evidence based

program is being followed precisely, and therefore both are

"contaminated." Funders frown.



2. No one has said what additive effect, if any, is expected from

providing a parenting program on top of a home visitation program.

The underlying assumption is that families will be better off with

two distinct interventions instead of one. (In fact, many centers in

this home visiting network have many ancillary programs that serve

some if not many of their home visited families.)



3. I know that lots of funders are demanding that agencies use

"evidence-based" programs. But I now learn that funders are pushing

implementation of a "matrix" of "evidence-based" programs. The

underlying assumption is that "if one evidence-based program is good

for families, then many are better."



My question: Is there any research that helps multi-service agencies

make their way through this minefield when working with a cohort of

families?

-- Any study of the "deterioration of effects" of the blending of two

evidence based models for the same intervention?

-- Any classical studies of "additive effects" of multiple targeted

interventions?

-- Any evidence that a "matrix" of evidence based programs has a

stronger effect than a "pure" one-program approach?

-- Any analysis that shows that evidence based programs in different

interventions (home visitation vs parenting ed vs therapy groups) are

(or are not) internally consistent? (ie, my doctor gave me one

instruction about diet, my nutritionist gave me a contradictory

instruction, and my home visitor's instruction differed from the

other two.)



TIA

Tom









--

--

Tom Hanna, Director

Child Abuse Prevention Network

www.child-abuse.com

tomchild-abuse.com

tph3cornell.edu

off 607.275.9360

cel 607.227.4524

fax: 415.962.0510

--