50
Evidence Based Evidence Based Medicine Medicine Pharmaceutica Pharmaceutica l Funding of l Funding of Clinical Clinical Research Research

Ebm pharm funding 2014

Embed Size (px)

Citation preview

Evidence Based MedicineEvidence Based Medicine

Pharmaceutical Pharmaceutical Funding of Funding of

Clinical ResearchClinical Research

Conflict of interest slideConflict of interest slide

• This presentation has been brought to you by;

The letter: T

The number: 3

How funding affects researchHow funding affects research

• In the US, the pharmaceutical industry spends about $15 billion on research

• Together drug companies spend more money then the NIH for research

Influence of industryInfluence of industry

1. Direction of medical research

2. Sharing of information

3. Discontinuation of clinical trials

4. Publication of research results

5. Outcome of clinical research

6. Adaptation of research

Direction of Medical researchDirection of Medical research

• When choosing research topics, faculty members with industry support are much more likely to consider commercial considerations into account.

• 35% of those with industry funding

• 14% without funding

Direction of Medical researchDirection of Medical research

• Effect of industry and Institutional Review Boards (IRB)– 36% of IRB members

receive industry funding– IRB members felt that

14.4% of the time industry funding influenced IRB-related decisions

Cambell E G et al. NEJM. 2006;355:2321-9

Sharing of informationSharing of information

• Sharing of scientific information is necessary to avoid duplication of research and helps to increase the pace of innovation.

• Faculty members with industry support are almost twice as likely to refuse to share research results.

Publication of research resultsPublication of research results

• Receiving money from commercial sources can lead to either a suppression or delay in the publication.– Faculty who delay

publication of results > 6 months are almost twice as likely to receive funding from drug companies.

Stopping clinical trialsStopping clinical trials

CONVINCE Trial: Comparing verapamil to either HCTZ or atenolol– 16,000 patients with planned mean follow-up of 5

years– Stopped after 3 years “for commercial reasons” – Preliminary analysis showed nonsignificant

increase combined end-point of death, stroke, or MI (4.9% vs. 4.7%)

• Significant increase in risk of heart failure and to die or be hospitalized due to bleeding.

Stopping of clinical trialsStopping of clinical trials

• Other examples of trials stopped early– Paclitaxel for ovarian carcinoma– N-acetylcysteine in dialysis patients to reduce

erythropoetin need– Iron chelation therapy for transfusion

dependent thalassemia• Then issued legal warning to prevent investigator

from talking about results

– Many other examplesPsaty and Rennie in JAMA 2003;289:2128-2131

Oversight of research resultsOversight of research results

• Unlike 20 years ago, many companies design the trials themselves and bring in outside investigators to run it.

• The raw data is often stored at the company and investigators may only receive portions of the data.– One executive explained why, “We are reluctant to

provide the data tape because some investigators want to take the data beyond where the data should go.”

Bodenheimer in NEJM 2000;342:1539-1544

Oversight of research resultsOversight of research results

• Accuracy of RCT data• Study of 37 reanalyses which took raw data and

re-evaluated it.• “35% of published reanalyses led to changes in

findings that implied conclusions different from those of the original article about the types and number of patients that should be treated.” – 4 changed the direction of treatment effect– 2 changed the magnitude of effect– 3 showed that different patients should be treated– 9 showed that more patients should be treated

Ebrahim et. Al. in JAMA 2014;Sept 10th: 312:1024-32

Oversight of research results: Oversight of research results: How the article is writtenHow the article is written

Drug companies employ “ghostwriters” who write the article and may be instructed to insert favorable wording.– Are not named by study

• Named authors are instead used to enhance the prestige of the article.

Bodenheimer in NEJM 2000;342:1539-1544

Control over publicationControl over publication

• 30-50% of companies that fund research submit contracts with clauses that may allow them to stop publication based on the study finding.

Bodenheimer in NEJM 2000;342:1539-1544

Control over publicationControl over publication

• A Canadian investigator was sued by Apotex after she reported results that showed her study drug (deferiprone) worsened hepatic fibrosis.

Control over publicationControl over publication

• Abbott Pharmaceuticals refused to allow a researcher to publish a study that showed that their synthroid did not work better then generic preparations of levothyroxine.

Control over publicationControl over publication

• For NEJM article, 6 investigator interviewed for the report cited cases of articles whose publications were stopped or whose content was changed.– The actual frequency of this occurring is

unknown.• An “investigator found that a drug he was studying

caused adverse reactions. . . the company vowed never to fund his work again and published a competing article with scant mention of the adverse effect.”

Bodenheimer in NEJM 2000;342:1539-1544

Control over publicationControl over publication

• In a 2007 study evaluating clinical-trial agreements with academic institutions.– 15% of academic institutions agreed to

allow industry sponsors to revise manuscripts or to cancel publications.

– 50% allow industry sponsors to draft manuscripts

Mello et al in NEJM 2007 352:21:2202-10

SSRIs in pediatric depressionSSRIs in pediatric depression

• Several years ago, published trials supported a favorable risk-benefit profile for SSRIs in pediatric depression.

• Unpublished data of paroxetine showed increased incidence of suicidal behavior and no benefit compared to placebo.

• Rennie in JAMA 2004:292;1359-1362.

Publication biasPublication bias

• Publication of only data supporting intervention leads to incorrect magnification of perceived benefit

• Mandatory trial registration should decrease this

Publication BiasPublication Bias

FunnelPlot

Publication BiasPublication Bias

Anne WS et al. Viscsupplementation for Osteoarthritis of the KneeAnn Intern med. 157(3):180-191

PublicationBias

Turner et al. NEJM. 2008;358:252-60.

Comparison of outcomes from Comparison of outcomes from registered trialsregistered trials

• 2009 Study in JAMA assessing mandatory trial registration.

• Only 45% of published trials are registered correctly.

• 31% of trials showed evidence of selective reporting.

• Outcome registered differed from outcome published.

Mathieu et. al. in JAMA 2009;302(9):997-984

Outcome of clinical researchOutcome of clinical research

Association of industry funding and Association of industry funding and statistically significant findingstatistically significant finding

• Consecutive series of randomized controlled trials in Journals between 1/99 and 6/01, including;

Bhandari in CMAJ 2004;170:477-480

Industry fundingIndustry funding

• Analyzed 332 RCT’s

• Industry funding was associated with a significant result in favor of the new industry product (OR 1.9)

• When these results were combined with additional studies of 1240 trials, the association was found to be higher (OR 2.3)

Industry fundingIndustry funding

• Observational study of 370 randomized drug trials

• Concluded that the best predictor of whether the experimental treatment is recommended is how the study was funded (OR 5.3)

Als-Nielsen et al in JAMA 2003;290:921-928

Source of funding % recommending experimental drug

Nonprofit organization 16%

No funding reported 30%

Both nonprofit and for-profit 35%

For-profit organization 51%

Effect of funding on conclusionEffect of funding on conclusion

Effect of funding on conclusionEffect of funding on conclusion

• This association could not be explained by;– Study quality

– Type of control

– Sample size

– Year of publication

– Publication in high impact journal

Industry sponsored researchIndustry sponsored research

Possible explanations of why industry sponsored research show favorable results.

1. Drug companies may fund research on trials they consider to be superior.

• No evidence that researchers can predict outcome

2. Positive results may be the result of poor quality research.

• Low quality studies exaggerate the benefits of treatment by an average of 34%.

• Found that research sponsored by drug companies are at least as good as non-industry funded research.

Industry sponsored researchIndustry sponsored research

3. Are appropriate comparison drugs selected?• In studies where 2 drugs were compared and

the doses not equivalent, the higher dose drug was the one the study sponsored.

• Comparison drug not equivalent PO fluconazole vs. PO amphotericin B

• Using atenolol as control- atenolol has not been shown to reduce cardiovascular outcomes.

4. Publication bias• Negative studies are not published

Lexchin in BMJ 2003;326:1167-77Calberg et al. in Lancet 2004 364;1684-1689

Industry sponsored researchIndustry sponsored research

5. Selection of patient population.• If want low side effects- use younger

population• To increase efficacy- exclude everyone but

the most likely to benefit.• Only about 10% of patient in California receiving

acetylcholinesterase inhibitors for dementia would have been considered for inclusion into initial drug company studies.

Industry sponsored researchIndustry sponsored research6. Distort or ‘spin’ the results of RCT’s with

nonsignificant results for primary outcome– Review of RCT’s published Dec ’06. 72 of 616

reports were eligible.– Examples of spin- focusing on subgroup

analyses or only focusing on only significant results

– 18% of trials had ‘spin’ in the title– 37.5% had ‘spin’ in the Results and

Conclusion sectionsBoutron et al. in JAMA 2010;303(20):2058-2064

Adaptation of clinical researchAdaptation of clinical research

• Drug industry used to spend $8,000 to $15,000 per physician every year on direct marketing.

• There were 90,000 drug reps in the United states. – One for every 4.7 office-based physician.

Blumenthal in NEJM 2004;351:1885-1890.

Effects of drug repsEffects of drug reps

• Pharmaceutical reps focus on the 3 F’s– Food– Flattery– Friendship

• Sociologists have found that after receiving gifts, it is natural that the recipient feels indebted.

How do drug rep interactions, drug How do drug rep interactions, drug samples, food, and small gifts samples, food, and small gifts

affect physicians?affect physicians?• Evidence suggest that physicians receive

such gifts:1. Are more likely to request the inclusion of

the companies drugs on formularies.

2. More likely to prescribe the companies drug.

3. Less likely to prescribe generic medications.

Effect of direct to physician Effect of direct to physician marketingmarketing

• There is evidence that the more gifts that physicians receive, the more likely they were to believe that gifts did not affect their prescribing.

• Small gifts ended in 2009

Drug rep money at UMC/VADrug rep money at UMC/VA

• One hour guest lecturer for management conference at the VA receives $2500-$3000 for the lecture.

• Average lunch at UMC used to cost $150-$250.

Adaption of clinical research: Adaption of clinical research: “seeding trials”“seeding trials”

• “Clinical trials, deceptively portrayed as patient studies, which are used to promote drugs recently approved or under review by the FDA”

• Used to expose physicians to new drugs and have them interact with sales reps.

• Physicians “investigators” are the actual trial subjects. Krumholz et al. Arch Intern Med 2011;171(12):1100-1107

Adaption of clinical research: Adaption of clinical research: “seeding trials”“seeding trials”

• Examples of “seeding trials”– Vioxx vs. naproxen (ADVANTAGE)– Neurontin- titrate to effect (STEPS)

• No current estimate how often this occurs.

• Most of the evidence of seeding trials come from tort litigation documents against pharmaceutical companies

Krumholz et al. Arch Intern Med 2011;171(12):1100-1107

Adaption of clinical research: Adaption of clinical research: “seeding trials”“seeding trials”

• STEPS study- titrate to effect, profile of safety– Uncontrolled and unblinded study designed to study

efficacy, safety and tolerability– Site investigator had little or no clinical trial experience– Sales reps collected individual subject trial data– Company recruited 5000 “investigators” with 1500 attending

introductory study• Study showed a 38% increase in prescriptions after attendance

– Physicians were given a free lunch after recruiting 4 patients and free dinner after 7

Krumholz et al. Arch Intern Med 2011;171(12):1100-1107

Adaption of clinical research: Adaption of clinical research: “seeding trials”“seeding trials”

• 2759 patients were enrolled in STEPS– 11 died– 73 had serious adverse effects– 997 experienced less serious adverse

effects

• Seeding trials are not illegal and are not regulated by FDA

Krumholz et al. Arch Intern Med 2011;171(12):1100-1107