BACKGROUND: In US clinical practice, many patients who undergo placement of an implantable cardioverter-defibrillator (ICD) for primary prevention of sudden cardiac death receive dual-chamber devices. The superiority of dual-chamber over single-chamber devices in reducing the risk of inappropriate ICD shocks in clinical practice has not been established. The objective of this study was to compare risk of adverse outcomes, including inappropriate shocks, between single- and dual-chamber ICDs for primary prevention.
METHODS AND RESULTS: We identified patients receiving a single- or dual-chamber ICD for primary prevention who did not have an indication for pacing from 15 hospitals within 7 integrated health delivery systems in the Longitudinal Study of Implantable Cardioverter-Defibrillators from 2006 to 2009. The primary outcome was time to first inappropriate shock. ICD shocks were adjudicated for appropriateness. Other outcomes included all-cause hospitalization, heart failure hospitalization, and death. Patient, clinician, and hospital-level factors were accounted for using propensity score weighting methods. Among 1042 patients without pacing indications, 54.0% (n=563) received a single-chamber device and 46.0% (n=479) received a dual-chamber device. In a propensity-weighted analysis, device type was not significantly associated with inappropriate shock (hazard ratio, 0.91; 95% confidence interval, 0.59-1.38 [=0.65]), all-cause hospitalization (hazard ratio, 1.03; 95% confidence interval, 0.87-1.21 [=0.76]), heart failure hospitalization (hazard ratio, 0.93; 95% confidence interval, 0.72-1.21 [=0.59]), or death (hazard ratio, 1.19; 95% confidence interval, 0.93-1.53 [=0.17]).
CONCLUSIONS: Among patients who received an ICD for primary prevention without indications for pacing, dual-chamber devices were not associated with lower risk of inappropriate shock or differences in hospitalization or death compared with single-chamber devices. This study does not justify the use of dual-chamber devices to minimize inappropriate shocks.
Objective: To compare healthcare in acute myocardial infarction (AMI) treatment between contrasting health systems using comparable representative data from Europe and USA.
Design: Repeated cross-sectional retrospective cohort study.
Setting: Acute care hospitals in Portugal and USA during 2000-2010.
Participants: Adults discharged with AMI.
Interventions: Coronary revascularizations procedures (percutaneous coronary intervention (PCI), coronary artery bypass graft (CABG) surgery).
Main Outcome Measures: In-hospital mortality and length of stay.
Results: We identified 1 566 601 AMI hospitalizations. Relative to the USA, more hospitalizations in Portugal presented with elevated ST-segment, and fewer had documented comorbidities. Age-sex-adjusted AMI hospitalization rates decreased in USA but increased in Portugal. Crude procedure rates were generally lower in Portugal (PCI: 44% vs. 47%; CABG: 2% vs. 9%, 2010) but only CABG rates differed significantly after standardization. PCI use increased annually in both countries but CABG decreased only in the USA (USA: 0.95 [0.94, 0.95], Portugal: 1.04 [1.02, 1.07], odds ratios). Both countries observed annual decreases in risk-adjusted mortality (USA: 0.97 [0.965, 0.969]; Portugal: 0.99 [0.979, 0.991], hazard ratios). While between-hospital variability in procedure use was larger in USA, the risk of dying in a high relative to a low mortality hospital (hospitals in percentiles 95 and 5) was 2.65 in Portugal when in USA was only 1.03.
Conclusions: Although in-hospital mortality due to an AMI improved in both countries, patient management in USA seems more effective and alarming disparities in quality of care across hospitals are more likely to exist in Portugal.
Background To isolate hospital effects on risk-standardized hospital-readmission rates, we examined readmission outcomes among patients who had multiple admissions for a similar diagnosis at more than one hospital within a given year. Methods We divided the Centers for Medicare and Medicaid Services hospital-wide readmission measure cohort from July 2014 through June 2015 into two random samples. All the patients in the cohort were Medicare recipients who were at least 65 years of age. We used the first sample to calculate the risk-standardized readmission rate within 30 days for each hospital, and we classified hospitals into performance quartiles, with a lower readmission rate indicating better performance (performance-classification sample). The study sample (identified from the second sample) included patients who had two admissions for similar diagnoses at different hospitals that occurred more than 1 month and less than 1 year apart, and we compared the observed readmission rates among patients who had been admitted to hospitals in different performance quartiles. Results In the performance-classification sample, the median risk-standardized readmission rate was 15.5% (interquartile range, 15.3 to 15.8). The study sample included 37,508 patients who had two admissions for similar diagnoses at a total of 4272 different hospitals. The observed readmission rate was consistently higher among patients admitted to hospitals in a worse-performing quartile than among those admitted to hospitals in a better-performing quartile, but the only significant difference was observed when the patients were admitted to hospitals in which one was in the best-performing quartile and the other was in the worst-performing quartile (absolute difference in readmission rate, 2.0 percentage points; 95% confidence interval, 0.4 to 3.5; P=0.001). Conclusions When the same patients were admitted with similar diagnoses to hospitals in the best-performing quartile as compared with the worst-performing quartile of hospital readmission performance, there was a significant difference in rates of readmission within 30 days. The findings suggest that hospital quality contributes in part to readmission rates independent of factors involving patients. (Funded by Yale-New Haven Hospital Center for Outcomes Research and Evaluation and others.).
New state-level health insurance markets, denoted , created under the Affordable Care Act, use risk-adjusted plan payment formulas derived from a population to participate in the Marketplaces. We develop methodology to derive a sample from the target population and to assemble information to generate improved risk-adjusted payment formulas using data from the Medical Expenditure Panel Survey and Truven MarketScan databases. Our approach requires multi-stage data selection and imputation procedures because both data sources have systemic missing data on crucial variables and arise from different populations. We present matching and imputation methods adapted to this setting. The long-term goal is to improve risk-adjustment estimation utilizing information found in Truven MarketScan data supplemented with imputed Medical Expenditure Panel Survey values.
OBJECTIVES: To evaluate the incidence and characteristics of nursing home (NH) use after implantable cardioverter-defibrillator (ICD) implantation.
DESIGN: Cohort study.
SETTING: Medicare beneficiaries in the National Cardiovascular Data Registry-ICD Registry.
PARTICIPANTS: Individuals aged 65 and older receiving ICDs between January 1, 2006, and March 31, 2010 (N = 192,483).
MEASUREMENTS: Proportion of ICD recipients discharged to NHs directly after device placement, cumulative incidence of long-term NH admission, and factors associated with immediate discharge to a NH and time to long-term NH admission.
RESULTS: Over 4 years, 40.6% of the cohort died, and 35,939 (18.7%) experienced at least one NH admission, including 4.0% directly discharged to a NH after ICD implantation and 2.8% admitted to long-term NH care during follow-up. The cumulative incidence of long-term NH admission, accounting for the competing risk of death, was 1.7% at 1 year, 3.8% at 3 years, and 4.6% at 4 years; 20.1% of individuals admitted to a NH died there. Factors most strongly associated with direct NH discharge and time to long-term NH care were older age (adjusted odds ratio (AOR) = 2.09, 95% confidence interval (CI) = 2.01-2.17 per 10-year increment; adjusted hazard ratio (AHR) = 1.88, 95% CI = 1.80-1.97, respectively), dementia (AOR = 2.60, 95% CI = 2.25-3.01; AHR = 2.50, 95% CI = 2.14-2.93, respectively), and Medicare Part A claim for NH stay in prior 6 months (AOR = 3.96, 95% CI = 3.70-4.25; AHR = 2.88, 95% CI = 2.65-3.14, respectively).
CONCLUSION: Nearly one in five individuals are admitted to NHs over a median of 1.6 years of follow-up after ICD implantation. Understanding these outcomes may help inform the clinical care of these individuals.
BACKGROUND: The process of assuring the safety of medical devices is constrained by reliance on voluntary reporting of adverse events. We evaluated a strategy of prospective, active surveillance of a national clinical registry to monitor the safety of an implantable vascular-closure device that had a suspected association with increased adverse events after percutaneous coronary intervention (PCI).
METHODS: We used an integrated clinical-data surveillance system to conduct a prospective, propensity-matched analysis of the safety of the Mynx vascular-closure device, as compared with alternative approved vascular-closure devices, with data from the CathPCI Registry of the National Cardiovascular Data Registry. The primary outcome was any vascular complication, which was a composite of access-site bleeding, access-site hematoma, retroperitoneal bleeding, or any vascular complication requiring intervention. Secondary safety end points were access-site bleeding requiring treatment and postprocedural blood transfusion.
RESULTS: We analyzed data from 73,124 patients who had received Mynx devices after PCI procedures with femoral access from January 1, 2011, to September 30, 2013. The Mynx device was associated with a significantly greater risk of any vascular complication than were alternative vascular-closure devices (absolute risk, 1.2% vs. 0.8%; relative risk, 1.59; 95% confidence interval [CI], 1.42 to 1.78; P<0.001); there was also a significantly greater risk of access-site bleeding (absolute risk, 0.4% vs. 0.3%; relative risk, 1.34; 95% CI, 1.10 to 1.62; P=0.001) and transfusion (absolute risk, 1.8% vs. 1.5%; relative risk, 1.23; 95% CI, 1.13 to 1.34; P<0.001). The initial alerts occurred within the first 12 months of monitoring. Relative risks were greater in three prespecified high-risk subgroups: patients with diabetes, those 70 years of age or older, and women. All safety alerts were confirmed in an independent sample of 48,992 patients from April 1, 2014, to September 30, 2015.
CONCLUSIONS: A strategy of prospective, active surveillance of a clinical registry rapidly identified potential safety signals among recipients of an implantable vascular-closure device, with initial alerts occurring within the first 12 months of monitoring. (Funded by the Food and Drug Administration and others.).
BACKGROUND: Regulators must act to protect the public when evidence indicates safety problems with medical devices. This requires complex tradeoffs among risks and benefits, which conventional safety surveillance methods do not incorporate.
OBJECTIVE: To combine explicit regulator loss functions with statistical evidence on medical device safety signals to improve decision making.
METHODS: In the Hospital Cost and Utilization Project National Inpatient Sample, we select pediatric inpatient admissions and identify adverse medical device events (AMDEs). We fit hierarchical Bayesian models to the annual hospital-level AMDE rates, accounting for patient and hospital characteristics. These models produce expected AMDE rates (a safety target), against which we compare the observed rates in a test year to compute a safety signal. We specify a set of loss functions that quantify the costs and benefits of each action as a function of the safety signal. We integrate the loss functions over the posterior distribution of the safety signal to obtain the posterior (Bayes) risk; the preferred action has the smallest Bayes risk. Using simulation and an analysis of AMDE data, we compare our minimum-risk decisions to a conventional Z score approach for classifying safety signals.
RESULTS: The 2 rules produced different actions for nearly half of hospitals (45%). In the simulation, decisions that minimize Bayes risk outperform Z score-based decisions, even when the loss functions or hierarchical models are misspecified.
LIMITATIONS: Our method is sensitive to the choice of loss functions; eliciting quantitative inputs to the loss functions from regulators is challenging.
CONCLUSIONS: A decision-theoretic approach to acting on safety signals is potentially promising but requires careful specification of loss functions in consultation with subject matter experts.
OBJECTIVES: To investigate the association between hospital safety culture and 30-day risk-adjusted mortality for Medicare patients with acute myocardial infarction (AMI) in a large, diverse hospital cohort.
SUBJECTS: The final analytic cohort consisted of 19,357 Medicare AMI discharges (MedPAR data) linked to 257 AHRQ Hospital Survey on Patient Safety Culture surveys from 171 hospitals between 2008 and 2013.
STUDY DESIGN: Observational, cross-sectional study using hierarchical logistic models to estimate the association between hospital safety scores and 30-day risk-adjusted patient mortality. Odds ratios of 30-day, all-cause mortality, adjusting for patient covariates, hospital characteristics (size and teaching status), and several different types of safety culture scores (composite, average, and overall) were determined.
PRINCIPAL FINDINGS: No significant association was found between any measure of hospital safety culture and adjusted AMI mortality.
CONCLUSIONS: In a large cross-sectional study from a diverse hospital cohort, AHRQ safety culture scores were not associated with AMI mortality. Our study adds to a growing body of investigations that have failed to conclusively demonstrate a safety culture-outcome association in health care, at least with widely used national survey instruments.
OBJECTIVE: Second-generation antipsychotics increase the risk of diabetes and other metabolic conditions among individuals with schizophrenia. Although metabolic testing is recommended to reduce this risk, low testing rates have prompted concerns about negative health consequences and downstream medical costs. This study simulated the effect of increasing metabolic testing rates on ten-year prevalence rates of prediabetes and diabetes (diabetes conditions) and their associated health care costs.
METHODS: A microsimulation model (N=21,491 beneficiaries) with a ten-year time horizon was used to quantify the impacts of policies that increased annual testing rates in a Medicaid population with schizophrenia. Data sources included California Medicaid data, National Health and Nutrition Examination Survey data, and the literature. In the model, metabolic testing increased diagnosis of diabetes conditions and diagnosis prompted prescribers to switch patients to lower-risk antipsychotics. Key inputs included observed diagnoses, prescribing rates, annual testing rates, imputed rates of undiagnosed diabetes conditions, and literature-based estimates of policy effectiveness.
RESULTS: Compared with 2009 annual testing rates, ten-year outcomes for policies that achieved universal testing reduced exposure to higher-risk antipsychotics by 14%, time to diabetes diagnosis by 57%, and diabetes prevalence by .6%. These policies were associated with higher spending because of testing and earlier treatment.
CONCLUSIONS: The model showed that policies promoting metabolic testing provided an effective approach to improve the safety of second-generation antipsychotic prescribing in a Medicaid population with schizophrenia; however, the policies led to additional costs at ten years. Simulation studies are a useful source of information on the potential impacts of these policies.
BACKGROUND: Risk-adjustment algorithms typically incorporate demographic and clinical variables to equalize compensation to insurers for enrollees who vary in expected cost, but including information about enrollees' socioeconomic background is controversial.
METHODS: We studied 1 182 847 continuously insured 0 to 19-year-olds using 2008-2012 Blue Cross Blue Shield of Massachusetts and American Community Survey data. We characterized enrollees' socioeconomic background using the validated area-based socioeconomic measure and calculated annual plan payments using paid claims. We evaluated the relationship between annual plan payments and geocoded socioeconomic background using generalized estimating equations (γ distribution and log link). We expressed outcomes as the percentage difference in spending and utilization between enrollees with high and low socioeconomic backgrounds.
RESULTS: Geocoded socioeconomic background had a significant, positive association with annual plan payments after applying standard adjusters. Every 1 SD increase in socioeconomic background was associated with a 7.8% (95% confidence interval, 7.2% to 8.3%; P < .001) increase in spending. High socioeconomic background enrollees used higher-priced outpatient and pharmacy services more frequently than their counterparts from low socioeconomic backgrounds (eg, 25% more outpatient encounters annually; 8% higher price per encounter; P < .001), which outweighed greater emergency department spending among low socioeconomic background enrollees.
CONCLUSIONS: Higher socioeconomic background is associated with greater levels of pediatric health care spending in commercially insured children. Including socioeconomic information in risk-adjustment algorithms may address concerns about adverse selection from an economic perspective, but it would direct funds away from those caring for children and adolescents from lower socioeconomic backgrounds who are at greater risk of poor health.
BACKGROUND: People admitted to psychiatric hospitals with a diagnosis of schizophrenia may display behavioural problems. These may require management approaches such as use of coercive practices, which impact the well-being of staff members, visiting families and friends, peers, as well as patients themselves. Studies have proposed that not only patients' conditions, but also treatment environment and ward culture may affect patients' behaviour. Seclusion and restraint could possibly be prevented with staff education about user-centred, more humane approaches. Staff education could also increase collaboration between patients, family members and staff, which may further positively affect treatment culture and lower the need for using coercive treatment methods.
METHODS: This is a single-blind, two-arm cluster randomised controlled trial involving 28 psychiatric hospital wards across Finland. Units will be randomised to receive either a staff educational programme delivered by the team of researchers, or standard care. The primary outcome is the incidence of use of patient seclusion rooms, assessed from the local/national health registers. Secondary outcomes include use of other coercive methods (limb restraint, forced injection, and physical restraint), service use, treatment satisfaction, general functioning among patients, and team climate and employee turn-over (nursing staff).
DISCUSSION: The study, designed in close collaboration with staff members, patients and their relatives, will provide evidence for a co-operative and user-centred educational intervention aiming to decrease the prevalence of coercive methods and service use in the units, increase the functional status of patients and improve team climate in the units. We have identified no similar trials.
TRIAL REGISTRATION: ClinicalTrials.gov NCT02724748 . Registered on 25(th) of April 2016.
There is an active public debate about whether patients' socioeconomic status should be included in the readmission measures used to determine penalties in Medicare's Hospital Readmissions Reduction Program (HRRP). Using the current Centers for Medicare and Medicaid Services methodology, we compared risk-standardized readmission rates for hospitals caring for high and low proportions of patients of low socioeconomic status (as defined by their Medicaid status or neighborhood income). We then calculated risk-standardized readmission rates after additionally adjusting for patients' socioeconomic status. Our results demonstrate that hospitals caring for large proportions of patients of low socioeconomic status have readmission rates similar to those of other hospitals. Moreover, readmission rates calculated with and without adjustment for patients' socioeconomic status are highly correlated. Readmission rates of hospitals caring for patients of low socioeconomic status changed by approximately 0.1 percent with adjustment for patients' socioeconomic status, and only 3-4 percent fewer such hospitals reached the threshold for payment penalty in Medicare's HRRP. Overall, adjustment for socioeconomic status does not change hospital results in meaningful ways.
Approved medical devices frequently undergo FDA mandated post-approval studies (PAS). However, there is uncertainty as to the value of PAS in assessing the safety of medical devices and the cost of these studies to the healthcare system is unknown. Since PAS costs are funded through device manufacturers who do not share the costs with regulators, we sought to estimate the total PAS costs through interviews with a panel of experts in medical device clinical trial design in order to design a general cost model for PAS which was then applied to the FDA PAS. A total of 277 PAS were initiated between 3/1/05 through 6/30/13 and demonstrated a median cost of $2.16 million per study and an overall cost of $1.22 billion over the 8.25 years of study. While these costs are funded through manufacturers, the ultimate cost is borne by the healthcare system through the medical device costs. Given concerns regarding the informational value of PAS, the resources used to support mandated PAS may be better allocated to other approaches to assure safety.
BACKGROUND: Little is known regarding the relationship between hospital performance on adverse event rates and hospital performance on 30-day mortality and unplanned readmission rates for Medicare fee-for-service patients hospitalized for acute myocardial infarction (AMI).
METHODS AND RESULTS: Using 2009-2013 medical record-abstracted patient safety data from the Agency for Healthcare Research and Quality's Medicare Patient Safety Monitoring System and hospital mortality and readmission data from the Centers for Medicare & Medicaid Services, we fitted a mixed-effects model, adjusting for hospital characteristics, to evaluate whether hospital performance on patient safety, as measured by the hospital-specific risk-standardized occurrence rate of 21 common adverse event measures for which patients were at risk, is associated with hospital-specific 30-day all-cause risk-standardized mortality and unplanned readmission rates for Medicare patients with AMI. The unit of analysis was at the hospital level. The final sample included 793 acute care hospitals that treated 30 or more Medicare patients hospitalized for AMI and had 40 or more adverse events for which patients were at risk. The occurrence rate of adverse events for which patients were at risk was 3.8%. A 1% point change in the risk-standardized occurrence rate of adverse events was associated with average changes in the same direction of 4.86% points (95% CI, 0.79-8.94) and 3.44% points (95% CI, 0.19-6.68) for the risk-standardized mortality and unplanned readmission rates, respectively.
CONCLUSIONS: For Medicare fee-for-service patients discharged with AMI, hospitals with poorer patient safety performance were also more likely to have poorer performance on 30-day all-cause mortality and on unplanned readmissions.
BACKGROUND: Guideline-based admission therapies for acute myocardial infarction (AMI) significantly improve 30-day survival, but little is known about their association with long-term outcomes.
OBJECTIVES: This study evaluated the association of 5 AMI admission therapies (aspirin, beta-blockers, acute reperfusion therapy, door-to-balloon [D2B] time ≤90 min, and time to fibrinolysis ≤30 min) with life expectancy and years of life saved after AMI.
METHODS: We analyzed data from the Cooperative Cardiovascular Project, a study of Medicare beneficiaries hospitalized for AMI, with 17 years of follow-up. Life expectancy and years of life saved after AMI were calculated using Cox proportional hazards regression with extrapolation using exponential models.
RESULTS: Survival for recipients and non-recipients of the 5 guideline-based therapies diverged early after admission and continued to diverge during 17-year follow-up. Receipt of aspirin, beta-blockers, and acute reperfusion therapy on admission was associated with longer life expectancy of 0.78 (standard error [SE]: 0.05), 0.55 (SE: 0.06), and 1.03 (SE: 0.12) years, respectively. Patients receiving primary percutaneous coronary intervention (PCI) within 90 min lived 1.08 (SE: 0.49) years longer than patients with D2B times >90 min, and door-to-needle (D2N) times ≤30 min were associated with 0.55 (SE: 0.12) more years of life. A dose-response relationship was observed between longer D2B and D2N times and shorter life expectancy after AMI.
CONCLUSIONS: Guideline-based therapy for AMI admission is associated with both early and late survival benefits, and results in meaningful gains in life expectancy and large numbers of years of life saved in elderly patients.