Novedades Bibliográficas

A Donation After Circulatory Death Program Has the Potential to Increase the Number of Donors After Brain Death*

Critical Care Medicine - Lun, 01/02/2016 - 08:00
Objectives: Donation after circulatory death has been responsible for 75% of the increase in the numbers of deceased organ donors in the United Kingdom. There has been concern that the success of the donation after circulatory death program has been at the expense of donation after brain death. The objective of the study was to ascertain the impact of the donation after circulatory death program on donation after brain death in the United Kingdom. Design: Retrospective cohort study. Setting: A national organ procurement organization. Patients: Patients referred and assessed as donation after circulatory death donors in the United Kingdom between October and December 2013. Interventions: None. Measurements and Main Results: A total of 257 patients were assessed for donation after circulatory death. Of these, 193 were eligible donors. Three patients were deemed medically unsuitable following surgical inspection, 56 patients did not proceed due to asystole, and 134 proceeded to donation. Four donors had insufficient data available for analysis. Therefore, 186 cases were analyzed in total. Organ donation would not have been possible in 79 of the 130 actual donors if donation after circulatory death was not available. Thirty-six donation after circulatory death donors (28% of actual donors) were judged to have the potential to progress to brain death if withdrawal of life-sustaining treatment had been delayed by up to a further 36 hours. A further 15 donation after circulatory death donors had brain death confirmed or had clinical indications of brain death with clear mitigating circumstances in all but three cases. We determined that the maximum potential donation after brain death to donation after circulatory death substitution rate observed was 8%; however due to mitigating circumstances, only three patients (2%) could have undergone brain death testing. Conclusions: The development of a national donation after circulatory death program has had minimal impact on the number of donation after brain death donors. The number of donation after brain death donors could increase with changes in end-of-life care practices to allow the evolution of brain death and increasing the availability of ancillary testing.

Randomized, Double-Blind, Placebo-Controlled Trial of Thiamine as a Metabolic Resuscitator in Septic Shock: A Pilot Study

Critical Care Medicine - Lun, 01/02/2016 - 08:00
Objective: To determine if intravenous thiamine would reduce lactate in patients with septic shock. Design: Randomized, double-blind, placebo-controlled trial. Setting: Two US hospitals. Patients: Adult patients with septic shock and elevated (> 3 mmol/L) lactate between 2010 and 2014. Interventions: Thiamine 200 mg or matching placebo twice daily for 7 days or until hospital discharge. Measurements and Main Results: The primary outcome was lactate levels 24 hours after the first study dose. Of 715 patients meeting the inclusion criteria, 88 patients were enrolled and received study drug. There was no difference in the primary outcome of lactate levels at 24 hours after study start between the thiamine and placebo groups (median: 2.5 mmol/L [1.5, 3.4] vs. 2.6 mmol/L [1.6, 5.1], p = 0.40). There was no difference in secondary outcomes including time to shock reversal, severity of illness and mortality. 35% of the patients were thiamine deficient at baseline. In this predefined subgroup, those in the thiamine treatment group had statistically significantly lower lactate levels at 24 hours (median 2.1 mmol/L [1.4, 2.5] vs. 3.1 [1.9, 8.3], p = 0.03). There was a statistically significant decrease in mortality over time in those receiving thiamine in this subgroup (p = 0.047). Conclusion: Administration of thiamine did not improve lactate levels or other outcomes in the overall group of patients with septic shock and elevated lactate. In those with baseline thiamine deficiency, patients in the thiamine group had significantly lower lactate levels at 24 hours and a possible decrease in mortality over time.

Multicenter Comparison of Machine Learning Methods and Conventional Regression for Predicting Clinical Deterioration on the Wards

Critical Care Medicine - Lun, 01/02/2016 - 08:00
Objective: Machine learning methods are flexible prediction algorithms that may be more accurate than conventional regression. We compared the accuracy of different techniques for detecting clinical deterioration on the wards in a large, multicenter database. Design: Observational cohort study. Setting: Five hospitals, from November 2008 until January 2013. Patients: Hospitalized ward patients Interventions: None Measurements And Main Results: Demographic variables, laboratory values, and vital signs were utilized in a discrete-time survival analysis framework to predict the combined outcome of cardiac arrest, intensive care unit transfer, or death. Two logistic regression models (one using linear predictor terms and a second utilizing restricted cubic splines) were compared to several different machine learning methods. The models were derived in the first 60% of the data by date and then validated in the next 40%. For model derivation, each event time window was matched to a non-event window. All models were compared to each other and to the Modified Early Warning score, a commonly cited early warning score, using the area under the receiver operating characteristic curve (AUC). A total of 269,999 patients were admitted, and 424 cardiac arrests, 13,188 intensive care unit transfers, and 2,840 deaths occurred in the study. In the validation dataset, the random forest model was the most accurate model (AUC, 0.80 [95% CI, 0.80-0.80]). The logistic regression model with spline predictors was more accurate than the model utilizing linear predictors (AUC, 0.77 vs 0.74; p < 0.01), and all models were more accurate than the MEWS (AUC, 0.70 [95% CI, 0.70-0.70]). Conclusions: In this multicenter study, we found that several machine learning methods more accurately predicted clinical deterioration than logistic regression. Use of detection algorithms derived from these techniques may result in improved identification of critically ill patients on the wards.

Value-Based Medicine: Dollars and Sense

Critical Care Medicine - Lun, 01/02/2016 - 08:00
No abstract available

Evaluating Tele-ICU Cost—An Imperfect Science*

Critical Care Medicine - Lun, 01/02/2016 - 08:00
No abstract available

Endotoxin: Back to the Future*

Critical Care Medicine - Lun, 01/02/2016 - 08:00
No abstract available

Thrombolytic-Enhanced Extracorporeal Cardiopulmonary Resuscitation After Prolonged Cardiac Arrest

Critical Care Medicine - Lun, 01/02/2016 - 08:00
Objective: To investigate the effects of the combination of extracorporeal cardiopulmonary resuscitation and thrombolytic therapy on the recovery of vital organ function after prolonged cardiac arrest. Design: Laboratory investigation. Setting: University laboratory. Subjects: Pigs. Interventions: Animals underwent 30-minute untreated ventricular fibrillation cardiac arrest followed by extracorporeal cardiopulmonary resuscitation for 6 hours. Animals were allocated into two experimental groups: t-extracorporeal cardiopulmonary resuscitation (t-ECPR) group, which received streptokinase 1 million units, and control extracorporeal cardiopulmonary resuscitation (c-ECPR), which did not receive streptokinase. In both groups, the resuscitation protocol included the following physiologic targets: mean arterial pressure greater than 70 mm Hg, cerebral perfusion pressure greater than 50 mm Hg, PaO2 150 ± 50 torr (20 ± 7 kPa), Paco2 40 ± 5 torr (5 ± 1 kPa), and core temperature 33°C ± 1°C. Defibrillation was attempted after 30 minutes of extracorporeal cardiopulmonary resuscitation. Measurements and Main Results: A cardiac resuscitability score was assessed on the basis of success of defibrillation, return of spontaneous heart beat, weanability from extracorporeal cardiopulmonary resuscitation, and left ventricular systolic function after weaning. The addition of thrombolytic to extracorporeal cardiopulmonary resuscitation significantly improved cardiac resuscitability (3.7 ± 1.6 in t-ECPR vs 1.0 ± 1.5 in c-ECPR). Arterial lactate clearance was higher in t-ECPR than in c-ECPR (40% ± 15% vs 18% ± 21%). At the end of the experiment, the intracranial pressure was significantly higher in c-ECPR than in t-ECPR. Recovery of brain electrical activity, as assessed by quantitative analysis of electroencephalogram signal, and ischemic neuronal injury on histopathologic examination did not differ between groups. Animals in t-ECPR group did not have increased bleeding complications, including intracerebral hemorrhages. Conclusions: In a porcine model of prolonged cardiac arrest, t-ECPR improved cardiac resuscitability and reduced brain edema, without increasing bleeding complications. However, early electroencephalogram recovery and ischemic neuronal injury were not improved.

Posttraumatic Propofol Neurotoxicity Is Mediated via the Pro–Brain-Derived Neurotrophic Factor-p75 Neurotrophin Receptor Pathway in Adult Mice*

Critical Care Medicine - Lun, 01/02/2016 - 08:00
Objectives: The gamma-aminobutyric acid modulator propofol induces neuronal cell death in healthy immature brains by unbalancing neurotrophin homeostasis via p75 neurotrophin receptor signaling. In adulthood, p75 neurotrophin receptor becomes down-regulated and propofol loses its neurotoxic effect. However, acute brain lesions, such as traumatic brain injury, reactivate developmental-like programs and increase p75 neurotrophin receptor expression, probably to foster reparative processes, which in turn could render the brain sensitive to propofol-mediated neurotoxicity. This study investigates the influence of delayed single-bolus propofol applications at the peak of p75 neurotrophin receptor expression after experimental traumatic brain injury in adult mice. Design: Randomized laboratory animal study. Setting: University research laboratory. Subjects: Adult C57BL/6N and nerve growth factor receptor–deficient mice. Interventions: Sedation by IV propofol bolus application delayed after controlled cortical impact injury. Measurements and Main Results: Propofol sedation at 24 hours after traumatic brain injury increased lesion volume, enhanced calpain-induced ?II-spectrin cleavage, and increased cell death in perilesional tissue. Thirty-day postinjury motor function determined by CatWalk (Noldus Information Technology, Wageningen, The Netherlands) gait analysis was significantly impaired in propofol-sedated animals. Propofol enhanced pro–brain-derived neurotrophic factor/brain-derived neurotrophic factor ratio, which aggravates p75 neurotrophin receptor–mediated cell death. Propofol toxicity was abolished both by pharmacologic inhibition of the cell death domain of the p75 neurotrophin receptor (TAT-Pep5) and in mice lacking the extracellular neurotrophin binding site of p75 neurotrophin receptor. Conclusions: This study provides first evidence that propofol sedation after acute brain lesions can have a deleterious impact and implicates a role for the pro–brain-derived neurotrophic factor-p75 neurotrophin receptor pathway. This observation is important as sedation with propofol and other compounds with GABA receptor activity are frequently used in patients with acute brain pathologies to facilitate sedation or surgical and interventional procedures.

How to Solve the Underestimated Problem of Overestimated Sodium Results in the Hypoproteinemic Patient

Critical Care Medicine - Lun, 01/02/2016 - 08:00
Objectives: The availability of a fast and reliable sodium result is a prerequisite for the appropriate correction of a patient’s fluid balance. Blood gas analyzers and core laboratory chemistry analyzers measure electrolytes via different ion-selective electrode methodology, that is, direct and indirect ion-selective electrodes, respectively. Sodium concentrations obtained via both methods are not always concordant. A comparison of results between both methods was performed, and the impact of the total protein concentration on the sodium concentration was investigated. Furthermore, we sought to develop an adjustment equation to correct between both ion-selective electrode methods. Design: A model was developed using a pilot study cohort (n = 290) and a retrospective patient cohort (n = 690), which was validated using a prospective patient cohort (4,006 samples). Setting: ICU and emergency department at Ghent University Hospital. Patients: Patient selection was based on the concurrent availability of routine blood gas Na+direct as well as core laboratory Na+indirect results. Interventions: In the pilot study, left-over blood gas syringes were collected for further laboratory analysis. Measurement and Main Results: There was a significant negative linear correlation between Na+indirect and Na+direct relative to changes in total protein concentration (Pearson r = –0.69; p < 0.0001). In our setting, for each change of 10 g/L in total protein concentration, a deviation of ~1.3 mmol/L is observed with the Na+indirect result. Validity of our adjustment equation protein-corrected Na+indirect = Na+indirect – 10.53 + (0.1316 × total protein) was demonstrated on a prospective patient cohort. Conclusions: As Na+direct measurements on a blood gas analyzer are not influenced by the total protein concentration in the sample, they should be preferentially used in patients with abnormal protein concentrations. However, as blood gas analyzers are not available at all clinical wards, the implementation of a protein-corrected sodium result might provide an acceptable alternative.
Distribuir contenido