首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 55 毫秒
1.
BackgroundAlpha-fetoprotein-producing gastric cancer (AFPGC) poses a therapeutic challenge worldwide because of its poor prognosis. This study aimed to evaluate the efficacy and safety of antiangiogenic drug apatinib in advanced AFPGC in a real-world setting.MethodsFrom September 2015 to December 2017, twenty-one patients identified with AFPGC from the clinical trial AHEAD-G202, an open-label, prospective, multicenter, non-interventional study of apatinib for advanced metastatic gastric cancer, were enrolled to perform this analysis. Patients received oral apatinib as monotherapy or combination therapy. A treatment cycle was defined as 28 days. The primary outcome was progression-free survival (PFS) and overall survival (OS), and the secondary outcomes included safety, objective response rate (ORR), and disease control rate (DCR).ResultsTwenty patients were evaluated for the apatinib efficacy analysis. The ORR of apatinib was 10%, whereas the DCR was 70%. The median PFS was 3.5 months [95%confidence interval (CI): 2.34–4.66]. The median OS was 4.5 months (95%CI: 3.49–5.51). Median OS of AFPGC patients without carcinoembryonic antigen (CEA) elevation achieved 30.8 months. CEA elevation was considered to be a potential independent predictive factor for OS (P = 0.030) and PFS (P = 0.047) by the analysis of multivariate analysis. The most common grade 3 to 4 adverse events (AEs) were hypertension (4.8%), hand-foot syndrome (4.8%), anorexia (4.8%), and vomiting and nausea (4.8%).ConclusionApatinib showed promising efficacy and an acceptable safety profile in patients with advanced AFPGC. Antiangiogenic therapy may be a good strategy for the treatment of AFPGC as a rare sub-type of gastric cancer.Trial registrationAHEAD-G202 (NCT02668380).  相似文献   

2.

Correction to: EMBO Reports (2019) 20: e47074. DOI 10.15252/embr.201847074 | Published online 6 May 2019The authors noticed that the control and disease labels had been inverted in their data analysis resulting in publication of incorrect data in Figure 1C. The corrected figure is displayed below. This change affects the conclusions as detailed below. The authors apologize for this error and any confusion it may have caused.In the legend of 1C, change from, “Differential gene expression analysis of pediatric ileal CD patient samples (n = 180) shows increased (> 4‐fold) IMP1 expression as compared to non‐inflammatory bowel disease (IBD) pediatric samples (n = 43)”.Open in a separate windowFigure 1CCorrected Open in a separate windowFigure 1COriginal To, "Differential gene expression analysis of pediatric ileal CD patient samples (n = 180) shows decreased (> 4‐fold) IMP1 expression as compared to non‐inflammatory bowel disease (IBD) pediatric samples (n = 43)”.In abstract, change from, “Here, we report increased IMP1 expression in patients with Crohn''s disease and ulcerative colitis”.To, “Here, we report increased IMP1 expression in adult patients with Crohn''s disease and ulcerative colitis”.In results, change from, “Consistent with these findings, analysis of published the Pediatric RISK Stratification Study (RISK) cohort of RNA‐sequencing data 38 from pediatric patients with Crohn''s disease (CD) patients revealed that IMP1 is upregulated significantly compared to control patients and that this effect is specific to IMP1 (i.e., other distinct isoforms, IMP2 and IMP3, are not changed; Fig 1C)”.To, “Contrary to our findings in colon tissue from adults, analysis of published RNA‐sequencing data from the Pediatric RISK Stratification Study (RISK) cohort of ileal tissue from children with Crohn’s disease (CD) 38 revealed that IMP1 is downregulated significantly compared to control patients in the RISK cohort and that this effect is specific to IMP1 (i.e., other distinct isoforms, IMP2 and IMP3, are not changed; Fig 1C)”.In discussion, change from, “Indeed, we report that IMP1 is upregulated in patients with Crohn''s disease and ulcerative colitis and that mice with Imp1 loss exhibit enhanced repair following DSS‐mediated damage”.To “Indeed, we report that IMP1 is upregulated in adult patients with Crohn''s disease and ulcerative colitis and that mice with Imp1 loss exhibit enhanced repair following DSS‐mediated damage”.  相似文献   

3.
ObjectiveTo assess the effect of pioglitazone on renal outcome, including urinary albumin excretion and estimated glomerular filtration rate (eGFR), in diabetic patients.DesignA prospective, randomized, open-labeled, controlled study.SettingTaipei Veterans General Hospital.PatientsSixty type 2 diabetic patients treated with sulfonylureas and metformin, whose glycated hemoglobin (HbA1c) levels were between 7% and 10% and eGFR was between 45 and 125 mL/min/1.73 m2.InterventionThe patients were randomized to receive acarbose or pioglitazone and followed up for 6 months. Thirty patients were randomly assigned to receive acarbose, and 30 patients were assigned to receive pioglitazone.MeasurementsThe primary study endpoint was the changes in the urinary albumin-to-creatinine ratio (UACR). The secondary endpoint was the changes in eGFR and other parameters.ResultsAfter 6 months of treatment, the mean changes in UACR were −18 ± 104 and 12 ± 85 (p = 0.25, between groups) for the acarbose and pioglitazone groups, respectively. The mean changes in eGFR were 0 ± 14 and −7 ± 16 mL/min/1.73 m2 (p = 0.09, between groups) for the acarbose and pioglitazone groups, respectively. The reductions in HbA1c were similar in both groups. Fasting blood glucose was lower in the pioglitazone group than in the acarbose group. Significant body weight gain was observed in the pioglitazone group as compared with the acarbose group (1.3 ± 2.8 vs. −0.6 ± 1.5 kg, p = 0.002).ConclusionIn type 2 diabetic patients who were treated with sulfonylureas and metformin and possessed HbA1c levels between 7% and 10%, additional acarbose or pioglitazone for 6 months provided similar glycemic control and eGFR and UACR changes. In the pioglitazone group, the patients exhibited significant body weight gain.

Trial Registration

ClinicalTrials.gov NCT01175486  相似文献   

4.
BackgroundThe dose of protamine required following cardiopulmonary bypass (CPB) is often determined by the dose of heparin required pre-CPB, expressed as a fixed ratio. Dosing based on mathematical models of heparin clearance is postulated to improve protamine dosing precision and coagulation. We hypothesised that protamine dosing based on a 2-compartment model would improve thromboelastography (TEG) parameters and reduce the dose of protamine administered, relative to a fixed ratio.Methods and findingsWe undertook a 2-stage, adaptive randomised controlled trial, allocating 228 participants to receive protamine dosed according to a mathematical model of heparin clearance or a fixed ratio of 1 mg of protamine for every 100 IU of heparin required to establish anticoagulation pre-CPB. A planned, blinded interim analysis was undertaken after the recruitment of 50% of the study cohort. Following this, the randomisation ratio was adapted from 1:1 to 1:1.33 to increase recruitment to the superior arm while maintaining study power. At the conclusion of trial recruitment, we had randomised 121 patients to the intervention arm and 107 patients to the control arm. The primary endpoint was kaolin TEG r-time measured 3 minutes after protamine administration at the end of CPB. Secondary endpoints included ratio of kaolin TEG r-time pre-CPB to the same metric following protamine administration, requirement for allogeneic red cell transfusion, intercostal catheter drainage at 4 hours postoperatively, and the requirement for reoperation due to bleeding. The trial was listed on a clinical trial registry (ClinicalTrials.gov Identifier: NCT03532594).Participants were recruited between April 2018 and August 2019. Those in the intervention/model group had a shorter mean kaolin r-time (6.58 [SD 2.50] vs. 8.08 [SD 3.98] minutes; p = 0.0016) post-CPB. The post-protamine thromboelastogram of the model group was closer to pre-CPB parameters (median pre-CPB to post-protamine kaolin r-time ratio 0.96 [IQR 0.78–1.14] vs. 0.75 [IQR 0.57–0.99]; p < 0.001). We found no evidence of a difference in median mediastinal/pleural drainage at 4 hours postoperatively (140 [IQR 75–245] vs. 135 [IQR 94–222] mL; p = 0.85) or requirement (as a binary outcome) for packed red blood cell transfusion at 24 hours postoperatively (19 [15.8%] vs. 14 [13.1%] p = 0.69). Those in the model group had a lower median protamine dose (180 [IQR 160–210] vs. 280 [IQR 250–300] mg; p < 0.001).Important limitations of this study include an unblinded design and lack of generalisability to certain populations deliberately excluded from the study (specifically children, patients with a total body weight >120 kg, and patients requiring therapeutic hypothermia to <28°C).ConclusionsUsing a mathematical model to guide protamine dosing in patients following CPB improved TEG r-time and reduced the dose administered relative to a fixed ratio. No differences were detected in postoperative mediastinal/pleural drainage or red blood cell transfusion requirement in our cohort of low-risk patients.Trial registrationClinicalTrials.gov Unique identifier NCT03532594.

Lachlan Miles and co-workers report on a randomized controlled trial seeking to optimise protamine dosing after cardiopulmonary bypass.  相似文献   

5.
IntroductionPatients with active rheumatoid arthritis (RA) despite anti–tumor necrosis factor(anti-TNF)agent treatment can switch to either a subsequent anti-TNF agent or a biologic with an alternative mechanism of action, such as rituximab; however, there are limited data available to help physicians decide between these 2 strategies. The objective of this analysis was to examine the effectiveness and safety of rituximab versus a subsequent anti-TNF agent in anti-TNF–experienced patients with RA using clinical practice data from the Corrona registry.MethodsRituximab-naive patients from the Corrona registry with prior exposure to ≥1 anti-TNF agent who initiated rituximab or anti-TNF agents (2/28/2006-10/31/2012) were included. Two cohorts were analyzed: the trimmed population (excluding patients who fell outside the propensity score distribution overlap) and the stratified-matched population (stratified by 1 vs ≥2 anti-TNF agents, then matched based on propensity score). The primary effectiveness outcome was achievement of low disease activity (LDA)/remission (Clinical Disease Activity Index ≤10) at 1 year. Secondary outcomes included achievement of modified American College of Rheumatology (mACR) 20/50/70 responses and meaningful improvement (≥0.25) in modified Health Assessment Questionnaire (mHAQ) score at 1 year. New cardiovascular, infectious and cancer events were reported.ResultsEstimates for LDA/remission, mACR response and mHAQ improvement were consistently better for rituximab than for anti-TNF agent users in adjusted analyses. The odds ratio for likelihood of LDA/remission in rituximab versus anti-TNF patients was 1.35 (95 % CI, 0.95-1.91) in the trimmed population and 1.54 (95 % CI, 1.01-2.35) in the stratified-matched population. Rituximab patients were significantly more likely than anti-TNF patients to achieve mACR20/50 and mHAQ improvement in the trimmed population and mACR20 and mHAQ in the stratified-matched population. The rate of new adverse events per 100 patient-years was similar between groups.ConclusionsIn anti-TNF–experienced patients with RA, rituximab was associated with an increased likelihood of achieving LDA/remission, mACR response and physical function improvement, with a comparable safety profile, versus subsequent anti-TNF agent users.

Trial registration

ClinicalTrials.gov NCT01402661. Registered 25 July 2011.

Electronic supplementary material

The online version of this article (doi:10.1186/s13075-015-0776-1) contains supplementary material, which is available to authorized users.  相似文献   

6.
IntroductionNew-onset left bundle branch block (LBBB) following transcatheter or surgical aortic valve replacement (LBBBAVI) implies a proximal pathogenesis of LBBB. This study compares electrocardiographic characteristics and concordance with LBBB definitions between LBBBAVI and non-procedure-induced LBBB controls (LBBBcontrol).MethodsAll LBBBAVI patients at Ghent University Hospital between 2013 and 2019 were enrolled in the study. LBBBAVI patients were matched for age, sex, ischaemic heart disease and ejection fraction to LBBBcontrol patients in a 1:2 ratio. For inclusion, a non-strict LBBB definition was used (QRS duration ≥ 120 ms, QS or rS in V1, absence of Q waves in V5-6). Electrocardiograms were digitally analysed and classified according to three LBBB definitions: European Society of Cardiology (ESC), Strauss and American Heart Association (AHA).ResultsA total of 177 patients (59 LBBBAVI and 118 LBBBcontrol) were enrolled in the study. LBBBAVI patients had more lateral QRS notching/slurring (100% vs 85%, p = 0.001), included a higher percentage with a QRS duration ≥ 130 ms (98% vs 86%, p = 0.007) and had a less leftward oriented QRS axis (−15° vs −30°, p = 0.013) compared to the LBBBcontrol group. ESC and Strauss criteria were fulfilled in 100% and 95% of LBBBAVI patients, respectively, but only 18% met the AHA criteria. In LBBBcontrol patients, concordance with LBBB definitions was lower than in the LBBBAVI group: ESC 85% (p = 0.001), Strauss 68% (p < 0.001) and AHA 7% (p = 0.035). No differences in electrocardiographic characterisation or concordance with LBBB definitions were observed between LBBBAVI and LBBBcontrol patients with lateral QRS notching/slurring.ConclusionNon-uniformity exists among current LBBB definitions concerning the detection of proximal LBBB. LBBBAVI may provide a framework for more consensus on defining proximal LBBB.Supplementary InformationThe online version of this article (10.1007/s12471-021-01565-8) contains supplementary material, which is available to authorized users.  相似文献   

7.
IntroductionTofacitinib is an oral Janus kinase inhibitor for the treatment of rheumatoid arthritis (RA). During the clinical development programme, increases in mean serum creatinine (SCr) of approximately 0.07 mg/dL and 0.08 mg/dL were observed which plateaued early. This study assessed changes in measured glomerular filtration rate (mGFR) with tofacitinib relative to placebo in patients with active RA.MethodsThis was a randomised, placebo-controlled, Phase 1 study (NCT01484561). Patients were aged ≥18 years with active RA. Patients were randomised 2:1 to oral tofacitinib 10 mg twice daily (BID) in Period 1 then placebo BID in Period 2 (tofacitinib → placebo); or oral placebo BID in both Periods (placebo → placebo). Change in mGFR was evaluated by iohexol serum clearance at four time points (run-in, pre-dose in Period 1, Period 1 end, and Period 2 end). The primary endpoint was the change in mGFR from baseline to Period 1 end. Secondary endpoints included: change in mGFR at other time points; change in estimated GFR (eGFR; Cockcroft–Gault equation) and SCr; efficacy; and safety.Results148 patients were randomised to tofacitinib → placebo (N = 97) or placebo → placebo (N = 51). Baseline characteristics were similar between groups. A reduction of 8% (90% confidence interval [CI]: 2%, 14%) from baseline in adjusted geometric mean mGFR was observed during tofacitinib treatment in Period 1 vs placebo. During Period 2, mean mGFR returned towards baseline during placebo treatment, and there was no difference between the two treatment groups at the end of the study – ratio (tofacitinib → placebo/placebo → placebo) of adjusted geometric mean fold change of mGFR was 1.04 (90% CI: 0.97, 1.11). Post-hoc analyses, focussed on mGFR variability in placebo → placebo patients, were consistent with this conclusion. At study end, similar results were observed for eGFR and SCr. Clinical efficacy and safety were consistent with prior studies.ConclusionIncreases in mean SCr and decreases in eGFR in tofacitinib-treated patients with RA may occur in parallel with decreases in mean mGFR; mGFR returned towards baseline after tofacitinib discontinuation, with no significant difference vs placebo, even after post-hoc analyses. Safety monitoring will continue in ongoing and future clinical studies and routine pharmacovigilance.

Trial registration

Clinicaltrials.gov NCT01484561. Registered 30 November 2011.

Electronic supplementary material

The online version of this article (doi:10.1186/s13075-015-0612-7) contains supplementary material, which is available to authorized users.  相似文献   

8.
BackgroundCryoballoon pulmonary vein isolation (PVI) is a common therapy for atrial fibrillation (AF). While moderately increased sinus rhythm heart rate (HR) after PVI has been observed, inappropriate sinus tachycardia (IST) is a rare phenomenon. We aimed to investigate the prevalence and natural history of an abnormal sinus HR response after cryoballoon PVI.MethodsWe included 169/646 (26.2%) patients with AF undergoing PVI with available Holter recordings before and 3, 6 and 12 months after the procedure. Patients with AF on Holter monitoring were excluded. Mean HR increase ≥ 20 bpm or an IST-like pattern (mean HR > 90 bpm or > 80 bpm when beta-blocking agents were used) following PVI was categorised as abnormal sinus HR response.ResultsFollowing PVI, mean HR ± standard deviation increased in the entire group from 63.5 ± 8.4 to 69.1 ± 9.9 bpm at 3 months (p < 0.001), and to 71.9 ± 9.4 bpm at 6 months (p < 0.001). At 12 months, mean HR was 71.2 ± 10.1 bpm (p < 0.001). Only 7/169 patients (4.1%) met criteria for abnormal sinus HR response: mean HR was 61.9 ± 10.6 bpm (pre-ablation), 84.6 ± 9.8 bpm (3 months), 80.1 ± 6.5 bpm (6 months) and 76.3 ± 10.1 bpm (12 months). Even at 12 months, mean HR was significantly different from that pre-ablation in this group (p = 0.033). However, in patients meeting IST-like pattern criteria, mean HR at 12 months was no longer significantly different from that pre-ablation.ConclusionFew patients had an abnormal sinus HR response after PVI. Peak HR was observed 3 months after PVI, but HR was still significantly increased 12 months post-ablation compared with pre-ablation. An IST-like pattern was rarely observed. In these patients, HR decreased to pre-ablation values within a year.  相似文献   

9.
IntroductionThis randomized, double-blind, phase II study evaluated the pharmacodynamics, safety and tolerability of ISIS 329993 (ISIS-CRPRx), an antisense oligonucleotide, in patients with active rheumatoid arthritis (RA).MethodsPatients with active RA of at least six months duration were randomized into three cohorts to receive ISIS-CRPRx (100 mg, 200 mg or 400 mg) or placebo (3 active:1 placebo within each cohort) via subcutaneous (SC) injection on Days 1, 3, 5 and 8 and then once weekly for the next 11 weeks. The effects of study treatment on high-sensitivity C-reactive protein (hs-CRP) level were evaluated. An exploratory analysis on disease activity was assessed via the American College of Rheumatology 20% improvement criteria (ACR20). Safety was evaluated via adverse events and laboratory measures.ResultsFifty-one patients received one of the following treatments: ISIS-CRPRx 100 mg, n = 12; 200 mg, n = 13, 400 mg, n = 14; placebo n = 12. In the ISIS-CRPRx treatment groups there were dose-dependent reductions in hs-CRP. At Day 36 the mean percent change from baseline was: placebo: −14.4%; ISIS-CRPRx 100 mg: −19.5%; 200 mg: −56.6% and 400 mg: −76.7%, (P = 0.0015 placebo compared to 400 mg). There were no differences between treatment groups and placebo in the ACR20 at Day 36 or Day 92. There were no serious infections and no elevations in liver function tests, lipids, creatinine or other lab abnormalities related to ISIS-CRPRx.ConclusionsIn this study, ISIS-CRPRx selectively reduced hs-CRP in a dose-dependent manner, and was well-tolerated in patients with RA. Its utility as a therapy in RA remains unclear.

Trial registration

Clinicaltrials.gov NCT01414101. Registered 21 July 2011.

Electronic supplementary material

The online version of this article (doi:10.1186/s13075-015-0578-5) contains supplementary material, which is available to authorized users.  相似文献   

10.
IntroductionThe coronavirus disease 2019 (COVID-19) pandemic has put tremendous pressure on healthcare systems. Most transcatheter aortic valve implantation (TAVI) centres have adopted different triage systems and procedural strategies to serve highest-risk patients first and to minimise the burden on hospital logistics and personnel. We therefore assessed the impact of the COVID-19 pandemic on patient selection, type of anaesthesia and outcomes after TAVI.MethodsWe used data from the Netherlands Heart Registration to examine all patients who underwent TAVI between March 2020 and July 2020 (COVID cohort), and between March 2019 and July 2019 (pre-COVID cohort). We compared patient characteristics, procedural characteristics and clinical outcomes.ResultsWe examined 2131 patients who underwent TAVI (1020 patients in COVID cohort, 1111 patients in pre-COVID cohort). EuroSCORE II was comparable between cohorts (COVID 4.5 ± 4.0 vs pre-COVID 4.6 ± 4.2, p = 0.356). The number of TAVI procedures under general anaesthesia was lower in the COVID cohort (35.2% vs 46.5%, p < 0.001). Incidences of stroke (COVID 2.7% vs pre-COVID 1.7%, p = 0.134), major vascular complications (2.3% vs 3.4%, p = 0.170) and permanent pacemaker implantation (10.0% vs 9.4%, p = 0.634) did not differ between cohorts. Thirty-day and 150-day mortality were comparable (2.8% vs 2.2%, p = 0.359 and 5.2% vs 5.2%, p = 0.993, respectively).ConclusionsDuring the COVID-19 pandemic, patient characteristics and outcomes after TAVI were not different than before the pandemic. This highlights the fact that TAVI procedures can be safely performed during the COVID-19 pandemic, without an increased risk of complications or mortality.Supplementary InformationThe online version of this article (10.1007/s12471-022-01704-9) contains supplementary material, which is available to authorized users.  相似文献   

11.
BackgroundWe hypothesized that obstructive sleep apnea (OSA) can predispose individuals to lower airway infections and community-acquired pneumonia (CAP) due to upper airway microaspiration. This study evaluated the association between OSA and CAP.MethodsWe performed a case-control study that included 82 patients with CAP and 41 patients with other infections (control group). The controls were matched according to age, sex and body mass index (BMI). A respiratory polygraph (RP) was performed upon admission for patients in both groups. The severity of pneumonia was assessed according to the Pneumonia Severity Index (PSI). The associations between CAP and the Epworth Sleepiness Scale (ESS), OSA, OSA severity and other sleep-related variables were evaluated using logistic regression models. The associations between OSA, OSA severity with CAP severity were evaluated with linear regression models and non-parametric tests.FindingsNo significant differences were found between CAP and control patients regarding anthropometric variables, toxic habits and risk factors for CAP. Patients with OSA, defined as individuals with an Apnea-Hypopnea Index (AHI) ≥10, showed an increased risk of CAP (OR = 2·86, 95%CI 1·29–6·44, p = 0·01). Patients with severe OSA (AHI≥30) also had a higher risk of CAP (OR = 3·18, 95%CI 1·11–11·56, p = 0·047). In addition, OSA severity, defined according to the AHI quartile, was also significantly associated with CAP (p = 0·007). Furthermore, OSA was significantly associated with CAP severity (p = 0·0002), and OSA severity was also associated with CAP severity (p = 0·0006).ConclusionsOSA and OSA severity are associated with CAP when compared to patients admitted to the hospital for non-respiratory infections. In addition, OSA and OSA severity are associated with CAP severity. These results support the potential role of OSA in the pathogenesis of CAP and could have clinical implications. This link between OSA and infection risk should be explored to investigate the relationships among gastroesophageal reflux, silent aspiration, laryngeal sensory dysfunction and CAP.

Trial Registration

ClinicalTrials.gov NCT01071421  相似文献   

12.
BackgroundAs coronavirus disease 2019 (COVID-19) has reached pandemic status, authors from the most severely affected countries have reported reduced rates of hospital admissions for patients with acute coronary syndrome (ACS).AimThe aim of the present study was to investigate the influence of the COVID-19 outbreak on hospital admissions and outcomes in ACS patients in a single high-volume centre in southeastern Europe.MethodsThis retrospective observational study aimed to investigate the number of hospital admissions for ACS, clinical findings at admission, length of hospitalisation, major complications and in-hospital mortality during the COVID-19 outbreak and to compare the data with the same parameters during an equivalent time frame in 2019. For the ST-elevated myocardial infarction (STEMI) subgroup of patients, changes in ischaemic times were analysed as well.ResultsThere was a significant reduction of 44.3% in the number of patients admitted for ACS during the COVID-19 outbreak when compared with the same period in 2019 (151 vs 271; 95% confidence interval 38.4–50.2, p < 0.01) with a higher mortality rate (13.2% vs 7.2%, p = 0.03). In 2020, patients with non-ST-elevated myocardial infarction presented more often with acute heart failure (3.3% vs 0.7%, p = 0.04). During the COVID-19 outbreak, we observed increases in the total ischaemic time (303 ± 163.4 vs 200.8 ± 156.8 min, p < 0.05) and door-to-balloon time (69.2 ± 58.4 vs 50.5 ± 31.3 min, p < 0.01) in STEMI patients.ConclusionsThese findings should increase the awareness of morbidity and mortality related to missed or delayed treatment of ACS among the public and the healthcare services.  相似文献   

13.
BackgroundLeft bundle branch (LBB) pacing is a novel pacing technique which may serve as an alternative to both right ventricular pacing for symptomatic bradycardia and cardiac resynchronisation therapy (CRT). A substantial amount of data is reported by relatively few, highly experienced centres. This study describes the first experience of LBB pacing in a high-volume device centre.MethodsSuccess rates (i.e. the ability to achieve LBB pacing), electrophysiological parameters and complications at implant and up to 6 months of follow-up were prospectively assessed in 100 consecutive patients referred for various pacing indications.ResultsThe mean age was 71 ± 11 years and 65% were male. Primary pacing indication was atrioventricular (AV) block in 40%, CRT in 42%, and sinus node dysfunction or refractory atrial fibrillation prior to AV node ablation in 9% each. Baseline left ventricular ejection fraction was < 50% in 57% of patients, mean baseline QRS duration 145 ± 34 ms. Overall LBB pacing was successful in 83 of 100 (83%) patients but tended to be lower in patients with CRT pacing indication (69%, p = ns). Mean left ventricular activation time (LVAT) during LBB pacing was 81 ms and paced QRS duration was 120 ± 19 ms. LBB capture threshold and R‑wave sense at implant was 0.74 ± 0.4 mV at 0.4 ms and 11.9 ± 5.9 V and remained stable at 6‑month follow-up. No complications occurred during implant or follow-up.ConclusionLBB pacing for bradycardia pacing and resynchronisation therapy can be easily adopted by experienced implanters, with favourable success rates and safety profile.  相似文献   

14.
ObjectivesSince vitamin D insufficiency is common worldwide in people with HIV, we explored safety and efficacy of high dose cholecalciferol (D₃) in Botswana, and evaluated potential modifiers of serum 25 hydroxy vitamin D change (Δ25D).DesignProspective randomized double-blind 12-week pilot trial of subjects ages 5.0–50.9 years.MethodsSixty subjects randomized within five age groups to either 4000 or 7000IU per day of D₃ and evaluated for vitamin D, parathyroid hormone, HIV, safety and growth status. Efficacy was defined as serum 25 hydroxy vitamin D (25D) ≥32ng/mL, and safety as no simultaneous elevation of serum calcium and 25D. Also assessed were HIV plasma viral RNA viral load (VL), CD4%, anti-retroviral therapy (ART) regime, and height-adjusted (HAZ), weight-adjusted (WAZ) and Body Mass Index (BMIZ) Z scores.ResultsSubjects were 50% male, age (mean±SD) 19.5±11.8 years, CD4% 31.8±10.4, with baseline VL log₁₀ range of <1.4 to 3.8 and VL detectable (>1.4) in 22%. From baseline to 12 weeks, 25D increased from 36±9ng/ml to 56±18ng/ml (p<0.0001) and 68% and 90% had 25D ≥32ng/ml, respectively (p = 0.02). Δ25D was similar by dose. No subjects had simultaneously increased serum calcium and 25D. WAZ and BMIZ improved by 12 weeks (p<0.04). HAZ and CD4% increased and VL decreased in the 7000IU/d group (p<0.04). Younger (5–13y) and older (30–50y) subjects had greater Δ25D than those 14–29y (26±17 and 28±12 vs. 11±11ng/ml, respectively, p≤0.001). Δ25D was higher with efavirenz or nevirapine compared to protease inhibitor based treatment (22±12, 27±17, vs. 13±10, respectively, p≤0.03).ConclusionsIn a pilot study in Botswana, 12-week high dose D₃ supplementation was safe and improved vitamin D, growth and HIV status; age and ART regimen were significant effect modifiers.

Trial Registration

ClinicalTrials.gov NCT02189902  相似文献   

15.
BackgroundOral bleeding after dental extraction in patients on non-vitamin K oral anticoagulants (NOACs) is a frequent problem. We investigated whether 10% tranexamic acid (TXA) mouthwash decreases post-extraction bleeding in patients treated with NOACs.Methods and findingsThe EXTRACT-NOAC study is a randomized, double-blind, placebo-controlled, multicenter, clinical trial. Patients were randomly assigned to 10% TXA or placebo mouthwash and were instructed to use the mouthwash once prior to dental extraction, and thereafter for 3 times a day for 3 days. The primary outcome was the number of patients with any post-extraction oral bleeding up to day 7. Secondary outcomes included periprocedural, early, and delayed bleeding, and the safety outcomes included all thrombotic events. The first patient was randomized on February 9, 2018 and the last patient on March 12, 2020. Of 222 randomized patients, 218 patients were included in the full analysis set, of which 106 patients were assigned to TXA (74.8 (±8.8) years; 81 men) and 112 to placebo (72.7 (±10.7) years; 64 men). Post-extraction bleeding occurred in 28 (26.4%) patients in the TXA group and in 32 (28.6%) patients in the placebo group (relative risk, 0.92; 95% confidence interval [CI], 0.60 to 1.42; P = 0.72). There were 46 bleeds in the TXA group and 85 bleeds in the placebo group (rate ratio, 0.57; 95% CI, 0.31 to 1.05; P = 0.07). TXA did not reduce the rate of periprocedural bleeding (bleeding score 4 ± 1.78 versus 4 ± 1.82, P = 0.80) and early bleeding (rate ratio, 0.76; 95% CI, 0.42 to 1.37). Delayed bleeding (rate ratio, 0.32; 95% CI, 0.12 to 0.89) and bleeding after multiple extractions (rate ratio, 0.40; 95% CI, 0.20 to 0.78) were lower in the TXA group. One patient in the placebo group had a transient ischemic attack while interrupting the NOAC therapy in preparation for the dental extraction. Two of the study limitations were the premature interruption of the trial following a futility analysis and the assessment of the patients’ compliance that was based on self-reported information during follow-up.ConclusionsIn patients on NOACs undergoing dental extraction, TXA does not seem to reduce the rate of periprocedural or early postoperative oral bleeding compared to placebo. TXA appears to reduce delayed bleeds and postoperative oral bleeding if multiple teeth are extracted.Trial registrationClinicalTrials.gov NCT03413891EudraCT; EudraCT number:2017-001426-17; EudraCT Public website: eudract.ema.europa.eu.

Anna Ockerman and co-workers evaluate mouthwash containing tranexamic acid for people on non-vitamin K oral anticoagulants undergoing dental extraction.  相似文献   

16.
BackgroundLeft bundle branch area pacing (LBBAP) has recently been introduced as a physiological pacing technique with synchronous left ventricular activation. It was our aim to evaluate the feasibility and learning curve of the technique, as well as the electrical characteristics of LBBAP.Methods and resultsLBBAP was attempted in 80 consecutive patients and electrocardiographic characteristics were evaluated during intrinsic rhythm, right ventricular septum pacing (RVSP) and LBBAP. Permanent lead implantation was successful in 77 of 80 patients (96%). LBBAP lead implantation time and fluoroscopy time shortened significantly from 33 ± 16 and 21 ± 13 min to 17 ± 5 and 12 ± 7 min, respectively, from the first 20 to the last 20 patients. Left bundle branch (LBB) capture was achieved in 54 of 80 patients (68%). In 36 of 45 patients (80%) with intact atrioventricular conduction and narrow QRS, an LBB potential (LBBpot) was present with an LBBpot to onset of QRS interval of 22 ± 6 ms. QRS duration increased significantly more during RVSP (141 ± 20 ms) than during LBBAP (125 ± 19 ms), compared to 130 ± 30 ms without pacing. An even clearer difference was observed for QRS area, which increased significantly more during RVSP (from 32 ± 16 µVs to 73 ± 20 µVs) than during LBBAP (41 ± 15 µVs). QRS area was significantly smaller in patients with LBB capture compared to patients without LBB capture (43 ± 18 µVs vs 54 ± 21 µVs, respectively). In patients with LBB capture (n = 54), the interval from the pacing stimulus to R‑wave peak time in lead V6 was significantly shorter than in patients without LBB capture (75 ± 14 vs 88 ± 9 ms, respectively).ConclusionLBBAP is a safe and feasible technique, with a clear learning curve that seems to flatten after 40–60 implantations. LBB capture is achieved in two-thirds of patients. Compared to RVSP, LBBAP largely maintains ventricular electrical synchrony at a level close to intrinsic (narrow QRS) rhythm.Supplementary InformationThe online version of this article (10.1007/s12471-022-01679-7) contains supplementary material, which is available to authorized users.  相似文献   

17.
BackgroundCardiovascular guidelines recommend (bi-)annual computed tomography (CT) or magnetic resonance imaging (MRI) for surveillance of the diameter of thoracic aortic aneurysms (TAAs). However, no previous study has demonstrated the necessity for this approach. The current study aims to provide patient-specific intervals for imaging follow-up of non-syndromic TAAs.MethodsA total of 332 patients with non-syndromic ascending aortic aneurysms were followed over a median period of 6.7 years. Diameters were assessed using all available imaging techniques (echocardiography, CT and MRI). Growth rates were calculated from the differences between the first and last examinations. The diagnostic accuracy of follow-up protocols was calculated as the percentage of subjects requiring pre-emptive surgery in whom timely identification would have occurred.ResultsThe mean growth rate in our population was 0.2 ± 0.4 mm/year. The highest recorded growth rate was 2.0 mm/year, while 40.6% of patients showed no diameter expansion during follow-up. Females exhibited significantly higher growth rates than men (0.3 ± 0.5 vs 0.2 ± 0.4 mm/year, p = 0.007). Conversely, a bicuspid aortic valve was not associated with more rapid aortic growth. The optimal imaging protocol comprises triennial imaging of aneurysms 40–49 mm in diameter and yearly imaging of those measuring 50–54 mm. This strategy is as accurate as annual follow-up, but reduces the number of imaging examinations by 29.9%.ConclusionsIn our population of patients with non-syndromic TAAs, we found aneurysm growth rates to be lower than those previously reported. Yearly imaging does not lead to changes in the management of small aneurysms. Thus, lower imaging frequencies might be a good alternative approach.  相似文献   

18.
IntroductionIn the present study, we sought to identify markers in patients with anti-neutrophil cytoplasmic antibody (ANCA)-associated vasculitis (AAV) that distinguish those achieving remission at 6 months following rituximab or cyclophosphamide treatment from those for whom treatment failed in the Rituximab in ANCA-Associated Vasculitis (RAVE) trial.MethodsClinical and flow cytometry data from the RAVE trial were downloaded from the Immunology Database and Analysis Portal and Immune Tolerance Network TrialShare public repositories. Flow cytometry data were analyzed using validated automated gating and joined with clinical data. Lymphocyte and granulocyte populations were measured in patients who achieved or failed to achieve remission.ResultsThere was no difference in lymphocyte subsets and treatment outcome with either treatment. We defined a Granularity Index (GI) that measures the difference between the percentage of hypergranular and hypogranular granulocytes. We found that rituximab-treated patients who achieved remission had a significantly higher GI at baseline than those who did not (p = 0.0085) and that this pattern was reversed in cyclophosphamide-treated patients (p = 0.037). We defined optimal cutoff values of the GI using the Youden index. Cyclophosphamide was superior to rituximab in inducing remission in patients with GI below −9.25 % (67 % vs. 30 %, respectively; p = 0.033), whereas rituximab was superior to cyclophosphamide for patients with GI greater than 47.6 % (83 % vs. 33 %, respectively; p = 0.0002).ConclusionsWe identified distinct subsets of granulocytes found at baseline in patients with AAV that predicted whether they were more likely to achieve remission with cyclophosphamide or rituximab. Profiling patients on the basis of the GI may lead to more successful trials and therapeutic courses in AAV.

Trial registration

ClinicalTrials.gov identifier (for original study from which data were obtained): NCT00104299. Date of registration: 24 February 2005.

Electronic supplementary material

The online version of this article (doi:10.1186/s13075-015-0778-z) contains supplementary material, which is available to authorized users.  相似文献   

19.
BackgroundIn Phase II/III randomized controlled clinical trials for the treatment of acute uncomplicated malaria, pyronaridine–artesunate demonstrated high efficacy and a safety profile consistent with that of comparators, except that asymptomatic, mainly mild-to-moderate transient increases in liver aminotransferases were reported for some patients. Hepatic safety, tolerability, and effectiveness have not been previously assessed under real-world conditions in Africa.Methods and findingsThis single-arm, open-label, cohort event monitoring study was conducted at 6 health centers in Cameroon, Democratic Republic of Congo, Gabon, Ivory Coast, and Republic of Congo between June 2017 and April 2019. The trial protocol as closely as possible resembled real-world clinical practice for the treatment of malaria at the centers. Eligible patients were adults or children of either sex, weighing at least 5 kg, with acute uncomplicated malaria who did not have contraindications for pyronaridine–artesunate treatment as per the summary of product characteristics. Patients received fixed-dose pyronaridine–artesunate once daily for 3 days, dosed by body weight, without regard to food intake. A tablet formulation was used in adults and adolescents and a pediatric granule formulation in children and infants under 20 kg body weight. The primary outcome was the hepatic event incidence, defined as the appearance of the clinical signs and symptoms of hepatotoxicity confirmed by a >2× rise in alanine aminotransferase/aspartate aminotransferase (ALT/AST) versus baseline in patients with baseline ALT/AST >2× the upper limit of normal (ULN). As a secondary outcome, this was assessed in patients with ALT/AST >2× ULN prior to treatment versus a matched cohort of patients with normal baseline ALT/AST. The safety population comprised 7,154 patients, of mean age 13.9 years (standard deviation (SD) 14.6), around half of whom were male (3,569 [49.9%]). Patients experienced 8,560 malaria episodes; 158 occurred in patients with baseline ALT/AST elevations >2×ULN. No protocol-defined hepatic events occurred following pyronaridine–artesunate treatment of malaria patients with or without baseline hepatic dysfunction. Thus, no cohort comparison could be undertaken. Also, as postbaseline clinical chemistry was only performed where clinically indicated, postbaseline ALT/AST levels were not systematically assessed for all patients. Adverse events of any cause occurred in 20.8% (1,490/7,154) of patients, most frequently pyrexia (5.1% [366/7,154]) and vomiting (4.2% [303/7,154]). Adjusting for Plasmodium falciparum reinfection, clinical effectiveness at day 28 was 98.6% ([7,369/7,746] 95% confidence interval (CI) 98.3 to 98.9) in the per-protocol population. There was no indication that comorbidities or malnutrition adversely affected outcomes. The key study limitation was that postbaseline clinical biochemistry was only evaluated when clinically indicated.ConclusionsPyronaridine–artesunate had good tolerability and effectiveness in a representative African population under conditions similar to everyday clinical practice. These findings support pyronaridine–artesunate as an operationally useful addition to the management of acute uncomplicated malaria.Trial registrationClinicalTrials.gov NCT03201770.

Gaston Tona Lutete and co-workers report on safety and effectiveness of the antimalarial drug pyronaridine-artesunate in African countries.  相似文献   

20.
BackgroundPraziquantel (PZQ) is currently the only recommended drug for infection and disease caused by the schistosome species that infects humans; however, the current tablet formulation is not suitable for pre-school age children mainly due to its bitterness and the large tablet size. We assessed the palatability of two new orally disintegrating tablet (ODT) formulations of PZQ.MethodologyThis randomized, single-blind, crossover, swill-and-spit palatability study (NCT02315352) was carried out at a single school in Tanzania in children aged 6–11 years old, with or without schistosomiasis infection as this was not part of the assessment. Children were stratified according to age group (6–8 years or 9–11 years) and gender, then randomized to receive each formulation in a pre-specified sequence. Over 2 days, the children assessed the palatability of Levo-Praziquantel (L-PZQ) ODT 150 mg and Racemate Praziquantel (Rac-PZQ) ODT 150 mg disintegrated in the mouth without water on the first day, and L-PZQ and Rac-PZQ dispersed in water and the currently available PZQ 600 mg formulation (PZQ-Cesol) crushed and dispersed in water on the second day. The palatability of each formulation was rated using a 100 mm visual analogue scale (VAS) incorporating a 5-point hedonic scale, immediately after spitting out the test product (VASt = 0 primary outcome) and after 2–5 minutes (VASt = 2–5).Principal findingsIn total, 48 children took part in the assessment. Overall, there was no reported difference in the VASt = 0 between the two ODT formulations (p = 0.106) without water. Higher VASt = 0 and VASt = 2–5 scores were reported for L-PZQ ODT compared with Rac-PZQ ODT in older children (p = 0.046 and p = 0.026, respectively). The VASt = 0 and VASt = 2–5 were higher for both ODT formulations compared with the standard formulation (p<0.001 for both time points). No serious adverse events were reported.Conclusions/SignificanceThe new paediatric-friendly formulations dispersed in water were both found to be more palatable than the existing standard formulation of PZQ. There may be gender and age effects on the assessment of palatability. Further research is needed for assessing efficacy and tolerability of the newly ODTs Praziquantel drug in younger children.Trial registrationThe trial was registered on ClinicalTrials.gov (NCT02315352) and in the Pan African Clinical Trials Registry (PACTR201412000959159).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号