首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.

Background

Given the prevalence of non-valvular atrial fibrillation in the geriatric population, thromboembolic prevention by means of vitamin K antagonists (VKA) is one of the most frequent daily concerns of practitioners. The effectiveness and safety of treatment with VKA correlates directly with maximizing the time in therapeutic range, with an International Normalized Ratio (INR) of 2.0-3.0. The older population concentrates many of factors known to influence INR rate, particularly concomitant medications and concurrent medical conditions, also defined as comorbidities.

Objective

Determine whether a high burden on comorbidities, defined by a Charlson Comorbidity Index (CCI) of 3 or greater, is associated a lower quality of INR control.

Study-Design

Cross-sectional study.

Settings

French geriatric care units nationwide.

Participants

2164 patients aged 80 and over and treated with vitamin K antagonists.

Measurements

Comorbidities were assessed using the Charlson Comorbidity Index (CCI). The recorded data included age, sex, falls, kidney failure, hemorrhagic event, VKA treatment duration, and the number and type of concomitant medications. Quality of INR control, defined as time in therapeutic range (TTR), was assessed using the Rosendaal method.

Results

487 patients were identified the low-quality control of INR group. On multivariate logistic regression analysis, low-quality control of INR was independently associated with a CCI ≥3 (OR = 1.487; 95% CI [1.15; 1.91]). The other variables associated with low-quality control of INR were: hemorrhagic event (OR = 3.151; 95% CI [1.64; 6.07]), hospitalization (OR = 1.614, 95% CI [1.21; 2.14]).

Conclusion

An elevated CCI score (≥3) was associated with low-quality control of INR in elderly patients treated with VKA. Further research is needed to corroborate this finding.  相似文献   

2.

Background

Center-specific TTR (c-TTR) is a measure reporting the mean patient TTR within an anticoagulation clinic describing the quality of anticoagulant monitoring offered by that clinic. c-TTR has a considerable between-center variation, but its determinants are poorly understood.

Objectives

We aimed at evaluating which clinical, procedural or laboratory factors could be associated with c-TTR variability in a multicenter, observational cross-sectional study over a five-year period.

Patients/Methods

Data from 832,204 individual patients followed for VKA therapy in 292 Centers affiliated with the Italian Federation of Anticoagulation Clinics (FCSA) were analyzed. c-TTR was computed based on the TTR of patients followed at each Center, and a mixed linear regression model was used for a predefined set of explanatory variables.

Results

The Center next-visit interval ratio (the mean number of days after a visit with an INR outside the therapeutic range, divided by the days after a visit with an INR within the therapeutic range), the Center mean patient INR and the Center laboratory performance at EQA proficiency testing were the only variables that were independently associated with c-TTR (β-coefficients -17.32, 9.67, and -0.11, respectively; r 2 = 0.635).

Conclusions

These findings suggest that c-TTR associates with proactive strategies aimed at keeping patients very close to their target INR with a prompt re-evaluation of those patients with under- or over-therapeutic INR.  相似文献   

3.

Background

Patient-self-management (PSM) of oral anticoagulant therapy with vitamin K antagonists has demonstrated efficacy in randomized, controlled trials. However, the effectiveness and efficacy of PSM in clinical practice and whether outcomes are different for females and males has been sparsely investigated.The objective is to evaluate the sex-dependent effectiveness of PSM of oral anticoagulant therapy in everyday clinical practice.

Methods

All patients performing PSM affiliated to Aarhus University Hospital and Aalborg University Hospital, Denmark in the period 1996–2012 were included in a case-series study. The effectiveness was estimated using the following parameters: stroke, systemic embolism, major bleeding, intracranial bleeding, gastrointestinal bleeding, death and time spent in the therapeutic international normalized ratio (INR) target range. Prospectively registered patient data were obtained from two databases in the two hospitals. Cross-linkage between the databases and national registries provided detailed information on the incidence of death, bleeding and thromboembolism on an individual level.

Results

A total of 2068 patients were included, representing 6,900 patient-years in total. Males achieved a significantly better therapeutic INR control than females; females spent 71.1% of the time within therapeutic INR target range, whereas males spent 76.4% (p<0.0001). Importantly, death, bleeding and thromboembolism were not significantly different between females and males.

Conclusions

Among patients treated with self-managed oral anticoagulant therapy, males achieve a higher effectiveness than females in terms of time spent in therapeutic INR range, but the incidence of major complications is low and similar in both sexes.  相似文献   

4.

Objectives

This study evaluated the waiting list for elective electrical cardioversion (ECV) for persistent atrial fibrillation (AF), focusing on when and why procedures were postponed. We compared the effects of management of the waiting list conducted by physicians versus management by nurse practitioners (NPs) and we evaluated the safety of our anticoagulating policy by means of bleeding or thromboembolic complications during and after ECV.

Background

Not all patients selected for ECV receive their treatment at the first planned instance due to a variety of reasons. These reasons are still undocumented.

Methods

We evaluated 250 consecutive patients with persistent AF admitted to our clinic for elective ECV.

Results

Within 5 to 6 weeks, 186 of 242 patients (77%) received ECV. The main reason for postponing an ECV was an inadequate international normalised ratio (INR); other reasons included spontaneous sinus rhythm and switch to rate control. A total of 23 of the 147 patients (16%) managed by the research physician were postponed due to an inadequate INR at admission versus 4 out of 98 patients (4%) managed by NPs (p = 0.005)

Conclusion

An inadequate INR is the main reason for postponing an ECV. Management of ECV by NPs is safe and leads to less postponing on admission.  相似文献   

5.

Background

This study sought to investigate the relative efficacy and safety of non-vitamin K oral anticoagulants (NOACs) for the treatment of venous thromboembolism (VTE) in cancer patients.

Methods

A systematic search of the PubMed, EMBASE, and ClinicalTrials.gov databases identified all multicentre, randomised phase III trials investigating the initial use of NOAC against a vitamin K antagonist (VKA) together with subcutaneous heparin or low molecular weight heparin (upstart) for treatment of VTE. Outcomes of interest were recurrent VTE (deep venous thrombosis or pulmonary embolism), and clinically relevant bleeding.

Results

Four randomised controlled phase III trials were included, comprising a total of 19,060 patients randomised to either NOAC or VKA. For patients with active cancer (N = 759), the analysis on the efficacy outcomes demonstrated a trend in favour of NOAC (OR 0.56, 95% CI 0.28–1.13). Similar, analyses on the safety outcomes comparing NOAC to VKA and enoxaparin demonstrated a trend in favour of NOAC (OR 0.88, 95% CI 0.57–1.35).

Conclusion

Point estimates of the effect size suggest an important estimated beneficial effect of NOAC in the treatment of VTE in cancer, in terms of efficacy and safety, but given the small numbers of patients with cancer in the randomised trials, statistical significance was not achieved.  相似文献   

6.
7.

Background

Patients who have had an unprovoked deep venous thrombosis (DVT) or pulmonary embolus (PE) are at a high risk for recurrent venous thromboembolism (VTE). Extended “life-long” anticoagulation has been recommended in these patients. However, the risk benefit ratio of this approach is controversial and the role of the direct oral anticoagulants (DOACs) and aspirin is unclear. Furthermore, in some patients with a “weak provoking factor” there is clinical equipoise regarding continuation or cessation of anticoagulant therapy after treatment of the acute VTE event.

Objective

A systematic review and meta-analysis to determine the risks (major bleeding) and benefits (recurrent VTE and mortality) of extended anticoagulation with vitamin k antagonists (VKA), DOACs and aspirin in patients with an unprovoked VTE and in those patients with clinical equipoise regarding continuation or cessation of anticoagulant therapy. In addition, we sought to determine the risk of recurrent VTE events once extended anti-thrombotic therapy was discontinued.

Data Sources

MEDLINE, Cochrane Register of Controlled Trials, citation review of relevant primary and review articles.

Study Selection

Randomized placebo-controlled trials (RCTs) that compared the risk of recurrent VTE in patients with an unprovoked DVT or PE who had been treated for at least 3 months with a VKA or a DOAC and were then randomized to receive an oral anti-thrombotic agent or placebo for at least 6 additional months. We included studies that included patients in whom clinical equipoise existed regarding the continuation or cessation of anticoagulant therapy.

Data Extraction

Independent extraction of articles by both authors using predefined data fields, including study quality indicators. Data were abstracted on study size, study setting, initial event (DVT or PE), percentage of patients where the initial VTE event was unprovoked, the number of recurrent VTE events, major bleeds and mortality during the period of extended anticoagulation in the active treatment and placebo arms. In addition, we recorded the event rate once extended treatment was stopped. Meta-analytic techniques were used to summarize the data. Studies were grouped according to the type of anti-thrombotic agent.

Data Synthesis

Seven studies which enrolled 6778 patients met our inclusion criteria; two studies evaluated the extended use of Coumadin, three studies evaluated a DOAC and two studies evaluated the use of aspirin. The duration of followup varied from 6 to 37 months. In the Coumadin and aspirin studies 100% of the randomized patients had an unprovoked VTE, while in the DOAC studies between 73.5% and 93.2% of the VTE events were unprovoked. In the control group recurrent VTE occurred in 9.7% of patients compared to 2.8% in the active treatment group (OR 0.21; 95% CI 0.11–0.42, p<0.0001). VKA, DOACs and aspirin significantly reduced the risk of recurrent VTE, with VKA and DOACs being significantly more effective than aspirin. Major bleeding events occurred in 12 patients in the control group (0.4%) and 25 of 3815 (0.6%) patients in the active treatment group (OR 1.64; 95% CI 0.69–3.90, NS). There were 39 (1.3%) deaths in control patients and 33 (0.9%) deaths in the anti-thrombotic group during the treatment period (OR 0.73; 95% CI 0.40–1.33, NS). Patients whose initial VTE event was a PE were more likely to have a recurrent PE than a DVT. The annualized event rate after discontinuation of extended antithrombotic therapy was 4.4% in the control group and 6.5% in the active treatment arm.

Conclusions

VKA, DOACs and aspirin significantly reduced the risk of recurrent VTE, with DOACs and VKA being more effective than aspirin. The decision regarding life-long anticoagulation following an unprovoked DVT or PE should depend on the patients’ risk for recurrent PE as well as the patients’ values and preferences.  相似文献   

8.

Background

Recently, using the patient’s genotype to guide warfarin dosing has gained interest; however, whether pharmacogenetics-based dosing (PD) improves clinical outcomes compared to conventional dosing (CD) remains unclear. Thus, we performed a meta-analysis to evaluate these two strategies.

Methods

The PubMed, Embase, Cochrane Library, China National Knowledge Infrastructure (CNKI), Chinese VIP and Chinese Wan-fang databases were searched. The Cochrane Collaboration’s tool was used to assess the risk of bias in randomized controlled trials (RCTs). The primary outcome was time within the therapeutic range (TTR); the secondary end points were the time to maintenance dose and time to first therapeutic international normalized ratio (INR), an INR greater than 4, adverse events, major bleeding, thromboembolism and death from any cause.

Results

A total of 11 trials involving 2,678 patients were included in our meta-analysis. The results showed that PD did not improve the TTR compared to CD, although PD significantly shortened the time to maintenance dose (MD = -8.80; 95% CI: -11.99 to -5.60; P<0.00001) and the time to first therapeutic INR (MD = -2.80; 95% CI: -3.45 to -2.15; P<0.00001). Additionally, PD significantly reduced the risk of adverse events (RR = 0.86; 95% CI: 0.75 to 0.99; P = 0.03) and major bleeding (RR = 0.36; 95% CI: 0.15 to 0.89, P = 0.03), although it did not reduce the percentage of INR greater than 4, the risk of thromboembolic events and death from any cause. Subgroup analysis showed that PD resulted in a better improvement in the endpoints of TTR and over-anticoagulation at a fixed initial dosage rather than a non-fixed initial dosage.

Conclusions

The use of genotype testing in the management of warfarin anticoagulation was associated with significant improvements in INR-related and clinical outcomes. Thus, genotype-based regimens can be considered a reliable and accurate method to determine warfarin dosing and may be preferred over fixed-dose regimens.

Trial Registration PROSPERO

Database registration: CRD42015024127.  相似文献   

9.

Background

Warfarin-related nephropathy (WRN) is a recently described disease entity, in which excessive warfarinization (international normalized ratio (INR) >3.0) causes acute kidney injury. Previous reports regarding WRN included few Asian patients who might have differed from the western WRN patients in terms of genetic and environmental factors.

Methods

During the period of March 2003 to December 2011, the data about a total of 1297 patients who had serum creatinine (sCr) level measured within 1 week after INR >3.0 and within 6 months before INR >3.0 was analyzed through the retrospective review of electronic medical records of a single tertiary hospital in Korea.

Result

WRN developed in 19.3% of patients having excessive warfarinization. The incidence was higher in the chronic kidney disease (CKD) group than the non-CKD group. The risk of WRN increased as the basal serum albumin level decreased and was strongly associated with highest quartile serum AST level at post INR elevation and the presence of congestive heart failure. But the presence of atrial fibrillation was protective against the development of WRN. Neither the presence of CKD nor basal estimated glomerular filtration rate (eGFR) was an independent risk factor for WRN. Despite no difference in the basal sCr level, the sCr level was higher in patients with WRN than those without WRN after follow-up. The mortality rates were also higher in patients with WRN.

Conclusions

WRN developed in 19.3% of patients having excessive warfarinization. A lower basal serum albumin, highest quartile serum AST level at post INR elevation, and congestive heart failure were associated with the occurrence of WRN. The development of WRN adversely affected renal and patient outcomes.  相似文献   

10.

Background

Essential information regarding efficacy and safety of vitamin K-antagonists (VKA) treatment for atrial fibrillation (AF) in non-dialysis dependent chronic kidney disease (CKD) is still lacking in current literature. The aim of our study was to compare the risks of stroke or transient ischemic attack (TIA) and major bleeds between patients without CKD (eGFR >60 ml/min), and those with moderate (eGFR 30–60 ml/min), or severe non-dialysis dependent CKD (eGFR <30 ml/min).

Methods

We included 300 patients without CKD, 294 with moderate, and 130 with severe non-dialysis dependent CKD, who were matched for age and sex. Uni- and multivariate Cox regression analyses were performed reporting hazard ratios (HRs) for the endpoint of stroke or TIA and the endpoint of major bleeds as crude values and adjusted for comorbidity and platelet-inhibitor use.

Results

Overall, 6.2% (45/724, 1.7/100 patient years) of patients developed stroke or TIA and 15.6% (113/724, 4.8/100 patient years) a major bleeding event. Patients with severe CKD were at high risk of stroke or TIA and major bleeds during VKA treatment compared with those without renal impairment, HR 2.75 (95%CI 1.25–6.05) and 1.66 (95%CI 0.97–2.86), or with moderate CKD, HR 3.93(1.71–9.00) and 1.86 (95%CI 1.08–3.21), respectively. These risks were similar for patients without and with moderate CKD. Importantly, both less time spent within therapeutic range and high INR-variability were associated with increased risks of stroke or TIA and major bleeds in severe CKD patients.

Conclusions

VKA treatment for AF in patients with severe CKD has a poor safety and efficacy profile, likely related to suboptimal anticoagulation control. Our study findings stress the need for better tailored individualised anticoagulant treatment approaches for patients with AF and severe CKD.  相似文献   

11.

Background

Vitamin-K antagonists (VKAs) present an effective anticoagulant treatment in deep venous thrombosis (DVT). However, the use of VKAs is limited because of the risk of bleeding and the necessity of frequent and long-term laboratory monitoring. Therefore, new oral anticoagulant drugs (NOACs) such as dabigatran, with lower rates of (major) intracranial bleeding compared to VKAs and not requiring monitoring, may be considered.

Objectives

To estimate resource utilization and costs of patients treated with the VKAs acenocoumarol and phenprocoumon, for the indication DVT. Furthermore, a formal cost-effectiveness analysis of dabigatran compared to VKAs for DVT treatment was performed, using these estimates.

Methods

A retrospective observational study design in the thrombotic service of a teaching hospital (Deventer, The Netherlands) was applied to estimate real-world resource utilization and costs of VKA monitoring. A pooled analysis of data from RE-COVER and RE-COVER II on DVT was used to reflect the probabilities for events in the cost-effectiveness model. Dutch costs, utilities and specific data on coagulation monitoring levels were incorporated in the model. Next to the base case analysis, univariate probabilistic sensitivity and scenario analyses were performed.

Results

Real-world resource utilization in the thrombotic service of patients treated with VKA for the indication of DVT consisted of 12.3 measurements of the international normalized ratio (INR), with corresponding INR monitoring costs of €138 for a standardized treatment period of 180 days. In the base case, dabigatran treatment compared to VKAs in a cohort of 1,000 DVT patients resulted in savings of €18,900 (95% uncertainty interval (UI) -95,832, 151,162) and 41 (95% UI -18, 97) quality-adjusted life-years (QALYs) gained calculated from societal perspective. The probability that dabigatran is cost-effective at a conservative willingness-to pay threshold of €20,000 per QALY was 99%. Sensitivity and scenario analyses also indicated cost savings or cost-effectiveness below this same threshold.

Conclusions

Total INR monitoring costs per patient were estimated at minimally €138. Inserting these real-world data into a cost-effectiveness analysis for patients diagnosed with DVT, dabigatran appeared to be a cost-saving alternative to VKAs in the Netherlands in the base case. Cost savings or favorable cost-effectiveness were robust in sensitivity and scenario analyses. Our results warrant confirmation in other settings and locations.  相似文献   

12.

Background

Tumor recurrence is a major problem after curative resection of hepatocellular carcinoma (HCC). The current study evaluated the effects of adjuvant iodine-125 (125I) brachytherapy on postoperative recurrence of HCC.

Methodology/Principal Findings

From July 2000 to June 2004, 68 HCC patients undergoing curative hepatectomy were randomly assigned into a 125I adjuvant brachytherapy group (n = 34) and a group of best care (n = 34). Patients in the 125I adjuvant brachytherapy group received 125I seed implantation on the raw surface of resection. Patients in the best care control group received identical treatments except for the 125I seed implantation. Time to recurrence (TTR) and 1-, 3- and 5-year overall survival (OS) were compared between the two groups. The follow-up ended in January 2010, and lasted for 7.7–106.4 months with a median of 47.6 months. TTR was significantly longer in the 125I group (mean of 60.0 months vs. 36.7 months in the control). The 1-, 3- and 5-year recurrence-free rates of the 125I group were 94.12%, 76.42%, and 73.65% vs. 88.24%, 50.00%, and 29.41% compared with the control group, respectively. The 1-, 3- and 5-year OS rates of the 125I group were 94.12%, 73.53%, and 55.88% vs. 88.24%, 52.94%, and 29.41% compared with the control group, respectively. The 125I brachytherapy decreased the risk of recurrence (HR = 0.310) and the risk of death (HR = 0.364). Most frequent adverse events in the 125I group included nausea, vomiting, arrhythmia, decreased white blood cell and/or platelet counts, and were generally mild and manageable.

Conclusions/Significance

Adjuvant 125I brachytherapy significantly prolonged TTR and increased the OS rate after curative resection of HCC.

Trial Registration

Australian New Zealand Clinical Trials Registry ACTRN12610000081011.  相似文献   

13.

Setting

An Australian metropolitan TB clinic where treatment for latent tuberculosis infection (LTBI) comprises six months of isoniazid, self-administered but dispensed monthly by the clinic.

Objective

To determine the proportion of patients who complete treatment for LTBI and to identify factors associated with non-completion.

Methods

Clinical files of all patients receiving treatment for LTBI between 01/2000 and 12/2010 were reviewed. The study population comprised all patients who were commenced on isoniazid as treatment for LTBI. Odds ratios (OR) for completing treatment were estimated by logistic regression.

Results

Of 216 patients who commenced isoniazid treatment for LTBI, 16 (75%) completed six months treatment. Fifty-three percent of the 53 patients who did not complete treatment dropped out after three months treatment. The mean (SD) age of the patients was 27 (16) years and 123 (57%) were female. The majority of patients (59%) were born overseas and 69% received treatment for LTBI because they were contacts of patients with TB. Patients'' sex, age, country of birth, time since immigration for overseas born people, health care worker status, TST conversion status, chest x-ray findings, language, employment status and the indication for which treatment of LTBI was prescribed were not significantly related to treatment completion.

Conclusion

In a setting where isoniazid is dispensed monthly by the TB clinic, a relatively high proportion of patients who commence treatment for LTBI complete the six month scheduled course of treatment. The study did not identify any patient characteristics that predicted treatment completion. Interventions to improve completion rates should extend over the whole duration of treatment.  相似文献   

14.

Background

New drugs and regimens with the potential to transform tuberculosis treatment are presently in early stage clinical trials.

Objective

The goal of the present study was to infer the required duration of these treatments.

Method

A meta-regression model was developed to predict relapse risk using treatment duration and month 2 sputum culture positive rate as predictors, based on published historical data from 24 studies describing 58 regimens in 7793 patients. Regimens in which rifampin was administered for the first 2 months but not subsequently were excluded. The model treated study as a random effect.

Results

The model predicted that new regimens of 4 or 5 months duration with rates of culture positivity after 2 months of 1% or 3%, would yield relapse rates of 4.0% or 4.1%, respectively. In both cases, the upper limit of the 2-sided 80% prediction interval for relapse for a hypothetical trial with 680 subjects per arm was <10%. Analysis using this model of published month 2 data for moxifloxacin-containing regimens indicated they would result in relapse rates similar to standard therapy only if administered for ≥5 months.

Conclusions

This model is proposed to inform the required duration of treatment of new TB regimens, potentially hastening their accelerated approval by several years.  相似文献   

15.

Background

Various studies have reported culture conversion at two months as a predictor of successful treatment outcome in multidrug-resistant tuberculosis (MDR-TB).

Objectives

The present study was conducted with the aim to evaluate the rate and predictors of culture conversion at two months in MDR-TB patients.

Methods

All confirmed pulmonary MDR-TB patients enrolled for treatment at Lady Reading Hospital Peshawar, Pakistan from 1 January to 31 December 2012 and met the inclusion criteria were reviewed retrospectively. Rate and predictors of culture conversion at two months were evaluated.

Results

Eighty seven (53.4%) out of 163 patients achieved culture conversion at two months. In a multivariate analysis lung cavitation at baseline chest X-ray (P = 0.006, OR = 0.349), resistance to ofloxacin (P = 0.041, OR = 0.193) and streptomycin (P = 0.017, OR = 0.295) had statistically significant (P<0.05) negative association with culture conversion at two months.

Conclusion

A reasonable proportion of patients achieved culture conversion at two months. Factors negatively associated with culture conversion at two months can be easily identified either before diagnosis or early in the course of MDR-TB treatment. This may help in better care of individual patients by identifying them early and treating them vigorously.  相似文献   

16.

Aim

To report outcomes for patients with para-aortic lymph node positive cervical cancer treated with a dynamic field-matching technique.

Background

PET staging of cervical cancer has increased identification of patients with para-aortic lymph node metastasis. IMRT enables dose escalation in this area, but matching IMRT fields with traditional whole pelvis fields presents a challenge.

Materials and methods

From 2003 to 2012, 20 patients with cervical cancer and para-aortic lymph node metastasis were treated utilizing the dynamic field-matching technique. As opposed to single-isocenter half-beam junction techniques, this technique employs wedge-shaped dose junctions for the abutment of fields. We reviewed the records of all patients who completed treatment with the technique and abstracted treatment, toxicity, and disease-related outcome data for analysis.

Results

Median prescribed dose to the whole pelvis field was 45 Gy and para-aortic IMRT field 50.4 Gy. All but 3 patients underwent HDR (13 pts) or LDR (4 pts) brachytherapy. All patients developed lower GI toxicity; 10 grade 1, 9 grade 2, and 1 grade 4 (enterovaginal fistula). Median DFS was 12.4 months with 1 and 2-year DFS 60.0% and 38.1%. One-year OS was 83.7% and 2-year OS, 64.4%. A total of 10 patients developed recurrence; none occurred at the matched junction.

Conclusions

The dynamic field-matching technique provides a means for joining conventional whole pelvis fields and para-aortic IMRT fields that substantially reduces dose deviations at the junction due to field mismatch. Treatment with the dynamic matching technique is simple, effective, and tolerated with no apparent increase in toxicity.  相似文献   

17.

Objective

The HAS-BLED score enables a risk estimate of major bleeds in patients with atrial fibrillation on vitamin K-antagonists (VKA) treatment, but has not been validated for patients with venous thromboembolism (VTE). We analyzed whether the HAS-BLED score accurately identifies patients at high risk of major bleeds during VKA treatment for acute VTE.

Methods

Medical records of 537 patients with acute VTE (primary diagnosis pulmonary embolism in 223, deep vein thrombosis in 314) starting VKA treatment between 2006-2007 were searched for items on the HAS-BLED score and the occurrence of major bleeds during the first 180 days of follow-up. The hazard ratio (HR) for the occurrence of major bleeds comparing non-high with high-risk patients as defined by a HAS-BLED score ≥ 3 points was calculated using Cox-regression analysis.

Results

Major bleeds occurred in 11/537 patients (2.0%, 5.2/100 person years, 95% CI 2.8-9.2). Cumulative incidences of major bleeds were 1.3% (95% CI 0.1-2.5) in the non-high (HAS-BLED < 3) and 9.6% (95%CI 2.2-17.0) in the high-risk group (HAS-BLED ≥ 3), (p <0.0001 by Log-Rank test), with a HR of 8.7 (95% CI 2.7-28.4). Of the items in the HAS-BLED score, abnormal renal function (HR 10.8, 95% CI 1.9-61.7) and a history of bleeding events (HR 10.4, 95% CI 2.5-42.5) were independent predictors of major bleeds during follow-up.

Conclusion

Acute VTE patients with a HAS-BLED score ≥ 3 points are at increased risk of major bleeding. These results warrant for correction of the potentially reversible risk factors for major bleeding and careful International Normalized Ratio monitoring in acute VTE patients with a high HAS-BLED score.  相似文献   

18.

Aim

To evaluate different treatment modalities, sequences, and prognostic factors in patients with brain metastases from stomach cancer.

Background

Brain metastases from gastric cancer are rare and late manifestation of the disease, occurring in less than 1% of gastric cancer patients. The prognosis is poor and median overall survival is 1.3–2.4 months. The standard treatment scheme has not yet been described. Most studies present small sample sizes. The choice of treatment scheme is individually based on performance status, number, location and size of metastases, the status of primary tumor and the presence of other metastases.

Materials and methods

Sixteen patients diagnosed with brain metastases from gastric cancer in Maria Sklodowska-Curie Memorial Cancer Center and Institute of Oncology, Gliwice Branch.Patients, mostly men (69%) aged 51–75 years, (median 68.5 years). Thirteen (81.25%) had treatment of primary tumor before diagnosis of brain metastases. Primary metastatic gastric cancer was diagnosed in 6 patients (37.5%), in 3 cases (18.75%) brain was the site of those metastases. Treatment schemes were individually based.

Results

We identified prognostic factors influencing OS: performance status, number of brain metastases, type of treatment. Median OS was 2.8 months. Median time to brain metastases was 12.3 months and it was shorter in patients with pretreatment metastases to other organs. Patients treated with combined treatment had median survival of 12.3 months.

Conclusions

Aggressive treatment schemes are needed to improve the outcome. Prognostic factors such as performance status, number of metastases, dissemination to other organs are helpful in considering the best treatment options.  相似文献   

19.

Context

We have previously shown that serum VEGF-D is elevated at baseline, correlates with kidney angiomyolipoma size at baseline and 12 months, and decreases with sirolimus treatment in adults with tuberous sclerosis complex (TSC). To further investigate the utility of serum VEGF-D for longer term monitoring of TSC kidney disease, we present VEGF-D level results with 24 month follow-up.

Objective

To compare 24 month VEGF-D levels in two subgroups of sirolimus treated patients (OFF SIROLIMUS AFTER 12 MONTHS or ON SIROLIMUS AFTER 12 MONTHS).

Design and Intervention(s)

Serum VEGF-D was measured in samples collected from subjects enrolled in a phase 2 multicenter trial evaluating sirolimus for the treatment of kidney angiomyolipomas associated with TSC or TSC/LAM. All participants were treated with sirolimus from 0–12 months. During months 12–24, sirolimus was discontinued in one subgroup. The other subgroup was treated with additional sirolimus.

Setting

Adult TSC participants were recruited from six clinical sites in the United States (comprehensive TSC clinics, 5; urology clinic, 1).

Patients

There were 28 TSC patients who completed all 24 months of the study and serum samples were available at 24 months from 18/28 patients.

Main Outcome Measure(s)

We compared the percent change in VEGF-D levels (baseline to 24 months) in patients from the two treatment subgroups.

Results

At 24 months, VEGF-D levels decreased by 67% compared with baseline (to 787±426 pg/ml) in the ON SIROLIMUS AFTER 12 MONTHS group versus a 13% decrease (to 2971±4014 pg/ml) in the OFF SIROLIMUS AFTER 12 MONTHS group (p = 0.013, Mann-Whitney test). A similar trend was observed in kidney angiomyolipoma size but not in pulmonary function tests. Conclusions Serum VEGF-D may be useful for monitoring response to treatment with sirolimus and kidney angiomyolipoma size in patients with TSC, but confirmation is needed.

Trial Registration

Clinical trials.gov NCT00126672.  相似文献   

20.

Background

Familial amyloidotic polyneuropathy (FAP) is a neurodegenerative disease caused by the extracellular deposition of mutant transthyretin (TTR), with special involvement of the peripheral nervous system (PNS). Currently, hepatic transplantation is considered the most efficient therapy to halt the progression of clinical symptoms in FAP since more than 95% of TTR is produced by the liver. However, less invasive and more reliable therapeutic approaches have been proposed for FAP therapy, namely based on drugs acting as inhibitors of amyloid formation or as amyloid disruptors. We have recently reported that epigallocatechin-3-gallate (EGCG), the most abundant catechin in green tea, is able to inhibit TTR aggregation and fibril formation, “in vitro” and in a cellular system, and is also able to disrupt pre-formed amyloid fibrils “in vitro”.

Methodology and Principal Findings

In the present study, we assessed the effect of EGCG subchronic administration on TTR amyloidogenesis “in vivo”, using well characterized animal models for FAP. Semiquantitative immunohistochemistry (SQ-IHC) and Western blot analysis of mice tissues after treatment demonstrated that EGCG inhibits TTR toxic aggregates deposition in about 50% along the gastrointestinal tract (GI) and peripheral nervous system (PNS). Moreover EGCG treatment considerably lowered levels of several biomarkers associated with non-fibrillar TTR deposition, namely endoplasmic reticulum (ER)-stress, protein oxidation and apoptosis markers. Treatment of old FAP mice with EGCG resulted not only in the decrease of non-fibrillar TTR deposition but also in disaggregation of amyloid deposits. Consistently, matrix metalloproteinase (MMP)-9 and serum amyloid P component (SAP), both markers of amyloid deposition, were also found reduced in treated old FAP mice.

Conclusions and Significance

The dual effect of EGCG both as TTR aggregation inhibitor and amyloid fibril disruptor together with the high tolerability and low toxicity of EGCG in humans, point towards the potential use of this compound, or optimized derivatives, in the treatment of TTR-related amyloidoses.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号