首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.

Background

Result publication is the key step to improve the transparency of clinical trials.

Objective

To investigate the result publication rate of Chinese trials registered in World Health Organization (WHO) primary registries.

Method

We searched 11 WHO primary registries for Chinese trials records. The progress of each trial was analyzed. We searched for the full texts of result publications cited in the registration records. For completed trials without citations, we searched PubMed, Embase, Chinese Biomedical Literature Database (Chinese), China Knowledge Resource Integrated Database, and Chinese Science and Technology Periodicals Database for result publications. The search was conducted on July 14, 2009. We also called the investigators of completed trials to ask about results publication.

Results

We identified 1294 Chinese trials records (428 in ChiCTR,743 in clinicaltrials.gov,55 in ISRCTN, 21 in ACTRN). A total of 443 trials had been completed. The publication rate of the Chinese trials in WHO primary registries is 35.2%(156/443).The publication rate of Chinese trials in clinicaltrials.gov, ChiCTR, ISRCTN, and ACRTN was 36.5% (53/145), 36.3% (89/245), 26.0%(9/44), and 55.6%(5/9), respectively. The publication rate of trials sponsored by industry(23.8%) was lower than that of sponsored by central and local government(31.7%), hospital(35.1%), and universities (40.7%). The publication rate for randomized trials was higher than that of cohort study and case-control study (33.2% versus 16.7%, 22.2%). The publication rate for interventional studies and observational studies was similar(33.4% versus 33.3%).

Conclusion

The publication rate of the registered Chinese trials was low, with no significant difference between ChiCTR and clinicaltrials.gov. An effective mechanism is needed to promote publication of results for registered trials in China.  相似文献   

2.
3.

Background

Leptospirosis, a zoonosis associated with potentially fatal consequences, has long been a grossly underreported disease in India. There is no accurate estimate of the problem of leptospirosis in non-endemic areas such as north India.

Methods/Principal Findings

In order to understand the clinical spectrum and risk factors associated with leptospirosis, we carried out a retrospective study in patients with acute febrile illness in north India over the last 5 years (January 2004 to December 2008). There was increased incidence of leptospirosis (11.7% in 2004 to 20.5% in 2008) as diagnosed by IgM ELISA and microscopic agglutination titer in paired acute and convalescent sera. The disease showed a peak during the rainy season (August and September). We followed up 86 cases of leptospirosis regarding their epidemiological pattern, clinical features, laboratory parameters, complications, therapy, and outcome. Mean age of patients was 32.6 years (2.5 years to 78 years) and males (57%) outnumbered females (43%). Infestation of dwellings with rats (53.7%), working in farm lands (44.2%), and contact with animals (62.1%) were commonly observed epidemiological risk factors. Outdoor workers including farmers (32.6%), labourers (11.6%), para-military personnel (2.3%), and sweepers (1.2%) were commonly affected. Modified Faine''s criteria could diagnose 76 cases (88.3%). Renal failure (60.5%), respiratory failure (20.9%), the neuroleptospirosis (11.6%), and disseminated intravascular coagulation (DIC) (11.6%) were the commonest complications. Five patients died, giving a case fatality rate of 5.9%.

Conclusions/Significance

There has been a rapid rise in the incidence of leptospirosis in north India. Severe complications such as renal failure, respiratory failure, neuroleptospirosis, and DIC are being seen with increasing frequency. Increased awareness among physicians, and early diagnosis and treatment, may reduce mortality due to leptospirosis.  相似文献   

4.

Background

PCR has evolved into one of the most promising tools for T. cruzi detection in the diagnosis and control of Chagas disease. However, general use of the technique is hampered by its complexity and the lack of standardization.

Methodology

We here present the development and phase I evaluation of the T. cruzi OligoC-TesT, a simple and standardized dipstick format for detection of PCR amplified T. cruzi DNA. The specificity and sensitivity of the assay were evaluated on blood samples from 60 Chagas non-endemic and 48 endemic control persons and on biological samples from 33 patients, 7 reservoir animals, and 14 vectors collected in Chile.

Principal Findings

The lower detection limits of the T. cruzi OligoC-TesT were 1 pg and 1 to 10 fg of DNA from T. cruzi lineage I and II, respectively. The test showed a specificity of 100% (95% confidence interval [CI]: 96.6%–100%) on the control samples and a sensitivity of 93.9% (95% CI: 80.4%–98.3%), 100% (95% CI: 64.6%–100%), and 100% (95% CI: 78.5%–100%) on the human, rodent, and vector samples, respectively.

Conclusions

The T. cruzi OligoC-TesT showed high sensitivity and specificity on a diverse panel of biological samples. The new tool is an important step towards simplified and standardized molecular diagnosis of Chagas disease.  相似文献   

5.

Background

Few studies have examined the contribution of treatment on the mortality of dementia based on a population-based study.

Objective

To investigate the effects of anti-dementia and nootropic treatments on the mortality of dementia using a population-based cohort study.

Methods

12,193 incident dementia patients were found from 2000 to 2010. Their data were compared with 12,193 age- and sex-matched non-dementia controls that were randomly selected from the same database. Dementia was classified into vascular (VaD) and degenerative dementia. Mortality incidence and hazard ratios (HRs) were calculated.

Results

The median survival time was 3.39 years (95% confidence interval [CI]: 2.88–3.79) for VaD without medication, 6.62 years (95% CI: 6.24–7.21) for VaD with nootropics, 3.01 years (95% CI: 2.85–3.21) for degenerative dementia without medication, 8.11 years (95% CI: 6.30–8.55) for degenerative dementia with anti-dementia medication, 6.00 years (95% CI: 5.73–6.17) for degenerative dementia with nootropics, and 9.03 years (95% CI: 8.02–9.87) for degenerative dementia with both anti-dementia and nootropic medications. Compared to the non-dementia group, the HRs among individuals with degenerative dementia were 2.69 (95% CI: 2.55–2.83) without medication, 1.46 (95% CI: 1.39–1.54) with nootropics, 1.05 (95% CI: 0.82–1.34) with anti-dementia medication, and 0.92 (95% CI: 0.80–1.05) with both nootropic and anti-dementia medications. VaD with nootropics had a lower mortality (HR: 1.25, 95% CI: 1.15–1.37) than VaD without medication (HR: 2.46, 95% CI: 2.22–2.72).

Conclusion

Pharmacological treatments have beneficial effects for patients with dementia in prolonging their survival.  相似文献   

6.

Background

Longevity is a multifactorial trait with a genetic contribution, and mitochondrial DNA (mtDNA) polymorphisms were found to be involved in the phenomenon of longevity.

Methodology/Principal Findings

To explore the effects of mtDNA haplogroups on the prevalence of extreme longevity (EL), a population based case-control study was conducted in Rugao – a prefecture city in Jiangsu, China. Case subjects include 463 individuals aged ≥95 yr (EL group). Control subjects include 926 individuals aged 60–69 years (elderly group) and 463 individuals aged 40–49 years (middle-aged group) randomly recruited from Rugao. We observed significant reduction of M9 haplogroups in longevity subjects (0.2%) when compared with both elderly subjects (2.2%) and middle-aged subjects (1.7%). Linear-by-linear association test revealed a significant decreasing trend of N9 frequency from middle-aged subjects (8.6%), elderly subjects (7.2%) and longevity subjects (4.8%) (p = 0.018). In subsequent analysis stratified by gender, linear-by-linear association test revealed a significant increasing trend of D4 frequency from middle-aged subjects (15.8%), elderly subjects (16.4%) and longevity subjects (21.7%) in females (p = 0.025). Conversely, a significant decreasing trend of B4a frequency was observed from middle-aged subjects (4.2%), elderly subjects (3.8%) and longevity subjects (1.7%) in females (p = 0.045).

Conclusions

Our observations support the association of mitochondrial DNA haplogroups with exceptional longevity in a Chinese population.  相似文献   

7.

Background

There is a commonly held assumption that early August is an unsafe period to be admitted to hospital in England, as newly qualified doctors start work in NHS hospitals on the first Wednesday of August. We investigate whether in-hospital mortality is higher in the week following the first Wednesday in August than in the previous week.

Methodology

A retrospective study in England using administrative hospital admissions data. Two retrospective cohorts of all emergency patients admitted on the last Wednesday in July and the first Wednesday in August for 2000 to 2008, each followed up for one week.

Principal Findings

The odds of death for patients admitted on the first Wednesday in August was 6% higher (OR 1.06, 95% CI 1.00 to 1.15, p = 0.05) after controlling for year, gender, age, socio-economic deprivation and co-morbidity. When subdivided into medical, surgical and neoplasm admissions, medical admissions admitted on the first Wednesday in August had an 8% (OR 1.08, 95% CI 1.01 to 1.16, p = 0.03) higher odds of death. In 2007 and 2008, when the system for junior doctors'' job applications changed, patients admitted on Wednesday August 1st had 8% higher adjusted odds of death than those admitted the previous Wednesday, but this was not statistically significant (OR 1.08, 95% CI 0.95 to 1.23, p = 0.24).

Conclusions

We found evidence that patients admitted on the first Wednesday in August have a higher early death rate in English hospitals compared with patients admitted on the previous Wednesday. This was higher for patients admitted with a medical primary diagnosis.  相似文献   

8.

Background

Monochorionic (MC) twins are at increased risk for perinatal mortality and serious morbidity due to the presence of placental vascular anastomoses. Cerebral injury can be secondary to haemodynamic and hematological disorders during pregnancy (especially twin-to-twin transfusion syndrome (TTTS) or intrauterine co-twin death) or from postnatal injury associated with prematurity and low birth weight, common complications in twin pregnancies. We investigated neurodevelopmental outcome in MC and dichorionic (DC) twins at the age of two years.

Methods

This was a prospective cohort study. Cerebral palsy (CP) was studied in 182 MC infants and 189 DC infants matched for weight and age at delivery, gender, ethnicity of the mother and study center. After losses to follow-up, 282 of the 366 infants without CP were available to be tested with the Griffiths Mental Developmental Scales at 22 months corrected age, all born between January 2005 and January 2006 in nine perinatal centers in The Netherlands. Due to phenotypic (un)alikeness in mono-or dizygosity, the principal investigator was not blinded to chorionic status; perinatal outcome, with exception of co-twin death, was not known to the examiner.

Findings

Four out of 182 MC infants had CP (2.2%) - two of the four CP-cases were due to complications specific to MC twin pregnancies (TTTS and co-twin death) and the other two cases of CP were the result of cystic PVL after preterm birth - compared to one sibling of a DC twin (0.5%; OR 4.2, 95% CI 0.5–38.2) of unknown origin. Follow-up rate of neurodevelopmental outcome by Griffith''s test was 76%. The majority of 2-year-old twins had normal developmental status. There were no significant differences between MC and DC twins. One MC infant (0.7%) had a developmental delay compared to 6 DC infants (4.2%; OR 0.2, 95% 0.0–1.4). Birth weight discordancy did not influence long-term outcome, though the smaller twin had slightly lower developmental scores than its larger co-twin.

Conclusions

There were no significant differences in occurrence of cerebral palsy as well as neurodevelopmental outcome between MC and DC twins. Outcome of MC twins seems favourable in the absence of TTTS or co-twin death.  相似文献   

9.

Background

Repeated mass azithromycin distributions are effective in controlling the ocular strains of chlamydia that cause trachoma. However, it is unclear when treatments can be discontinued. Investigators have proposed graduating communities when the prevalence of infection identified in children decreases below a threshold. While this can be tested empirically, results will not be available for years. Here we use a mathematical model to predict results with different graduation strategies in three African countries.

Methods

A stochastic model of trachoma transmission was constructed, using the parameters with the maximum likelihood of obtaining results observed from studies in Tanzania (with 16% infection in children pre-treatment), The Gambia (9%), and Ethiopia (64%). The expected prevalence of infection at 3 years was obtained, given different thresholds for graduation and varying the characteristics of the diagnostic test.

Results

The model projects that three annual treatments at 80% coverage would reduce the mean prevalence of infection to 0.03% in Tanzanian, 2.4% in Gambian, and 12.9% in the Ethiopian communities. If communities graduate when the prevalence of infection falls below 5%, then the mean prevalence at 3 years with the new strategy would be 0.3%, 3.9%, and 14.4%, respectively. Graduations reduced antibiotic usage by 63% in Tanzania, 56% in The Gambia, and 11% in Ethiopia.

Conclusion

Models suggest that graduating communities from a program when the infection is reduced to 5% is a reasonable strategy and could reduce the amount of antibiotic distributed in some areas by more than 2-fold.  相似文献   

10.
11.

Background

CASTLE compared the efficacy of atazanavir/ritonavir with lopinavir/ritonavir, each in combination with tenofovir-emtricitabine in ARV-naïve subjects from 5 continents.

Objectives

Determine the baseline rate and clinical significance of TDR mutations using ultra-deep sequencing (UDS) in ARV-naïve subjects in CASTLE.

Methods

A case control study was performed on baseline samples for all 53 subjects with virologic failures (VF) at Week 48 and 95 subjects with virologic successes (VS) randomly selected and matched by CD4 count and viral load. UDS was performed using 454 Life Sciences/Roche technology.

Results

Of 148 samples, 141 had successful UDS (86 subtype B, 55 non-B subtypes). Overall, 30.5% of subjects had a TDR mutation at baseline; 15.6% only had TDR(s) at <20% of the viral population. There was no difference in the rate of TDRs by B (30.2%) or non-B subtypes (30.9%). VF (51) and VS (90) had similar rates of any TDRs (25.5% vs. 33.3%), NNRTI TDRs (11.1% vs.11.8%) and NRTI TDRs (24.4% vs. 25.5%). Of 9 (6.4%) subjects with M184V/I (7 at <20% levels), 6 experienced VF. 16 (11.3%) subjects had multiple TAMs, and 7 experienced VF. 3 (2.1%) subjects had both multiple TAMs+M184V, and all experienced VF. Of 14 (9.9%) subjects with PI TDRs (11 at <20% levels): only 1 experienced virologic failure. The majority of PI TDRs were found in isolation (e.g. 46I) at <20% levels, and had low resistance algorithm scores.

Conclusion

Among a representative sample of ARV-naïve subjects in CASTLE, TDR mutations were common (30.5%); B and non-B subtypes had similar rates of TDRs. Subjects with multiple PI TDRs were infrequent. Overall, TDRs did not affect virologic response for subjects on a boosted PI by week 48; however, a small subset of subjects with extensive NRTI backbone TDR patterns experienced virologic failure.  相似文献   

12.

Background

HIV-infected persons suffering from tuberculosis experience high mortality. No programmatic studies from India have documented the delivery of mortality-reducing interventions, such as cotrimoxazole prophylactic treatment (CPT) and antiretroviral treatment (ART). To guide TB-HIV policy in India we studied the effectiveness of delivering CPT and ART to HIV-infected persons treated for tuberculosis in three districts in Andhra Pradesh, India, and evaluated factors associated with death.

Methods and Findings

We retrospectively abstracted data for all HIV-infected tuberculosis patients diagnosed from March 2007 through August 2007 using standard treatment outcome definitions. 734 HIV-infected tuberculosis patients were identified; 493 (67%) were males and 569 (80%) were between the ages of 24–44 years. 710 (97%) initiated CPT, and 351 (50%) collected >60% of their monthly cotrimoxazole pouches provided throughout TB treatment. Access to ART was documented in 380 (51%) patients. Overall 130 (17%) patients died during TB treatment. Patients receiving ART were less likely to die (adjusted hazard ratio [HR] 0.4, 95% confidence interval [CI] 0.3–0.6), while males and those with pulmonary TB were more likely to die (HR 1.7, 95% CI 1.1–2.7, and HR 1.9, 95% CI 1.1–3.2 respectively).

Conclusions

Among HIV-infected TB patients in India death was common despite the availability of free cotrimoxazole locally and ART from referral centres. Death was strongly associated with the absence of ART during TB treatment. To minimize death, programmes should promote high levels of ART uptake and closely monitor progress in implementation.  相似文献   

13.
14.

Objective

Patient chances for cure and palliation for a variety of malignancies may be greatly affected by the care provided by a treating hospital. We sought to determine the effect of volume and teaching status on patient outcomes for five gynecologic malignancies: endometrial, cervical, ovarian and vulvar carcinoma and uterine sarcoma.

Methods

The Florida Cancer Data System dataset was queried for all patients undergoing treatment for gynecologic cancers from 1990–2000.

Results

Overall, 48,981 patients with gynecologic malignancies were identified. Endometrial tumors were the most common, representing 43.2% of the entire cohort, followed by ovarian cancer (30.9%), cervical cancer (20.8%), vulvar cancer (4.6%), and uterine sarcoma (0.5%). By univariate analysis, although patients treated at high volume centers (HVC) were significantly younger, they benefited from an improved short-term (30-day and/or 90-day) survival for cervical, ovarian and endometrial cancers. Multivariate analysis (MVA), however, failed to demonstrate significant survival benefit for gynecologic cancer patients treated at teaching facilities (TF) or HVC. Significant prognostic factors at presentation by MVA were age over 65 (HR = 2.6, p<0.01), African-American race (HR = 1.36, p<0.01), and advanced stage (regional HR = 2.08, p<0.01; advanced HR = 3.82, p<0.01, respectively). Surgery and use of chemotherapy were each significantly associated with improved survival.

Conclusion

No difference in patient survival was observed for any gynecologic malignancy based upon treating hospital teaching or volume status. Although instances of improved outcomes may occur, overall further regionalization would not appear to significantly improve patient survival.  相似文献   

15.

Background

Chikungunya virus (CHIKV) is a recently re-emerged arthropod borne virus responsible for a massive outbreak in the Indian Ocean and India, and extended to Southeast Asia as well as Italy. CHIKV has adapted to Aedes albopictus, an anthropophilic mosquito species widely distributed in Asia, Europe, Africa and America. Our objective was to determine the clinical and biological features of patients at the acute phase of CHIKV infection.

Methods and Findings

A prospective study enrolled 274 consecutive patients with febrile arthralgia recorded at the Emergency Department of the Groupe Hospitalier Sud-Réunion between March and May 2006. Three groups were defined: one group of 180 viremic patients (positive CHIKV RT-PCR), one group of 34 patients with acute post-viremic infection (negative CHIKV RT-PCR, positive anti-CHIKV IgM and negative IgG), and one group of 46 uninfected patients (negative CHIKV RT-PCR, anti-CHIKV IgM and IgG). Bivariate analyses of clinical and biological features between groups were performed. Patients with CHIKV viremia presented typically with asymmetrical bilateral polyarthralgia (96.5%) affecting the lower (98%) and small joints (74.8%), as well as asthenia (88.6%), headache (70%), digestive trouble (63.3%), myalgia (59%), exanthems (47.8%), conjunctival hyperhemia (23%) and adenopathy (8.9%). Vertigo, cutaneous dysesthesia, pharyngitis and haemorrhages were seldom observed. So far unreported symptoms such as chondrocostal arthralgia (20%), entesopathies (1.6%), talalgia (14%) were also noted. Prurit was less frequent during the viremic than post-viremic phase (13.9% vs. 41.2%; p<0.001), whereas lymphopenia was more frequent (87.6% vs. 39.4%; p<0.001). Others biological abnormalities included leukopenia (38.3%), thrombocytopenia (37.3%), increased ASAT and ALAT blood levels (31.6 and 7.3%, respectively) and hypocalcemia (38.7%). Lymphopenia <1,000/mm3 was very closely associated with viremic patients (Yule coefficient 0.82, positive predictive value 92.3%). Age under 65 was associated with a benign course, as no patients younger than 65 had to be hospitalized (Yule coefficient 0.78).

Conclusions

The diagnosis of CHIKV infection in acute phase is based on commonly accepted clinical criteria (fever and arthralgia), however clinical and biological diffrences exist in acute phase depending on whether or not the patient is within the viremic phase of the infection.  相似文献   

16.

Background

The demand for inpatient medical services increases during influenza season. A scoring system capable of identifying influenza patients at low risk death or ICU admission could help clinicians make hospital admission decisions.

Methods

Hospitalized patients with laboratory confirmed influenza were identified over 3 influenza seasons at 25 Ontario hospitals. Each patient was assigned a score for 6 pneumonia severity and 2 sepsis scores using the first data available following their registration in the emergency room. In-hospital mortality and ICU admission were the outcomes. Score performance was assessed using the area under the receiver operating characteristic curve (AUC) and the sensitivity and specificity for identifying low risk patients (risk of outcome <5%).

Results

The cohort consisted of 607 adult patients. Mean age was 76 years, 12% of patients died (71/607) and 9% required ICU care (55/607). None of the scores examined demonstrated good discriminatory ability (AUC≥0.80). The Pneumonia Severity Index (AUC 0.78, 95% CI 0.72–0.83) and the Mortality in Emergency Department Sepsis score (AUC 0.77, 95% 0.71–0.83) demonstrated fair predictive ability (AUC≥0.70) for in-hospital mortality. The best predictor of ICU admission was SMART-COP (AUC 0.73, 95% CI 0.67–0.79). All other scores were poor predictors (AUC <0.70) of either outcome. If patients classified as low risk for in-hospital mortality using the PSI were discharged, 35% of admissions would have been avoided.

Conclusions

None of the scores studied were good predictors of in-hospital mortality or ICU admission. The PSI and MEDS score were fair predictors of death and if these results are validated, their use could reduce influenza admission rates significantly.  相似文献   

17.

Background

Neuronal damage is correlated with vascular dysfunction in the diseased retina, but the underlying mechanisms remain controversial because of the lack of suitable models in which vasoregression related to neuronal damage initiates in the mature retinal vasculature. The aim of this study was to assess the temporal link between neuronal damage and vascular patency in a transgenic rat (TGR) with overexpression of a mutant cilia gene polycystin-2.

Methods

Vasoregression, neuroglial changes and expression of neurotrophic factors were assessed in TGR and control rats in a time course. Determination of neuronal changes was performed by quantitative morphometry of paraffin-embedded vertical sections. Vascular cell composition and patency were assessed by quantitative retinal morphometry of digest preparations. Glial activation was assessed by western blot and immunofluorescence. Expression of neurotrophic factors was detected by quantitative PCR.

Findings

At one month, number and thickness of the outer nuclear cell layers (ONL) in TGR rats were reduced by 31% (p<0.001) and 17% (p<0.05), respectively, compared to age-matched control rats. Furthermore, the reduction progressed from 1 to 7 months in TGR rats. Apoptosis was selectively detected in the photoreceptor in the ONL, starting after one month. Nevertheless, TGR and control rats showed normal responses in electroretinogram at one month. From the second month onwards, TGR retinas had significantly increased acellular capillaries (p<0.001), and a reduction of endothelial cells (p<0.01) and pericytes (p<0.01). Upregulation of GFAP was first detected in TGR retinas after 1 month in glial cells, in parallel with an increase of FGF2 (fourfold) and CNTF (60 %), followed by upregulation of NGF (40 %) at 3 months.

Interpretation

Our data suggest that TGR is an appropriate animal model for vasoregression related to neuronal damage. Similarities to experimental diabetic retinopathy render this model suitable to understand general mechanisms of maturity-onset vasoregression.  相似文献   

18.

Background

Short cycle treatment interruption could reduce toxicity and drug costs and contribute to further expansion of antiretroviral therapy (ART) programs.

Methods

A 72 week, non-inferiority trial enrolled one hundred forty six HIV positive persons receiving ART (CD4+ cell count ≥125 cells/mm3 and HIV RNA plasma levels <50 copies/ml) in one of three arms: continuous, 7 days on/7 days off and 5 days on/2 days off treatment. Primary endpoint was ART treatment failure determined by plasma HIV RNA level, CD4+ cell count decrease, death attributed to study participation, or opportunistic infection.

Results

Following enrollment of 32 participants, the 7 days on/7 days off arm was closed because of a failure rate of 31%. Six of 52 (11.5%) participants in the 5 days on/2 days off arm failed. Five had virologic failure and one participant had immunologic failure. Eleven of 51 (21.6%) participants in the continuous treatment arm failed. Nine had virologic failure with 1 death (lactic acidosis) and 1 clinical failure (extra-pulmonary TB). The upper 97.5% confidence boundary for the difference between the percent of non-failures in the 5 days on/2 days off arm (88.5% non-failure) compared to continuous treatment (78.4% non failure) was 4.8% which is well within the preset non-inferiority margin of 15%. No significant difference was found in time to failure in the 2 study arms (p = 0.39).

Conclusions

Short cycle 5 days on/2 days off intermittent ART was at least as effective as continuous therapy.

Trial Registration

ClinicalTrials.gov NCT00339456  相似文献   

19.

Background

The objective of this study was to assess efficacy and determine the optimal indication of selective arterial embolisation (SAE) in patients with life-threatening post-partum haemorrhage (PPH).

Methodology/Principal Findings

One hundred and two patients with PPH underwent SAE and were included from January 1998 to January 2002 in our university care center. Embolisation was considered effective when no other surgical procedure was required. Univariate and multivariate statistical analysis were performed. SAE was effective for 73 patients (71.5%), while 29 required surgical procedures. SAE was effective in 88.6% of women with uterine atony that was associated with positive outcome (OR 4.13, 1.35–12.60), whereas caesarean deliveries (OR 0.16, 0.04–0.5) and haemodynamic shock (OR 0.21, 0.07–0.60) were associated with high failure rates, 47.6% and 39.1%, respectively.

Conclusions/Significance

Success rate for SAE observed in a large population is lower than previously reported. It is most likely to succeed for uterine atony but not recommended in case of haemodynamic shock or after caesarean section.  相似文献   

20.

Background

Twin studies offer a ‘natural experiment’ that can estimate the magnitude of environmental and genetic effects on a target phenotype. We hypothesised that fidgetiness and enjoyment of activity would be heritable but that objectively-measured daily activity would show a strong shared environmental effect.

Methodology/Principal Findings

In a sample of 9–12 year-old same-sex twin pairs (234 individuals; 57 MZ, 60 DZ pairs) we assessed three dimensions of physical activity: i) objectively-measured physical activity using accelerometry, ii) ‘fidgetiness’ using a standard psychometric scale, and iii) enjoyment of physical activity from both parent ratings and children''s self-reports. Shared environment effects explained the majority (73%) of the variance in objectively-measured total physical activity (95% confidence intervals (CI): 0.63–0.81) with a smaller unshared environmental effect (27%; CI: 0.19–0.37) and no significant genetic effect. In contrast, fidgetiness was primarily under genetic control, with additive genetic effects explaining 75% (CI: 62–84%) of the variance, as was parent''s report of children''s enjoyment of low 74% (CI: 61–82%), medium 80% (CI: 71–86%), and high impact activity (85%; CI: 78–90%), and children''s expressed activity preferences (60%, CI: 42–72%).

Conclusions

Consistent with our hypothesis, the shared environment was the dominant influence on children''s day-to-day activity levels. This finding gives a strong impetus to research into the specific environmental characteristics influencing children''s activity, and supports the value of interventions focused on home or school environments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号