首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The number and size of free-range laying hen (Gallus gallus domesticus) production systems are increasing within Australia in response to consumer demand for perceived improvement in hen welfare. However, variation in outdoor stocking density has generated consumer dissatisfaction leading to the development of a national information standard on free-range egg labelling by the Australian Consumer Affairs Ministers. The current Australian Model Code of Practice for Domestic Poultry states a guideline of 1500 hens/ha, but no maximum density is set. Radio-frequency identification (RFID) tracking technology was used to measure daily range usage by individual ISA Brown hens housed in six small flocks (150 hens/flock – 50% of hens tagged), each with access to one of three outdoor stocking density treatments (two replicates per treatment: 2000, 10 000, 20 000 hens/ha), from 22 to 26, 27 to 31 and 32 to 36 weeks of age. There was some variation in range usage across the sampling periods and by weeks 32 to 36 individual hens from the lowest stocking density on average used the range for longer each day (P<0.001), with fewer visits and longer maximum durations per visit (P<0.001). Individual hens within all stocking densities varied in the percentage of days they accessed the range with 2% of tagged hens in each treatment never venturing outdoors and a large proportion that accessed the range daily (2000 hens/ha: 80.5%; 10 000 hens/ha: 66.5%; 20 000 hens/ha: 71.4%). On average, 38% to 48% of hens were seen on the range simultaneously and used all available areas of all ranges. These results of experimental-sized flocks have implications for determining optimal outdoor stocking densities for commercial free-range laying hens but further research would be needed to determine the effects of increased range usage on hen welfare.  相似文献   

2.
Cryptococcal meningitis (CM) remains as common life-threatening AIDS-defining illness mainly in resource-limited settings. Previous reports suggested that baseline cytokine profiles can be associated to fungal burden and clinical outcome. This study aimed to evaluate the baseline cytokine profiles in AIDS patients with CM and its relation with the outcome at weeks 2 and 10. Thirty AIDS patients with CM diagnosed by cerebrospinal fluid (CSF) Cryptococcus neoformans positive culture, India ink stain and cryptococcal antigen test were prospectively evaluated. As controls, 56 HIV-infected patients without CM and 48 non-HIV individuals were included. Baseline CSF and sera levels of IL-2, IL-4, IL-8, IL-10, IL-12p40, IL-17A, INF-γ and TNF-α were measured by ELISA. Of 30 CM patients, 24 (80%) were male, median age of 38.1. The baseline CSF high fungal burden and positive blood culture were associated with a positive CSF culture at week 2 (p = 0.043 and 0.029). Most CSF and sera cytokines presented higher levels in CM patients than control subjects (p < 0.05). CSF levels of IL-8, IL-12p40, IL-17A, TNF-α, INF-γ and sera TNF-α were significantly higher among survivors at weeks 2 and 10 (p < 0.05). Patients with increased intracranial pression exhibited CSF IL-10 high levels and poor outcome at week 10 (p = 0.032). Otherwise, baseline CSF log10 IFN-γ and IL-17A were negatively correlated with fungal burden (r = -0.47 and -0.50; p = 0.0175 and 0.0094, respectively). The mortality rate was 33% (10/30) at week 2 and 57% (17/30) at week 10. The severity of CM and the advanced immunodeficiency at admission were related to a poor outcome in these patients. Otherwise, the predominant Th1 cytokines profile among survivors confirms its pivotal role to infection control and would be a prognostic marker in cryptococcal meningitis.  相似文献   

3.
On-farm hatching systems for broiler chicks are increasingly used in practice. We studied whether or not performance, health and welfare aspects differed between commercial flocks hatched on-farm or in a hatchery (control). In two successive production cycles on seven farms, a total of 16 on-farm hatched flocks were paired to 16 control flocks, housed at the same farm. Paired flocks originated from the same batch of eggs and were subjected to similar on-farm management. On-farm hatched and control flocks only differed with respect to hatching conditions, with on-farm hatched flocks not being exposed to, for example, chick handling, post-hatch feed and water deprivation and transport, in contrast to control flocks that were subjected to standard hatchery procedures, subsequently transported and placed in the poultry house. Day-old chick quality (navel and hock scores), 1st week mortality, total mortality, BW at day (d) 0, d7 and at depopulation, and (total) feed conversion ratio were determined. Prevalence of footpad dermatitis, hock burn, breast discoloration/blisters and cleanliness, litter quality and gait score were determined at d21 of age and around depopulation (d39 on average). Gross pathology and gut morphology were examined at depopulation age in a sample of birds of five flocks per treatment. On-farm hatching resulted in a higher BW at d0 (Δ=5.4 g) and d7 (Δ=11.5 g) (P<0.001), but day-old chick quality as measured by navel (P=0.003) and hock (P=0.01) quality was worse for on-farm hatched compared to control birds. Body weight, 1st week and total mortality, and feed conversion ratio at slaughter age were similar for both on-farm hatched and control flocks. On-farm hatched flocks had less footpad dermatitis (P=0.05), which indicated a better welfare. This was likely related to a tendency for better litter quality in on-farm hatched flocks at 21 days of age in comparison to control flocks (P=0.08). No major differences in gross pathology or in intestinal morphology at depopulation age were found between treatments. In conclusion, on-farm hatching resulted in better 1st week broiler performance and better welfare compared to conventional hatching in a hatchery.  相似文献   

4.
Keel bone damage (KBD) in laying hens is an important welfare problem in both conventional and organic egg production systems. We aimed to identify possible risk factors for KBD in organic hens by analysing cross-sectional data of 107 flocks assessed in eight European countries. Due to partly missing data, the final multiple regression model was based on data from 50 flocks. Keel bone damage included fractures and/or deviations, and was recorded, alongside with other animal based measures, by palpation and visual inspection of at least 50 randomly collected hens per flock between 52 and 73 weeks of age. Management and housing data were obtained by interviews, inspection and by feed analysis. Keel bone damage flock prevalences ranged from 3% to 88%. Compiled on the basis of literature and practical experience, 26 potential associative factors of KBD went into an univariable selection by Spearman correlation analysis or Mann–Whitney U test (with P<0.1 level). The resulting nine factors were presented to stepwise forward linear regression modelling. Aviary v. floor systems, absence of natural daylight in the hen house, a higher proportion of underweight birds, as well as a higher laying performance were found to be significantly associated with a higher percentage of hens with KBD. The final model explained 32% of the variation in KBD between farms. The moderate explanatory value of the model underlines the multifactorial nature of KBD. Based on the results increased attention should be paid to an adequate housing design and lighting that allows the birds easy orientation and safe manoeuvring in the system. Furthermore, feeding management should aim at sufficient bird live weights that fulfil breeder weight standards. In order to achieve a better understanding of the relationships between laying performance, feed management and KBD further investigations are needed.  相似文献   

5.

Background

Mainland Tanzania scaled up multiple malaria control interventions between 1999 and 2010. We evaluated whether, and to what extent, reductions in all-cause under-five child mortality (U5CM) tracked with malaria control intensification during this period.

Methods

Four nationally representative household surveys permitted trend analysis for malaria intervention coverage, severe anemia (hemoglobin <8 g/dL) prevalence (SAP) among children 6–59 months, and U5CM rates stratified by background characteristics, age, and malaria endemicity. Prevalence of contextual factors (e.g., vaccination, nutrition) likely to influence U5CM were also assessed. Population attributable risk percentage (PAR%) estimates for malaria interventions and contextual factors that changed over time were used to estimate magnitude of impact on U5CM.

Results

Household ownership of insecticide-treated nets (ITNs) rose from near zero in 1999 to 64% (95% CI, 61.7–65.2) in 2010. Intermittent preventive treatment of malaria in pregnancy reached 26% (95% CI, 23.6–28.0) by 2010. Sulfadoxine-pyrimethamine replaced chloroquine in 2002 and artemisinin-based combination therapy was introduced in 2007. SAP among children 6–59 months declined 50% between 2005 (11.1%; 95% CI, 10.0–12.3%) and 2010 (5.5%; 95% CI, 4.7–6.4%) and U5CM declined by 45% between baseline (1995–9) and endpoint (2005–9), from 148 to 81 deaths/1000 live births, respectively. Mortality declined 55% among children 1–23 months of age in higher malaria endemicity areas. A large reduction in U5CM was attributable to ITNs (PAR% = 11) with other malaria interventions adding further gains. Multiple contextual factors also contributed to survival gains.

Conclusion

Marked declines in U5CM occurred in Tanzania between 1999 and 2010 with high impact from ITNs and ACTs. High-risk children (1–24 months of age in high malaria endemicity) experienced the greatest declines in mortality and SAP. Malaria control should remain a policy priority to sustain and further accelerate progress in child survival.  相似文献   

6.
Background: Monocyte count and serum albumin (Alb) have been proven to be involved in the process of systemic inflammation. Therefore, we investigated the prognostic value of monocyte-to-albumin ratio (MAR) in patients who underwent percutaneous coronary intervention (PCI).Methods: We enrolled a total of 3561 patients in the present study from January 2013 to December 2017. They were divided into two groups according to MAR cut-off value (MAR < 0.014, n=2220; MAR ≥ 0.014, n=1119) as evaluated by receiver operating characteristic (ROC) curve. The average follow-up time was 37.59 ± 22.24 months.Results: The two groups differed significantly in the incidences of all-cause mortality (ACM; P<0.001), cardiac mortality (CM; P<0.001), major adverse cardiovascular events (MACEs; P=0.038), and major adverse cardiovascular and cerebrovascular events (MACCEs; P=0.037). Multivariate Cox regression analyses revealed MAR as an independent prognostic factor for ACM and CM. The incidence of ACM increased by 56.5% (hazard ratio [HR] = 1.565; 95% confidence interval [CI], 1.086–2.256; P=0.016) and that of CM increased by 76.3% (HR = 1.763; 95% CI, 1.106–2.810; P=0.017) in patients in the higher-MAR group. Kaplan–Meier survival analysis suggested that patients with higher MAR tended to have an increased accumulated risk of ACM (Log-rank P<0.001) and CM (Log-rank P<0.001).Conclusion: The findings of the present study suggested that MAR was a novel independent predictor of long-term mortality in patients who underwent PCI.  相似文献   

7.
The risk factors for cerebral malaria (CM) and the wide variation in clinical manifestations of malaria are poorly understood. Recent studies indicate that interferon gamma inducible chemokine, CXCL10, is a strong predictor of both human and experimental cerebral malaria. Increased plasma and cerebrospinal fluid levels of CXCL10 were tightly associated with fatal CM in Indian and Ghanaian patients. In the present study, we hypothesized that in a subset of malaria patients, CM susceptibility is associated with variation in CXCL10 expression. We determined whether polymorphisms in the CXCL10 gene promoter region played a role in the clinical status of malaria patients and addressed the genetic basis of CXCL10 expression during malaria infection. Following extensive bioinformatics analyses, two reported single nucleotide polymorphisms in the CXCL10 promoter (−135G>A [rs56061981] and −1447A>G [rs4508917]) were identified among 66 CM and 69 non-CM Indian patients using PCR-restriction fragment length polymorphism assay. Individuals with the −1447(A/G) genotype were susceptible to CM (adjusted odds ratio [AOR] = 2.60, 95% CI = 1.51–5.85, p = 0.021). In addition, individuals with the −1447(A/G) genotype had significantly higher plasma CXCL10 levels than individuals with the −1447(A/A) genotype. Stratifying patients according to gender, the observed association of CM with over expression of CXCL10 were more pronounced in males than in female patients (AOR = 5.47, 95% CI = 1.34–22.29, p = 0.018). Furthermore, −135G>A polymorphism conferred a decreased risk of CM among males (AOR = 0.19, 95% CI = 0.05–0.78, p = 0.021). Polymorphisms in the CXCL10 gene promoter sequence were associated with increased CXCL10 production, which is linked to severity of CM. These results suggest that the −1447A>G polymorphism in CXCL10 gene promoter could be partly responsible for the reported variation underlying severity of CM outcomes particularly in males.  相似文献   

8.
ObjectiveWe aimed to assess whether oxidative stress is a predictor of mortality in HIV-infected patients.MethodsWe conducted a nested case-control study in CoRIS, a contemporary, multicentre cohort of HIV-infected patients, antiretroviral-naïve at entry, launched in 2004. Cases were patients who died with available stored plasma samples collected. Two age and sex-matched controls for each case were selected. We measured F2-isoprostanes (F2-IsoPs) and malondialdehyde (MDA) plasma levels in the first blood sample obtained after cohort engagement.Results54 cases and 93 controls were included. Median F2-IsoPs and MDA levels were significantly higher in cases than in controls. When adjustment was performed for age, HIV-transmission category, CD4 cell count and HIV viral load at cohort entry, and subclinical inflammation measured with highly-sensitive C-reactive protein (hsCRP), the association of F2-IsoPs with mortality remained significant (adjusted OR per 1 log10 increase, 2.34 [1.23–4.47], P = 0.009). The association of MDA with mortality was attenuated after adjustment: adjusted OR (95% CI) per 1 log10 increase, 2.05 [0.91–4.59], P = 0.080. Median hsCRP was also higher in cases, and it also proved to be an independent predictor of mortality in the adjusted analysis: OR (95% CI) per 1 log10 increase, 1.39 (1.01–1.91), P = 0.043; and OR (95% CI) per 1 log10 increase, 1.46 (1.07–1.99), P = 0.014, respectively, when adjustment included F2-IsoPs and MDA.ConclusionOxidative stress is a predictor of all-cause mortality in HIV-infected patients. For plasma F2-IsoPs, this association is independent of HIV-related factors and subclinical inflammation.  相似文献   

9.
Laying hens housed in free-range systems have access to an outdoor range, and individual hens within a flock differ in their ranging behaviour. Whether there is a link between ranging and laying hen welfare remains unclear. We analysed the relationships between ranging by individual hens on a commercial free-range layer farm and behavioural, physiological and health measures of animal welfare. We hypothesised that hens that access the range more will be (1) less fearful in general and in response to novelty and humans, (2) have better health in terms of physical body condition and (3) have a reduced physiological stress response to behavioural tests of fear and health assessments than hens that use the range less. Using radio frequency identification tracking across two flocks, we recorded individual hens’ frequency, duration and consistency of ranging. We also assessed how far hens ventured into the range based on three zones: 0 to 2.4, 2.4 to 11.4 or >11.4 m from the shed. We assessed hen welfare using a variety of measures including: tonic immobility, open field, novel object, human approach, and human avoidance (HAV) behavioural tests; stress-induced plasma corticosterone response and faecal glucocorticoid metabolites; live weight, comb colour, and beak, plumage, footpad, and keel bone condition. Range use was positively correlated with plasma corticosterone response, faecal glucocorticoid metabolites, and greater flight distance during HAV. Hens that used the range more, moved towards rather than away from the novel object more often than hens that ranged less. Distance ranged from the shed was significantly associated with comb colour and beak condition, in that hens with darker combs and more intact beaks ranged further. Overall the findings suggest that there is no strong link between outdoor range usage and laying hen welfare. Alternatively, it may be that hens that differed in their ranging behaviour showed few differences in measures of welfare because free-range systems provide hens with adequate choice to cope with their environment. Further research into the relationship between individual range access and welfare is needed to test this possibility.  相似文献   

10.

Background

Determine the effect of the day 1 urinary excretion of cadmium (D1-UE-Cd) on mortality of patients admitted to a coronary care unit (CCU).

Methods

A total of 323 patients were enrolled in this 6-month study. Urine and blood samples were taken within 24 h after CCU admission. Demographic data, clinical diagnoses, and hospital mortality were recorded. The scores of established systems for prediction of mortality in critically ill patients were calculated.

Results

Compared with survivors (n = 289), non-survivors (n = 34) had higher levels of D1-UE-Cd. Stepwise multiple linear regression analysis indicated that D1-UE-Cd was positively associated with pulse rate and level of aspartate aminotransferase, but negatively associated with serum albumin level. Multivariate Cox analysis, with adjustment for other significant variables and measurements from mortality scoring systems, indicated that respiratory rate and D1-UE-Cd were independent and significant predictors of mortality. For each 1 μg/day increase of D1-UE-Cd, the hazard ratio for CCU mortality was 3.160 (95% confidence interval: 1.944–5.136, p < 0.001). The chi-square value of Hosmer-Lemeshow goodness-of-fit test for D1-UE-Cd was 10.869 (p = 0.213). The area under the receiver operating characteristic curve for D1-UE-Cd was 0.87 (95% confidence interval: 0.81–0.93).

Conclusions

The D1-UE-Cd, an objective variable with no inter-observer variability, accurately predicted hospital mortality of CCU patients and outperformed other established scoring systems. Further studies are needed to determine the physiological mechanism of the effect of cadmium on mortality in CCU patients.  相似文献   

11.
The aim of the present large population-based cohort study is to explore the risk factors of age-related mortality in liver transplant recipients in Taiwan. Basic information and data on medical comorbidities for 2938 patients who received liver transplants between July 1, 1998, and December 31, 2012, were extracted from the National Health Insurance Research Database on the basis of ICD-9-codes. Mortality risks were analyzed after adjusting for preoperative comorbidities and compared among age cohorts. All patients were followed up until the study endpoint or death. This study finally included 2588 adults and 350 children [2068 (70.4%) male and 870 (29.6%) female patients]. The median age at transplantation was 52 (interquartile range, 43–58) years. Recipients were categorized into the following age cohorts: <20 (n = 350, 11.9%), 20–39 (n = 254, 8.6%), 40–59 (n = 1860, 63.3%), and ≥60 (n = 474, 16.1%) years. In the total population, 428 deaths occurred after liver transplantation, and the median follow-up period was 2.85 years (interquartile range, 1.2–5.5 years). Dialysis patients showed the highest risk of mortality irrespective of age. Further, the risk of death increased with an increase in the age at transplantation. Older liver transplant recipients (≥60 years), especially dialysis patients, have a higher mortality rate, possibly because they have more medical comorbidities. Our findings should make clinicians aware of the need for better risk stratification among elderly liver transplantation candidates.  相似文献   

12.

Background

A high sensitivity C-reactive protein to albumin ratio (hs-CRP/Alb) predicts mortality risk in patients with acute kidney injury. However, it varies dynamically. This study was conducted to evaluate whether a variation of this marker was associated with long-term outcome in clinically stable hemodialysis (HD) patients.

Methods

hs-CRP/Alb was checked bimonthly in 284 clinically stable HD outpatients throughout all of 2008. Based on the “slope” of trend equation derived from 5–6 hs-CRP/alb ratios for each patient, the total number of patients was divided into quartiles—Group 1: β≦ −0.13, n = 71; group 2: β>-0.13≦0.003; n = 71, group 3: β>0.003≦0.20; and group 4: β>0.20, n = 71. The observation period was from January 1, 2009 to August 31, 2012.

Results

Group 1+4 showed a worse long-term survival (p = 0.04) and a longer 5-year hospitalization stay than Group 2+3 (38.7±44.4 vs. 16.7±22.4 days, p<0.001). Group 1+4 were associated with older age (OR = 1.03, 95% CI = 1.01–1.05) and a high prevalence of congestive heart failure (OR = 2.02, 95% CI = 1.00–4.11). Standard deviation (SD) of hs-CRP/Alb was associated with male sex (β = 0.17, p = 0.003), higher Davies co-morbidity score (β = 0.16, p = 0.03), and baseline hs-CRP (β = 0.39, p<0.001). Patients with lower baseline and stable trend of hs-CRP/Alb had a better prognosis. By multivariate Cox proportional methods, SD of hs-CRP/alb (HR: 1.05, 95% CI: 1.01–1.08) rather than baseline hs-CRP/Alb was an independent predictive factor for long-term mortality after adjusting for sex and HD vintage.

Conclusion

Clinically stable HD patients with a fluctuating variation of hs-CRP/Alb are characterized by old age, and more co-morbidity, and they tend to have longer subsequent hospitalization stay and higher mortality risk.  相似文献   

13.
Predicting mortality in dialysis patients based on low intact parathyroid hormone levels is difficult, because aluminum intoxication, malnutrition, older age, race, diabetes, or peritoneal dialysis may influence these levels. We investigated the clinical implications of low parathyroid hormone levels in relation to the mortality of dialysis patients using sensitive, stratified, and adjusted models and a nationwide dialysis database. We analyzed data from 2005 to 2012 that were held on the Taiwan Renal Registry Data System, and 94,983 hemodialysis patients with valid data regarding their intact parathyroid levels were included in this study. The patient cohort was subdivided based on the intact parathyroid hormone and alkaline phosphatase levels. The mean hemodialysis duration within this cohort was 3.5 years. The mean (standard deviation) age was 62 (14) years. After adjusting for age, sex, diabetes, the hemodialysis duration, serum albumin levels, hematocrit levels, calcium levels, phosphate levels, and the hemodialysis treatment adequacy score, the single-pool Kt/V, the crude and adjusted all-cause mortality rates increased when alkaline phosphatase levels were higher or intact parathyroid hormone levels were lower. In general, at any given level of serum calcium or phosphate, patients with low intact parathyroid hormone levels had higher mortality rates than those with normal or high iPTH levels. At a given alkaline phosphatase level, the hazard ratio for all-cause mortality was 1.33 (p < 0.01, 95% confidence interval 1.27–1.39) in the group with intact parathyroid hormone levels < 150 pg/mL and serum calcium levels > 9.5 mg/dL, but in the group with intact parathyroid hormone levels > 300 pg/mL and serum calcium levels > 9.5 mg/dL, the hazard ratio was 0.92 (95% confidence interval 0.85–1.01). Hence, maintaining albumin-corrected high serum calcium levels at > 9.5 mg/dL may correlate with poor prognoses for patients with low intact parathyroid hormone levels.  相似文献   

14.

Background and Objectives

Numerous substances accumulate in the body in uremia but those contributing to cardiovascular morbidity and mortality in dialysis patients are still undefined. We examined the association of baseline free levels of four organic solutes that are secreted in the native kidney — p-cresol sulfate, indoxyl sulfate, hippurate and phenylacetylglutamine — with outcomes in hemodialysis patients.

Design, Setting, Participants and Measurements

We measured these solutes in stored specimens from 394 participants of a US national prospective cohort study of incident dialysis patients. We examined the relation of each solute and a combined solute index to cardiovascular mortality and morbidity (first cardiovascular event) using Cox proportional hazards regression adjusted for demographics, comorbidities, clinical factors and laboratory tests including Kt/VUREA.

Results

Mean age of the patients was 57 years, 65% were white and 55% were male. In fully adjusted models, a higher p-cresol sulfate level was associated with a greater risk (HR per SD increase; 95% CI) of cardiovascular mortality (1.62; 1.17–2.25; p=0.004) and first cardiovascular event (1.60; 1.23–2.08; p<0.001). A higher phenylacetylglutamine level was associated with a greater risk of first cardiovascular event (1.37; 1.18–1.58; p<0.001). Patients in the highest quintile of the combined solute index had a 96% greater risk of cardiovascular mortality (1.96; 1.05–3.68; p=0.04) and 62% greater risk of first cardiovascular event (1.62; 1.12–2.35; p=0.01) compared with patients in the lowest quintile. Results were robust in sensitivity analyses.

Conclusions

Free levels of uremic solutes that are secreted by the native kidney are associated with a higher risk of cardiovascular morbidity and mortality in incident hemodialysis patients.  相似文献   

15.
Decreased bioavailability of nitric oxide (NO) is a major contributor to the pathophysiology of severe falciparum malaria. Tetrahydrobiopterin (BH4) is an enzyme cofactor required for NO synthesis from L-arginine. We hypothesized that systemic levels of BH4 would be decreased in children with cerebral malaria, contributing to low NO bioavailability. In an observational study in Tanzania, we measured urine levels of biopterin in its various redox states (fully reduced [BH4] and the oxidized metabolites, dihydrobiopterin [BH2] and biopterin [B0]) in children with uncomplicated malaria (UM, n = 55), cerebral malaria (CM, n = 45), non-malaria central nervous system conditions (NMC, n = 48), and in 111 healthy controls (HC). Median urine BH4 concentration in CM (1.10 [IQR:0.55–2.18] μmol/mmol creatinine) was significantly lower compared to each of the other three groups — UM (2.10 [IQR:1.32–3.14];p<0.001), NMC (1.52 [IQR:1.01–2.71];p = 0.002), and HC (1.60 [IQR:1.15–2.23];p = 0.005). Oxidized biopterins were increased, and the BH4:BH2 ratio markedly decreased in CM. In a multivariate logistic regression model, each Log10-unit decrease in urine BH4 was independently associated with a 3.85-fold (95% CI:1.89–7.61) increase in odds of CM (p<0.001). Low systemic BH4 levels and increased oxidized biopterins contribute to the low NO bioavailability observed in CM. Adjunctive therapy to regenerate BH4 may have a role in improving NO bioavailability and microvascular perfusion in severe falciparum malaria.  相似文献   

16.
Life history theory predicts trade-offs between reproductive effort and maternal survivorship in energy-restricted environments. However, empirical evidence for the positive association between maternal mortality and reproductive effort from energetically challenged human populations are mixed and physiological mechanisms that may underlie this association are poorly understood. We hypothesized that increases in aerobic metabolism during repeated periods of pregnancy and lactation result in increased oxidative stress that may contribute to somatic deterioration, vulnerability to illness, and accelerated aging. We therefore predicted that lifetime gravidity and parity would be related to levels of biomarkers of oxidative stress, as well as antioxidative defence enzymes in post-menopausal women. Our hypothesis was supported by positive linear associations between levels of 8-OHdG, a biomarker of DNA oxidative damage (β = 0.21, p<0.05), levels of antioxidative defence enzyme Cu-Zn SOD (β = 0.25, p<0.05), and number of lifetime pregnancies. Furthermore, independent of age and health status, post-menopausal women with higher gravidity and parity (> = 4 pregnancies per lifetime) had 20% higher levels of 8-OHdG and 60% higher levels of Cu-Zn SOD compared to women with lower gravidity and parity (<4 pregnancies per lifetime). Our results present the first evidence for oxidative stress as a possible cost of reproductive effort in humans.  相似文献   

17.
Research on mate choice has primarily focused on preferences for quality indicators, assuming that all individuals show consensus about who is the most attractive. However, in some species, mating preferences seem largely individual-specific, suggesting that they might target genetic or behavioral compatibility. Few studies have quantified the fitness consequences of allowing versus preventing such idiosyncratic mate choice. Here, we report on an experiment that controls for variation in overall partner quality and show that zebra finch (Taeniopygia guttata) pairs that resulted from free mate choice achieved a 37% higher reproductive success than pairs that were forced to mate. Cross-fostering of freshly laid eggs showed that embryo mortality (before hatching) primarily depended on the identity of the genetic parents, whereas offspring mortality during the rearing period depended on foster-parent identity. Therefore, preventing mate choice should lead to an increase in embryo mortality if mate choice targets genetic compatibility (for embryo viability), and to an increase in offspring mortality if mate choice targets behavioral compatibility (for better rearing). We found that pairs from both treatments showed equal rates of embryo mortality, but chosen pairs were better at raising offspring. These results thus support the behavioral, but not the genetic, compatibility hypothesis. Further exploratory analyses reveal several differences in behavior and fitness components between “free-choice” and “forced” pairs.  相似文献   

18.
ObjectiveDehydroepiandrosterone sulphate (DHEA-s) is an anabolic protective hormone of importance for maintenance of health. DHEA-s levels peak in young adults and decline thereafter with age. DHEA-s has previously been shown to be lower in individuals reporting prolonged stress. This study investigates DHEA-s levels in patients with clinical burnout, a disorder caused by long-term psychosocial stress.Methods122 patients (51% men) and 47 controls (51% men) in the age 25–54 years were included in the study. DHEA-s levels were compared between patients and controls in the whole sample and within each of the three 10-year-interval age groups.ResultsIn the youngest age group (25–34 years), DHEA-s levels were on average 25% lower in the patients (p = 0.006). The differences in DHEA-s levels between patients and controls were more pronounced among female than male participants (on average 32% and 13% lower, respectively). There were no differences in DHEA-s levels between patients and controls in the age group 35–44 years (p = 0.927) or 45–54 years (p = 0.897) or when analyzing all age groups together (p = 0.187).ConclusionThe study indicates that levels of the health promoting “youth” hormone DHEA-s are low in younger burnout patients. The fact that younger adults have much higher DHEA-s levels and more pronounced inter-subject variability in DHEA-s levels than older individuals might explain why burnout status differentiates patients from controls only among the youngest patients included in this study.  相似文献   

19.
Through social interactions, individuals can affect one another’s phenotype. The heritable effect of an individual on the phenotype of a conspecific is known as an indirect genetic effect (IGE). Although IGEs can have a substantial impact on heritable variation and response to selection, little is known about the genetic architecture of traits affected by IGEs. We studied IGEs for survival in domestic chickens (Gallus gallus), using data on two purebred lines and their reciprocal cross. Birds were kept in groups of four. Feather pecking and cannibalism caused mortality, as beaks were kept intact. Survival time was shorter in crossbreds than in purebreds, indicating outbreeding depression and the presence of nonadditive genetic effects. IGEs contributed the majority of heritable variation in crossbreds (87 and 72%) and around half of heritable variation in purebreds (65 and 44%). There was no evidence of dominance variance, neither direct nor indirect. Absence of dominance variance in combination with considerable outbreeding depression suggests that survival is affected by many loci. Direct–indirect genetic correlations were moderately to highly negative in crossbreds (−0.37 ± 0.17 and −0.83 ± 0.10), but low and not significantly different from zero in purebreds (0.20 ± 0.21 and −0.28 ± 0.18). Consequently, unlike purebreds, crossbreds would fail to respond positively to mass selection. The direct genetic correlation between both crosses was high (0.95 ± 0.23), whereas the indirect genetic correlation was moderate (0.41 ± 0.26). Thus, for IGEs, it mattered which parental line provided the sire and which provided the dam. This indirect parent-of-origin effect appeared to be paternally transmitted and is probably Z chromosome linked.  相似文献   

20.

Purpose

We examined individual-level and neighborhood-level predictors of mortality in CRC patients diagnosed in Florida to identify high-risk groups for targeted interventions.

Methods

Demographic and clinical data from the Florida Cancer Data System registry (2007–2011) were linked with Agency for Health Care Administration and US Census data (n = 47,872). Cox hazard regression models were fitted with candidate predictors of CRC survival and stratified by age group (18–49, 50–64, 65+).

Results

Stratified by age group, higher mortality risk per comorbidity was found among youngest (21%), followed by middle (19%), and then oldest (14%) age groups. The two younger age groups had higher mortality risk with proximal compared to those with distal cancer. Compared with private insurance, those in the middle age group were at higher death risk if not insured (HR = 1.35), or received healthcare through Medicare (HR = 1.44), Medicaid (HR = 1.53), or the Veteran’s Administration (HR = 1.26). Only Medicaid in the youngest (52% higher risk) and those not insured in the oldest group (24% lower risk) were significantly different from their privately insured counterparts. Among 18–49 and 50–64 age groups there was a higher mortality risk among the lowest SES (1.17- and 1.23-fold higher in the middle age and 1.12- and 1.17-fold higher in the older age group, respectively) compared to highest SES. Married patients were significantly better off than divorced/separated (HR = 1.22), single (HR = 1.29), or widowed (HR = 1.19) patients.

Conclusion

Factors associated with increased risk for mortality among individuals with CRC included being older, uninsured, unmarried, more comorbidities, living in lower SES neighborhoods, and diagnosed at later disease stage. Higher risk among younger patients was attributed to proximal cancer site, Medicaid, and distant disease; however, lower SES and being unmarried were not risk factors in this age group. Targeted interventions to improve survivorship and greater social support while considering age classification may assist these high-risk groups.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号