首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 109 毫秒
1.
IntroductionDelayed graft function is a prevalent clinical problem in renal transplantation for which there is no objective system to predict occurrence in advance. It can result in a significant increase in the necessity for hospitalisation post-transplant and is a significant risk factor for other post-transplant complications.MethodologyThe importance of microRNAs (miRNAs), a specific subclass of small RNA, have been clearly demonstrated to influence many pathways in health and disease. To investigate the influence of miRNAs on renal allograft performance post-transplant, the expression of a panel of miRNAs in pre-transplant renal biopsies was measured using qPCR. Expression was then related to clinical parameters and outcomes in two independent renal transplant cohorts.ResultsHere we demonstrate, in two independent cohorts of pre-implantation human renal allograft biopsies, that a novel pre-transplant renal performance scoring system (GRPSS), can determine the occurrence of DGF with a high sensitivity (>90%) and specificity (>60%) for donor allografts pre-transplant, using just three senescence associated microRNAs combined with donor age and type of organ donation.ConclusionThese results demonstrate a relationship between pre-transplant microRNA expression levels, cellular biological ageing pathways and clinical outcomes for renal transplantation. They provide for a simple, rapid quantitative molecular pre-transplant assay to determine post-transplant allograft function and scope for future intervention. Furthermore, these results demonstrate the involvement of senescence pathways in ischaemic injury during the organ transplantation process and an indication of accelerated bio-ageing as a consequence of both warm and cold ischaemia.  相似文献   

2.
Older and marginal donors have been used to meet the shortfall in available organs for renal transplantation. Post-transplant renal function and outcome from these donors are often poorer than chronologically younger donors. Some organs, however, function adequately for many years. We have hypothesized that such organs are biologically younger than poorer performing counterparts. We have tested this hypothesis in a cohort of pre-implantation human renal allograft biopsies ( n  = 75) that have been assayed by real-time polymerase chain reaction for the expression of known markers of cellular damage and biological aging, including CDKN2A, CDKN1A, SIRT2 and POT1. These have been investigated for any associations with traditional factors affecting transplant outcome (donor age, cold ischaemic time) and organ function post-transplant (serum creatinine levels). Linear regression analyses indicated a strong association for serum creatinine with pre-transplant CDKN2A levels ( p  = 0.001) and donor age ( p  = 0.004) at 6 months post-transplant. Both these markers correlated significantly with urinary protein to creatinine ratios ( p  = 0.002 and p  = 0.005 respectively), an informative marker for subsequent graft dysfunction. POT1 expression also showed a significant association with this parameter ( p  = 0.05). Multiple linear regression analyses for CDKN2A and donor age accounted for 24.6% ( p  = 0.001) of observed variability in serum creatinine levels at 6 months and 23.7% ( p  = 0.001) at 1 year post-transplant. Thus, these data indicate that allograft biological age is an important novel prognostic determinant for renal transplant outcome.  相似文献   

3.

Background

Cellular senescence may be a key factor in HIV-related premature biological aging. We assessed features of the corneal endothelium that are known to be associated with biological aging, and cellular senescence markers in HIV-infected adults.

Methods

Case-control study of 242 HIV-infected adults and 249 matched controls. Using specular microscopy, the corneal endothelium was assessed for features of aging (low endothelial cell density [ECD], high variation in cell size, and low hexagonality index). Data were analysed by multivariable regression. CDKN2A expression (a cell senescence mediator) was measured in peripheral blood leukocytes and 8-hydroxy-2′-deoxyguanosine (8-OHDG; an oxidative DNA damage marker) levels were measured in plasma.

Results

The median age of both groups was 40 years. Among HIV-infected adults, 88% were receiving antiretroviral therapy (ART); their median CD4 count was 468 cells/µL. HIV infection was associated with increased odds of variation in cell size (OR = 1.67; 95% CI: 1.00–2.78, p = 0.04). Among HIV-infected participants, low ECD was independently associated with current CD4 count <200 cells/µL (OR = 2.77; 95%CI: 1.12–6.81, p = 0.03). In participants on ART with undetectable viral load, CDKN2A expression and 8-OHDG levels were higher in those with accelerated aging, as reflected by lower ECD.

Conclusions

The corneal endothelium shows features consistent with HIV-related accelerated senescence, especially among those with poor immune recovery.  相似文献   

4.
Although chronological donor age is the most potent predictor of long-term outcome after renal transplantation, it does not incorporate individual differences of the aging-process itself. We therefore hypothesized that an estimate of biological organ age as derived from markers of cellular senescence in zero hour biopsies would be of higher predictive value. Telomere length and mRNA expression levels of the cell cycle inhibitors CDKN2A (p16INK4a) and CDKN1A (p21WAF1) were assessed in pre-implantation biopsies of 54 patients and the association of these and various other clinical parameters with serum creatinine after 1 year was determined. In a linear regression analysis, CDKN2A turned out to be the best single predictor followed by donor age and telomere length. A multiple linear regression analysis revealed that the combination of CDKN2A values and donor age yielded even higher predictive values for serum creatinine 1 year after transplantation. We conclude that the molecular aging marker CDKN2A in combination with chronological donor age predict renal allograft function after 1 year significantly better than chronological donor age alone.  相似文献   

5.

Background

The use of expanded criteria donor kidneys (ECD) had been associated with worse outcomes. Whole gene expression of pre-implantation allograft biopsies from deceased donor kidneys (DDKs) was evaluated to compare the effect of pulsatile pump preservation (PPP) vs. cold storage preservation (CSP) on standard and ECD kidneys.

Methodology/Principal Findings

99 pre-implantation DDK biopsies were studied using gene expression with GeneChips. Kidneys transplant recipients were followed post transplantation for 35.8 months (range = 24–62). The PPP group included 60 biopsies (cold ischemia time (CIT)  = 1,367+/−509 minutes) and the CSP group included 39 biopsies (CIT = 1,022+/−485 minutes) (P<0.001). Donor age (42.0±14.6 vs. 34.1±14.2 years, P = 0.009) and the percentage of ECD kidneys (PPP = 35% vs. CSP = 12.8%, P = 0.012) were significantly different between groups. A two-sample t-test was performed, and probe sets having a P<0.001 were considered significant. Probe set level linear models were fit using cold ischemia time and CSP/PPP as independent variables to determine significant probe sets (P<0.001) between groups after adjusting for cold ischemia time. Thus, 43 significant genes were identified (P<0.001). Over-expression of genes associated with inflammation (CD86, CD209, CLEC4, EGFR2, TFF3, among others) was observed in the CSP group. Cell-to-cell signaling and interaction, and antigen presentation were the most important pathways with genes significantly over-expressed in CSP kidneys. When the analysis was restricted to ECD kidneys, genes involved in inflammation were also differentially up-regulated in ECD kidneys undergoing CSP. However, graft survival at the end of the study was similar between groups (P = 0.2). Moreover, the incidence of delayed graft function was not significant between groups.

Conclusions/Significance

Inflammation was the most important up-regulated pattern associated with pre-implantation biopsies undergoing CSP even when the PPP group has a larger number of ECD kidneys. No significant difference was observed in delayed graft function incidence and graft function post-transplantation. These findings support the use of PPP in ECD donor kidneys.  相似文献   

6.

Background

X-linked adrenoleukodystrophy results from mutations in the ABCD1 gene disrupting the metabolism of very-long-chain fatty acids. The most serious form of ALD, cerebral adrenoleukodystrophy (cALD), causes neuroinflammation and demyelination. Neuroimaging in cALD shows inflammatory changes and indicates blood-brain-barrier (BBB) disruption. We hypothesize that disruption may occur through the degradation of the extracellular matrix defining the BBB by matrix metalloproteinases (MMPs). MMPs have not been evaluated in the setting of cALD.

Methodology/Principal Findings

We used a multiplex assay to correlate the concentration of MMPs in cerebrospinal fluid and plasma to the severity of brain inflammation as determined by the ALD MRI (Loes) score and the neurologic function score. There were significant elevations of MMP2, MMP9, MMP10, TIMP1, and total protein in the CSF of boys with cALD compared to controls. Levels of MMP10, TIMP1, and total protein in CSF showed significant correlation [p<0.05 for each with pre-transplant MRI Loes Loes scores (R2 = 0.34, 0.20, 0.55 respectively). Levels of TIMP1 and total protein in CSF significantly correlated with pre-transplant neurologic functional scores (R2 = 0.22 and 0.48 respectively), and levels of MMP10 and total protein in CSF significantly correlated with one-year post-transplant functional scores (R2 = 0.38 and 0.69). There was a significant elevation of MMP9 levels in plasma compared to control, but did not correlate with the MRI or neurologic function scores.

Conclusions/Significance

MMPs were found to be elevated in the CSF of boys with cALD and may mechanistically contribute to the breakdown of the blood-brain-barrier. MMP concentrations directly correlate to radiographic and clinical neurologic severity. Interestingly, increased total protein levels showed superior correlation to MRI score and neurologic function score before and at one year after transplant.  相似文献   

7.

Background

Expanded criteria donors (ECDs) are currently accepted as potential sources to increase the donor pool and to provide more chances of kidney transplantation for elderly recipients who would not survive long waiting periods. Hypothermic machine perfusion (HMP) is designed to mitigate the deleterious effects of simple cold storage (CS) on the quality of preserved organs, particularly when the donor is in a marginal status.

Methods

We compared the transplant outcomes in patients receiving ECD kidneys with either HMP or CS graft preservation. Articles from the MEDLINE, EMBASE and Cochrane Library databases were searched and all studies reporting outcomes from HMP versus CS methods of kidney preservation were included in this meta-analysis. The parameters analyzed included the incidence of delayed graft function (DGF), primary non-function (PNF) and one-year graft and patient survival.

Results

A total of seven studies qualified for the review, involving 2374 and 8716 kidney grafts with HMP or CS preservation respectively, all from ECD donors. The incidence of delayed graft function (DGF) was significantly reduced with an odd ratio(OR) of 0.59 (95% CI 0.54–0.66, P<0.001) and one-year graft survival was significantly improved with an OR of 1.12 (95% CI 1.03–1.21, P = 0.005) in HMP preservation compared to CS. However, there was no difference in the incidence of PNF (OR 0.54, 95% CI 0.21–1.40, P = 0.20), and one-year patient survival (OR 0.98, 95% CI 0.94–1.02, P = 0.36) between HMP and CS preservation.

Conclusions

HMP was associated with a reduced incidence of DGF and an with increased one-year graft survival, but it was not associated with the incidence of PNF and one-year patient survival.  相似文献   

8.
Identifying patients who are potential placebo responders has major implications for clinical practice and trial design. Catechol-O-methyltransferase (COMT), an important enzyme in dopamine catabolism plays a key role in processes associated with the placebo effect such as reward, pain, memory and learning. We hypothesized that the COMT functional val158met polymorphism, was a predictor of placebo effects and tested our hypothesis in a subset of 104 patients from a previously reported randomized controlled trial in irritable bowel syndrome (IBS). The three treatment arms from this study were: no-treatment (“waitlist”), placebo treatment alone (“limited”) and, placebo treatment “augmented” with a supportive patient-health care provider interaction. The primary outcome measure was change from baseline in IBS-Symptom Severity Scale (IBS-SSS) after three weeks of treatment. In a regression model, the number of methionine alleles in COMT val158met was linearly related to placebo response as measured by changes in IBS-SSS (p = .035). The strongest placebo response occurred in met/met homozygotes treated in the augmented placebo arm. A smaller met/met associated effect was observed with limited placebo treatment and there was no effect in the waitlist control. These data support our hypothesis that the COMT val158met polymorphism is a potential biomarker of placebo response.  相似文献   

9.
Cold ischemia time especially impacts on outcomes of expanded-criteria donor (ECD) transplantation. Ischemia-reperfusion (IR) injury produces excessive poly[ADP-Ribose] Polymerase-1 (PARP-1) activation. The present study explored the hypothesis that increased tubular expression of PARP-1 contributes to delayed renal function in suboptimal ECD kidney allografts and in non-ECD allografts that develop posttransplant acute tubular necrosis (ATN).

Materials and Methods

Nuclear PARP-1 immunohistochemical expression was studied in 326 paraffin-embedded renal allograft biopsies (193 with different degrees of ATN and 133 controls) and in murine Parp-1 knockout model of IR injury.

Results

PARP-1 expression showed a significant relationship with cold ischemia time (r coefficient = 0.603), time to effective diuresis (r = 0.770), serum creatinine levels at biopsy (r = 0.649), and degree of ATN (r = 0.810) (p = 0.001, Pearson test). In the murine IR model, western blot showed an increase in PARP-1 that was blocked by Parp-1 inhibitor. Immunohistochemical study of PARP-1 in kidney allograft biopsies would allow early detection of possible delayed renal function, and the administration of PARP-1 inhibitors may offer a therapeutic option to reduce damage from IR in donor kidneys by preventing or minimizing ATN. In summary, these results suggest a pivotal role for PARP-1 in the ATN of renal transplantation. We propose the immunohistochemical assessment of PARP-1 in kidney allograft biopsies for early detection of a possible delayed renal function.  相似文献   

10.

Background

Podocyte injury is an early feature of diabetic nephropathy (DN). Recently, urinary exosomal Wilm''s tumor-1 protein (WT1), shed by renal epithelial cells, has been proposed as a novel biomarker for podocyte injury. However, its usefulness as biomarker for early diabetic nephropathy has not been verified yet. We investigated urinary exosomal WT1 in type-1 diabetic patients to confirm its role as a non-invasive biomarker for predicting early renal function decline.

Methods

The expression of WT1 protein in urinary exosomes from spot urine samples of type-1 diabetes mellitus patients (n = 48) and healthy controls (n = 25) were analyzed. Patients were divided based on their urinary albumin excretion, ACR (mg/g creatinine) into non- proteinuria group (ACR<30 mg/g, n = 30) and proteinuria group (ACR>30 mg/g, n = 18). Regression analysis was used to assess the association between urinary exosomal levels of WT1 with parameters for renal function. Receiver Operating Characteristic (ROC) curve analysis was used to determine the diagnostic performance of exosomal WT-1.

Results

WT1 protein was detected in 33 out of 48 diabetic patients and in only 1 healthy control. The levels of urinary exosomal WT1 protein is significantly higher (p = 0.001) in patients with proteinuria than in those without proteinuria. In addition, all the patients with proteinuria but only half of the patients without proteinuria were positive for exosomal WT1. We found that the level of exosomal WT1 were associated with a significant increase in urine protein-to-creatinine ratio, albumin-to-creatinine ratio, and serum creatinine as well as a decline in eGFR. Furthermore, patients exhibiting WT1-positive urinary exosomes had decreased renal function compared to WT1-negative patients. ROC analysis shows that WT-1 effectively predict GFR<60 ml. min-1/1.73 m2.

Conclusion

The predominant presence of WT1 protein in urinary exosomes of diabetic patients and increase in its expression level with decline in renal function suggest that it could be useful as early non-invasive marker for diabetic nephropathy.  相似文献   

11.
Sensitive and specific urinary biomarkers can improve patient outcomes in many diseases through informing early diagnosis. Unfortunately, to date, the accuracy and translation of diagnostic urinary biomarkers into clinical practice has been disappointing. We believe this may be due to inappropriate standardization of diagnostic urinary biomarkers. Our objective was therefore to characterize the effects of standardizing urinary levels of IL-6, IL-8, and VEGF using the commonly applied standards namely urinary creatinine, osmolarity and protein. First, we report results based on the biomarker levels measured in 120 hematuric patients, 80 with pathologically confirmed bladder cancer, 27 with confounding pathologies and 13 in whom no underlying cause for their hematuria was identified, designated “no diagnosis”. Protein levels were related to final diagnostic categories (p = 0.022, ANOVA). Osmolarity (mean = 529 mOsm; median = 528 mOsm) was normally distributed, while creatinine (mean = 10163 µmol/l, median = 9350 µmol/l) and protein (0.3297, 0.1155 mg/ml) distributions were not. When we compared AUROCs for IL-6, IL-8 and VEGF levels, we found that protein standardized levels consistently resulted in the lowest AUROCs. The latter suggests that protein standardization attenuates the “true” differences in biomarker levels across controls and bladder cancer samples. Second, in 72 hematuric patients; 48 bladder cancer and 24 controls, in whom urine samples had been collected on recruitment and at follow-up (median = 11 (1 to 20 months)), we demonstrate that protein levels were approximately 24% lower at follow-up (Bland Altman plots). There was an association between differences in individual biomarkers and differences in protein levels over time, particularly in control patients. Collectively, our findings identify caveats intrinsic to the common practice of protein standardization in biomarker discovery studies conducted on urine, particularly in patients with hematuria.  相似文献   

12.
The extracellular domain of the HER-2 (ERBB2) oncoprotein (p105HER-2, ECD HER-2) is shed in the serum and can be detected by immunoassay. The currently approved cutoff for an elevated HER-2 ECD is greater than 15 ng/mL. HER-2 ECD is attractive as a potentially useful serum biomarker. In metastatic breast cancer, serum assay of HER-2 ECD is recommended to assess HER-2 status when unknown in primary tumor or metastatic biopsy is not feasible. In that case, it has been recommended to consider a 50 ng/mL serum HER-2 ECD cutoff as criteria to select patients for HER-2 targeted therapy. It might offer additional prognostic or predictive value compared with HER-2 tumor tissue testing. In addition, it can be measured serially and might be able to monitor treatment response, predict relapse or provide a real-time assessment of HER-2 status at metastatic disease.  相似文献   

13.
The phases of bone marrow transplantation can be identified as the pre-transplant period, the immediate post-transplant period, and the late post-transplant period. The pre-transplant period is characterized by identification of the appropriate type of transplant to be done and, if necessary, finding an appropriate donor; entry of the patient into the transplant unit; administration of the preparative chemotherapy/irradiation regime; management of early toxicities; and pre-transplant supportive care. Nurses play an integral role during the entire transplant process. During the pre-transplant phase, nursing expertise is exemplified in the administration of chemotherapy, management of side effects, teaching of transplant procedures to patient and family, and supportive care. This paper reviews the patient care issues during the pre-transplant phase of bone marrow transplantation and identifies nursing management strategies.  相似文献   

14.

Background

Telomeres are involved in cellular ageing and shorten with increasing age. If telomere length is a valuable biomarker of ageing, then telomere shortening should be associated with worse physical performance, an ageing trait, but evidence for such an association is lacking. The purpose of this study was to examine whether change in telomere length is associated with physical performance.

Methods

Using data from four UK adult cohorts (ages 53–80 years at baseline), we undertook cross-sectional and longitudinal analyses. We analysed each study separately and then used meta-analytic methods to pool the results. Physical performance was measured using walking and chair rise speed, standing balance time and grip strength. Telomere length was measured by quantitative real-time polymerase chain reaction (PCR) in whole blood at baseline and follow-up (time 1, time 2).

Results

Total sample sizes in meta-analyses ranged from 1,217 to 3,707. There was little evidence that telomere length was associated with walking speed, balance or grip strength, though weak associations were seen with chair rise speed and grip strength at baseline (p = 0.02 and 0.01 respectively). Faster chair rise speed at follow-up, was associated with a smaller decline in telomere length between time 1 and time 2 (standardised coefficient per SD increase 0.061, 95% CI 0.006, 0.115, p = 0.03) but this was consistent with chance (p = 0.08) after further adjustment.

Conclusions

Whereas shortening of leukocyte telomeres might be an important measure of cellular ageing, there is little evidence that it is a strong biomarker for physical performance.  相似文献   

15.

Aim

The selection criteria for patients with hepatocellular carcinoma (HCC) to undergo liver transplantation should accurately predict posttransplant recurrence while not denying potential beneficiaries. In the present study, we attempted to identify risk factors associated with posttransplant recurrence and to expand the selection criteria.

Patients and Methods

Adult patients with HCC who underwent liver transplantation between November 2004 and September 2012 at our centre were recruited into the current study (N = 241). Clinical and pathological data were retrospectively reviewed. Patients who died during the perioperative period or died of non-recurrence causes were excluded from this study (N = 25). All potential risk factors were analysed using uni- and multi-variate analyses.

Results

Sixty-one recipients of 216 qualified patients suffered from recurrence. Similar recurrence-free and long-term survival rates were observed between living donor liver transplant recipients (N = 60) and deceased donor liver transplant recipients (N = 156). Total tumour volume (TTV) and preoperative percentage of lymphocytes (L%) were two independent risk factors in the multivariate analysis. We propose a prognostic score model based on these two risk factors. Patients within our criteria achieved a similar recurrence-free survival to patients within the Milan criteria. Seventy-one patients who were beyond the Milan criteria but within our criteria also had comparable survival to patients within the Milan criteria.

Conclusions

TTV and L% are two risk factors that contribute to posttransplant recurrence. Selection criteria based on these two factors, which are proposed by our study, expanded the Milan criteria without increasing the risk of posttransplant recurrence.  相似文献   

16.

Objective

The current study was designed to evaluate the sensitivity, feasibility, and effectiveness of the pallidal index (PI) serving as a biomarker of brain manganese (Mn) accumulation, which would be used as an early diagnosis criteria for Mn neurotoxicity.

Methods

The weighted mean difference (WMD) of the PI between control and Mn-exposed groups was estimated by using a random-effects or fixed-effects meta-analysis with 95% confidence interval (CI) performed by STATA software version 12.1. Moreover, the R package “metacor” was used to estimate correlation coefficients between PI and blood Mn (MnB).

Results

A total of eight studies with 281 occupationally Mn-exposed workers met the inclusion criteria. Results were pooled and performed with the Meta-analysis. Our data indicated that the PI of the exposed group was significantly higher than that of the control (WMD: 7.76; 95% CI: 4.86, 10.65; I2 = 85.7%, p<0.0001). A random effects model was used to perform meta-analysis. These findings were remarkably robust in the sensitivity analysis, and publication bias was shown in the included studies. Seven out of the eight studies reported the Pearson correlation (r) values. Significantly positive correlation between PI and MnB was observed (r = 0.42; 95% CI, 0.31, 0.52).

Conclusions

PI can be considered as a sensitive, feasible, effective and semi-quantitative index in evaluating brain Mn accumulation. MnB can also augment the evaluation of brain Mn accumulation levels in the near future. However, the results should be interpreted with caution.  相似文献   

17.

Background

Hyponatremia is the most common electrolyte disorder in clinical practice, and evidence to date indicates that severe hyponatremia is associated with increased morbidity and mortality. The aim of our study was to perform a meta-analysis that included the published studies that compared mortality rates in subjects with or without hyponatremia of any degree.

Methods and Findings

An extensive Medline, Embase and Cochrane search was performed to retrieve the studies published up to October 1st 2012, using the following words: “hyponatremia” and “mortality”. Eighty-one studies satisfied inclusion criteria encompassing a total of 850222 patients, of whom 17.4% were hyponatremic. The identification of relevant abstracts, the selection of studies and the subsequent data extraction were performed independently by two of the authors, and conflicts resolved by a third investigator. Across all 81 studies, hyponatremia was significantly associated with an increased risk of overall mortality (RR = 2.60[2.31–2.93]). Hyponatremia was also associated with an increased risk of mortality in patients with myocardial infarction (RR = 2.83[2.23–3.58]), heart failure (RR = 2.47[2.09–2.92]), cirrhosis (RR = 3.34[1.91–5.83]), pulmonary infections (RR = 2.49[1.44–4.30]), mixed diseases (RR = 2.59[1.97–3.40]), and in hospitalized patients (RR = 2.48[2.09–2.95]). A mean difference of serum [Na+] of 4.8 mmol/L was found in subjects who died compared to survivors (130.1±5.6 vs 134.9±5.1 mmol/L). A meta-regression analysis showed that the hyponatremia-related risk of overall mortality was inversely correlated with serum [Na+]. This association was confirmed in a multiple regression model after adjusting for age, gender, and diabetes mellitus as an associated morbidity.

Conclusions

This meta-analysis shows for the first time that even a moderate serum [Na+] decrease is associated with an increased risk of mortality in commonly observed clinical conditions across large numbers of patients.  相似文献   

18.
19.
Bone is a dynamically remodeled tissue that requires gravity-mediated mechanical stimulation for maintenance of mineral content and structure. Homeostasis in bone occurs through a balance in the activities and signaling of osteoclasts, osteoblasts, and osteocytes, as well as proliferation and differentiation of their stem cell progenitors. Microgravity and unloading are known to cause osteoclast-mediated bone resorption; however, we hypothesize that osteocytic osteolysis, and cell cycle arrest during osteogenesis may also contribute to bone loss in space. To test this possibility, we exposed 16-week-old female C57BL/6J mice (n = 8) to microgravity for 15-days on the STS-131 space shuttle mission. Analysis of the pelvis by µCT shows decreases in bone volume fraction (BV/TV) of 6.29%, and bone thickness of 11.91%. TRAP-positive osteoclast-covered trabecular bone surfaces also increased in microgravity by 170% (p = 0.004), indicating osteoclastic bone degeneration. High-resolution X-ray nanoCT studies revealed signs of lacunar osteolysis, including increases in cross-sectional area (+17%, p = 0.022), perimeter (+14%, p = 0.008), and canalicular diameter (+6%, p = 0.037). Expression of matrix metalloproteinases (MMP) 1, 3, and 10 in bone, as measured by RT-qPCR, was also up-regulated in microgravity (+12.94, +2.98 and +16.85 fold respectively, p<0.01), with MMP10 localized to osteocytes, and consistent with induction of osteocytic osteolysis. Furthermore, expression of CDKN1a/p21 in bone increased 3.31 fold (p<0.01), and was localized to osteoblasts, possibly inhibiting the cell cycle during tissue regeneration as well as conferring apoptosis resistance to these cells. Finally the apoptosis inducer Trp53 was down-regulated by −1.54 fold (p<0.01), possibly associated with the quiescent survival-promoting function of CDKN1a/p21. In conclusion, our findings identify the pelvic and femoral region of the mouse skeleton as an active site of rapid bone loss in microgravity, and indicate that this loss is not limited to osteoclastic degradation. Therefore, this study offers new evidence for microgravity-induced osteocytic osteolysis, and CDKN1a/p21-mediated osteogenic cell cycle arrest.  相似文献   

20.
Aminoacyl-tRNA synthetases (ARSs) are in charge of cellular protein synthesis and have additional domains that function in a versatile manner beyond translation. Eight core ARSs (EPRS, MRS, QRS, RRS, IRS, LRS, KRS, DRS) combined with three nonenzymatic components form a complex known as multisynthetase complex (MSC).We hypothesize that the single-nucleotide polymorphisms (SNPs) of the eight core ARS coding genes might influence the susceptibility of sporadic congenital heart disease (CHD). Thus, we conducted a case-control study of 984 CHD cases and 2953 non-CHD controls in the Chinese Han population to evaluate the associations of 16 potentially functional SNPs within the eight ARS coding genes with the risk of CHD. We observed significant associations with the risk of CHD for rs1061248 [G/A; odds ratio (OR) = 0.90, 95% confidence interval (CI) = 0.81–0.99; P = 3.81×10−2], rs2230301 [A/C; OR = 0.73, 95%CI = 0.60–0.90, P = 3.81×10−2], rs1061160 [G/A; OR = 1.18, 95%CI = 1.06–1.31; P = 3.53×10−3] and rs5030754 [G/A; OR = 1.39, 95%CI = 1.11–1.75; P = 4.47×10−3] of EPRS gene. After multiple comparisons, rs1061248 conferred no predisposition to CHD. Additionally, a combined analysis showed a significant dosage-response effect of CHD risk among individuals carrying the different number of risk alleles (P trend = 5.00×10−4). Compared with individuals with “0–2” risk allele, those carrying “3”, “4” or “5 or more” risk alleles had a 0.97-, 1.25- or 1.38-fold increased risk of CHD, respectively. These findings indicate that genetic variants of the EPRS gene may influence the individual susceptibility to CHD in the Chinese Han population.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号