首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 46 毫秒
1.
Background

The conventional radical resection of proximal gastric cancer is even more risky when performed laparoscopically, though this technique is widely used in gastrointestinal surgery and is accepted as the superior method. This paper explores the feasibility of laparoscopic spleen-preserving hilar lymph node dissection using a retro-pancreatic approach for the treatment of proximal gastric cancer.

Methods

Two cadavers were dissected for examination of and the pre-pancreatic and retro-pancreatic spaces. Following the dissection of the cadavers, ten live patients with proximal gastric cancer from May 2008 to May 2013 at Nanfang Hospital, Guangzhou, China, were given total gastrectomy and adjuvant splenic hilar lymph node clearance through pre-pancreatic and retro-pancreatic approach on the precondition of preserving the pancreas and spleen. The clinicopathologic characteristics, as well as the intraoperative and postoperative variables affecting the procedure, were observed and analyzed.

Results

Anatomy of the space anterior and posterior to the pancreas in the two cadavers demonstrated the feasibility of pre-pancreatic and retro-pancreatic approach. The surgeries were all successfully performed laparoscopically; conversion to laparotomy was not necessary for any of the ten patients. The overall mean operative time was 243.6 ± 45 min. The mean estimated blood loss was 232 ± 80 ml. At the time of follow-up (median 12 months post-surgery), there had been neither local recurrence nor mortality in any of the patients.

Conclusion

Laparoscopic spleen- and pancreas-preserving splenic hilar lymph node dissection during total gastrectomy, using both pre-pancreatic and retro-pancreatic approaches, is indicated as a safe and feasible method for the treatment of proximal gastric cancer.

  相似文献   

2.
3.
The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799–812, 2016  相似文献   

4.

In this paper, a graphene-based tunable multi-band terahertz absorber is proposed and numerically investigated. The proposed absorber can achieve perfect absorption within both sharp and ultra-broadband absorption spectra. This wide range of absorption is gathered through a unique combination of periodically cross- and square-shaped dielectrics sandwiched between two graphene sheets; the latter enables it to offer more absorption in comparison with the traditional single-layer graphene structures. The aforementioned top layer is mounted on a gold plate separated by a Topas layer with zero volume loss. Furthermore, in our proposed approach, we investigated the possibility of changing the shapes and sizes of the dielectric layers instead of the geometry of the graphene layers to alleviate the edge effects and manufacturing complications. In numerical simulations, parameters, such as graphene Fermi energy and the dimensions of the proposed dielectric layout, have been optimally tuned to reach perfect absorption. We have verified that the performance of our dielectric layout called fishnet, with two widely investigated dielectric layouts in the literature (namely, cross-shaped and frame-and-square). Our results demonstrate two absorption bands with near-unity absorbance at frequencies of 1.6–2.3 and 4.2–4.9 THz, with absorption efficiency of 98% in 1.96 and 4.62 THz, respectively. Moreover, a broadband absorption in the 7.77–9.78 THz is observed with an absorption efficiency of 99.6% that was attained in 8.44–9.11 THz. Finally, the versatility provided by the tunability of three operation bands of the absorber makes it a great candidate for integration into terahertz optoelectronic devices.

  相似文献   

5.
In this work we consider the problem of selecting a set of patients among a given waiting list of elective patients and assigning them to a set of available operating room blocks. We assume a block scheduling strategy in which the number and the length of available blocks are given. As each block is related to a specific day, by assigning a patient to a block his/her surgery date is fixed, as well. Each patient is characterized by a recommended maximum waiting time and an uncertain surgery duration. In practical applications, new patients enter the waiting list continuously. Patient selection and assignment is performed by surgery departments on a short-term, usually a week, regular base. We propose a so-called rolling horizon approach for the patient selection and assignment. At each iteration short-term patient assignment is decided. However, in a look-ahead perspective, a longer planning horizon is considered when looking for the patient selection. The mid-term assignment over the next \(n\) weeks is generated by solving an ILP problem, minimizing a penalty function based on total waiting time and tardiness of patients. The approach is iteratively applied by shifting ahead the mid-term planning horizon. When applying the first week solution, unpredictable extensions of surgeries may disrupt the schedule. Such disruptions are recovered in the next iteration: the mid-term solution is rescheduled limiting the number of variations from the previously computed plan. Besides, the approach allows to deal with new patient arrivals. To keep limited the number of disruptions due to uncertain surgery duration, we propose also a robust formulation of the ILP problem. The deterministic and the robust formulation based frameworks are compared over a set of instances, including different stochastic realization of surgery times.  相似文献   

6.
Abstract

5-Carboxy-2′-deoxyuridine is a methyl oxidation product of thymidine. It can be formed by the menadione-mediated photosensitization of thymidine in aerated aqueous solution. Here in we present a new four-step synthesis of the 5-carboxy-2′-deoxyuridine phosphoramidite building block based on the alkaline hydrolysis of 5-trifluoromethyl-2′-deoxyuridine. The phosphoramidite derivative has been incorporated at defined sites into oligonucleotides using the solid phase synthesis approach.  相似文献   

7.
Purpose

In response to the increasing concerns on the environmental conservation and energy saving, manufacturers are more aware of proving the ‘green’ performance of their products. Some qualitative eco design tools are used to support the development of greener products; however, most of these tools require subjective judgement during the evaluation processes. This paper is therefore to propose an alternative approach that is objective, systematic and efficient, by integrating the ant colony optimization (ACO) and life cycle assessment (LCA), to facilitate the decision-making process.

Methods

The proposed integrative LCA-ACO approach aims to support the simultaneous thorough evaluations of multiple design options. A sequence of options of the lowest corresponding environmental impact value can be obtained. A case application example of various design combinations is presented to demonstrate the applicability of the proposed approach.

Results and discussion

The proposed approach offers decision makers a preliminary fast-track approach for screening decisions without lengthy processes of LCA studies. This approach helps the decision makers, especially during the early design selection stages, identifying the most appropriate design combination from the environmental perspective. The proposed approach is proved a significant contribution to the field of LCA and green product design.

Conclusions

Since full-scale LCA studies require significant effort in data collection processes and experts for result interpretations, it would be time consuming and costly to conduct a full-scale LCA during early product development processes. The proposed approach offers a more convenient way for decision makers to assess multiple design options regarding the environmental considerations. The case example presented in this paper proves the practicality of the proposed approach.

  相似文献   

8.
ABSTRACT

Large-scale disasters cause enormous damage to people living in the affected areas. Providing relief quickly to the affected is a critical issue in recovering the effects of a disaster. Pre-disaster planning has an important role on reducing the arrival time of relief items to the affected areas and efficiently allocating them. In this study, a mixed integer programming model is proposed in order to pre-position warehouses throughout a potential affected area and determine the amount of relief items to be held in those warehouses. Time between the strike of the disaster and arrival of relief items at the affected areas is aimed to be minimized. In addition, using probabilistic constraints, the model ensures that relief items arrive at affected areas within a certain time window with certain reliability. Considering instable fault lines on which Istanbul is located, the proposed model is applied to the Istanbul case for pre-positioning warehouses a priori to the possible expected large-scale earthquake.  相似文献   

9.
Background

We study the adaptation of Link Grammar Parser to the biomedical sublanguage with a focus on domain terms not found in a general parser lexicon. Using two biomedical corpora, we implement and evaluate three approaches to addressing unknown words: automatic lexicon expansion, the use of morphological clues, and disambiguation using a part-of-speech tagger. We evaluate each approach separately for its effect on parsing performance and consider combinations of these approaches.

Results

In addition to a 45% increase in parsing efficiency, we find that the best approach, incorporating information from a domain part-of-speech tagger, offers a statistically significant 10% relative decrease in error.

Conclusion

When available, a high-quality domain part-of-speech tagger is the best solution to unknown word issues in the domain adaptation of a general parser. In the absence of such a resource, surface clues can provide remarkably good coverage and performance when tuned to the domain. The adapted parser is available under an open-source license.

  相似文献   

10.
Purpose

Composites consist of at least two merged materials. Separation of these components for recycling is typically an energy-intensive process with potentially significant impacts on the components’ quality. The purpose of this article is to suggest how allocation for recycling of products manufactured from composites can be handled in life cycle assessment to accommodate for the recycling process and associated quality degradations of the different composite components, as well as to describe the challenges involved.

Method

Three prominent recycling allocation approaches were selected from the literature: the cut-off approach, the end-of-life recycling approach with quality-adjusted substitution, and the circular footprint formula. The allocation approaches were adapted to accommodate for allocation of impacts by conceptualizing the composite material recycling as a separation process with subsequent recycling of the recovered components, allowing for separate modeling of the quality changes in each individual component. The adapted allocation approaches were then applied in a case study assessing the cradle-to-grave climate impact and energy use of a fictitious product made from a composite material that in the end of life is recycled through grinding, pyrolysis, or by means of supercritical water treatment. Finally, the experiences and results from applying the allocation approaches were analyzed with regard to what incentives they provide and what challenges they come with.

Results and discussion

Using the approach of modeling the composite as at least two separate materials rather than one helped to clarify the incentives provided by each allocation approach. When the product is produced using primary materials, the cut-off approach gives no incentive to recycle, and the end-of-life recycling approach and the circular footprint formula give incentives to recycle and recover materials of high quality. Each of the allocation approaches come with inherent challenges, especially when knowledge is limited regarding future systems as in prospective studies. This challenge is most evident for the circular footprint formula, for example, with regard to the supply and demand balance.

Conclusions

We recommend modeling the composite materials in products as separate, individual materials. This proved useful for capturing changes in quality, trade-offs between recovering high quality materials and the environmental impact of the recycling system, and the incentives the different approaches provide. The cut-off and end-of-life approaches can both be used in prospective studies, whereas the circular footprint formula should be avoided as a third approach when no market for secondary material is established.

  相似文献   

11.

Graphene can be utilized as a tunable material for a wide range of infrared wavelength regions due to its tunable conductivity property. In this paper, we use Y-shaped silver material resonator placed over the top of multiple graphene silica-layered structures to realize the perfect absorption over the infrared wavelength region. We propose four different designs by placing the graphene sheet over silica. The absorption and reflectance performance of the structures have been explored for 1500- to 1600-nm wavelength range. The proposed design also explores the absorption tunability of the structure for the different values of graphene chemical potential. We have reported the negative impedance for the perfect absorption for proposed metamaterial absorber structures. All the metamaterial absorbers have reported 99% of its absorption peaks in the infrared wavelength region. These designs can be used as a tunable absorber for narrowband and wideband applications. The proposed designs will become the basic building block of large photonics design which will be applicable for polariser, sensor, and solar applications.

  相似文献   

12.
Information Quality (IQ) is a critical factor for the success of many activities in the information age, including the development of data warehouses and implementation of data mining. The issue of IQ risk is recognized during the process of data mining; however, there is no formal methodological approach to dealing with such issues.

Consequently, it is essential to measure the risk of IQ in a data warehouse to ensure success in implementing data mining. This article presents a methodology to determine three IQ risk characteristics: accuracy, comprehensiveness, and non-membership. The methodology provides a set of quantitative models to examine how the quality risks of source information affect the quality for information outputs produced using the relational algebra operations: Restriction, Projection, and Cubic product. It can be used to determine how quality risks associated with diverse data sources affect the derived data. The study also develops a data cube model and associated algebra to support IQ risk operations.  相似文献   


13.

The possibility to produce plants edited in multiple genes by means of DNA-free approaches opens new perspectives for breeding purposes and acceptance of resultant genotypes. In this work, we have explored the polyethylene glycol (PEG)-mediated delivery of ribonucleoproteins (RNPs) in tomato protoplasts using a multiplexing approach (i.e. two genes targeted simultaneously using two sgRNAs per gene) for the first time. We have analysed the editing outcome in fully developed green calli and demonstrated that tomato protoplasts are a valid cell target for RNP-mediated multiplexed genome editing with high efficiency.

  相似文献   

14.
Purpose

Obsolescence, as premature end of use, increases the overall number of products produced and consumed, and thereby can increase the environmental impact. Measures to decrease the effects of obsolescence by altering the product or service design have the potential to increase use time (defined as the realized active service life) of devices, but can themselves have (environmental) drawbacks, for example, because the amount of material required for production increases. As such, paying special attention to methodological choices when assessing such measures and strategies using life cycle assessment (LCA) needs is crucial.

Methods

Open questions and key aspects of obsolescence, including the analysis of its effects and preventative measures, are discussed against the backdrop of the principles and framework for LCA given in ISO 14040/44, which includes guidance on how to define a useful functional unit and reference flow in the context of real-life use time.

Results and discussion

The open and foundational requirements of ISO 14040/14044 already form an excellent basis for analysis of the phenomenon obsolescence and its environmental impact in product comparisons. However, any analysis presumes clear definition of the goal and scope phase with special attention paid to aspects relevant to obsolescence: the target product and user group needs to be placed into context with the analysed “anti-obsolescence” measures. The reference flow needs to reflect a realized use time (and not solely a technical lifetime when not relevant for the product under study). System boundaries and types of data need to be chosen also in context of the anti-obsolescence measure to include, for example, the production of spare parts to reflect repairable design and/or manufacturer-specific yields to reflect high-quality manufacturing.

Conclusions

Understanding the relevant obsolescence conditions for the product system under study and how these may differ across the market segment or user types is crucial for a fair and useful comparison and the evaluation of anti-obsolescence measures.

  相似文献   

15.
To a customer, the waiting time for order processing for a product or service is important information for order placement. If the time foreseen for order fulfillment is long, the order might be lost to a competitor. In particular, modern principles of supply chain management highly suggest information sharing between entities in the chain and information technology has enabled customers to conveniently consider the waiting time for a potential balking decision. To help determine the design and operation of a manufacturing or service system in which a customer may balk based on the foreseen waiting time, this paper develops procedures to estimate the average waiting time of an order. Either the procedures allow the maximum waiting time for a balking decision to be random or do not require knowledge of the arrival process of customers before balking if the balking limit is known. For generality of the model, this paper considers general inter-arrival and service time distributions, and uses the simulation and regression approach.  相似文献   

16.
《Journal of Asia》2022,25(1):101868
Plodia interpunctella (Hübner) (Lepidoptera: Pyralidae) is a moth species that is able to feed on various vegetable commodities. Its control is economically critical for commercial food storing facilities such as warehouses. P. interpunctella causes quantitative and qualitative damage by eating important stored food crops such as dried Welsh onions (Allium fistulosum L.), and freezing treatment is a common method of control. To examine the effectiveness of freezing treatment, we changed the length of time of the conventional freezing method. The conventional method involves treatment below ?15 °C for 48 h, but we predicted that it would be effective with only 24 h freezing at ?25 °C. To test our theory, we conducted an experiment using three different frozen storage containers and assessed if the modified method was effective on the eggs and fourth instar larvae of P. interpunctella. Despite the temporary malfunctioning of one of the containers used in the experiment after incubation at 28 °C and 70% relative humidity for 10 days, the larval mortality rate was 100% and egg hatching rate was 0% in all samples, regardless of the treatment time. Further research is needed as this method is expected to decrease production costs and energy consumption and has the potential to be applied to other crops and pests.  相似文献   

17.

The study of the human gut microbiome is essential in microbiology and infectious diseases as specific alterations in the gut microbiome might be associated with various pathologies, such as chronic inflammatory disease, intestinal infection and colorectal cancer. To identify such dysregulations, several strategies are being used to create a repertoire of the microorganisms composing the human gut microbiome. In this study, we used the “microscomics” approach, which consists of creating an ultrastructural repertoire of all the cell-like objects composing stool samples from healthy donors using transmission electron microscopy (TEM). We used TEM to screen ultrathin sections of 8 resin-embedded stool samples. After exploring hundreds of micrographs, we managed to elaborate ultrastructural categories based on morphological criteria or features. This approach explained many inconsistencies observed with other techniques, such as metagenomics and culturomics. We highlighted the value of our culture-independent approach by comparing our microscopic images to those of cultured bacteria and those reported in the literature. This study helped to detect “minimicrobes” Candidate Phyla Radiation (CPR) for the first time in human stool samples. This “microscomics” approach is non-exhaustive but complements already existing approaches and adds important data to the puzzle of the microbiota.

  相似文献   

18.
Species Distribution Modelling (SDM) determines habitat suitability of a species across geographic areas using macro-climatic variables; however, micro-habitats can buffer or exacerbate the influence of macro-climatic variables, requiring links between physiology and species persistence. Experimental approaches linking species physiology to micro-climate are complex, time consuming and expensive. E.g., what combination of exposure time and temperature is important for a species thermal tolerance is difficult to judge a priori. We tackled this problem using an active learning approach that utilized machine learning methods to guide thermal tolerance experimental design for three kissing-bug species: Triatoma infestans, Rhodnius prolixus, and Panstrongylus megistus (Hemiptera: Reduviidae: Triatominae), vectors of the parasite causing Chagas disease. As with other pathogen vectors, triatomines are well known to utilize micro-habitats and the associated shift in microclimate to enhance survival. Using a limited literature-collected dataset, our approach showed that temperature followed by exposure time were the strongest predictors of mortality; species played a minor role, and life stage was the least important. Further, we identified complex but biologically plausible nonlinear interactions between temperature and exposure time in shaping mortality, together setting the potential thermal limits of triatomines. The results from this data led to the design of new experiments with laboratory results that produced novel insights of the effects of temperature and exposure for the triatomines. These results, in turn, can be used to better model micro-climatic envelope for the species. Here we demonstrate the power of an active learning approach to explore experimental space to design laboratory studies testing species thermal limits. Our analytical pipeline can be easily adapted to other systems and we provide code to allow practitioners to perform similar analyses. Not only does our approach have the potential to save time and money: it can also increase our understanding of the links between species physiology and climate, a topic of increasing ecological importance.  相似文献   

19.
Tsetse are the insect vectors of the African trypanosomiases. As with many diseases, transmission of trypanosomiasis varies through space and time. Capturing the variation of both vector and disease has, in the past, been attempted separately in the space and time dimensions, usually using deterministic techniques. Very few efforts have used space‐time covariation and have hence missed any correlations that may exist between variation in these two dimensions. Here we propose two novel approaches to space‐time analysis derived from space‐time geostatistics in a kriging framework. The approaches were developed through analysis of a dataset recording the Apparent Density of Glossina palpalis gambiensis and Glossina tachinoides (Diptera: Glossinidae) in three riparian sites in Burkina Faso over 15 months between 2006 and 2007. This site is fragmented due to human activity in the area. The first approach, Space Time Ordinary Kriging, does not consider the effect of fragmentation. It is used as a benchmark to test the increased explanatory power of the second method, which does account for fragmentation. The second method, Regression Space Time Simple Kriging, is a distinct improvement over the first approach because it allows for a spatial trend in the mean trap catch; this trend is related to, and later predicted from, environmental co‐variates. The results indicate the presence of space and time effects on tsetse distribution, dependent on the size of the habitat fragmentation patches. These effects occur at relatively small geographic scales within a season. Whilst such variation has long been suspected, the new methods presented here are able to quantify this variation precisely, so that seasonal and spatial comparisons can now be made both within and between species.  相似文献   

20.
Various approaches have been applied to optimize biological product fermentation processes and define design space. In this article, we present a stepwise approach to optimize a Saccharomyces cerevisiae fermentation process through risk assessment analysis, statistical design of experiments (DoE), and multivariate Bayesian predictive approach. The critical process parameters (CPPs) were first identified through a risk assessment. The response surface for each attribute was modeled using the results from the DoE study with consideration given to interactions between CPPs. A multivariate Bayesian predictive approach was then used to identify the region of process operating conditions where all attributes met their specifications simultaneously. The model prediction was verified by twelve consistency runs where all batches achieved broth titer more than 1.53 g/L of broth and quality attributes within the expected ranges. The calculated probability was used to define the reliable operating region. To our knowledge, this is the first case study to implement the multivariate Bayesian predictive approach to the process optimization for the industrial application and its corresponding verification at two different production scales. This approach can be extended to other fermentation process optimizations and reliable operating region quantitation. © 2012 American Institute of Chemical Engineers Biotechnol. Prog., 28: 1095–1105, 2012  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号