首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 171 毫秒
1.

Mechanical circulatory support using ventricular assist devices is a common technique for treating patients suffering from advanced heart failure. The latest generation of devices is characterized by centrifugal turbopumps which employ magnetic levitation bearings to ensure a gap clearance between moving and static parts. Despite the increasing use of these devices as a destination therapy, several long-term complications still exist regarding their hemocompatibility. The blood damage associated with different pump designs has been investigated profoundly in the literature, while the hemodynamic performance has been hardly considered. This work presents a novel comparison between the two main devices of the latest generation–HVAD and HM3–from both perspectives, hemodynamic performance and blood damage. Computational fluid dynamics simulations are performed to model the considered LVADs, and computational results are compared to experimental measurements of pressure head to validate the model. Enhanced performance and hemocompatibility are detected for HM3 owing to its design incorporating more conventional blades and larger gap clearances.

  相似文献   

2.
The usefulness of microbiological standards for frozen foods is now a controversy in the trade and scientific literature. Most reviewers have given arguments both for and against, and have concluded that they should be applied with great caution. Such standards have the advantage of putting questions of safety on a convenient numerical basis. Canadian workers have reported that promulgation of standards has invariably raised the hygienic level of the products controlled.

Bacteriological standards have often been associated with the question of safety to the consumer. Everyone recognizes that food poisoning bacteria are a potential danger in any food. But many have argued that the history of food poisoning outbreaks from frozen foods is excellent and that there is no need for standards; on the other hand, proponents of standards have pointed to the incomplete investigation and reporting of outbreaks, and have argued that there may be more outbreaks than we realize. They have pointed to laboratory studies that have shown grossly mishandled precooked frozen foods to be truly dangerous. Some have proposed that pathogens should be absent from foods; but others have questioned that a microbiological standard can accomplish this end. Some pathogens, such as Salmonella or Staphylococcus have been shown to be so ubiquitous that their presence in some commercial foods is unavoidable. Also, sampling and analytical methods have been described as inadequate to guarantee that pathogens present will be detected. Some have argued that control at the source is a better way—through inspections of the plant operation, by enforcement of handling codes, or by processing procedures such as pasteurization, which would be more certain to result in a pathogen-free food.

A most important part of any of the proposed standards is a “total count” of viable aerobic bacteria. English workers have found that foods causing poisoning outbreaks usually had total viable counts above 10 million per gram. On the other hand, these same workers found Salmonella on meats with very low total viable count. The assumption by many that low total count indicates safety has been shown to be not always true. Furthermore, high counts of nonpathogenic organisms, such as psychrophilic saprophytes would have no public health significance.

The relation between bacterial level and quality is open to less controversy. Some authorities have pointed to bacterial level as a measure of sanitation, adequacy of refrigeration, or speed of handling. Others have indicated that to determine which of these factors caused a high count would be impossible with only a total count on the product as a guide. Some investigators have said a high count affects flavor adversely before actual spoilage is evident, and this may be a factor in competition on today's market. It is well established that initial bacterial level will affect the shelf-life of a chilled product. Methods of analysis are more nearly adequate for counts than for pathogens, but they need improvement, and should be clearly specified as part of any bacteriological standard. Foods with high count could sometimes be brought into compliance merely by storing them for a sufficient period frozen, or by heating them slightly. This has been cited by some authors as a disadvantage of bacteriological standards.

The enterococci and the coliform group (except Escherichia coli) have been shown to be ubiquitous and therefore should not be used alone to indicate fecal contamination. Although E. coli has greater significance, its source should be determined each time it is found.

Various reviewers have expressed the need for caution in the application of standards. The principal precautionary arguments we have found are as follows:

1) A single set of microbiological standards should not be applied to foods as a miscellaneous group, such as “frozen foods” or “precooked foods.”

2) Microbiological standards should be applied first to the more hazardous types of foods on an individual basis, after sufficient data are accumulated on expected bacterial levels, with consideration of variations in composition, processing procedures, and time of frozen storage.

3) When standards are chosen, there should be a definite relation between the standard and the hazard against which it is meant to protect the public.

4) Methods of sampling and analysis should be carefully studied for reliability and reproducibility among laboratories, and chosen methods should be specified in detail as part of the standard.

5) Tolerances should be included in the standard to account for inaccuracies of sampling and analysis.

6) At first, the standard should be applied on a tentative basis to allow for voluntary compliance before becoming a strictly enforced regulation.

7) Microbiological standards will be expensive to enforce.

8) If standards are unwisely chosen they will not stand in courts of law.

  相似文献   

3.
Abstract

We review ways of individually identifying stoats (Mustela erminea) and similar small mammals from visits to bait stations or to monitoring devices in the field. Tracking devices are the cheapest and most practical method currently available of measuring the presence of a particular species, but there has been little research on the recognition of individuals. Elongation of tracking tunnels, or using sooty plates rather than ink to record prints, may improve detectability of individual markings. Recording visits to bait stations or tracking tunnels from DNA sequencing of hair or skin samples is likely to be prohibitively expensive for many monitoring programmes. Identification of stoats visiting bait stations or tracking tunnels using electronic devices has great potential, but these techniques are impracticably expensive because stoats move over such large areas that individual receivers and data loggers would be needed for each bait station. Chemical bait markers such as rhodamine B may be the most suitable method for identifying which animals have used a particular bait station.  相似文献   

4.
Purpose

Obsolescence, as premature end of use, increases the overall number of products produced and consumed, and thereby can increase the environmental impact. Measures to decrease the effects of obsolescence by altering the product or service design have the potential to increase use time (defined as the realized active service life) of devices, but can themselves have (environmental) drawbacks, for example, because the amount of material required for production increases. As such, paying special attention to methodological choices when assessing such measures and strategies using life cycle assessment (LCA) needs is crucial.

Methods

Open questions and key aspects of obsolescence, including the analysis of its effects and preventative measures, are discussed against the backdrop of the principles and framework for LCA given in ISO 14040/44, which includes guidance on how to define a useful functional unit and reference flow in the context of real-life use time.

Results and discussion

The open and foundational requirements of ISO 14040/14044 already form an excellent basis for analysis of the phenomenon obsolescence and its environmental impact in product comparisons. However, any analysis presumes clear definition of the goal and scope phase with special attention paid to aspects relevant to obsolescence: the target product and user group needs to be placed into context with the analysed “anti-obsolescence” measures. The reference flow needs to reflect a realized use time (and not solely a technical lifetime when not relevant for the product under study). System boundaries and types of data need to be chosen also in context of the anti-obsolescence measure to include, for example, the production of spare parts to reflect repairable design and/or manufacturer-specific yields to reflect high-quality manufacturing.

Conclusions

Understanding the relevant obsolescence conditions for the product system under study and how these may differ across the market segment or user types is crucial for a fair and useful comparison and the evaluation of anti-obsolescence measures.

  相似文献   

5.

The Software Defined Networking (SDN) promises exciting new networking functionality. However, there is always remains a chance of programming errors that result in unreliable data communication. The centralized programming model helps decrease bugs' probability where a single controller manages the whole network. Yet, many real-time events occur at switches and end hosts, which often affect and add a delay in the communication process. One of those events includes unannounced destination host migration after installing flow rules during receiving of data packets. Such destination host movement results in the loss of packets because controller is not aware of this recent event. Therefore, we need an efficient approach to transmit packets without any packet loss despite destination host migration. This paper proposed a design to achieve the objective as mentioned above by defining a layer named Intelligent Transmission Control Layer (ITCL). It monitors all the end hosts' connections at their specific locations and performs necessary actions whenever the connection state changes for one or multiple hosts. The controller collects information of end nodes and state change through ITCL using A star search algorithm. After that, it updates flow tables accordingly to accommodate a location-change scenario with a route-change policy. ICTL is developed on prototype-based implementation using a popular POX controller platform. By comparing ITCL with the existing solution, we conclude that our proposed approach exhibits efficient performance in terms of Packet loss, Bandwidth usage, and Network Throughput.

  相似文献   

6.
Bullerdiek  Jörn  Reisinger  Emil  Rommel  Birgit  Dotzauer  Andreas 《Protoplasma》2022,259(6):1381-1395

There is no doubt that genetic factors of the host play a role in susceptibility to infectious diseases. An association between ABO blood groups and SARS-CoV-2 infection as well as the severity of COVID-19 has been suggested relatively early during the pandemic and gained enormously high public interest. It was postulated that blood group A predisposes to a higher risk of infection as well as to a much higher risk of severe respiratory disease and that people with blood group O are less frequently and less severely affected by the disease. However, as to the severity of COVID-19, a thorough summary of the existing literature does not support these assumptions in general. Accordingly, at this time, there is no reason to suppose that knowledge of a patient’s ABO phenotype should directly influence therapeutical decisions in any way. On the other hand, there are many data available supporting an association between the ABO blood groups and the risk of contracting SARS-CoV-2. To explain this association, several interactions between the virus and the host cell membrane have been proposed which will be discussed here.

  相似文献   

7.
8.

In this paper, a novel refractive index sensor in terahertz region is proposed. The proposed structure is prism/(sample/porousTa2O5)15/sample/gyroid metal/substrate. The sensor is based on the Tamm plasmon polariton at the interface between porous one-dimensional photonic crystal and gyroidal metal. The gyroidal metal has been used as an alternative metal and its refraction index can be tuned by the gyroid parameters. The effects of the metal volume fraction and sample refractive index on the performance are studied to improve the ability of the sensor. The proposed sensor achieves high sensitivity of 6.7 THz/RIU, a high figure of merit 6*103 RIU?1, a high-quality factor of 3*103, and a low detection limit of 9*10?6 RIU. The proposed device can be a good candidate for fabricating gyroid metal and porous material-based biosensors, active optoelectronic and polaritonic devices.

  相似文献   

9.

An exponential rise in patient data provides an excellent opportunity to improve the existing health care infrastructure. In the present work, a method to enable cardiovascular digital twin is proposed using inverse analysis. Conventionally, accurate analytical solutions for inverse analysis in linear problems have been proposed and used. However, these methods fail or are not efficient for nonlinear systems, such as blood flow in the cardiovascular system (systemic circulation) that involves high degree of nonlinearity. To address this, a methodology for inverse analysis using recurrent neural network for the cardiovascular system is proposed in this work, using a virtual patient database. Blood pressure waveforms in various vessels of the body are inversely calculated with the help of long short-term memory (LSTM) cells by inputting pressure waveforms from three non-invasively accessible blood vessels (carotid, femoral and brachial arteries). The inverse analysis system built this way is applied to the detection of abdominal aortic aneurysm (AAA) and its severity using neural networks.

  相似文献   

10.

We present a multi-band terahertz absorber formed by periodic square metallic ribbon with T-shaped gap and a metallic ground plane separated by a dielectric layer. It is demonstrated that absorption spectra of the proposed structure consist of four absorption peaks located at 1.12, 2.49, 3.45, and 3.91 THz with high absorption coefficients of 98.0, 98.9, 98.7, and 99.6%, respectively. It is demonstrated that the proposed absorber has the tunability from single-band to broadband by changing the length of square metallic ribbon and we can also select or tune the frequencies which we want to use by changing polarization angles. Importantly, the quality factor Q at 3.91 THz is 30.1, which is 5.6 times higher than that of 1.12 THz. These results indicate that the proposed absorber has a promising potential for devices, such as detection, sensing, and imaging.

  相似文献   

11.
Background

In recent years many mobile devices able to record health-related data in ambulatory patients have emerged. However, well-organised programs to incorporate these devices are sparse. Hartwacht Arrhythmia (HA) is such a program, focusing on remote arrhythmia detection using the AliveCor Kardia Mobile (KM) and its algorithm.

Objectives

The aim of this study was to assess the benefit of the KM device and its algorithm in detecting cardiac arrhythmias in a real-world cohort of ambulatory patients.

Methods

All KM ECGs recorded in the HA program between January 2017 and March 2018 were included. Classification by the KM algorithm was compared with that of the Hartwacht team led by a cardiologist. Statistical analyses were performed with respect to detection of sinus rhythm (SR), atrial fibrillation (AF) and other arrhythmias.

Results

5,982 KM ECGs were received from 233 patients (mean age 58 years, 52% male). The KM algorithm categorised 59% as SR, 22% as possible AF, 17% as unclassified and 2% as unreadable. According to the Hartwacht team, 498 (8%) ECGs were uninterpretable. Negative predictive value for detection of AF was 98%. However, positive predictive value as well as detection of other arrhythmias was poor. In 81% of the unclassified ECGs, the Hartwacht team was able to provide a diagnosis.

Conclusions

This study reports on the first symptom-driven remote arrhythmia monitoring program in the Netherlands. Less than 10% of the ECGs were uninterpretable. However, the current performance of the KM algorithm makes the device inadequate as a stand-alone application, supporting the need for manual ECG analysis in HA and similar programs.

  相似文献   

12.
Summary

In the light of all that has been discovered about the mechanism of evolution it has become tempting to follow Darwin's lead and to “see no limit to this power“. Yet a careful examination of situations in which evolution is known to be occurring in some species, shows complete absence of evolution in others. This not because these species have not had the opportunity; in many situations there may even be uncolonised bare space.

The explanation must lie in the supply of appropriate variation. A tacit assumption of evolution by natural selection is that the necessary variation is always available. Yet there is no a priori justification for this. Evidence from populations in nature, particularly of species which are potential colonists of old metal mine workings and similar metal contaminated habitats, shows that the species that successfully colonise these habitats, by the evolution of metal tolerant populations, possess within their normal populations the necessary variation. But those species which fail to colonise these habitats, despite the opportunity, do not possess this variation. This applies also at the level of the population, in the replicated evolutionary situations occurring under electricity pylons.

Such evidence, together with arguments from theory, suggests that the failures of evolution have been as important as its successes in moulding the living world as we see it today, and that the reasons for failure must be sought at the molecular level in limitations to the origin of new variation.  相似文献   

13.
ABSTRACT:?

The cationic peanut peroxidase has been studied in detail, not only with regard to its peptide structure, but also to the sites and role of the three moieties linked to it. Peanut peroxidase lends itself well to a close examination as a potential example for other plant peroxidase studies. It was the first plant peroxidase for which a 3-D structure was derived from crystals, with the glycans intact. Subsequent analysis of peroxidases structures from other plants have not shown great differences to that of the peanut peroxidase. As the period of proteomics follows on the era of genomics, the study of glycans has been brought back into focus. With the potential use of peroxidase as a polymerization agent for industry, there are some aspects of the overall structure that should be kept in mind for successful use of this enzyme. A variety of techniques are now available to assay for these structures/ moieties and their roles. Peanut peroxidase data are reviewed in that light, as well as defining some true terms for isozymes. Because a high return of the enzyme in a pure form has been obtained from cultured cells in suspension culture, a brief review of this is also offered.  相似文献   

14.
Introduction

Expectations of physicians concerning e‑Health and perceived barriers to implementation in clinical practice are scarcely reported in the literature. The purpose of this study was to assess these aspects of cardiovascular e‑Health.

Methods

A survey was sent to members of the Netherlands Society of Cardiology. In total, the questionnaire contained 30 questions about five topics: personal use of smartphones, digital communication between respondents and patients, current e‑Health implementation in clinical practice, expectations about e‑Health and perceived barriers for e‑Health implementation. Age, personal use of smartphones and professional environment were noted as baseline characteristics.

Results

In total, 255 respondents filled out the questionnaire (response rate 25%); 89.4% of respondents indicated that they considered e‑Health to be clinically beneficial, improving patient satisfaction (90.2%), but also that it will increase the workload (83.9%). Age was a negative predictor and personal use of smartphones was a positive predictor of having high expectations. Lack of reimbursement was identified by 66.7% of respondents as a barrier to e‑Health implementation, as well as a lack of reliable devices (52.9%) and a lack of data integration with electronic medical records (EMRs) (69.4%).

Conclusion

Cardiologists are in general positive about the possibilities of e‑Health implementation in routine clinical care; however, they identify deficient data integration into the EMR, reimbursement issues and lack of reliable devices as major barriers. Age and personal use of smartphones are predictors of expectations of e‑Health, but the professional working environment is not.

  相似文献   

15.

Fisheries bycatch is one of the biggest threats to marine mammal populations. A literature review was undertaken to provide a comprehensive assessment and synopsis of gear modifications and technical devices to reduce marine mammal bycatch in commercial trawl, purse seine, longline, gillnet and pot/trap fisheries. Successfully implemented mitigation measures include acoustic deterrent devices (pingers) which reduced the bycatch of some small cetacean species in gillnets, appropriately designed exclusion devices which reduced pinniped bycatch in some trawl fisheries, and various pot/trap guard designs that reduced marine mammal entrapment. However, substantial development and research of mitigation options is required to address the bycatch of a range of species in many fisheries. No reliably effective technical solutions to reduce small cetacean bycatch in trawl nets are available, although loud pingers have shown potential. There are currently no technical options that effectively reduce marine mammal interactions in longline fisheries, although development of catch and hook protection devices is promising. Solutions are also needed for species, particularly pinnipeds and small cetaceans, that are not deterred by pingers and continue to be caught in static gillnets. Large whale entanglements in static gear, particularly buoy lines for pots/traps, needs urgent attention although there is encouraging research on rope-less pot/trap systems and identification of rope colours that are more detectable to whale species. Future mitigation development and deployment requires rigorous scientific testing to determine if significant bycatch reduction has been achieved, as well as consideration of potentially conflicting mitigation outcomes if multiple species are impacted by a fishery.

  相似文献   

16.
Purpose

This literature review aims to present the current methodologies that have been developed to perform a social life cycle assessment (sLCA) and to display the main differences among them. In addition to that, to identify the nexus between sLCA and circular economy (CE) and to what extent this life cycle technique has been involved within CE studies.

Methods

An analysis of scientific literature using online databases was made. A total of 76 publications, including all industry sectors worldwide, were chosen spanning 11 years, from 2009 to 2020. Special attention was made to the methodology used to assess the social impacts, the impact categories analyzed, and whether there is or not a circular economy case. All the impact categories of both UNEP/SETAC and PSIA were taken into account when doing the review, and the top three of the categories are mentioned here.

Results and discussion

The leadership of the UNEP/SETAC methodology is clear with 58 cases. Almost 90% of the case studies are focused on products while the remaining ones are related with services. Workers are the most considered stakeholder when conducting an sLCA research, followed by local communities and society. Regarding the impact assessment, the performance reference point (PRP) was the most common method used. When considering the CE even when some cases included the end-of-life stage in the system boundaries, the studies did not consider the actors from that stage; excluding these cases, one out of four articles has a link with CE, a promising proportion taking into account the early stage of both concepts (i.e., sLCA and CE).

Conclusions

UNEP/SETAC guidelines seem to be the most promising methodology due to its reception among the scientific community. However, a more industry-oriented approach is proposed by the Roundtable for Product Social Metrics (PSIA) in a way to respond to manufacturing companies’ demand. Regardless of the type of methodology to be implemented, workers represent the key stakeholder when assessing social impacts. The change in usual patterns is leading to a change in the way how stakeholders interact and therefore new and more impacts may arise, and that is the reason why it is important to include the CE into the sLCA. A series of challenges such as the feasibility of aggregating all the life cycle techniques to one (life cycle sustainability assessment), data availability, and quality are still present for the moment.

  相似文献   

17.

Real-time accurate traffic congestion prediction can enable Intelligent traffic management systems (ITMSs) that replace traditional systems to improve the efficiency of traffic and reduce traffic congestion. The ITMS consists of three main layers, which are: Internet of Things (IoT), edge, and cloud layers. Edge can collect real-time data from different routes through IoT devices such as wireless sensors, and then it can compute and store this collected data before transmitting them to the cloud for further processing. Thus, an edge is an intermediate layer between IoT and cloud layers that can receive the transmitted data through IoT to overcome cloud challenges such as high latency. In this paper, a novel real-time traffic congestion prediction strategy (TCPS) is proposed based on the collected data in the edge’s cache server at the edge layer. The proposed TCPS contains three stages, which are: (i) real-time congestion prediction (RCP) stage, (ii) congestion direction detection (CD2) stage, and (iii) width change decision (WCD) stage. The RCP aims to predict traffic congestion based on the causes of congestion in the hotspot using a fuzzy inference system. If there is congestion, the CD2 stage is used to detect the congestion direction based on the predictions from the RCP by using the Optimal Weighted Naïve Bayes (OWNB) method. The WCD stage aims to prevent the congestion occurrence in which it is used to change the width of changeable routes (CR) after detecting the direction of congestion in CD2. The experimental results have shown that the proposed TCPS outperforms other recent methodologies. TCPS provides the highest accuracy, precision, and recall. Besides, it provides the lowest error, with values equal to 95%, 74%, 75%, and 5% respectively.

  相似文献   

18.

Fog-cloud computing is a promising distributed model for hosting ever-increasing Internet of Things (IoT) applications. IoT applications should meet different characteristics such as deadline, frequency rate, and input file size. Fog nodes are heterogeneous, resource-limited devices and cannot accommodate all the IoT applications. Due to these difficulties, designing an efficient algorithm to deploy a set of IoT applications in a fog-cloud environment is very important. In this paper, a fuzzy approach is developed to classify applications based on their characteristics then an efficient heuristic algorithm is proposed to place applications on the virtualized computing resources. The proposed policy aims to provide a high quality of service for IoT users while the profit of fog service providers is maximized by minimizing resource wastage. Extensive simulation experiments are conducted to evaluate the performance of the proposed policy. Results show that the proposed policy outperforms other approaches by improving the average response time up to 13%, the percentage of deadline satisfied requests up to 12%, and the resource wastage up to 26%.

  相似文献   

19.
Abstract

The scheme for the elongation cycle of protein biosynthesis is proposed based on modern quantitative data on the interactions of mRNA and different functional forms of tRNA with 70S ribosomes and their 30S and 50S subunits. This scheme takes into account recently discovered third ribosomal (E) site with presumable exit function. The E site is introduced into 70S ribosome by its 50S subunit, the codon-anticodon interaction does not take place at the E site, and the affinity of tRNA for the E site is considerably lower than that for the P site. On the other hand, the P and A sites are located mainly on a 30S subunit, the codon-anticodon interactions being realized on both these sites. An mRNA molecule is placed exclusively on a 30S subunit where it makes U-turn. The proposed scheme does not contradict to any data but includes all main postulates of the initial Watson's model (J. D. Watson, Bull Soc. Chim. Biol. 46, 1399 (1964), and is considered as a natural extension of the later according to modern experimental data.  相似文献   

20.
Liu  Fenxiang  Movahedi  Ali  Yang  Wenguo  Xu  Dezhi  Jiang  Chuanbei  Xie  Jigang  Zhang  Yu 《Molecular biology reports》2021,48(11):7113-7125
Background

An ornamental plant often seen in gardens and farmhouses, Musa basjoo Siebold can also be used as Chinese herbal medicine. Its pseudostem and leaves are diuretic; its root can be decocted together with ginger and licorice to cure gonorrhea and diabetes; the decoct soup of its pseudostem can help relieve heat, and the decoct soup of its dried flower can treat cerebral hemorrhage. There have not been many chloroplast genome studies on M. basjoo Siebold.

Methods and results

We characterized its complete chloroplast genome using Novaseq 6000 sequencing. This paper shows that the length of the chloroplast genome M. basjoo Siebold is 172,322 bp, with 36.45% GC content. M. basjoo Siebold includes a large single-copy region of 90,160 bp, a small single-copy region of 11,668 bp, and a pair of inverted repeats of 35,247 bp. Comparing the genomic structure and sequence data of closely related species, we have revealed the conserved gene order of the IR and LSC/SSC regions, which has provided a very inspiring discovery for future phylogenetic research.

Conclusions

Overall, this study has constructed an evolutionary tree of the genus Musa species with the complete chloroplast genome sequence for the first time. As can be seen, there is no obvious multi-branching in the genus, and M. basjoo Siebold and Musa itinerans are the closest relatives.

  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号