共查询到20条相似文献,搜索用时 312 毫秒
1.
Cynthia K. Brown Lucinda Buhse Horst-Dieter Friedel Susanne Keitel Johannes Kraemer J. Michael Morris Mary Stickelmeyer Chikako Yomota Vinod P. Shah 《AAPS PharmSciTech》2009,10(3):924-927
The qualification process for ensuring that a paddle or basket apparatus is suitable for its intended use is a highly debated
and controversial topic. Different instrument qualification and suitability methods have been proposed by the pharmacopeias
and regulatory bodies. In an effort to internationally harmonize dissolution apparatus suitability requirements, the International
Pharmaceutical Federation's (FIP) Dissolution/Drug Release Special Interest Group (SIG) reviewed current instrument suitability
requirements listed in the US, European, and Japanese pharmacopeias and the International Conference on Harmonization (ICH)
Topic Q4B on harmonization of pharmacopoeial methods, in its Annex 7, Dissolution Test General. In addition, the SIG reviewed
the Food and Drug Administration (FDA) Draft Guidance for Industry, “The Use of Mechanical Calibration of Dissolution Apparatus
1 and 2—Current Good Manufacturing Practice (CGMP)” and the related ASTM Standard E2503-07. Based on this review and several
in-depth discussions, the FIP Dissolution/Drug Release SIG recommends that the qualification of a dissolution test instrument
should be performed following the calibration requirements as indicated in the FDA (draft) guidance. If additional system
performance information is desired, a performance verification test using US Pharmacopeia Reference Standard tablet or an
established in-house reference product can be conducted. Any strict requirement on the use of a specific performance verification
test tablet is not recommended at this time. 相似文献
2.
Kenneth R. Still G. Bruce Briggs Paul Knechtges William K. Alexander Cody L. Wilson 《人类与生态风险评估》2000,6(6):1125-1136
The risk assessment process is a critical function for deployment toxicology research. It is essential to the decision making process related to establishing risk reduction procedures and for formulating appropriate exposure levels to protect naval personnel from potentially hazardous chemicals in the military that could result in a reduction in readiness operations. These decisions must be based on quality data from well-planned laboratory animal studies that guide the judgements, which result in effective risk characterization and risk management. The process of risk assessment in deployment toxicology essentially uses the same principles as civilian risk assessment, but adds activities essential to the military mission, including intended and unintended exposure to chemicals and chemical mixtures. Risk assessment and Navy deployment toxicology data are integrated into a systematic and well-planned approach to the organization of scientific information. The purpose of this paper is to outline the analytical framework used to develop strategies to protect the health of deployed Navy forces. 相似文献
3.
政策工具被认为是政府为解决某个公共问题采取的具体手段或措施,一项政策可以视作目标和多种政策工具的组合。在生态治理过程中,不同政策工具对农户行为的刺激程度不同,进而导致不同的政策效果。分离不同政策工具的影响,可为工具选择和政策优化提供科学参考。以盐池县为例,利用基于该县1983-2017年内出台的316份生态政策文本构建的政策工具数据集和VAR模型中的脉冲响应和方差分解方法定量研究了强制型、混合型和自愿型三大类政策工具及十种子工具对农户耕作、放牧、造林3种行为的影响。结果表明:(1)政策工具对农户行为的影响具有时效性,一般在政策出台后2-3年内影响最大,随后逐渐减小并消失,影响持续时间为7-10年。(2)总体来看,政策工具对农户行为的冲击力度较小,冲击范围在0-0.30之间,说明农户行为还受到其他诸多因素的影响。(3)10年累计影响从大到小依次为强制型、混合型、自愿型,其中直接提供和规制两种子工具的影响最大。放牧行为受到政策工具的刺激最大,耕作行为次之,造林行为最小。(4)直接提供工具对耕作行为具有最大正向影响,最大冲击为0.30;规制工具在短期内抑制牲畜数量增长,而直接提供和补贴工具促进牲畜数量增长,且由于冲击曲线存在正负波动,说明政府与农户在牲畜养殖上存在长期博弈;只有规制工具对造林行为具有积极影响,说明造林更多是在政府的主导下进行。建议充分利用政策工具的短期效应,凸显政府角色,及时做好政策效果评估工作,调整工具组合,助力生态目标的实现。 相似文献
4.
Conclusion What’s necessary for this purpose is an appropriate qualification of the workers’ representatives. The foundations for this
are offered by a project promoted by the Foundation on Work and Environment and sponsored by the Federal Environmental Agency,
and whose aim it is, among other things, to obtain workers’ representatives as critical review experts who are involved at
various levels in the environmental balance process. This could be put into effect through a solution with funds to which
the companies should contribute as well. 相似文献
5.
Joachim Schult Jürgen Querengässer Olaf Breidbach Burghart Scheidt Thomas Erler 《Theorie in den Biowissenschaften》2001,120(2):107-114
Summary Standard EEG risk evaluation works on scoring systems that use different types of questionnaires. Here, an alternative for
SIDS (Sudden Infant Death Syndrome) risk detection is presented that is based exclusively on EEG data which possibly could
substitute the procedure of questioning the parents and allow a direct qualification of the physiological disposition of the
individual neonate: Using EEG-characters an approved SIDS-case could be discriminated as well against the group of “healthy”
infants as against the “high-risk-group”. The results of this study may confirm the evidence that the EEG analysis can be
a promising approach to predict an increased SIDS risk. 相似文献
6.
Sensory experimental data whether collected via the consumer or via the laboratory population have many characteristics in common. Among the more important characteristics are that they are relative, skewed, and difficult to replicate. These characteristics result from the use of human beings as instruments to measure multivariate stimuli as opposed to the use of a mechanical instrument. The use of human beings as instruments presents several problems in the design and analysis of sensory/consumer studies. In the clinical/medical applications of sensory evaluation, we have used people both as the experimental unit (panelists/subjects) and as the instrument (judges) to measure panelists’ responses. Thus it is my belief that sensory data are the most difficult scientific data to statistically analyze and interpret. Unfortunately, limited theoretical work has been done to tackle the resulting problems. These problems include the comparison of responses obtained from independent populations (monadic design), sample size estimation with special reference to time-dependent responses, analysis of degree of difference data, the invariance property of rating scales, effect of correlated errors in observations, and the use of laboratory panels to predict results from a consumer panel. Some solutions to these problems have been provided by experience. My purpose here is to discuss each of these problems in greater detail and offer some possible scientific solutions. 相似文献
7.
《生物化学与分子生物教育》2000,28(5):251-255
Capillary electrophoresis, a recent analytical method (the first commercial instrument was sold just 10 years ago), offers an efficient alternative means compared to other current separation techniques. Due to a wide application range, this method is becoming more and more important among analytical laboratories. Thus, introductory lectures on this analytical technique are of particular interest. Moreover, the use of capillary electrophoresis aids in the understanding of the numerous parameters which influence electrophoretic migrations, such as ion mobilities, electrical field, pH, pKa, joule heating and buffering capacity. This article presents the theory of capillary electrophoresis and various steps of an analysis of four carbohydrates using this technique. This presentation of the data is similar to the way the students, to whom this analysis was asked, presented theirs. Representative results obtained by four students out of 100 are shown. 相似文献
8.
Junker B 《Biotechnology and bioengineering》2001,74(1):49-61
The purpose of this article is to provide a few concrete examples of the potential to acceptably reduce the scope of validation and qualification testing based on scientific justification for the specific area of microbial fermentation. The key areas explored include: autoclave operational qualification (OQ) testing, autoclave load pattern testing, vessel sterilize-in-place testing, spore strip use and failure investigation, grouping of D-values for media and concentrated nutrients, influence of temperature on D-values, and equipment clean-in-place cleaning agent/recovery studies. Suggestions are offered based on technical data and engineering analysis of the procedures involved. Methodologies are described for how to evaluate the systems being tested relative to processing requirements to determine which testing might be minimized and which testing might warrant expansion. The ultimate risk to quality then must be evaluated by the designated quality control groups within the organization for the specific process and equipment in use. 相似文献
9.
《Cytotherapy》2014,16(9):1187-1196
The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. 相似文献
10.
Carolien J. Jansen Anthony R. Absalom Geertruida H. de Bock Barbara L. van Leeuwen Gerbrand J. Izaks 《PloS one》2014,9(12)
Several risk stratification instruments for postoperative delirium in older people have been developed because early interventions may prevent delirium. We investigated the performance and agreement of nine commonly used risk stratification instruments in an independent validation cohort of consecutive elective and emergency surgical patients aged ≥50 years with ≥1 risk factor for postoperative delirium. Data was collected prospectively. Delirium was diagnosed according to DSM-IV-TR criteria. The observed incidence of postoperative delirium was calculated per risk score per risk stratification instrument. In addition, the risk stratification instruments were compared in terms of area under the receiver operating characteristic (ROC) curve (AUC), and positive and negative predictive value. Finally, the positive agreement between the risk stratification instruments was calculated. When data required for an exact implementation of the original risk stratification instruments was not available, we used alternative data that was comparable. The study population included 292 patients: 60% men; mean age (SD), 66 (8) years; 90% elective surgery. The incidence of postoperative delirium was 9%. The maximum observed incidence per risk score was 50% (95%CI, 15–85%); for eight risk stratification instruments, the maximum observed incidence per risk score was ≤25%. The AUC (95%CI) for the risk stratification instruments varied between 0.50 (0.36–0.64) and 0.66 (0.48–0.83). No AUC was statistically significant from 0.50 (p≥0.11). Positive predictive values of the risk stratification instruments varied between 0–25%, negative predictive values between 89–95%. Positive agreement varied between 0–66%. No risk stratification instrument showed clearly superior performance. In conclusion, in this independent validation cohort, the performance and agreement of commonly used risk stratification instruments for postoperative delirium was poor. Although some caution is needed because the risk stratification instruments were not implemented exactly as described in the original studies, we think that their usefulness in clinical practice can be questioned. 相似文献
11.
Reliability of Life Cycle Assessment (LCA) results depends on the availability and quality of Life Cycle Inventory (LCI) data.
In order to provide high-quality LCI data for background systems in LCA and to make it applicable to a wider range of fields,
harmonization strategies for already existing datasets and databases are required. In view of the high significance of LCI
data as a basis of major fields of action within a sustainability strategy, the German Helmholtz Association (HGF), under
the leadership of the Forschungszentrum Karlsruhe (FZK) has taken up this issue in its research programme. In 2002, the FZK
conducted a preliminary study on ‘Quality Assurance and User-oriented Supply of a Life Cycle Inventory Data’ funded by the
Federal Ministry of Education and Research (BMBF). Within the framework of this study, a long-term concept for improving the
scientific fundamentals and practical use of LCI data was developed in association with external experts. The focus is on
establishing a permanent German ‘Network on Life Cycle Inventory Data’ which will serve as the German information and cooperation
platform for all scientific and non-scientific actors in the field of life cycle analysis. This network will integrate expertise
on LCA in Germany, harmonise methodology and data, and use the comprehensive expert panel as an efficient basis for further
scientific development and practical use of LCA. At the same time, this network will serve as a platform for cooperation on
an international level. Current developments address methodological definitions for the initial information infrastructure.
As a novel element, user needs are differentiated in parallel according to the broad application fields of LCI-data from product
declaration to process design. Case studies will be used to define tailored interfaces for the database, since different data
quality levels will be encountered. 相似文献
12.
Bernd Pulverer 《The EMBO journal》2015,34(20):2483-2485
A reliable scientific literature is crucial for an efficient research process. Peer review remains a highly successful quality assurance mechanism, but it does not always prevent data and image aberrations and the publication of flawed data. Journals need to be in a position to detect such problems and take proportionate action. Publishers should apply consistent policies to correcting the published literature and adopt versioning. The scientific community ought to encourage corrections. 相似文献
13.
The possibility of using minimally invasive analytical instruments to monitor cancerous cells and their interactions with analytes provide great advances in cancer research and toxicology. The real success in the development of a reliable sensor for cell monitoring depends on the ability to design powerful instrumentation that will facilitate efficient signal transduction from the biological process that occurs in the cellular environment. The resulting sensor should not affect cell viability and must function as well as adapt the system to the specific conditions imposed by the cell culture. Due to their performance, electrochemical biosensors could be used as an effective instrument in cell cancer research for studying biochemical processes, cancer development and progression as well as toxicity monitoring. Current research in this direction is conducted through high-throughput, compact, portable, and easy to use sensors that enable measurement of cells' activity in their optimum environment. This paper discusses the potential of a high-throughput electrochemical multisensor system, so-called the DOX system for monitoring cancerous cells and their interaction with chemical toxins. We describe the methodology, experiments, and the operation principle of this device, and we focus on the challenges encountered in optimizing and adapting the system to the specific cell-culture conditions. The DOX system is also compared with conventional cell-culture techniques. 相似文献
14.
Leila Malik A. Pernille Tofteng Søren L. Pedersen Kasper K. Sørensen Knud J. Jensen 《Journal of peptide science》2010,16(9):506-512
Precise microwave heating has emerged as a valuable method to aid solid‐phase peptide synthesis (SPPS). New methods and reliable protocols, as well as their embodiment in automated instruments, are required to fully use this potential. Here we describe a new automated robotic instrument for SPPS with microwave heating, report protocols for its reliable use and report the application to the synthesis of long sequences, including the β‐amyloid 1‐42 peptide. The instrument is built around a valve‐free robot originally developed for parallel peptide synthesis, where the robotic arm transports reagents instead of pumping reagents via valves. This is the first example of an ‘X‐Y’ robotic microwave‐assisted synthesizer developed for the assembly of long peptides. Although the instrument maintains its capability for parallel synthesis at room temperature, in this paper, we focus on sequential peptide synthesis with microwave heating. With this valve‐free instrument and the protocols developed for its use, fast and efficient syntheses of long and difficult peptide sequences were achieved. Copyright © 2010 European Peptide Society and John Wiley & Sons, Ltd. 相似文献
15.
Incorporating uncertainty and social values in managing invasive alien species: a deliberative multi-criteria evaluation approach 总被引:1,自引:0,他引:1
The management of Invasive Alien Species (IAS) is stymied by complex social values and severe levels of uncertainty. However,
these two challenges are often hidden in the conventional model of management by “value-free” analyses and probability-based
estimates of risk. As a result, diverse social values and wide margins of error in risk assessment carry zero weights in the
decision-making process, leaving IAS risk decisions to be made in the wake of political pressure and the crisis atmosphere
of incursion. We propose to use a Deliberative Multi-Criteria Evaluation (DMCE) to incorporate multiple social values and
profound uncertainty into decision-making processes. The DMCE process combines the advantages of conventional multi-criteria
decision analysis methods with the benefits of stakeholder participation to provide an analytical structure to assess complex
multi-dimensional objectives. It, therefore, offers an opportunity for diverse views to enter the decision-making process,
and for the negotiation of consensus positions. The DMCE process can also function as a platform for risk communication in
which scientists, stakeholders, and decision-makers can interact and discuss the uncertainty associated with biological invasions.
We examine two case studies that demonstrate how DMCE provides scientific rigor and transparency in the decision-making process
of invasion risk management. The first case regards pre-border priority ranking for potential invasive species and the second
relates to selecting the most desirable policy option for managing a post-border invader. 相似文献
16.
Yu B Yang M Wong HY Watt RM Song E Zheng BJ Yuen KY Huang JD 《Applied microbiology and biotechnology》2011,91(1):177-188
Live attenuated Salmonella enterica serovar Typhi Ty21a (Ty21a) is an important vaccine strain used in clinical studies for typhoid fever and as a vaccine vector
for the expression of heterologous antigens. To facilitate the use of Ty21a in such studies, it is desirable to develop improved
strategies that enable the stable chromosomal integration and expression of multiple heterologous antigens. The phage λ Red
homologous recombination system has previously been used in various gram-negative bacteria species to mediate the accurate
replacement of regions of chromosomal DNA with PCR-generated ‘targeting cassettes’ that contain flanking regions of shared
homologous DNA sequence. However, the efficiency of λ Red-mediated recombineering in Ty21a is far lower than in Escherichia coli and other Salmonella typhimurium strains. Here, we describe an improved strategy for recombineering-based methods in Ty21a. Our reliable and efficient method
involves the use of linear DNA-targeting cassettes that contain relatively long flanking ‘arms’ of sequence (ca. 1,000 bp)
homologous to the chromosomal target. This enables multiple gene-targeting procedures to be performed on a single Ty21a chromosome
in a straightforward, sequential manner. Using this strategy, we inserted three different influenza antigen expression cassettes
as well as a green fluorescent protein gene reporter into four different loci on the Ty21a chromosome, with high efficiency
and accuracy. Fluorescent microscopy and Western blotting analysis confirmed that strong inducible expression of all four
heterologous genes could be achieved. In summary, we have developed an efficient, robust, and versatile method that may be
used to construct recombinant Ty21a antigen-expressing strains. 相似文献
17.
Cross-cultural adaptation of instruments assessing breastfeeding determinants: a multi-step approach
Background
Cross-cultural adaptation is a necessary process to effectively use existing instruments in other cultural and language settings. The process of cross-culturally adapting, including translation, of existing instruments is considered a critical set to establishing a meaningful instrument for use in another setting. Using a multi-step approach is considered best practice in achieving cultural and semantic equivalence of the adapted version. We aimed to ensure the content validity of our instruments in the cultural context of KwaZulu-Natal, South Africa.Methods
The Iowa Infant Feeding Attitudes Scale, Breastfeeding Self-Efficacy Scale-Short Form and additional items comprise our consolidated instrument, which was cross-culturally adapted utilizing a multi-step approach during August 2012. Cross-cultural adaptation was achieved through steps to maintain content validity and attain semantic equivalence in the target version. Specifically, Lynn’s recommendation to apply an item-level content validity index score was followed. The revised instrument was translated and back-translated. To ensure semantic equivalence, Brislin’s back-translation approach was utilized followed by the committee review to address any discrepancies that emerged from translation.Results
Our consolidated instrument was adapted to be culturally relevant and translated to yield more reliable and valid results for use in our larger research study to measure infant feeding determinants effectively in our target cultural context.Conclusions
Undertaking rigorous steps to effectively ensure cross-cultural adaptation increases our confidence that the conclusions we make based on our self-report instrument(s) will be stronger. In this way, our aim to achieve strong cross-cultural adaptation of our consolidated instruments was achieved while also providing a clear framework for other researchers choosing to utilize existing instruments for work in other cultural, geographic and population settings.18.
Background
Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. 相似文献19.
An IBM-compatible microcomputer program for teaching purposesis described which simulates the operation of a sedimentationvelocity determination of a protein in an analytical ultracentrifugeusing schlieren optics. The program operates in speeded-up timeand simulates the major procedures which would need to be carriedout to operate such an instrument. The position of the sedimentingboundary can be observed at any time during the run, and upto six photographs can be recorded for subsequentanalysis. Calculation of sedimentation coefficient, diffusioncoefficient and mol. wt can be made from a dot-matrix printout.Ten representative proteins are stored within the program, butprovision exists for user-supplied data.
Received on June 25, 1987; accepted on September 9, 1987 相似文献
20.
Khatami M 《Cell biochemistry and biophysics》2007,47(2):187-198
The purpose of this position article was to design a set of criteria (data elements) for a wide range of cancer biomarkers
(CBs) in an attempt to standardize biomarkers features through a common language as a foundation for a database. Data elements
are described as a set of generic criteria, which should characterize nearly all biomarkers introduced in the literature.
Data elements were extracted from the review of prominent features that biomarkers represent within various categories. The
extracted characteristics of biomarkers produced a short list of shared and unique generic features such as biological nature
and history; stage/phase of study; sensitivity and specificity; modes of action; risk assessment; validation status; technology,
and recommendation status for diversified biomarkers. To tailor data elements on specific markers, a cytokine, such as macrophage-colony
stimulating factor (M-CSF), which has been proposed as a ‹potentially suitable biomarker’ for diagnosis of ovarian, lung,
breast, pancreatic, and colorectal cancers, was selected as a Model biomarker. Small scale clinical studies suggested the
superior usefulness of M-CSF compared with traditional markers for cancer detection. A key criterion for selecting Model marker
and tailoring data elements for detection of cancer was the comparison of data on its specificity and sensitivity with traditional
markers. The design of data elements for standardizing CBs criteria is considered a Research Tool and a foundation for developing
a comprehensive CBs database useful for oncology researchers for a wide range of biomarkers. Validation, integration and proper
packaging, data visualization and recommendation of suitability of CBs, by a panel of experts, for technology development
are important challenging next steps toward developing a reliable database, which would allow professionals to effectively
retrieve and study integrated information on potentially useful markers; identify important knowledge gaps and limitations
of data; and assess state of technologies and commercialization of markers at a point of need. Appropriate use of integrated
information on biomarkers in clinical practices would eventually account for more cost-effective characteristics of an individual’s
state of health. 相似文献