首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
Many studies assume stock prices follow a random process known as geometric Brownian motion. Although approximately correct, this model fails to explain the frequent occurrence of extreme price movements, such as stock market crashes. Using a large collection of data from three different stock markets, we present evidence that a modification to the random model—adding a slow, but significant, fluctuation to the standard deviation of the process—accurately explains the probability of different-sized price changes, including the relative high frequency of extreme movements. Furthermore, we show that this process is similar across stocks so that their price fluctuations can be characterized by a single curve. Because the behavior of price fluctuations is rooted in the characteristics of volatility, we expect our results to bring increased interest to stochastic volatility models, and especially to those that can produce the properties of volatility reported here.  相似文献   

2.
We analyse times between consecutive transactions for a diverse group of stocks registered on the NYSE and NASDAQ markets, and we relate the dynamical properties of the intertrade times with those of the corresponding price fluctuations. We report that market structure strongly impacts the scale-invariant temporal organisation in the transaction timing of stocks, which we have observed to have long-range power-law correlations. Specifically, we find that, compared to NYSE stocks, stocks registered on the NASDAQ exhibit significantly stronger correlations in their transaction timing on scales within a trading day. Further, we find that companies that transfer from the NASDAQ to the NYSE show a reduction in the correlation strength of transaction timing on scales within a trading day, indicating influences of market structure. We also report a persistent decrease in correlation strength of intertrade times with increasing average intertrade time and with corresponding decrease in companies'' market capitalization–a trend which is less pronounced for NASDAQ stocks. Surprisingly, we observe that stronger power-law correlations in intertrade times are coupled with stronger power-law correlations in absolute price returns and higher price volatility, suggesting a strong link between the dynamical properties of intertrade times and the corresponding price fluctuations over a broad range of time scales. Comparing the NYSE and NASDAQ markets, we demonstrate that the stronger correlations we find in intertrade times for NASDAQ stocks are associated with stronger correlations in absolute price returns and with higher volatility, suggesting that market structure may affect price behavior through information contained in transaction timing. These findings do not support the hypothesis of universal scaling behavior in stock dynamics that is independent of company characteristics and stock market structure. Further, our results have implications for utilising transaction timing patterns in price prediction and risk management optimization on different stock markets.  相似文献   

3.
The characterization of asset price returns is an important subject in modern finance. Traditionally, the dynamics of stock returns are assumed to lack any temporal order. Here we present an analysis of the autocovariance of stock market indices and unravel temporal order in several major stock markets. We also demonstrate a fundamental difference between developed and emerging markets in the past decade - emerging markets are marked by positive order in contrast to developed markets whose dynamics are marked by weakly negative order. In addition, the reaction to financial crises was found to be reversed among developed and emerging markets, presenting large positive/negative autocovariance spikes following the onset of these crises. Notably, the Chinese market shows neutral or no order while being regarded as an emerging market. These findings show that despite the coupling between international markets and global trading, major differences exist between different markets, and demonstrate that the autocovariance of markets is correlated with their stability, as well as with their state of development.  相似文献   

4.
This paper identifies liquidity spillovers through different time scales based on a wavelet multiscaling method. We decompose daily data from U.S., British, Brazilian and Hong Kong stock markets indices in order to calculate the scale correlation between their illiquidities. The sample is divided in order to consider non-crisis, sub-prime crisis and Eurozone crisis. We find that there are changes in correlations of distinct scales and different periods. Association in finest scales is smaller than in coarse scales. There is a rise on associations in periods of crisis. In frequencies, there is predominance for significant distinctions involving the coarsest scale, while for crises periods there is predominance for distinctions on the finest scale.  相似文献   

5.
Psychophysical and neurophysiological studies have suggested that memory is not simply a carbon copy of our experience: Memories are modified or new memories are formed depending on the dynamic structure of our experience, and specifically, on how gradually or abruptly the world changes. We present a statistical theory of memory formation in a dynamic environment, based on a nonparametric generalization of the switching Kalman filter. We show that this theory can qualitatively account for several psychophysical and neural phenomena, and present results of a new visual memory experiment aimed at testing the theory directly. Our experimental findings suggest that humans can use temporal discontinuities in the structure of the environment to determine when to form new memory traces. The statistical perspective we offer provides a coherent account of the conditions under which new experience is integrated into an old memory versus forming a new memory, and shows that memory formation depends on inferences about the underlying structure of our experience.  相似文献   

6.
The analysis of cross-correlations is extensively applied for the understanding of interconnections in stock markets and the portfolio risk estimation. Current studies of correlations in Chinese market mainly focus on the static correlations between return series, and this calls for an urgent need to investigate their dynamic correlations. Our study aims to reveal the dynamic evolution of cross-correlations in the Chinese stock market, and offer an exact interpretation for the evolution behavior. The correlation matrices constructed from the return series of 367 A-share stocks traded on the Shanghai Stock Exchange from January 4, 1999 to December 30, 2011 are calculated over a moving window with a size of 400 days. The evolutions of the statistical properties of the correlation coefficients, eigenvalues, and eigenvectors of the correlation matrices are carefully analyzed. We find that the stock correlations are significantly increased in the periods of two market crashes in 2001 and 2008, during which only five eigenvalues significantly deviate from the random correlation matrix, and the systemic risk is higher in these volatile periods than calm periods. By investigating the significant contributors of the deviating eigenvectors in different time periods, we observe a dynamic evolution behavior in business sectors such as IT, electronics, and real estate, which lead the rise (drop) before (after) the crashes. Our results provide new perspectives for the understanding of the dynamic evolution of cross-correlations in the Chines stock markets, and the result of risk estimation is valuable for the application of risk management.  相似文献   

7.
Social media are increasingly reflecting and influencing behavior of other complex systems. In this paper we investigate the relations between a well-known micro-blogging platform Twitter and financial markets. In particular, we consider, in a period of 15 months, the Twitter volume and sentiment about the 30 stock companies that form the Dow Jones Industrial Average (DJIA) index. We find a relatively low Pearson correlation and Granger causality between the corresponding time series over the entire time period. However, we find a significant dependence between the Twitter sentiment and abnormal returns during the peaks of Twitter volume. This is valid not only for the expected Twitter volume peaks (e.g., quarterly announcements), but also for peaks corresponding to less obvious events. We formalize the procedure by adapting the well-known “event study” from economics and finance to the analysis of Twitter data. The procedure allows to automatically identify events as Twitter volume peaks, to compute the prevailing sentiment (positive or negative) expressed in tweets at these peaks, and finally to apply the “event study” methodology to relate them to stock returns. We show that sentiment polarity of Twitter peaks implies the direction of cumulative abnormal returns. The amount of cumulative abnormal returns is relatively low (about 1–2%), but the dependence is statistically significant for several days after the events.  相似文献   

8.
This paper reexamines the profitability of loser, winner and contrarian portfolios in the Chinese stock market using monthly data of all stocks traded on the Shanghai Stock Exchange and Shenzhen Stock Exchange covering the period from January 1997 to December 2012. We find evidence of short-term and long-term contrarian profitability in the whole sample period when the estimation and holding horizons are 1 month or longer than 12 months and the annualized return of contrarian portfolios increases with the estimation and holding horizons. We perform subperiod analysis and find that the long-term contrarian effect is significant in both bullish and bearish states, while the short-term contrarian effect disappears in bullish states. We compare the performance of contrarian portfolios based on different grouping manners in the estimation period and unveil that decile grouping outperforms quintile grouping and tertile grouping, which is more evident and robust in the long run. Generally, loser portfolios and winner portfolios have positive returns and loser portfolios perform much better than winner portfolios. Both loser and winner portfolios in bullish states perform better than those in the whole sample period. In contrast, loser and winner portfolios have smaller returns in bearish states, in which loser portfolio returns are significant only in the long term and winner portfolio returns become insignificant. These results are robust to the one-month skipping between the estimation and holding periods and for the two stock exchanges. Our findings show that the Chinese stock market is not efficient in the weak form. These findings also have obvious practical implications for financial practitioners.  相似文献   

9.
10.
Biophysics - Abstract—Conformational mobility is one of the most important properties of the DNA molecule. A striking example of this mobility is provided by the formation of regions where...  相似文献   

11.
Malaria is a global health problem responsible for nearly one million deaths every year around 85% of which concern children younger than five years old in Sub-Saharan Africa. In addition, around million clinical cases are declared every year. The level of infection, expressed as parasite density, is classically defined as the number of asexual parasites relative to a microliter of blood. Microscopy of Giemsa-stained thick blood films is the gold standard for parasite enumeration. Parasite density estimation methods usually involve threshold values; either the number of white blood cells counted or the number of high power fields read. However, the statistical properties of parasite density estimators generated by these methods have largely been overlooked. Here, we studied the statistical properties (mean error, coefficient of variation, false negative rates) of parasite density estimators of commonly used threshold-based counting techniques depending on variable threshold values. We also assessed the influence of the thresholds on the cost-effectiveness of parasite density estimation methods. In addition, we gave more insights on the behavior of measurement errors according to varying threshold values, and on what should be the optimal threshold values that minimize this variability.  相似文献   

12.
This work presents a specific stock-effort dynamical model. The stocks correspond to two populations of fish moving and growing between two fishery zones. They are harvested by two different fleets. The effort represents the number of fishing boats of the two fleets that operate in the two fishing zones. The bioeconomical model is a set of four ODE's governing the fishing efforts and the stocks in the two fishing areas. Furthermore, the migration of the fish between the two patches is assumed to be faster than the growth of the harvested stock. The displacement of the fleets is also faster than the variation in the number of fishing boats resulting from the investment of the fishing income. So, there are two time scales: a fast one corresponding to the migration between the two patches, and a slow time scale corresponding to growth. We use aggregation methods that allow us to reduce the dimension of the model and to obtain an aggregated model for the total fishing effort and fish stock of the two fishing zones. The mathematical analysis of the model is shown. Under some conditions, we obtain a stable equilibrium, which is a desired situation, as it leads to a sustainable harvesting equilibrium, keeping the stock at exploitable densities.  相似文献   

13.
The statistical properties of one estimator of absolute genetic distance (1/2) Ki=1 |pxi-yr|, tween two populations X and Y, are presented. It is shown that using this distance in small samples can be misleading particularly when populations are close to each other.  相似文献   

14.
The present study is an extension of the investigations made by Grieszbach and Schack (1993) where the recursive estimators of the quantile were introduced. Attention is focused on statistical properties and on the controlling of these estimators in order to reduce their variance and to improve their capability of adaptation. Using methods of stochastic approximation, several control algorithms have been developed, where both the consistent and the adaptive estimation are considered. Due to the recursive computation formula the estimators are suitable for the analysis of large data sets and for sets whose elements are obtained sequentially. In this study, application examples from the analysis of EEG‐records are presented, where quantiles are used as threshold values.  相似文献   

15.
Statistical Properties of a DNA Sample under the Finite-Sites Model   总被引:1,自引:0,他引:1       下载免费PDF全文
Z. Yang 《Genetics》1996,144(4):1941-1950
Statistical properties of a DNA sample from a random-mating population of constant size are studied under the finite-sites model. It is assumed that there is no migration and no recombination occurs within the locus. A Markov process model is used for nucleotide substitution, allowing for multiple substitutions at a single site. The evolutionary rates among sites are treated as either constant or variable. The general likelihood calculation using numerical integration involves intensive computation and is feasible for three or four sequences only; it may be used for validating approximate algorithms. Methods are developed to approximate the probability distribution of the number of segregating sites in a random sample of n sequences, with either constant or variable substitution rates across sites. Calculations using parameter estimates obtained for human D-loop mitochondrial DNAs show that among-site rate variation has a major effect on the distribution of the number of segregating sites; the distribution under the finite-sites model with variable rates among sites is quite different from that under the infinite-sites model.  相似文献   

16.
Primates display a remarkable ability to adapt to novel situations. Determining what is most pertinent in these situations is not always possible based only on the current sensory inputs, and often also depends on recent inputs and behavioral outputs that contribute to internal states. Thus, one can ask how cortical dynamics generate representations of these complex situations. It has been observed that mixed selectivity in cortical neurons contributes to represent diverse situations defined by a combination of the current stimuli, and that mixed selectivity is readily obtained in randomly connected recurrent networks. In this context, these reservoir networks reproduce the highly recurrent nature of local cortical connectivity. Recombining present and past inputs, random recurrent networks from the reservoir computing framework generate mixed selectivity which provides pre-coded representations of an essentially universal set of contexts. These representations can then be selectively amplified through learning to solve the task at hand. We thus explored their representational power and dynamical properties after training a reservoir to perform a complex cognitive task initially developed for monkeys. The reservoir model inherently displayed a dynamic form of mixed selectivity, key to the representation of the behavioral context over time. The pre-coded representation of context was amplified by training a feedback neuron to explicitly represent this context, thereby reproducing the effect of learning and allowing the model to perform more robustly. This second version of the model demonstrates how a hybrid dynamical regime combining spatio-temporal processing of reservoirs, and input driven attracting dynamics generated by the feedback neuron, can be used to solve a complex cognitive task. We compared reservoir activity to neural activity of dorsal anterior cingulate cortex of monkeys which revealed similar network dynamics. We argue that reservoir computing is a pertinent framework to model local cortical dynamics and their contribution to higher cognitive function.  相似文献   

17.
S. Datta  M. Kiparsky  D. M. Rand    J. Arnold 《Genetics》1996,144(4):1985-1992
In this paper we use cytonuclear disequilibria to test the neutrality of mtDNA markers. The data considered here involve sample frequencies of cytonuclear genotypes subject to both statistical sampling variation as well as genetic sampling variation. First, we obtain the dynamics of the sample cytonuclear disequilibria assuming random drift alone as the source of genetic sampling variation. Next, we develop a test statistic using cytonuclear disequilibria via the theory of generalized least squares to test the random drift model. The null distribution of the test statistic is shown to be approximately chi-squared using an asymptotic argument as well as computer simulation. Power of the test statistic is investigated under an alternative model with drift and selection. The method is illustrated using data from cage experiments utilizing different cytonuclear genotypes of Drosophila melanogaster. A program for implementing the neutrality test is available upon request.  相似文献   

18.
《Biophysical journal》2020,118(7):1564-1575
The endothelial glycocalyx layer (EGL), which consists of long proteoglycans protruding from the endothelium, acts as a regulator of inflammation by preventing leukocyte engagement with adhesion molecules on the endothelial surface. The amount of resistance to adhesive events the EGL provides is the result of two properties: EGL thickness and stiffness. To determine these, we used an atomic force microscope to indent the surfaces of cultured endothelial cells with a glass bead and evaluated two different approaches for interpreting the resulting force-indentation curves. In one, we treat the EGL as a molecular brush, and in the other, we treat it as a thin elastic layer on an elastic half-space. The latter approach proved more robust in our hands and yielded a thickness of 110 nm and a modulus of 0.025 kPa. Neither value showed significant dependence on indentation rate. The brush model indicated a larger layer thickness (∼350 nm) but tended to result in larger uncertainties in the fitted parameters. The modulus of the endothelial cell was determined to be 3.0–6.5 kPa (1.5–2.5 kPa for the brush model), with a significant increase in modulus with increasing indentation rates. For forces and leukocyte properties in the physiological range, a model of a leukocyte interacting with the endothelium predicts that the number of molecules within bonding range should decrease by an order of magnitude because of the presence of a 110-nm-thick layer and even further for a glycocalyx with larger thickness. Consistent with these predictions, neutrophil adhesion increased for endothelial cells with reduced EGL thickness because they were grown in the absence of fluid shear stress. These studies establish a framework for understanding how glycocalyx layers with different thickness and stiffness limit adhesive events under homeostatic conditions and how glycocalyx damage or removal will increase leukocyte adhesion potential during inflammation.  相似文献   

19.
This paper looks at the relationship between negative news and stock markets in times of global crisis, such as the 2008/2009 period. We analysed one year of front page banner headlines of three financial newspapers, the Wall Street Journal, Financial Times, and Il Sole24ore to examine the influence of bad news both on stock market volatility and dynamic correlation. Our results show that the press and markets influenced each other in generating market volatility and in particular, that the Wall Street Journal had a crucial effect both on the volatility and correlation between the US and foreign markets. We also found significant differences between newspapers in their interpretation of the crisis, with the Financial Times being significantly pessimistic even in phases of low market volatility. Our results confirm the reflexive nature of stock markets. When the situation is uncertain and unpredictable, market behaviour may even reflect qualitative, big picture, and subjective information such as streamers in a newspaper, whose economic and informative value is questionable.  相似文献   

20.
An analytical model based on the statistical properties of Open Reading Frames (ORFs) of eubacterial genomes such as codon composition and sequence length of all reading frames was developed. This new model predicts the average length, maximum length as well as the length distribution of the ORFs of 70 species with GC contents varying between 21% and 74%. Furthermore, the number of annotated genes is predicted with high accordance. However, the ORF length distribution in the five alternative reading frames shows interesting deviations from the predicted distribution. In particular, long ORFs appear more often than expected statistically. The unexpected depletion of stop codons in these alternative open reading frames cannot completely be explained by a biased codon usage in the +1 frame. While it is unknown if the stop codon depletion has a biological function, it could be due to a protein coding capacity of alternative ORFs exerting a selection pressure which prevents the fixation of stop codon mutations. The comparison of the analytical model with bacterial genomes, therefore, leads to a hypothesis suggesting novel gene candidates which can now be investigated in subsequent wet lab experiments.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号