首页 | 本学科首页   官方微博 | 高级检索  
     


An improved Four-Russians method and sparsified Four-Russians algorithm for RNA folding
Authors:Yelena Frid  Dan Gusfield
Affiliation:1.Department of Computer Science,UC Davis,Davis,USA
Abstract:

Background

The basic RNA secondary structure prediction problem or single sequence folding problem (SSF) was solved 35 years ago by a now well-known (O(n^3))-time dynamic programming method. Recently three methodologies—Valiant, Four-Russians, and Sparsification—have been applied to speedup RNA secondary structure prediction. The sparsification method exploits two properties of the input: the number of subsequence Z with the endpoints belonging to the optimal folding set and the maximum number base-pairs L. These sparsity properties satisfy (0 le L le n / 2) and (n le Z le n^2 / 2), and the method reduces the algorithmic running time to O(LZ). While the Four-Russians method utilizes tabling partial results.

Results

In this paper, we explore three different algorithmic speedups. We first expand the reformulate the single sequence folding Four-Russians (Theta left(frac{n^3}{log ^2 n}right))-time algorithm, to utilize an on-demand lookup table. Second, we create a framework that combines the fastest Sparsification and new fastest on-demand Four-Russians methods. This combined method has worst-case running time of (O(tilde{L}tilde{Z})), where (frac{{L}}{log n} le tilde{L}le minleft({L},frac{n}{log n}right)) and (frac{{Z}}{log n}le tilde{Z} le minleft({Z},frac{n^2}{log n}right)). Third we update the Four-Russians formulation to achieve an on-demand (O( n^2/ log ^2n ))-time parallel algorithm. This then leads to an asymptotic speedup of (O(tilde{L}tilde{Z_j})) where (frac{{Z_j}}{log n}le tilde{Z_j} le minleft({Z_j},frac{n}{log n}right)) and (Z_j) the number of subsequence with the endpoint j belonging to the optimal folding set.

Conclusions

The on-demand formulation not only removes all extraneous computation and allows us to incorporate more realistic scoring schemes, but leads us to take advantage of the sparsity properties. Through asymptotic analysis and empirical testing on the base-pair maximization variant and a more biologically informative scoring scheme, we show that this Sparse Four-Russians framework is able to achieve a speedup on every problem instance, that is asymptotically never worse, and empirically better than achieved by the minimum of the two methods alone.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号