scholarly journals More Time-Space Tradeoffs for Finding a Shortest Unique Substring

Algorithms ◽  
2020 ◽  
Vol 13 (9) ◽  
pp. 234
Author(s):  
Hideo Bannai ◽  
Travis Gagie ◽  
Gary Hoppenworth ◽  
Simon J. Puglisi ◽  
Luís M. S. Russo

We extend recent results regarding finding shortest unique substrings (SUSs) to obtain new time-space tradeoffs for this problem and the generalization of finding k-mismatch SUSs. Our new results include the first algorithm for finding a k-mismatch SUS in sublinear space, which we obtain by extending an algorithm by Senanayaka (2019) and combining it with a result on sketching by Gawrychowski and Starikovskaya (2019). We first describe how, given a text T of length n and m words of workspace, with high probability we can find an SUS of length L in O(n(L/m)logL) time using random access to T, or in O(n(L/m)log2(L)loglogσ) time using O((L/m)log2L) sequential passes over T. We then describe how, for constant k, with high probability, we can find a k-mismatch SUS in O(n1+ϵL/m) time using O(nϵL/m) sequential passes over T, again using only m words of workspace. Finally, we also describe a deterministic algorithm that takes O(nτlogσlogn) time to find an SUS using O(n/τ) words of workspace, where τ is a parameter.

2018 ◽  
Author(s):  
Laurel G. Woodruff ◽  
◽  
Suzanne W. Nicholson ◽  
Connie L. Dicken ◽  
Klaus J. Schulz

2021 ◽  
Vol 11 (10) ◽  
pp. 4607
Author(s):  
Xiaozhou Guo ◽  
Yi Liu ◽  
Kaijun Tan ◽  
Wenyu Mao ◽  
Min Jin ◽  
...  

In password guessing, the Markov model is still widely used due to its simple structure and fast inference speed. However, the Markov model based on random sampling to generate passwords has the problem of a high repetition rate, which leads to a low cover rate. The model based on enumeration has a lower cover rate for high-probability passwords, and it is a deterministic algorithm that always generates the same passwords in the same order, making it vulnerable to attack. We design a dynamic distribution mechanism based on the random sampling method. This mechanism enables the probability distribution of passwords to be dynamically adjusted and tend toward uniform distribution strictly during the generation process. We apply the dynamic distribution mechanism to the Markov model and propose a dynamic Markov model. Through comparative experiments on the RockYou dataset, we set the optimal adjustment degree α. Compared with the Markov model without the dynamic distribution mechanism, the dynamic Markov model reduced the repetition rate from 75.88% to 66.50% and increased the cover rate from 37.65% to 43.49%. In addition, the dynamic Markov model had the highest cover rate for high-probability passwords. Finally, the model avoided the lack of a deterministic algorithm, and when it was run five times, it reached almost the same cover rate as OMEN.


1995 ◽  
Vol 32 (6) ◽  
pp. 758-767 ◽  
Author(s):  
Stephen R. Hicock ◽  
Olav B. Lian

Sisters Creek Formation is formally defined, stratotypes are established for it, and the time–space chart is updated for the Fraser Lowland, southwestern British Columbia. The Sisters Creek is a Pleistocene formation comprising in situ and reworked organic-rich sediments, and nonorganic silt, sand, and gravel. The formation was deposited during the Port Moody interstade (within the Late Wisconsinan Fraser Glaciation; δ18O stage 2) between the Coquitlam stade (early Fraser Glaciation) and the main Vashon stadial maximum that occurred about 14.5 ka. The Sisters Creek Formation represents a glacial recession in southwestern British Columbia that generally coincided with the timing of the last global glacial maximum. The new time–space chart implies that, in Fraser Lowland, the Fraser Glaciation represents the rapid advances and retreats of glacial lobes issuing from surrounding mountains, which remained ice-covered during interstades.


2014 ◽  
Vol 651-653 ◽  
pp. 2287-2290
Author(s):  
Zi Hua Zhang ◽  
Hua An Zhang ◽  
Zhi Ying Zhong

we pointed out the idea of Einstein, light speed is independent of the observer i.e. The principle of light speed invariance is incorrect. Instead of Lorentz Transformation we suggested a new time-space transformation, this revision of the Relativity will deeply effect the development of Science and Technology.


2013 ◽  
Vol 56 (6) ◽  
pp. 840-850 ◽  
Author(s):  
LIANG Wen-Quan ◽  
YANG Chang-Chun ◽  
WANG Yan-Fei ◽  
LIU Hong-Wei

2021 ◽  
Author(s):  
Tingting Feng ◽  
Liang Guo ◽  
Hongli Gao ◽  
Tao Chen ◽  
Yaoxiang Yu ◽  
...  

Abstract In order to accurately monitor the tool wear process, it is usually necessary to collect a variety of sensor signals during the cutting process. Different sensor signals in the feature space can provide complementary information. In addition, the monitoring signal is time series data, which also contains a wealth of tool degradation information in the time dimension. However, how to fuse multi-sensor information in time and space dimensions is a key issue that needs to be solved. This paper proposes a new time-space attention mechanism driven multi-feature fusion method to realize the tool wear monitoring. Firstly, lots of features are established from different sensor signals and selected preliminarily. Then, a new feature fusion model with time-space attention mechanism is constructed to fuse features in time and space dimensions. Finally, the tool degradation model is established according to the predicted wear, and the tool remaining useful life is predicted by particle filter. The effectiveness of this method is verified by a tool life cycle wear experiment. Through comparing with other feature fusion models, it is demonstrated that the proposed method realizes the tool wear monitoring more accurately and has better stability.


2021 ◽  
Vol 17 (4) ◽  
pp. 1-26
Author(s):  
Guy Even ◽  
Reut Levi ◽  
Moti Medina ◽  
Adi Rosén

We consider the problem of sampling from a distribution on graphs, specifically when the distribution is defined by an evolving graph model, and consider the time, space, and randomness complexities of such samplers. In the standard approach, the whole graph is chosen randomly according to the randomized evolving process, stored in full, and then queries on the sampled graph are answered by simply accessing the stored graph. This may require prohibitive amounts of time, space, and random bits, especially when only a small number of queries are actually issued. Instead, we propose a setting where one generates parts of the sampled graph on-the-fly, in response to queries, and therefore requires amounts of time, space, and random bits that are a function of the actual number of queries. Yet, the responses to the queries correspond to a graph sampled from the distribution in question. Within this framework, we focus on two random graph models: the Barabási-Albert Preferential Attachment model (BA-graphs) ( Science , 286 (5439):509–512) (for the special case of out-degree 1) and the random recursive tree model ( Theory of Probability and Mathematical Statistics , (51):1–28). We give on-the-fly generation algorithms for both models. With probability 1-1/poly( n ), each and every query is answered in polylog( n ) time, and the increase in space and the number of random bits consumed by any single query are both polylog( n ), where n denotes the number of vertices in the graph. Our work thus proposes a new approach for the access to huge graphs sampled from a given distribution, and our results show that, although the BA random graph model is defined by a sequential process, efficient random access to the graph’s nodes is possible. In addition to the conceptual contribution, efficient on-the-fly generation of random graphs can serve as a tool for the efficient simulation of sublinear algorithms over large BA-graphs, and the efficient estimation of their on such graphs.


1994 ◽  
Vol 21 (6) ◽  
pp. 653-673 ◽  
Author(s):  
K Spiekermann ◽  
M Wegener
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document