algorithms and data structures
Recently Published Documents





Andrey Makashov ◽  
Andrew Makhorin ◽  
Maxim Terentiev

A wireless sensor network (WSN) of a tree-like topology is considered, which performs measurements and transmits their results to the consumer. Under the interference influence, the WSN nodes transmitters low power makes the transmitted information vulnerable, which leads to significant data loss. To reduce the data loss during transmission, a noise-immune WSN model is proposed. Such a WSN, having detected a stable connection absence between a pair of nodes, transfers the interaction between these nodes to a radio channel free from interference influence. For this, the model, in addition to forming a network and transferring application data, provides for checking the communication availability based on the keep-alive mechanism and restoring the network with a possible channel change. A feature point of the proposed approach is the ability to restore network connectivity when exposed to interference of significant power and duration, which makes it impossible to exchange service messages on the channel selected for the interaction of nodes. To support the model, work algorithms and data structures have been developed, indicators have been formalized to assess an anti-jamming system work quality.

Sensors ◽  
2021 ◽  
Vol 21 (24) ◽  
pp. 8241
Mitko Aleksandrov ◽  
Sisi Zlatanova ◽  
David J. Heslop

Voxel-based data structures, algorithms, frameworks, and interfaces have been used in computer graphics and many other applications for decades. There is a general necessity to seek adequate digital representations, such as voxels, that would secure unified data structures, multi-resolution options, robust validation procedures and flexible algorithms for different 3D tasks. In this review, we evaluate the most common properties and algorithms for voxelisation of 2D and 3D objects. Thus, many voxelisation algorithms and their characteristics are presented targeting points, lines, triangles, surfaces and solids as geometric primitives. For lines, we identify three groups of algorithms, where the first two achieve different voxelisation connectivity, while the third one presents voxelisation of curves. We can say that surface voxelisation is a more desired voxelisation type compared to solid voxelisation, as it can be achieved faster and requires less memory if voxels are stored in a sparse way. At the same time, we evaluate in the paper the available voxel data structures. We split all data structures into static and dynamic grids considering the frequency to update a data structure. Static grids are dominated by SVO-based data structures focusing on memory footprint reduction and attributes preservation, where SVDAG and SSVDAG are the most advanced methods. The state-of-the-art dynamic voxel data structure is NanoVDB which is superior to the rest in terms of speed as well as support for out-of-core processing and data management, which is the key to handling large dynamically changing scenes. Overall, we can say that this is the first review evaluating the available voxelisation algorithms for different geometric primitives as well as voxel data structures.

Robert Worden

Bayesian formulations of learning imply that whenever the evidence for a correlation between events in an animal’s habitat is sufficient, the correlation is learned. This implies that regularities can be learnt rapidly, from small numbers of learning examples. This speed of learning gives maximum possible fitness, and no faster learning is possible. There is evidence in many domains that animals and people can learn at nearly Bayesian optimal speeds. These domains include associative conditioning, and the more complex domains of navigation and language. There are computational models of learning which learn at near-Bayesian speeds in complex domains, and which can scale well – to learn thousands of pieces of knowledge (i.e., relations and associations). These are not neural net models. They can be defined in computational terms, as algorithms and data structures at David Marr’s [1] Level Two. Their key data structures are composite feature structures, which are graphs of multiple linked nodes. This leads to the hypothesis that animal learning results not from deep neural nets (which typically require thousands of training exam-ples), but from neural implementations of the Level Two models of fast learning; and that neu-rons provide the facilities needed to implement those models at Marr’s Level Three. The required facilities include feature structures, dynamic binding, one-shot memory for many feature struc-tures, pattern-based associative retrieval, unification and generalization of feature structures. These may be supported by multiplexing of data and metadata in the same neural fibres.

2021 ◽  
Thomas R Etherington ◽  
O. Pascal Omondiagbe

Computational geometry algorithms and data structures are widely applied across numerous scientific domains, and there a variety of R packages that implement computational geometry functionality. However, these packages often work in specific numbers of dimensions, do not have directly compatible data structures, and include additional non-computational geometry functionality that can be domain specific. Our objective in developing the compGeometeR package is to implement in a generic and consistent framework the most commonly used combinatorial computational geometry algorithms so that they can be easily combined and integrated into domain specific scientific workflows. We briefly explain the discrete and digital combinatorial computational geometry algorithms available in compGeometeR, and identify priorities for future development.

2021 ◽  
Vol 73 (1) ◽  
pp. 134-141
A.R. Baidalina ◽  
S.A. Boranbayev ◽  

The article discusses ways of programming algorithms for complex data structures in Python. Knowledge of these structures and the corresponding algorithms is necessary when choosing the best methods for developing various software. When studying the subject "Algorithms and Data Structures", it is important to understand the essence of data structures. This is due to the fact that manipulating a data structure to fit a specific problem requires an understanding of the essence and algorithms of this data structure. Examples of programming algorithms related to dynamic lists and binary search trees in the currently widely used Python language are given. The algorithms for traversing the graph in depth and breadth are optimally and clearly implemented using the Python dictionary.

Petr Yu. Bugakov ◽  
Nikita A. Baraev

The article presents the results of the software development for automated graph generation, and corresponding adjacency and incidence tables. The program is tested in the educational process within the laboratory work "Representation of a graph in a computer in the form of an adjacency matrix and a list of edges" in the discipline "Algorithms and data structures" for students of the 1st year studying Information Systems and Technologies.

Algorithmica ◽  
2021 ◽  
Vol 83 (3) ◽  
pp. 775-775
Zachary Friggstad ◽  
Jörg-Rüdiger Sack ◽  
Mohammad R. Salavatipour

Sign in / Sign up

Export Citation Format

Share Document