large problem
Recently Published Documents


TOTAL DOCUMENTS

94
(FIVE YEARS 29)

H-INDEX

10
(FIVE YEARS 1)

2021 ◽  
Vol 2 (3) ◽  
pp. 1-26
Author(s):  
Timothée Goubault De Brugière ◽  
Marc Baboulin ◽  
Benoît Valiron ◽  
Simon Martiel ◽  
Cyril Allouche

Linear reversible circuits represent a subclass of reversible circuits with many applications in quantum computing. These circuits can be efficiently simulated by classical computers and their size is polynomially bounded by the number of qubits, making them a good candidate to deploy efficient methods to reduce computational costs. We propose a new algorithm for synthesizing any linear reversible operator by using an optimized version of the Gaussian elimination algorithm coupled with a tuned LU factorization. We also improve the scalability of purely greedy methods. Overall, on random operators, our algorithms improve the state-of-the-art methods for specific ranges of problem sizes: The custom Gaussian elimination algorithm provides the best results for large problem sizes (n > 150), while the purely greedy methods provide quasi optimal results when n < 30. On a benchmark of reversible functions, we manage to significantly reduce the CNOT count and the depth of the circuit while keeping other metrics of importance (T-count, T-depth) as low as possible.


2021 ◽  
Vol 2021 ◽  
pp. 1-19
Author(s):  
Wichai Srisuruk ◽  
Kanchala Sudtachat ◽  
Paramate Horkaew

Modern factories have been moving toward just-in-time manufacturing paradigm. Optimal resource scheduling is therefore essential to minimize manufacturing cost and product delivery delay. This paper therefore focuses on scheduling multiple unrelated parallel machines, via Pareto approach. With the proposed strategy, additional realistic concerns are addressed. Particularly, contingencies regarding product dependencies as well as machine capacity and its eligibility are also considered. Provided a jobs list, each with a distinct resource work hour capacity, this novel scheduling is aimed at minimizing manufacturing costs, while maintaining the balance of machine utilization. To this end, different computational intelligence algorithms, i.e., adaptive nearest neighbour search and modified tabu search, are employed in turn and then benchmarked and validated against combinatorial mathematical baseline, on both small and large problem sets. The experiments reported herein were made on MATLAB™ software. The resultant manufacturing plans obtained by these algorithms are thoroughly assessed and discussed.


2021 ◽  
pp. 558
Author(s):  
Indira Betancourt López ◽  
Mar Riera Spiegelhalder ◽  
Adrián Ferrandis Martínez ◽  
Mar Violeta Ortega-Reig ◽  
Héctor Del Alcázar Indarte ◽  
...  

Resumen: La inseguridad y la vulnerabilidad laboral del siglo XXI son asignaturas pendientes de resolver. Es por ello que, el acceso al Trabajo Decente, se ha convertido en una necesidad de las sociedades modernas, abocadas a un ritmo de productividad que no sólo deja poco margen a la vida personal y familiar, sino que, además, ha convertido la precariedad en un problema de grandes dimensiones. Los actores locales que forman parte importante en la toma decisiones necesitan herramientas para medir e implementar políticas conducentes a alcanzar la dignidad en el trabajo, tal y como estipulan grandes organismos internacionales como la OIT y la OMS. Este artículo presenta un sistema de indicadores que permiten medir el impacto y la calidad de las políticas en materia de empleo. Un sistema de indicadores que ya fue aplicado con éxito en la comarca catalana del Montsià.   Palabras clave: Trabajo Decente, indicadores de trabajo decente, conciliación, empleabilidad, productividad.   Abstract: The insecurity and job vulnerability of the 21st century are pending issues. That is why access to Decent Work has become a necessity in modern societies, engaged in a rate of productivity that not only leaves little room for personal and family life, but has also made precariousness in a large problem. Local actors that are an important part in decision-making need tools to measure and implement policies that lead to achieving dignity at work, as stipulated by large international organizations such as the ILO and the WHO. This article presents a system of indicators that make it possible to measure the impact and quality of employment policies. A system of indicators that has already been successfully applied in the Catalan region of Montsià.   Key words: Decent Work, decent work indicators, work-life balance, employability, productivity.


Author(s):  
J. A. Davis ◽  
S.A. Lorimer

Problem databases in STEM courses are used in tools for the development of student learning andfinal assessment. In addition, large problem databases are used to develop models for automatic assessment and feedback of students’ work. However, the availability of large, open source, problem databases for specificcourses is limited, and in-house development of a wide variety of problems can take years. In this paper, theframework for a problem database in STEM courses was created using semantic analysis of sentence structure and composition. Problem statements were analyzed to determine the key grammatical constructs that are used in commonly posed problems. Based on this analysis, software was developed to create large problemdatabases which allow for simple extension to other courses. Using a first-year mechanics course this softwarewas populated with a few generalized question and sentence structures to create a large problem database.


Author(s):  
J. Jin ◽  
W. Kaewsakul ◽  
J.W.M. Noordermeer ◽  
W.K. Dierkes ◽  
A. Blume

ABSTRACT The dispersion of rubber fillers, such as silica, can be divided into two categories: macro- and micro-dispersion. Both dispersions are important; however, to achieve the best reinforcement of rubber, micro-dispersion of silica is crucial. The common view is that these filler dispersions are strongly related. The micro-dispersion is understood as the consequence of the continuous breakdown of filler clusters from macro-dispersion. Yet, a large problem is that an objective unequivocal direct measurement method for micro-dispersion is not available. In this study, a set of parameters is defined that are anticipated to have an influence on the micro- as well as the macro-dispersion. Mixing trials are performed with varying silanization temperature and time, different amounts of silane coupling agent, and by using silicas with different structures and specific surface areas. The degrees of micro- and macro-dispersion are evaluated by measuring the Payne effect as an indirect method for micro-dispersion and using a dispergrader for quantitative measurement of macro-dispersion. The results show that the filler dispersion processes happen simultaneously but independently. These results are supported by earlier work of Blume and Uhrlandt, who stated as well that micro- and macro-dispersion are independent. The major influencing factors on micro- and macro-dispersion of silica are also identified.


Author(s):  
Emmanuel Sapin ◽  
Matthew C Keller

Abstract Motivation Pairwise comparison problems arise in many areas of science. In genomics, datasets are already large and getting larger, and so operations that require pairwise comparisons—either on pairs of SNPs or pairs of individuals—are extremely computationally challenging. We propose a generic algorithm for addressing pairwise comparison problems that breaks a large problem (of order n2 comparisons) into multiple smaller ones (each of order n comparisons), allowing for massive parallelization. Results We demonstrated that this approach is very efficient for calling identical by descent (IBD) segments between all pairs of individuals in the UK Biobank dataset, with a 250-fold savings in time and 750-fold savings in memory over the standard approach to detecting such segments across the full dataset. This efficiency should extend to other methods of IBD calling and, more generally, to other pairwise comparison tasks in genomics or other areas of science.


Computation ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 23
Author(s):  
Narisara Khamsing ◽  
Kantimarn Chindaprasert ◽  
Rapeepan Pitakaso ◽  
Worapot Sirirak ◽  
Chalermchat Theeraviriya

This research presents a solution to the family tourism route problem by considering daily time windows. To find the best solution for travel routing, the modified adaptive large neighborhood search (MALNS) method, using the four destructions and the four reconstructions approach, is applied here. The solution finding performance of the MALNS method is compared with an exact method running on the Lingo program. As shown by various solutions, the MALNS method can balance travel routing designs, including when many tourist attractions are present in each path. Furthermore, the results of the MALNS method are not significantly different from the results of the exact method for small problem sizes. For medium and large problem sizes, the MALNS method shows a higher performance and a smaller processing time for finding solutions. The values for the average total travel cost and average travel satisfaction rating derived by the MALNS method are approximately 0.18% for a medium problem and 0.05% for a large problem, 0.24% for a medium problem, and 0.21% for a large problem, respectively. The values derived from the exact method are slightly different. Moreover, the MALNS method calculation requires less processing time than the exact method, amounting to approximately 99.95% of the time required for the exact method. In this case study, the MALNS algorithm result shows a suitable balance of satisfaction and number of tourism places in relation to the differences between family members of different ages and genders in terms of satisfaction in tour route planning. The proposed solution methodology presents an effective high-quality solution, suggesting that the MALNS method has the potential to be a great competitive algorithm. According to the empirical results shown here, the MALNS method would be useful for creating route plans for tourism organizations that support travel route selection for family tours in Thailand.


2021 ◽  
Vol 249 ◽  
pp. 01004
Author(s):  
Anthony Thornton

Segregation in dense granular flows is a large problem in many areas of industry and the natural environment. In the last few years an advection-diffusion style framework has been shown to capture segregation in many geometries. Here, we review the different ways such a framework has been obtained by different authors, compare the forms and make recommendations for the best form to use. Finally, we briefly outline some of the remaining open-questions.


Author(s):  
Donald Davendra ◽  
Magdalena Metlicka ◽  
Magdalena Bialic-Davendra

This research involves the development of a compute unified device architecture (CUDA) accelerated 2-opt local search algorithm for the traveling salesman problem (TSP). As one of the fundamental mathematical approaches to solving the TSP problem, the time complexity has generally reduced its efficiency, especially for large problem instances. Graphic processing unit (GPU) programming, especially CUDA has become more mainstream in high-performance computing (HPC) approaches and has made many intractable problems at least reasonably solvable in acceptable time. This chapter describes two CUDA accelerated 2-opt algorithms developed to solve the asymmetric TSP problem. Three separate hardware configurations were used to test the developed algorithms, and the results validate that the execution time decreased significantly, especially for the large problem instances when deployed on the GPU.


Author(s):  
Gomasa Ramesh Mandala Sheshu Kumar and Palakurthi Manoj Kumar

Finite Element Method is very useful powerful technique. FEM is espescially used in civil and structural engineering disciplines, and other branch disciplines and applied sciences are also used. The main aim of FEM is reducing the time for large problem calculations. It is a numerical method, so easy to solve with sufficient time and accurately. By using FEM analyse the structural behaviour of structures and it is also useful for non-structural members also. It is one of the important numerical technique. It is used to solve problems in engineering disciplines in a mathematical way. It is used for structural analysis, fluid flow, heat transfer and mass transfer problems etc. there are number of softwares are available for FEM. In this some important are Ansys, Cosmos, Nisa, Nastran, Sap etc. there are most important principles are there. Which are really useful. In this paper, we discuss about introduction to FEM, dicretization, element and node, types of elements in FEM, some important equations.


Sign in / Sign up

Export Citation Format

Share Document