large matrix
Recently Published Documents


TOTAL DOCUMENTS

183
(FIVE YEARS 40)

H-INDEX

21
(FIVE YEARS 1)

Author(s):  
Roby Gauthier ◽  
Aidan Luscombe ◽  
Toby Bond ◽  
Michael Bauer ◽  
Michel Johnson ◽  
...  

Abstract Lithium-ion cells testing under different state of charge ranges, C-rates and cycling temperature have different degrees of lithium inventory loss, impedance growth and active mass loss. Here, a large matrix of polycrystalline NMC622/natural graphite Li-ion pouch cells were tested with seven different state of charge ranges (0-25, 0-50, 0-75, 0-100, 75-100, 50-100 and 25-100%), three different C-rates and at two temperatures. First, capacity fade was compared to a model developed by Deshpande and Bernardi. Second, after 2.5 years of cycling, detailed analysis by dV/dQ analysis, lithium-ion differential thermal analysis, volume expansion by Archimedes’ principle, electrode stack growth, ultrasonic transmissivity and x-ray computed tomography were undertaken. These measurements enabled us to develop a complete picture of cell aging for these cells. This then led to an empirical predictive model for cell capacity loss versus SOC range and calendar age. Although these particular cells exhibited substantial positive electrode active mass loss, this did not play a role in capacity retention because the cells were anode limited during full discharge under all the tests carried out here. However, the positive electrode mass loss was strongly coupled to positive electrode swelling and electrolyte “unwetting” that would eventually cause dramatic failure.


Sensors ◽  
2021 ◽  
Vol 21 (20) ◽  
pp. 6821
Author(s):  
Mingyang Song ◽  
Yingpeng Sang

Computing the determinant of large matrix is a time-consuming task, which is appearing more and more widely in science and engineering problems in the era of big data. Fortunately, cloud computing can provide large storage and computation resources, and thus, act as an ideal platform to complete computation outsourced from resource-constrained devices. However, cloud computing also causes security issues. For example, the curious cloud may spy on user privacy through outsourced data. The malicious cloud violating computing scripts, as well as cloud hardware failure, will lead to incorrect results. Therefore, we propose a secure outsourcing algorithm to compute the determinant of large matrix under the malicious cloud mode in this paper. The algorithm protects the privacy of the original matrix by applying row/column permutation and other transformations to the matrix. To resist malicious cheating on the computation tasks, a new verification method is utilized in our algorithm. Unlike previous algorithms that require multiple rounds of verification, our verification requires only one round without trading off the cheating detectability, which greatly reduces the local computation burden. Both theoretical and experimental analysis demonstrate that our algorithm achieves a better efficiency on local users than previous ones on various dimensions of matrices, without sacrificing the security requirements in terms of privacy protection and cheating detectability.


2021 ◽  
Author(s):  
Ali Ebrahimi ◽  
Akshit Goyal ◽  
Otto X Cordero

Microbial foraging in patchy environments, where resources are fragmented into particles or pockets embedded in a large matrix, plays a key role in natural environments. In the oceans and freshwater systems, particle-associated bacteria can interact with particle surfaces in different ways: some colonize only during short transients, while others form long-lived, stable colonies. We do not yet understand the ecological mechanisms by which both short-term and long-term colonizers can coexist. Here, we address this problem with a mathematical model that explains how marine populations with different detachment rates from particles can stably coexist. In our model, populations grow only while on particles, but also face the increased risk of mortality by predation and sinking. Key to coexistence is the idea that detachment from particles modulates both net growth and mortality, but in opposite directions, creating a trade-off between them. While slow-detaching populations show the highest growth return (i.e., produce more net offspring), they are more susceptible to suffer higher rates of mortality than fast-detaching populations. Surprisingly, fluctuating environments, manifesting as blooms of particles (favoring growth) and predators (favoring mortality) significantly expand the likelihood that populations with different detachment rates can coexist. Our study shows how the spatial ecology of microbes in the ocean can lead to a predictable diversification of foraging strategies and the coexistence of multiple taxa on a single growth-limiting resource.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Torstein Nesheim

PurposeThe author aims to explore and provide new insights on the resource manager role in a matrix-like project-based organization. What is the content of this role and the challenges as perceived by role incumbents?Design/methodology/approachThis a case study of a large project department in an industrial organization. The main source of data is interviews with 12 respondents.FindingsThe author describes and analyzes several mechanisms related to the key tasks of resource allocation, competence development and performance appraisals. Key challenges are the magnitude of stakeholders, especially the relationship with the project manager. To cope with these challenges, resource managers engage in extensive networking and recurrent dialog with the project manager. In addition, system knowledge and a sociable personality are perceived to enhance coping.Research limitations/implicationsOne case. 12 interviews were conducted at one point in time. The resource manager is a specific type of line manager, complementing a task (project) manager. Hypotheses and research questions based on empirical findings are identified.Practical implicationsOrganizational structure and the content of managerial roles are important in order to understand HRM challenges and activities in project-based organizations. Networking, relation maintenance and system knowledge and sociable and creative mindsets are key success factors for resource managers in large matrix-like project-based organizations.Originality/valueOne of the few in-depth studies of the resource manager in a project-based organization. A novel organizational context for the study of roles in HRM. A number of suggestions for further research.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0256584
Author(s):  
Sam Pimentel ◽  
Youssef Qranfal

The process of integrating observations into a numerical model of an evolving dynamical system, known as data assimilation, has become an essential tool in computational science. These methods, however, are computationally expensive as they typically involve large matrix multiplication and inversion. Furthermore, it is challenging to incorporate a constraint into the procedure, such as requiring a positive state vector. Here we introduce an entirely new approach to data assimilation, one that satisfies an information measure and uses the unnormalized Kullback-Leibler divergence, rather than the standard choice of Euclidean distance. Two sequential data assimilation algorithms are presented within this framework and are demonstrated numerically. These new methods are solved iteratively and do not require an adjoint. We find them to be computationally more efficient than Optimal Interpolation (3D-Var solution) and the Kalman filter whilst maintaining similar accuracy. Furthermore, these Kullback-Leibler data assimilation (KL-DA) methods naturally embed constraints, unlike Kalman filter approaches. They are ideally suited to systems that require positive valued solutions as the KL-DA guarantees this without need of transformations, projections, or any additional steps. This Kullback-Leibler framework presents an interesting new direction of development in data assimilation theory. The new techniques introduced here could be developed further and may hold potential for applications in the many disciplines that utilize data assimilation, especially where there is a need to evolve variables of large-scale systems that must obey physical constraints.


2021 ◽  
Author(s):  
Marcus J. Hamilton ◽  
Robert S. Walker ◽  
Briggs Buchanan ◽  
Damian E. Blasi ◽  
Claire L. Bowern

Estimating the total human population size (i.e., abundance) of the preagricultural planet is important for setting the baseline expectations for human-environment interactions if all energy and material requirements to support growth, maintenance, and well-being were foraged from local environments. However, demographic parameters and biogeographic distributions do not preserve directly in the archaeological record. Rather than attempting to estimate human abundance at some specific time in the past, a principled approach to making inferences at this scale is to ask what the human demography and biogeography of a hypothetical planet Earth would look like if populated by ethnographic hunter-gatherer societies. Given ethnographic hunter-gatherer societies likely include the largest, densest, and most complex foraging societies to have existed, we suggest population inferences drawn from this sample provide an upper bound to demographic estimates in prehistory. Our goal in this paper is to produce principled estimates of hunter-gatherer abundance, diversity, and biogeography. To do this we trained an extreme gradient boosting algorithm (XGBoost) to learn ethnographic hunter-gatherer population densities from a large matrix of climatic, environmental, and geographic data. We used the predictions generated by this model to reconstruct the hunter-gatherer biogeography of the rest of the planet. We find the human abundance of this world to be 6.1±2 million with an ethnolinguistic diversity of 8,330±2,770 populations, most of whom would have lived near coasts and in the tropics.Significance StatementUnderstanding the abundance of humans on planet Earth prior to the development of agriculture and the industrialized world is essential to understanding human population growth. However, the problem is that these features of human populations in the past are unknown and so must be estimated from data. We developed a machine learning approach that uses ethnographic and environmental data to reconstruct the demography and biogeography of planet Earth if populated by hunter-gatherers. Such a world would house about 6 million people divided into about 8,330 populations with a particular concentration in the tropics and along coasts.


2021 ◽  
Vol 5 (3) ◽  
pp. 98
Author(s):  
Andang Sunarto ◽  
Praveen Agarwal ◽  
Jumat Sulaiman ◽  
Jackel Vui Lung Chew ◽  
Shaher Momani

Research into the recent developments for solving fractional mathematical equations requires accurate and efficient numerical methods. Although many numerical methods based on Caputo’s fractional derivative have been proposed to solve fractional mathematical equations, the efficiency of obtaining solutions using these methods when dealing with a large matrix requires further study. The matrix size influences the accuracy of the solution. Therefore, this paper proposes a quarter-sweep finite difference scheme with a preconditioned relaxation-based approximation to efficiently solve a large matrix, which is based on the establishment of a linear system for a fractional mathematical equation. The paper presents the formulation of the quarter-sweep finite difference scheme that is used to approximate the selected fractional mathematical equation. Then, the derivation of a preconditioned relaxation method based on a quarter-sweep scheme is discussed. The design of a C++ algorithm of the proposed quarter-sweep preconditioned relaxation method is shown and, finally, efficiency analysis comparing the proposed method with several tested methods is presented. The contributions of this paper are the presentation of a new preconditioned matrix to restructure the developed linear system, and the derivation of an efficient preconditioned relaxation iterative method for solving a fractional mathematical equation. By simulating the solutions of time-fractional diffusion problems with the proposed numerical method, the study found that computing solutions using the quarter-sweep preconditioned relaxation method is more efficient than using the tested methods. The proposed numerical method is able to solve the selected problems with fewer iterations and a faster execution time than the tested existing methods. The efficiency of the methods was evaluated using different matrix sizes. Thus, the combination of a quarter-sweep finite difference method, Caputo’s time-fractional derivative, and the preconditioned successive over-relaxation method showed good potential for solving different types of fractional mathematical equations, and provides a future direction for this field of research.


2021 ◽  
Vol 263 (5) ◽  
pp. 1041-1052
Author(s):  
Martin Richter ◽  
Gregor Tanner ◽  
Bruno Carpentieri ◽  
David J. Chappell

Dynamical energy analysis (DEA) is a computational method to address high-frequency vibro-acoustics in terms of ray densities. It has been used to describe wave equations governing structure-borne sound in two-dimensional shell elements as well as three-dimensional electrodynamics. To describe either of those problems, the wave equation is reformulated as a propagation of boundary densities. These densities are expressed by finite dimensional approximations. All use-cases have in common that they describe the resulting linear problem using a very large matrix which is block-sparse, often real-valued, but non-symmetric. In order to efficiently use DEA, it is therefore important to also address the performance of solving the corresponding linear system. We will cover three aspects in order to reduce the computational time: The use of preconditioners, properly chosen initial conditions, and choice of iterative solvers. Especially the aspect of potentially reusing preconditioners for different input parameters is investigated.


2021 ◽  
Author(s):  
Erwan Auburtin ◽  
Quentin Delivré ◽  
Jason McConochie ◽  
Jim Brown ◽  
Yuriy Drobyshevski

Abstract The Prelude Floating Liquefied Natural Gas (FLNG) platform is designed to offload liquefied natural and petroleum gas products to carrier vessels moored in a Side-by-Side (SBS) configuration. Prior to the mooring operation, the carrier vessel is escorted and held alongside the FLNG with the assistance of tugs connected to her bow and stern to ensure sufficient control over the vessel in this critical phase. In order to better understand the impact of environmental conditions, to determine the optimum length, strength, material and configuration of the towline stretcher, and to estimate the maximum operable environments, coupled multi-body simulations have been performed in time domain. The numerical model, which considered both the LNG carrier and the forward tug, was calibrated using full-scale measurements of tug motions and tow line tension recorded during a real approach and berthing manoeuvre at Prelude FLNG. The measured environment effects were generated numerically and the model parameters were adjusted to reproduce the recorded behavior as accurately as possible. Since actions of the tug master are difficult to model numerically and only the statistical environment parameters are known, a simplified approach has been adopted for modelling the tug propulsion and steering using a combination of static forces, stiffness and linear and quadratic damping for relevant horizontal degrees of freedom. The calibrated numerical model was first subjected to several sensitivity assessments of the modelling level (single- or multi-body, inclusion of second-order wave loads, inclusion of forward speed). Then sensitivity studies were performed to help address operational requirements related to the wave height and direction, and the stretcher length and strength. The conclusions have been taken into consideration for the selection of the tow line configurations for future operations. Finally, the calibrated coupled LNG carrier and tug model was used to derive Prelude-specific tug operability criteria that may be used for decision-making based on weather forecasts, prior to the SBS offloading operations. A large matrix of swell and wind driven waves was simulated over a range of wave heights, periods, directions and static towing forces to allow a criterion to be developed based on a stochastic extreme tow line tension. Such criterion considers relevant wave parameters while remaining simplified enough for easy use in operations. This paper describes the assumptions and process to numerically model the towing configuration and calibrate the different coefficients, discusses the results obtained for the various sensitivities, and explains the operability criteria. Important conclusions and lessons learnt are also shared.


Electronics ◽  
2021 ◽  
Vol 10 (12) ◽  
pp. 1406
Author(s):  
Salih Sarp ◽  
Murat Kuzlu ◽  
Emmanuel Wilson ◽  
Umit Cali ◽  
Ozgur Guler

Artificial Intelligence (AI) has been among the most emerging research and industrial application fields, especially in the healthcare domain, but operated as a black-box model with a limited understanding of its inner working over the past decades. AI algorithms are, in large part, built on weights calculated as a result of large matrix multiplications. It is typically hard to interpret and debug the computationally intensive processes. Explainable Artificial Intelligence (XAI) aims to solve black-box and hard-to-debug approaches through the use of various techniques and tools. In this study, XAI techniques are applied to chronic wound classification. The proposed model classifies chronic wounds through the use of transfer learning and fully connected layers. Classified chronic wound images serve as input to the XAI model for an explanation. Interpretable results can help shed new perspectives to clinicians during the diagnostic phase. The proposed method successfully provides chronic wound classification and its associated explanation to extract additional knowledge that can also be interpreted by non-data-science experts, such as medical scientists and physicians. This hybrid approach is shown to aid with the interpretation and understanding of AI decision-making processes.


Sign in / Sign up

Export Citation Format

Share Document