smooth transformation
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 10)

H-INDEX

5
(FIVE YEARS 1)

2021 ◽  
pp. 108128652110592
Author(s):  
Van Hoi Nguyen ◽  
Guy Casale ◽  
Loïc Le Marrec

This paper introduces tools on fibre geometry towards the framework of mechanics of microstructured continuum. The material is modelled by an appropriate bundle for which the associated connection and metric are induced from the Euclidean space by a smooth transformation represented by a fibre morphism from the bundle to Euclidean space. Furthermore, the general kinematic structure of the theory includes macroscopic and microscopic fields in a multiscaled approach, including large transformation. Defects appear in this geometrical point of view by an induced curvature, torsion and non-metricity tensor in the induced geometry. Special attention is given to transport along a finite path in order to extend the standard infinitesimal analysis of torsion and curvature to a macroscopical point of view. Both theoretical and numerical analysis may be handled without additional difficulties. Accordingly, several examples of transformation involving the distribution of material defects are exhibited and analysed.


Author(s):  
Maryam Sabzevari ◽  
Gonzalo Martínez-Muñoz ◽  
Alberto Suárez

AbstractHeterogeneous ensembles consist of predictors of different types, which are likely to have different biases. If these biases are complementary, the combination of their decisions is beneficial and could be superior to homogeneous ensembles. In this paper, a family of heterogeneous ensembles is built by pooling classifiers from M homogeneous ensembles of different types of size T. Depending on the fraction of base classifiers of each type, a particular heterogeneous combination in this family is represented by a point in a regular simplex in M dimensions. The M vertices of this simplex represent the different homogeneous ensembles. A displacement away from one of these vertices effects a smooth transformation of the corresponding homogeneous ensemble into a heterogeneous one. The optimal composition of such heterogeneous ensemble can be determined using cross-validation or, if bootstrap samples are used to build the individual classifiers, out-of-bag data. The proposed heterogeneous ensemble building strategy, composed of neural networks, SVMs, and random trees (i.e. from a standard random forest), is analyzed in a comprehensive empirical analysis and compared to a benchmark of other heterogeneous and homogeneous ensembles. The achieved results illustrate the gains that can be achieved by the proposed ensemble creation method with respect to both homogeneous ensembles and to the tested heterogeneous building strategy at a fraction of the training cost.


2021 ◽  
Vol 26 (6) ◽  
pp. 1-36
Author(s):  
Pushpita Roy ◽  
Ansuman Banerjee

Digital Microfluidics is an emerging technology for automating laboratory procedures in biochemistry. With more and more complex biochemical protocols getting mapped to biochip devices and microfluidics receiving a wide adoption, it is becoming indispensable to develop automated tools and synthesis platforms that can enable a smooth transformation from complex cumbersome benchtop laboratory procedures to biochip execution. Given an informal/semi-formal assay description and a target microfluidic grid architecture on which the assay has to be implemented, a synthesis tool typically translates the high-level assay operations to low-level actuation sequences that can drive the assay realization on the grid. With more and more complex biochemical assay protocols being taken up for synthesis and biochips supporting a wider variety of operations (e.g., MicroElectrode Dot Arrays (MEDAs)), the task of assay synthesis is getting intricately complex. Errors in the synthesized assay descriptions may have undesirable consequences in assay operations, leading to unacceptable outcomes after execution on the biochips. In this work, we focus on the challenge of examining the correctness of synthesized protocol descriptions, before they are taken up for realization on a microfluidic biochip. In particular, we take up a protocol description synthesized for a MEDA biochip and adopt a formal analysis method to derive correctness proofs or a violation thereof, pointing to the exact operation in the erroneous translation. We present experimental results on a few bioassay protocols and show the utility of our framework for verifiable protocol synthesis.


Author(s):  
Liqing Li ◽  
Mengqi Guan ◽  
Zixuan Liu

In order to study the dynamic influence mechanism of population characteristics on health expenses, the panel data of 31 provinces, autonomous regions and municipalities directly under the Central Government (excluding Hong Kong, Macao and Taiwan) from 2009 to 2016 were used to construct panel smooth transformation model with population characteristics as heterogeneous variables for empirical analysis. The results show that the population number, population structure and population distribution have significant nonlinear effects on per capita health expenses, and the threshold values of population number, population structure and population distribution are 24.6686 million people, 6.9928%, 40.707%. When the threshold is crossed, The effect of population size on per capita health costs is reduced smoothly, The effect of population structure and population distribution on per capita health cost changed from inhibition to promotion.Therefore, according to the relationship between current population characteristics and health expenses, all regions should optimize the structure of health expenses expenditure, improve the level of health services, especially expand the scale of education and training of targeted rural medical students, strengthen the education and training of rural doctors, in order to narrow the difference between urban and rural diagnosis and treatment level.


Author(s):  
Hailiang Du

AbstractThe evaluation of probabilistic forecasts plays a central role both in the interpretation and in the use of forecast systems and their development. Probabilistic scores (scoring rules) provide statistical measures to assess the quality of probabilistic forecasts. Often, many probabilistic forecast systems are available while evaluations of their performance are not standardized, with different scoring rules being used to measure different aspects of forecast performance. Even when the discussion is restricted to strictly proper scoring rules, there remains considerable variability between them; indeed strictly proper scoring rules need not rank competing forecast systems in the same order when none of these systems are perfect. The locality property is explored to further distinguish scoring rules. The nonlocal strictly proper scoring rules considered are shown to have a property that can produce “unfortunate” evaluations. Particularly the fact that Continuous Rank Probability Score prefers the outcome close to the median of the forecast distribution regardless the probability mass assigned to the value at/near the median raises concern to its use. The only local strictly proper scoring rules, the logarithmic score, has direct interpretations in terms of probabilities and bits of information. The nonlocal strictly proper scoring rules, on the other hand, lack meaningful direct interpretation for decision support. The logarithmic score is also shown to be invariant under smooth transformation of the forecast variable, while the nonlocal strictly proper scoring rules considered may, however, change their preferences due to the transformation. It is therefore suggested that the logarithmic score always be included in the evaluation of probabilistic forecasts.


Author(s):  
Ram Prasad K. ◽  
Nishal M. ◽  
Arunram S. P.

The current era is moving towards digitalization of the system in every aspect, be it in manufacturing, automobile, or service sectors. This long way in digitalization has been done by overcoming a lot of barriers in their path. The healthcare sector is no exception to this and has also shifted to the ideology of moving to digitalization. This chapter discusses how the healthcare sector is adopting digitalization and the presence of internet of things (IoT) in the healthcare system. The barriers for implementing IoT in healthcare are identified and have been clustered under different categories with each being discussed in detail for better understanding of how it stands as a barrier to implementing IoT in the healthcare sector. The drivers for implementing IoT in healthcare are also identified and grouped accordingly. With a detailed study on barriers and drivers, this chapter helps to identify the key barriers and drivers that are to be addressed while implementing IoT in healthcare systems for a smooth transformation from conventional medical care to e-healthcare.


2020 ◽  
Author(s):  
Vasil Dinev Penchev

In fact, the first law of conservation (that of mass) was found in chemistry and generalized to the conservation of energy in physics by means of Einstein’s famous “E=mc2”. Energy conservation is implied by the principle of least action from a variational viewpoint as in Emmy Noether’s theorems (1918): any chemical change in a conservative (i.e. “closed”) system can be accomplished only in the way conserving its total energy. Bohr’s innovation to found Mendeleev’s periodic table by quantum mechanics implies a certain generalization referring to the quantum leaps as if accomplished in all possible trajectories (according to Feynman’s interpretation) and therefore generalizing the principle of least action and needing a certain generalization of energy conservation as to any quantum change. The transition from the first to the second theorem of Emmy Noether represents well the necessary generalization: its chemical meaning is the generalization of any chemical reaction to be accomplished as if any possible course of time rather than in the standard evenly running time (and equivalent to energy conservation according to the first theorem).The problem: If any quantum change is accomplished in all possible “variations (i.e. “violations) of energy conservation” (by different probabilities), what (if any) is conserved?An answer: quantum information is what is conserved. Indeed, it can be particularly defined as the counterpart (e.g. in the sense of Emmy Noether’s theorems) to the physical quantity of action (e.g. as energy is the counterpart of time in them). It is valid in any course of time rather than in the evenly running one. That generalization implies a generalization of the periodic table including any continuous and smooth transformation between two chemical elements.


2019 ◽  
Vol 88 (2) ◽  
pp. 55-71
Author(s):  
Andreas Breitenfellner ◽  
Wolfgang Pointner ◽  
Helene Schuberth

Summary: Central banks and financial supervisors approach ‘green finance’ mostly to preserve macroeconomic and financial stability according to their mandates. Obviously, climate change poses severe risks to households, firms and their financial intermediaries. These risks tend to be correlated and their scope goes beyond historical evidence, therefore their impact on the financial system is difficult to model. On the other hand, the planned decarbonization of the global economy creates enormous investment opportunities. Central banks and supervisors play a role in safeguarding the financial system’s smooth transformation from funding old, brown industries to funding a new green economy. The ‘Network for Greening the Financial System’ facilitates an exchange of experience and ideas among central banks and financial supervisors; we present some of their findings. While central banks can and should contribute to making the economy and the financial system more sustainable, they can only complement, but not substitute for, decisive political action by governments.


2019 ◽  
Vol 116 (8) ◽  
pp. 2821-2830 ◽  
Author(s):  
Moritz Lang ◽  
Mikhail Shkolnikov

The abelian sandpile is a cellular automaton which serves as the archetypical model to study self-organized criticality, a phenomenon occurring in various biological, physical, and social processes. Its recurrent configurations form an abelian group, whose identity is a fractal composed of self-similar patches. Here, we analyze the evolution of the sandpile identity under harmonic fields of different orders. We show that this evolution corresponds to periodic cycles through the abelian group characterized by the smooth transformation and apparent conservation of the patches constituting the identity. The dynamics induced by second- and third-order harmonics resemble smooth stretchings and translations, respectively, while the ones induced by fourth-order harmonics resemble magnifications and rotations. Based on an extensive analysis of these sandpile dynamics on domains of different size, we conjecture the existence of several scaling limits for infinite domains. Furthermore, we show that the space of harmonic functions provides a set of universal coordinates identifying configurations between different domains, which directly implies that the sandpile group admits a natural renormalization. Finally, we show that the harmonic fields can be induced by simple Markov processes and that the corresponding stochastic dynamics show remarkable robustness. Our results suggest that harmonic fields might split the sandpile group into subsets showing different critical coefficients and that it might be possible to extend the fractal structure of the identity beyond the boundaries of its domain.


Sign in / Sign up

Export Citation Format

Share Document