theoretical ground
Recently Published Documents


TOTAL DOCUMENTS

179
(FIVE YEARS 67)

H-INDEX

12
(FIVE YEARS 3)

Author(s):  
Angbo Fang

Abstract Quite recently I have proposed a nonperturbative dynamical effective field model (DEFM) to quantitatively describe the dynamics of interacting ferrofluids. Its predictions compare very well with the results from Brownian dynamics simulations. In this paper I put the DEFM on firm theoretical ground by deriving it within the framework of dynamical density functional theory (DDFT), taking into account nonadiabatic effects. The DEFM is generalized to inhomogeneous finite-size samples for which the macroscopic and mesoscopic scale separation is nontrivial due to the presence of long-range dipole-dipole interactions. The demagnetizing field naturally emerges from microscopic considerations and is consistently accounted for. The resulting mesoscopic dynamics only involves macroscopically local quantities such as local magnetization and Maxwell field. Nevertheless, the local demagnetizing field essentially couples to magnetization at distant macroscopic locations. Thus, a two-scale parallel algorithm, involving information transfer between different macroscopic locations, can be applied to fully solve the dynamics in an inhomogeneous sample. I also derive the DEFM for polydisperse ferrofluids, in which different species can be strongly coupled to each other dynamically. I discuss the underlying assumptions in obtaining a thermodynamically consistent polydisperse magnetization relaxation equation, which is of the same generic form as that for monodisperse ferrofluids. The theoretical advances presented in this paper are important for both qualitative understanding and quantitative modeling of the dynamics of ferrofluids and other dipolar systems.


2021 ◽  
Vol 2021 (12) ◽  
pp. 124002
Author(s):  
Stéphane d’Ascoli ◽  
Levent Sagun ◽  
Giulio Biroli

Abstract A recent line of research has highlighted the existence of a ‘double descent’ phenomenon in deep learning, whereby increasing the number of training examples N causes the generalization error of neural networks (NNs) to peak when N is of the same order as the number of parameters P. In earlier works, a similar phenomenon was shown to exist in simpler models such as linear regression, where the peak instead occurs when N is equal to the input dimension D. Since both peaks coincide with the interpolation threshold, they are often conflated in the literature. In this paper, we show that despite their apparent similarity, these two scenarios are inherently different. In fact, both peaks can co-exist when NNs are applied to noisy regression tasks. The relative size of the peaks is then governed by the degree of nonlinearity of the activation function. Building on recent developments in the analysis of random feature models, we provide a theoretical ground for this sample-wise triple descent. As shown previously, the nonlinear peak at N = P is a true divergence caused by the extreme sensitivity of the output function to both the noise corrupting the labels and the initialization of the random features (or the weights in NNs). This peak survives in the absence of noise, but can be suppressed by regularization. In contrast, the linear peak at N = D is solely due to overfitting the noise in the labels, and forms earlier during training. We show that this peak is implicitly regularized by the nonlinearity, which is why it only becomes salient at high noise and is weakly affected by explicit regularization. Throughout the paper, we compare analytical results obtained in the random feature model with the outcomes of numerical experiments involving deep NNs.


Quanta ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 42-54
Author(s):  
Tarek Khalil ◽  
Jean Richert

The study of the physical properties of open quantum systems is at the heart of many investigations, which aim to describe their dynamical evolution on theoretical ground and through physical realizations. Here, we develop a presentation of different aspects, which characterize these systems and confront different physical situations that can be realized leading to systems, which experience Markovian, non-Markovian, divisible or non-divisible interactions with the environments to which they are dynamically coupled. We aim to show how different approaches describe the evolution of quantum systems subject to different types of interactions with their environments.Quanta 2021; 10: 42–54.


2021 ◽  
Vol 13 (22) ◽  
pp. 12656
Author(s):  
Yasheng Chen ◽  
Mohammad Islam Biswas

The COVID-19 pandemic has severe impacts on global health and social and economic safety. The present study discusses strategies for turning the COVID-19 crisis into opportunities to use artificial intelligence (AI) and big data in business operations. Based on the shared experience and theoretical ground, researchers identified five major business challenges during the COVID-19 pandemic: production and supply-chain disruption, appropriate business model selection, inventory management, budget planning, and workforce management. These five challenges were outlined with eight business cases as examples of companies that had already utilized AI and big data for their business operations during the COVID-19 pandemic. The outcomes of this study provide valuable insights into contemporary social science research and business management with AI and big data applications as a business response to any crisis in the future.


2021 ◽  
Vol 3 (1) ◽  
pp. 79-93
Author(s):  
Alberto Andres

This article aims at investigating the emergence of the American folk horror revival of the 2010s, focusing on texts such as Ari Aster’s Midsommar (2019) or Robert Eggers’s The VVitch (2015). This survey of the folk horror revival will inevitably lead us to the genre’s past, particularly to the so-called Unholy Trinity, comprised by three films released in Great Britain during the late 1960s and early 1970s. This temporal and geographical dislocation will be situated against a larger background of cultural production, arguing that the appearance of the folk horror revival sheds some light on the debate on nostalgia and pastiche as the predominant artistic modes under late capitalism. The notion of hauntology, as explored by Jacques Derrida, Mark Fisher, or Katy Shaw, will be used throughout the essay in order to provide a form theoretical ground on which this debate can take place.


2021 ◽  
pp. 1-44
Author(s):  
Marco Bernini

After radical blows inflicted by empiricist or pragmatist thinkers like David Hume or William James, cognitive science is today largely rejecting Descartes’ view of the self as an internal and unified “thing” or substance. The growing amount of books by foremost philosophers of mind (Metzinger 2009) and neuroscientists (Hood 2012; Gazzaniga 2012) on the so-called “illusion of self” (Albahari 2006; see also Siderits et al. 2011) is the most tangible sign of how the current dominating theory is rather that we are not who we feel or think we are. If hardly anybody questions that our common phenomenological sense of being or having a self is real, the illusion would reside precisely in a misalignment between this phenomenological feeling and a different underlying ontology. This debate has led to a vital new “tradition of disagreements” (Gallagher 2012, 122–127), with a variety of competing or complementing explanatory models attempting to account for what is illusory about the self, for what is not, and for how this illusion is generated. These models will be progressively reviewed throughout this chapter, as the theoretical ground against which to understand and analyze Beckett’s own variety of modeling solutions in exploring what he also called, in a letter to George Duthuit on July 27, 1948, “the illusion of the human and the fully realised” (BL II, 86).


Author(s):  
Mohamed Souhassou ◽  
Iurii Kiblin ◽  
Maxime Deutsch ◽  
Ariste Bolivar Voufack ◽  
Claude Lecomte ◽  
...  

MOLLYNX is a new crystallographic tool developed to access a more precise description of the spin-dependent electron density of magnetic crystals, taking advantage of the richness of experimental information from high-resolution X-ray diffraction (XRD), unpolarized neutron (UND) and polarized neutron diffraction (PND). This new program is based either on the well known Hansen–Coppens multipolar model (MOLLYNX-mult) or on a new expansion over a set of atomic orbitals (MOLLYNX-orb). The main difference between the two models is the basis of the expansion: in MOLLYNX-mult the expansion is over atom centered real spherical harmonics, in MOLLYNX-orb the expansion is over a set of atomic orbitals with which mono and bicentric contributions are calculated. This new approach of MOLLYNX-orb can also be applied to nonmagnetic crystals. This paper summarizes the theoretical ground of two models and describes the first applications to organic, organometallic and inorganic magnetic materials


2021 ◽  
pp. 109467052110322
Author(s):  
Ilias Danatzis ◽  
Ingo O. Karpen ◽  
Michael Kleinaltenkamp

Fueled by technological advances, service delivery today is increasingly realized among multiple actors beyond dyadic service encounters. Customers, for example, often collaborate with peers, service employees, platform providers, or other actors in a service ecosystem to realize desired outcomes. Yet such multi-actor settings pose greater demands for both customers and employees given added connectivity, changing roles, and responsibilities. Advancing prior dyadic readiness conceptualizations, this article lays the theoretical ground for an ecosystem-oriented understanding of readiness, which we refer to as actor ecosystem readiness (AER) . Grounded in a six-stage systematic synthesis of literature from different disciplines, our AER concept unpacks the cognitive, emotional, interactional, and motivational conditions that enable a customer or an employee to navigate a service ecosystem effectively. Building on human capital resource literature, we propose a multilevel framework around five sets of propositions that theorize AER’s nomological interdependencies across ecosystem levels. In articulating the process of how AER results in higher-level ecosystem outcomes, we demonstrate how AER serves as a microfoundation of service ecosystem effectiveness. By bridging this micro–macro divide, our AER concept and framework advance multilevel theory on human readiness and critically refine the service ecosystem concept itself while providing managerial guidance and an extensive future research agenda.


2021 ◽  
Author(s):  
Miguel Filipe Passos Sério Lourenço ◽  
Miguel Fernández Ruiz ◽  
Stein Atle Haugerud ◽  
Johan Blaauwendraad ◽  
Stathis Bousias ◽  
...  

Following the long-standing tradition of fib in promoting the use of consistent design methods, strut-and-tie models were formally incorporated in Model Code 1990 to serve as the design basis for discontinuity regions. This choice was largely acknowledged as a sound approach for design and was thereafter followed in many national standards. For Model Code 2010, some update and revision of the previous provisions was performed, but the scope of the method was particularly broadened by introducing its complementary use with the stress field method. Since Model Code 2010, significant new knowledge has been generated in this topic. Particularly, the use of software implementing the theoretical ground of the stress field method is becoming increasingly popular and efficient, allowing for design, optimisation and assessment of structures in a simple, transparent and accessible manner. In this Bulletin, the current state-of-the-art of the strut-and-tie models (STM) and the stress field models (SFM) is presented. Reference is not only made to classical rigid-plastic solutions, but also to solutions considering compatibility of deformations, such as elastic-plastic approaches or models allowing investigation of serviceability behaviour and deformation capacity of concrete structures. It is shown in the Bulletin that all models share the same ground and fundamental hypotheses. Their results are presented in a unitary and consistent manner by means of compression fields in the concrete and stresses in the reinforcement. The consistency amongst these approaches and their potential use in practice is also explored by means of the Levels-of-Approximation (LoA) approach as described in Model Code 2010. Another effort in this Bulletin has been devoted to provide comparisons of the solutions according to strut-and-tie and stress fields to tests, in order to discuss on their pertinence and limitations. This perspective is also completed with practical examples presented of structures actually designed with this technique and where the potential of these methods can be appreciated in a clear manner. Finally, a number of special topics are also covered in the Bulletin, related to numerical optimisation, verifications at serviceability states, minimum reinforcement or the applicability of the methods under cyclic or reversal actions. This Bulletin not only aims to give state-of-the-art rules and methods to design according to these techniques, but also to provide an outlook of how these methods could be implemented in future standards. This material also serves as the background document for the revision of the current provisions of Model Code 2010 in the new Model Code 2020.


2021 ◽  
Vol 11 (9) ◽  
pp. 872
Author(s):  
Lorena Angela Cattaneo ◽  
Anna Chiara Franquillo ◽  
Alessandro Grecucci ◽  
Laura Beccia ◽  
Vincenzo Caretti ◽  
...  

Several studies have suggested a correlation between heart rate variability (HRV), emotion regulation (ER), psychopathological conditions, and cognitive functions in the past two decades. Specifically, recent data seem to support the hypothesis that low-frequency heart rate variability (LF-HRV), an index of sympathetic cardiac control, correlates with worse executive performances, worse ER, and specific psychopathological dimensions. The present work aims to review the previous findings on these topics and integrate them from two main cornerstones of this perspective: Porges’ Polyvagal Theory and Thayer and Lane’s Neurovisceral Integration Model, which are necessary to understand these associations better. For this reason, based on these two approaches, we point out that low HRV is associated with emotional dysregulation, worse cognitive performance, and transversal psychopathological conditions. We report studies that underline the importance of considering the heart-brain relation in order to shed light on the necessity to implement psychophysiology into a broader perspective on emotions, mental health, and good cognitive functioning. This integration is beneficial not only as a theoretical ground from which to start for further research studies but as a starting point for new theoretical perspectives useful in clinical practice.


Sign in / Sign up

Export Citation Format

Share Document