Signatures of blazar spectra in the electromagnetic and hadronic intergalactic cascade models

2017 ◽  
Vol 81 (4) ◽  
pp. 443-445
Author(s):  
T. A. Dzhatdoev ◽  
A. P. Kircheva ◽  
A. A. Lyukshin ◽  
E. V. Khalikov
Keyword(s):  
Galaxies ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 36
Author(s):  
Yoshiyuki Inoue ◽  
Dmitry Khangulyan ◽  
Akihiro Doi

To explain the X-ray spectra of active galactic nuclei (AGN), non-thermal activity in AGN coronae such as pair cascade models has been extensively discussed in the past literature. Although X-ray and gamma-ray observations in the 1990s disfavored such pair cascade models, recent millimeter-wave observations of nearby Seyferts have established the existence of weak non-thermal coronal activity. In addition, the IceCube collaboration reported NGC 1068, a nearby Seyfert, as the hottest spot in their 10 yr survey. These pieces of evidence are enough to investigate the non-thermal perspective of AGN coronae in depth again. This article summarizes our current observational understanding of AGN coronae and describes how AGN coronae generate high-energy particles. We also provide ways to test the AGN corona model with radio, X-ray, MeV gamma ray, and high-energy neutrino observations.


2021 ◽  
pp. 49-59
Author(s):  
S. P. Seleznev ◽  
O. B. Tamrazova ◽  
V. Yu. Sergeev ◽  
V. G. Nikitaev ◽  
A. N. Pronichev

This review article provides an overview of the etiology, pathogenesis, clinical presentation, diagnosis, and treatment methods for actinic keratosis, keratoacanthoma, and Bowen’s disease. The provoking factors are described, where the main importance is attached to insolation, previous immunosuppression and immunodeficiency and trauma. The pathogenesis of these diseases is described in the form of cascade models. Various clinical forms and their main dermatoscopic features, suitable for digital processing in automated diagnostic systems, are presented. A stepwise approach to the treatment of these nosologies is described, and a preliminary prognosis is assessed based on the duration of progression and the likelihood of transformation into squamous cell carcinoma. Given the fact that dermato-oncologists have not yet come to a consensus on the classification of the described diseases, in this article they are considered as a borderline, thereby demonstrating a fine line of transition from a precancerous state to cancer in situ.


2018 ◽  
Vol 80 (6) ◽  
Author(s):  
Siti Mariam Saad ◽  
Abdul Aziz Jemain ◽  
Noriszura Ismail

This study evaluates the utility and suitability of a simple discrete multiplicative random cascade model for temporal rainfall disaggregation. Two of a simple random cascade model, namely log-Poisson and log-Normal  models are applied to simulate hourly rainfall from daily rainfall at seven rain gauge stations in Peninsular Malaysia. The cascade models are evaluated based on the capability to simulate data that preserve three important properties of observed rainfall: rainfall variability, intermittency and extreme events. The results show that both cascade models are able to simulate reasonably well the commonly used statistical measures for rainfall variability (e.g. mean and standard deviation) of hourly rainfall. With respect to rainfall intermittency, even though both models are underestimated, the observed dry proportion, log-Normal  model is likely to simulate number of dry spells better than log-Poisson model. In terms of rainfall extremes, it is demonstrated that log-Poisson and log-Normal  models gave a satisfactory performance for most of the studied stations herein, except for Dungun and Kuala Krai stations, which both located in the east part of Peninsula.


2019 ◽  
Author(s):  
Marc Schleiss

Abstract. Spatial downscaling of rainfall fields is a challenging mathematical problem for which many different types of methods have been proposed. One popular solution consists in redistributing rainfall amounts over smaller and smaller scales by means of a discrete multiplicative random cascade (DMRC). This works well for slowly varying, homogeneous rainfall fields but often fails in the presence of intermittency (i.e., large amounts of zero rainfall values). The most common workaround in this case is to use two separate cascade models, one for the occurrence and another for the intensity. In this paper, a new and simpler approach based on the notion of equal-volume areas (EVAs) is proposed. Unlike classical cascades where rainfall amounts are redistributed over grid cells of equal size, the EVA cascade splits grid cells into areas of different sizes, each of them containing exactly half of the original amount of water. The relative areas of the sub-grid cells are determined by drawing random values from a logit-normal cascade generator model with scale and intensity dependent standard deviation. The process ends when the amount of water in each sub-grid cell is smaller than a fixed bucket capacity, at which point the output of the cascade can be re-sampled over a regular Cartesian mesh. The present paper describes the implementation of the EVA cascade model and gives some first results for 100 selected events in the Netherlands. Performance is assessed by comparing the outputs of the EVA model to bilinear interpolation and to a classical DMRC model based on fixed grid cell sizes. Results show that on average, the EVA cascade outperforms the classical method, producing fields with more realistic distributions, small-scale extremes and spatial structures. Improvements are mostly credited to the higher robustness of the EVA model to the presence of intermittency and to the lower variance of its generator. However, improvements are not systematic and both approaches have their advantages and weaknesses. For example, while the classical cascade tends to overestimate small-scale extremes and variability, the EVA model tends to produce fields that are slightly too smooth and blocky compared with observations.


2019 ◽  
Vol 959 ◽  
pp. 32-45 ◽  
Author(s):  
Max Rehberger ◽  
Michael Hiete

Cascade use - a concept for increasing resource efficiency by multiple use of resources - gains in importance, in particular for bio-based materials. Allocation of environmental burdens and costs along the cascade chain plays a major role in deciding whether to establish a cascade or not. This highlights the need for a methodology for properly assessing different types of cascades. To provide guidance in terms of choice of allocation procedure available from life cycle assessment (LCA), Monte Carlo analysis is used. Especially hybrid, individually tailored allocation approaches can be evaluated in this way. The results show a high diversity of possible outcomes in terms of general allocation intensity (how much burden is shifted between steps of the cascade), rank reversals (exchange of positions inside the burden ranking) and variance of the overall results of the cascade allocation. Results are valuable for selecting an allocation procedure for cascade LCA and for further interpreting cascade models using specific allocation procedures.


2019 ◽  
Vol 122 (4) ◽  
pp. 1473-1490 ◽  
Author(s):  
Jan Karbowski

Dendritic spines, the carriers of long-term memory, occupy a small fraction of cortical space, and yet they are the major consumers of brain metabolic energy. What fraction of this energy goes for synaptic plasticity, correlated with learning and memory? It is estimated here based on neurophysiological and proteomic data for rat brain that, depending on the level of protein phosphorylation, the energy cost of synaptic plasticity constitutes a small fraction of the energy used for fast excitatory synaptic transmission, typically 4.0–11.2%. Next, this study analyzes a metabolic cost of new learning and its memory trace in relation to the cost of prior memories, using a class of cascade models of synaptic plasticity. It is argued that these models must contain bidirectional cyclic motifs, related to protein phosphorylation, to be compatible with basic thermodynamic principles. For most investigated parameters longer memories generally require proportionally more energy to store. The exceptions are the parameters controlling the speed of molecular transitions (e.g., ATP-driven phosphorylation rate), for which memory lifetime per invested energy can increase progressively for longer memories. Furthermore, in general, a memory trace decouples dynamically from a corresponding synaptic metabolic rate such that the energy expended on new learning and its memory trace constitutes in most cases only a small fraction of the baseline energy associated with prior memories. Taken together, these empirical and theoretical results suggest a metabolic efficiency of synaptically stored information. NEW & NOTEWORTHY Learning and memory involve a sequence of molecular events in dendritic spines called synaptic plasticity. These events are physical in nature and require energy, which has to be supplied by ATP molecules. However, our knowledge of the energetics of these processes is very poor. This study estimates the empirical energy cost of synaptic plasticity and considers theoretically a metabolic rate of learning and its memory trace in a class of cascade models of synaptic plasticity.


2020 ◽  
Vol 500 (1) ◽  
pp. 718-735
Author(s):  
Alexander V Krivov ◽  
Mark C Wyatt

ABSTRACT Debris belts on the periphery of planetary systems, encompassing the region occupied by planetary orbits, are massive analogues of the Solar system’s Kuiper belt. They are detected by thermal emission of dust released in collisions amongst directly unobservable larger bodies that carry most of the debris disc mass. We estimate the total mass of the discs by extrapolating up the mass of emitting dust with the help of collisional cascade models. The resulting mass of bright debris discs appears to be unrealistically large, exceeding the mass of solids available in the systems at the preceding protoplanetary stage. We discuss this ‘mass problem’ in detail and investigate possible solutions to it. These include uncertainties in the dust opacity and planetesimal strength, variation of the bulk density with size, steepening of the size distribution by damping processes, the role of the unknown ‘collisional age’ of the discs, and dust production in recent giant impacts. While we cannot rule out the possibility that a combination of these might help, we argue that the easiest solution would be to assume that planetesimals in systems with bright debris discs were ‘born small’, with sizes in the kilometre range, especially at large distances from the stars. This conclusion would necessitate revisions to the existing planetesimal formation models, and may have a range of implications for planet formation. We also discuss potential tests to constrain the largest planetesimal sizes and debris disc masses.


Entropy ◽  
2019 ◽  
Vol 21 (4) ◽  
pp. 421 ◽  
Author(s):  
Masashi Kamogawa ◽  
Kazuyoshi Z. Nanjo ◽  
Jun Izutsu ◽  
Yoshiaki Orihara ◽  
Toshiyasu Nagao ◽  
...  

The relation between the size of an earthquake mainshock preparation zone and the magnitude of the forthcoming mainshock is different between nucleation and domino-like cascade models. The former model indicates that magnitude is predictable before an earthquake’s mainshock because the preparation zone is related to the rupture area. In contrast, the latter indicates that magnitude is substantially unpredictable because it is practically impossible to predict the size of final rupture, which likely consists of a sequence of smaller earthquakes. As this proposal is still controversial, we discuss both models statistically, comparing their spatial occurrence rates between foreshocks and aftershocks. Using earthquake catalogs from three regions, California, Japan, and Taiwan, we showed that the spatial occurrence rates of foreshocks and aftershocks displayed a similar behavior, although this feature did not vary between these regions. An interpretation of this result, which was based on statistical analyses, indicates that the nucleation model is dominant.


1989 ◽  
Vol 1 (2) ◽  
pp. 420-425 ◽  
Author(s):  
Vincenzo Carbone ◽  
Pierluigi Veltri

Sign in / Sign up

Export Citation Format

Share Document