Metrics of Evolving Ego-Networks with Forgetting Factor

Author(s):  
Rui Portocarrero Sarmento

Nowadays, treating the data as a continuous real-time flux is an exigence explained by the need for immediate response to events in daily life. We study the data like an ongoing data stream and represent it by streaming egocentric networks (Ego-Networks) of the particular nodes under study. We use a non-standard node forgetting factor in the representation of the network data stream, as previously introduced in the related literature. This way the representation is sensible to recent events in users' networks and less sensible for the past node events. We study this method with large scale Ego-Networks taken from telecommunications social networks with power law distribution. We aim to compare and analysis some reference Ego-Networks metrics, and their variation with or without forgetting factor.

2021 ◽  
Vol 11 (12) ◽  
pp. 5320
Author(s):  
Redhwan Al-amri ◽  
Raja Kumar Murugesan ◽  
Mustafa Man ◽  
Alaa Fareed Abdulateef ◽  
Mohammed A. Al-Sharafi ◽  
...  

Anomaly detection has gained considerable attention in the past couple of years. Emerging technologies, such as the Internet of Things (IoT), are known to be among the most critical sources of data streams that produce massive amounts of data continuously from numerous applications. Examining these collected data to detect suspicious events can reduce functional threats and avoid unseen issues that cause downtime in the applications. Due to the dynamic nature of the data stream characteristics, many unresolved problems persist. In the existing literature, methods have been designed and developed to evaluate certain anomalous behaviors in IoT data stream sources. However, there is a lack of comprehensive studies that discuss all the aspects of IoT data processing. Thus, this paper attempts to fill this gap by providing a complete image of various state-of-the-art techniques on the major problems and core challenges in IoT data. The nature of data, anomaly types, learning mode, window model, datasets, and evaluation criteria are also presented. Research challenges related to data evolving, feature-evolving, windowing, ensemble approaches, nature of input data, data complexity and noise, parameters selection, data visualizations, heterogeneity of data, accuracy, and large-scale and high-dimensional data are investigated. Finally, the challenges that require substantial research efforts and future directions are summarized.


2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


Author(s):  
Daiva Milinkevičiūtė

The Age of Enlightenment is defined as the period when the universal ideas of progress, deism, humanism, naturalism and others were materialized and became a golden age for freemasons. It is wrong to assume that old and conservative Christian ideas were rejected. Conversely, freemasons put them into new general shapes and expressed them with the help of symbols in their daily routine. Symbols of freemasons had close ties with the past and gave them, on the one hand, a visible instrument, such as rituals and ideas to sense the transcendental, and on the other, intense gnostic aspirations. Freemasons put in a great amount of effort to improve themselves and to create their identity with the help of myths and symbols. It traces its origins to the biblical builders of King Solomon’s Temple, the posterity of the Templar Knights, and associations of the medieval craft guilds, which were also symbolical and became their link not only to each other but also to the secular world. In this work we analysed codified masonic symbols used in their rituals. The subject of our research is the universal Masonic idea and its aspects through the symbols in the daily life of the freemasons in Vilnius. Thanks to freemasons’ signets, we could find continuity, reception, and transformation of universal masonic ideas in the Lithuanian freemasonry and national characteristics of lodges. Taking everything into account, our article shows how the universal idea of freemasonry spread among Lithuanian freemasonry, and which forms and meanings it incorporated in its symbols. The objective of this research is to find a universal Masonic idea throughout their visual and oral symbols and see its impact on the daily life of the masons in Vilnius. Keywords: Freemasonry, Bible, lodge, symbols, rituals, freemasons’ signets.


Author(s):  
Aman Ahuja ◽  
Wei Wei ◽  
Kathleen M. Carley

1987 ◽  
Vol 19 (5-6) ◽  
pp. 701-710 ◽  
Author(s):  
B. L. Reidy ◽  
G. W. Samson

A low-cost wastewater disposal system was commissioned in 1959 to treat domestic and industrial wastewaters generated in the Latrobe River valley in the province of Gippsland, within the State of Victoria, Australia (Figure 1). The Latrobe Valley is the centre for large-scale generation of electricity and for the production of pulp and paper. In addition other industries have utilized the brown coal resource of the region e.g. gasification process and char production. Consequently, industrial wastewaters have been dominant in the disposal system for the past twenty-five years. The mixed industrial-domestic wastewaters were to be transported some eighty kilometres to be treated and disposed of by irrigation to land. Several important lessons have been learnt during twenty-five years of operating this system. Firstly the composition of the mixed waste stream has varied significantly with the passage of time and the development of the industrial base in the Valley, so that what was appropriate treatment in 1959 is not necessarily acceptable in 1985. Secondly the magnitude of adverse environmental impacts engendered by this low-cost disposal procedure was not imagined when the proposal was implemented. As a consequence, clean-up procedures which could remedy the adverse effects of twenty-five years of impact are likely to be costly. The question then may be asked - when the total costs including rehabilitation are considered, is there really a low-cost solution for environmentally safe disposal of complex wastewater streams?


2019 ◽  
Vol 19 (1) ◽  
pp. 4-16 ◽  
Author(s):  
Qihui Wu ◽  
Hanzhong Ke ◽  
Dongli Li ◽  
Qi Wang ◽  
Jiansong Fang ◽  
...  

Over the past decades, peptide as a therapeutic candidate has received increasing attention in drug discovery, especially for antimicrobial peptides (AMPs), anticancer peptides (ACPs) and antiinflammatory peptides (AIPs). It is considered that the peptides can regulate various complex diseases which are previously untouchable. In recent years, the critical problem of antimicrobial resistance drives the pharmaceutical industry to look for new therapeutic agents. Compared to organic small drugs, peptide- based therapy exhibits high specificity and minimal toxicity. Thus, peptides are widely recruited in the design and discovery of new potent drugs. Currently, large-scale screening of peptide activity with traditional approaches is costly, time-consuming and labor-intensive. Hence, in silico methods, mainly machine learning approaches, for their accuracy and effectiveness, have been introduced to predict the peptide activity. In this review, we document the recent progress in machine learning-based prediction of peptides which will be of great benefit to the discovery of potential active AMPs, ACPs and AIPs.


Author(s):  
Jeasik Cho

This book provides the qualitative research community with some insight on how to evaluate the quality of qualitative research. This topic has gained little attention during the past few decades. We, qualitative researchers, read journal articles, serve on masters’ and doctoral committees, and also make decisions on whether conference proposals, manuscripts, or large-scale grant proposals should be accepted or rejected. It is assumed that various perspectives or criteria, depending on various paradigms, theories, or fields of discipline, have been used in assessing the quality of qualitative research. Nonetheless, until now, no textbook has been specifically devoted to exploring theories, practices, and reflections associated with the evaluation of qualitative research. This book constructs a typology of evaluating qualitative research, examines actual information from websites and qualitative journal editors, and reflects on some challenges that are currently encountered by the qualitative research community. Many different kinds of journals’ review guidelines and available assessment tools are collected and analyzed. Consequently, core criteria that stand out among these evaluation tools are presented. Readers are invited to join the author to confidently proclaim: “Fortunately, there are commonly agreed, bold standards for evaluating the goodness of qualitative research in the academic research community. These standards are a part of what is generally called ‘scientific research.’ ”


Author(s):  
Gianluca Bardaro ◽  
Alessio Antonini ◽  
Enrico Motta

AbstractOver the last two decades, several deployments of robots for in-house assistance of older adults have been trialled. However, these solutions are mostly prototypes and remain unused in real-life scenarios. In this work, we review the historical and current landscape of the field, to try and understand why robots have yet to succeed as personal assistants in daily life. Our analysis focuses on two complementary aspects: the capabilities of the physical platform and the logic of the deployment. The former analysis shows regularities in hardware configurations and functionalities, leading to the definition of a set of six application-level capabilities (exploration, identification, remote control, communication, manipulation, and digital situatedness). The latter focuses on the impact of robots on the daily life of users and categorises the deployment of robots for healthcare interventions using three types of services: support, mitigation, and response. Our investigation reveals that the value of healthcare interventions is limited by a stagnation of functionalities and a disconnection between the robotic platform and the design of the intervention. To address this issue, we propose a novel co-design toolkit, which uses an ecological framework for robot interventions in the healthcare domain. Our approach connects robot capabilities with known geriatric factors, to create a holistic view encompassing both the physical platform and the logic of the deployment. As a case study-based validation, we discuss the use of the toolkit in the pre-design of the robotic platform for an pilot intervention, part of the EU large-scale pilot of the EU H2020 GATEKEEPER project.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Mateusz Taszarek ◽  
John T. Allen ◽  
Mattia Marchio ◽  
Harold E. Brooks

AbstractGlobally, thunderstorms are responsible for a significant fraction of rainfall, and in the mid-latitudes often produce extreme weather, including large hail, tornadoes and damaging winds. Despite this importance, how the global frequency of thunderstorms and their accompanying hazards has changed over the past 4 decades remains unclear. Large-scale diagnostics applied to global climate models have suggested that the frequency of thunderstorms and their intensity is likely to increase in the future. Here, we show that according to ERA5 convective available potential energy (CAPE) and convective precipitation (CP) have decreased over the tropics and subtropics with simultaneous increases in 0–6 km wind shear (BS06). Conversely, rawinsonde observations paint a different picture across the mid-latitudes with increasing CAPE and significant decreases to BS06. Differing trends and disagreement between ERA5 and rawinsondes observed over some regions suggest that results should be interpreted with caution, especially for CAPE and CP across tropics where uncertainty is the highest and reliable long-term rawinsonde observations are missing.


2021 ◽  
Vol 33 ◽  
pp. 258-269
Author(s):  
Matilda Holmes ◽  
Richard Thomas ◽  
Helena Hamerow

Sign in / Sign up

Export Citation Format

Share Document