scholarly journals Italian Tsunami Effects Database (ITED): The First Database of Tsunami Effects Observed Along the Italian Coasts

2021 ◽  
Vol 9 ◽  
Author(s):  
Alessandra Maramai ◽  
Laura Graziani ◽  
Beatriz Brizuela

Traditional tsunami catalogues are conceived as a collection of tsunamis classified by the generating cause, providing a general description of the effects observed for each tsunami. Those catalogues, even if they provide fundamental information, are not suitable for producing an exhaustive picture of the geographical distribution of the tsunami effects. In this paper we introduce the new Italian Tsunami Effects Database (ITED), a collection of evidence documenting the effects along the Italian coasts from historical times to present. The database comes forth the Euro-Mediterranean Tsunami Catalogue (EMTC) and focusses on the effects of tsunamis observed along the Italian coasts providing descriptive and quantitative information for each OP. The information reported in ITED does not only concern the effects produced by Italian tsunamis, but also those effects produced by tsunamis originated outside the Italian territory. ITED contains 318 OPs, related to 73 Italian tsunamis and to four tsunamis which occurred outside Italy. The database can be accessed through a WebApp that displays for each OP the description of effects, quantitative data (run-up, inundation, withdrawal, etc.) and tsunami intensity with the corresponding bibliographic references. The database also provides the tsunami intensity distribution along time (tsunami-history) for each site, allowing the end user to know how a place has been affected by tsunamis over the time. The information contained in ITED makes this database a useful tool to understand how tsunamis have affected the Italian territory and emphasizes the importance of studying the tsunami hazard along the Italian coasts.

2019 ◽  
Author(s):  
Alessandra Maramai ◽  
Laura Graziani ◽  
Beatriz Brizuela

Abstract. The Italian Tsunami Effects Database (ITED), consists of an ensemble of records reporting tsunami effects observed at several Observation Points (OP) along the Italian coasts from historical times. ITED was compiled starting from the Euro Mediterranean Tsunami Catalogue (EMTC) and it focuses on the propagation effects observed along the Italian coasts, providing information on how each locality was interested by tsunami effects over time. The effects reported in ITED are related to tsunamis occurred within the Italian territory and contained in the EMTC; these events were excerpt, analyzed and updated according to recent studies published in literature. The database can be accessed through a web GIS application, that displays the location of the OPs indicating for each of them the description of tsunami effects found in literature and the corresponding bibliographic references as well as the metrics related to the observed event. Based on those descriptions, the estimate value of the tsunami intensity has been assigned to each OP, according to both the Sieberg-Ambraseys and the Papadopoulos-Imamura scales. All the ITED data, including quantitative data such as runup, inundation, withdrawal, can be retrieved by accessing online the database through the WebApp that was expressly designed and built for this purpose. ITED contains 300 observations of tsunami effects at 225 OPs referred to 186 Italian main localities, hereafter called Place Name (PN) related to 72 Italian tsunamis. The database provides also the tsunami-history for each PN, allowing the end user to have a complete picture of how the PN is prone to tsunami effects.The realization of ITED was also the occasion to update Italian tsunamis contained in the EMTC, leading to the release of a new version of the EMTC catalogue, named EMTC2.0. ITED was specifically built to meet the needs of the tsunami hazard community, thus providing useful information that can improve the knowledge on how much the national territory is exposed to tsunami risk.


2017 ◽  
Vol 10 (04) ◽  
pp. 745-754
Author(s):  
Mudasir M Kirmani

Data Warehouse design requires a radical rebuilding of tremendous measures of information, frequently of questionable or conflicting quality, drawn from various heterogeneous sources. Data Warehouse configuration assimilates business learning and innovation know-how. The outline of theData Warehouse requires a profound comprehension of the business forms in detail. The principle point of this exploration paper is to contemplate and investigate the transformation model to change over the E-R outlines to Star Schema for developing Data Warehouses. The Dimensional modelling is a logical design technique used for data warehouses. This research paper addresses various potential differences between the two techniques and highlights the advantages of using dimensional modelling along with disadvantages as well. Dimensional Modelling is one of the popular techniques for databases that are designed keeping in mind the queries from end-user in a data warehouse. In this paper the focus has been on Star Schema, which basically comprises of Fact table and Dimension tables. Each fact table further comprises of foreign keys of various dimensions and measures and degenerate dimensions if any. We also discuss the possibilities of deployment and acceptance of Conversion Model (CM) to provide the details of fact table and dimension tables according to the local needs. It will also highlight to why dimensional modelling is preferred over E-R modelling when creating data warehouse.


Author(s):  
B. Tourniaire ◽  
J. M. Seiler ◽  
J. M. Bonnet ◽  
M. Amblard

Corium coolability after a severe PWR accident involving core meltdown and RPV failure is one of the main items in nuclear safety. The case considered here is a situation in which the corium is supposed to spread over a concrete floor and is flooded by water. In this frame, many researches are performed to study the physical phenomena which may enhance the heat transfer between the corium and the water pool. Among them, the melt entrainment above the corium crust by the sparging gas released by the concrete ablation appears as a potentially efficient cooling mechanism. The main target of the experimental program PERCOLA is to provide qualitative and quantitative information on this entrainment phenomenon. The first part of this paper is devoted to a general description of the experimental program and to the presentation of the main results. In a second part, the attention is focused on the modelling of the liquid entrainment phenomenon and to the comparison between the experimental data and the calculation results of two different entrainment models.


2014 ◽  
Vol 5 (3) ◽  
pp. 51-70
Author(s):  
Maurice Mulvenna ◽  
Suzanne Martin

Living labs, defined as a collection of people, equipment, services, and technology to provide a test platform for research and experiments, offer much promise in engaging with users to create new products and services. However, they are not widely understood outside some of the academic departments in which the concepts underlying them have been developed. The purpose of this study was to provide information about the phenomenon of living labs by asking the labs themselves to provide fundamental information of this position, outlook, and relationships with users and related stakeholders in triple-helix partnerships comprising academia, public sector, and private business. The approach of the study was to design and conduct a survey using an electronic Internet-based survey tool. The survey was designed to provide quantitative information about number of users involved in each living lab, for example. However, the survey also probed the labs to provide more detailed response to questions exploring qualitative aspects. The survey request was sent to all extant living labs that provided some form of email address as a form of contact. Fifty-six living labs responded, comprising a response rate of 29%. This study is believed to be the first major survey undertaken of living labs since the European Network of Living Labs was established in Espoo, Finland in 2006. A key value of the study is that it provides a baseline against which future studies can compare results. It also provides very interesting findings about the diversity of living labs, how they engage with users, and how strong the relationships are between living labs.


Machines ◽  
2019 ◽  
Vol 7 (4) ◽  
pp. 70
Author(s):  
Nima Alaei ◽  
Emil Kurvinen ◽  
Aki Mikkola

Digital tools have become indispensable for the testing and modification of prototypes in mobile and industrial machine manufacturing. Data that are extracted from virtual experimentation and analysis are both affordable and valuable, due to their repeatability and because they are close to real-world observations. Expert knowledge is a prerequisite for full deployment of computer aided engineering tools in the design phase and concomitant stages of product development. Currently, such knowledge, for the most part, is provided by the product development team and the manufacturer. Yet, it is important that manufacturers and designers receive end-user feedback throughout the product development process. However, end-users often lack sufficient know-how about the technical and engineering background of the product development, and this lack of understanding can become a barrier to user-designer communication. The aim of this article is to present an alternative to traditional design approaches that is based on customized real-time multibody simulation. This simulation-based approach can be seen as a platform that has the potential to improve knowledge management systems for product development. End-user feedback to the designer is given in a systematic manner throughout the design process using a multipurpose XML-based multibody environment.


1918 ◽  
Vol 17 (1) ◽  
pp. 20-33 ◽  
Author(s):  
William G. Savage

Compared with 20, or even 10, years ago our present knowledge of food poisoning outbreaks is extensive and in certain directions fairly complete. In spite of this greatly extended knowledge there are some aspects in regard to which we are yet lacking in fundamental information. This is particularly the case as to the precise sources of infection. It may be accepted as a demonstrated fact that most outbreaks of food poisoning are due to infection of the food eaten with one or other member of the Gaertner group of bacilli. The present paper is only concerned with the outbreaks associated with this group of organisms. A study of the individual outbreaks usually supplies evidence which definitely incriminates a certain article of food, and for most of the recent outbreaks further evidence is forthcoming that this has been infected with one or other member of the Gaertner group of bacilli. Tracing the matter a step further back it is only in a quite small minority of outbreaks that the recorded facts show how the food has become so infected. In a proportion of cases, perhaps more than half for continental recorded outbreaks but in only a small fraction of the British outbreaks, it is true that definite evidence is forthcoming showing that the meat was derived from an animal itself suffering from general or local disease caused by Gaertner group bacilli. Even, however, for these cases our recorded knowledge ceases with this information, and we do not know how these animals became infected or whether they represent isolated cases or are part of widespread epidemics amongst the animals affected.


2011 ◽  
Vol 1 (7) ◽  
pp. 36
Author(s):  
A. Paape

In the past it has been found that serious damage and breaching of seawalls is most frequently caused by overtopping. Hence for the design of seawalls data must be available about the overtopping by waves of the different profiles that might be possible. Naturally the conditions under which damage is caused to the seawall also depend on the type of construction and the materials used, for example: the stability of grass covered dikes can be endangered seriously by water flowing over the inner slope. In many designs the necessary height of a seawall has been defined such that not more than 2% of the waves overtop the crest, under chosen design conditions. This criterion has been determined on the assumption that the overtopping must remain very small. Some overtopping has to be accepted because no maximum value for wave height and wave run-up can be given, unless of course the wave height is limited by fore-shore conditions. Unfortunately this criterion gives no information about the volume and concentration of water overtopping the crest in each instance. Moreover it is of interest to know how this overtopping varies with other conditions, such as changes in the significant wave height. Information about the overtopping by waves was obtained from model investigations on simple plane slopes w^th inclinations varying from 1 : 8 to 1 : 2. The experiments were made in a windflume where wind generated waves as well as regular waves were employed. Using wind generated waves, conditions from nature regarding the distribution of wave heights could be reproduced. It appeared that the overtopping depends on the irregularity of the waves and that the same effects cannot be reproduced using regular paddle generated waves. In this paper a description of the model and the results of these tests are given. Investigations are m progress on composite slopes, including the reproduction of conditions for a seawall which suffered much overtopping but remained practically undamaged during the flood of 1953.


2021 ◽  
Author(s):  
Robert Hoffman ◽  
Gary Klein ◽  
Shane T. Mueller ◽  
Mohammadreza Jalaeian ◽  
Connor Tate

The purpose of the Stakeholder Playbook is to enable system developers to take into account the different ways in which stakeholders need to "look inside" of the AI/XAI systems. Recent work on Explainable AI has mapped stakeholder categories onto explanation requirements. While most of these mappings seem reasonable, they have been largely speculative. We investigated these matters empirically. We conducted interviews with senior and mid-career professionals possessing post-graduate degrees who had experience with AI and/ or autonomous systems, and who had served in a number of roles including former military, civilian scientists working for the government, scientists working in the private sector, and scientists working as independent consultants. The results show that stakeholders need access to others (e.g., trusted engineers, trusted vendors) to develop satisfying mental models of AI systems. and they need to know "how it fails" and "how it misleads" and not just "how it works." In addition, explanations need to support end-users in performing troubleshooting and maintenance activities, especially as operational situations and input data change. End-users need to be able to anticipate when the AI is approaching an edge case. Stakeholders often need to develop an understanding that enables them to explain the AI to someone else and not just satisfy their own sensemaking. We were surprised that only about half of our Interviewees said they always needed better explanations. This and other findings that are apparently paradoxical can be resolved by acknowledging that different stakeholders have different capabilities, different sensemaking requirements, and different immediate goals. In fact, the concept of “stakeholder” is misleading because the people we interviewed served in a variety of roles simultaneously — we recommend referring to these roles rather than trying to pigeonhole people into unitary categories. Different cognitive styles re another formative factor, as suggested by participant comments to the effect that they preferred to dive in and play with the system rather than being spoon-fed an explanation of how it works. These factors combine to determine what, for each given end-user, constitutes satisfactory and actionable understanding. exp


2017 ◽  
Vol 16 (2) ◽  
pp. 176
Author(s):  
Hanina Halimatussaidiyah Hamsan ◽  
Lee Mei Siah

Research has been conducted attitudes related HIV/AIDS issue particularly among university students because students are the group of younger generation that may be parents advising their children in future or they may have to contact with HIV  infected and AIDS people in their future work.  The purpose of research design is to identify the most economical method in conducting the research. The non- experimental quantitative research design which is in questionnaire tools has used to finalize results that are based on hypothesis testingThe population of this study is targeted to the undergraduates from Human Ecology in UPM, Serdang. The sample size for the study is determined by the calculation formula (Israel, 2009). The minimum sample size for FEM is approximately 271, which is calculated by the formula of known population size (N = 835) with the 95% confidence level and confidence interval of 5%.to achieve the objectives for this current study. Besides, three techniques of research design are descriptive, correlation and comparative also have been chosen in this study.  Descriptive design is used to gather the quantitative information and then organizes, tabulates, depicts, and describes the data collection. Descriptive involved in this study will be given a general description about respondent’s personal characteristic (gender), level of knowledge in HIV/AIDS and attitudes toward HIV/AIDS.  The correlation design is applied to determine the linear relationship between personal characteristic (gender), knowledge and attitudes towards HIV/AIDS among FEM undergraduates. The result of the correlation can be positive and negative. If the result is positive correlation, then the changes in value of one variable will make the changes of the other variables in the same direction, or vice versa.


2004 ◽  
Vol 38 ◽  
pp. 159-165 ◽  
Author(s):  
Ricard Molina ◽  
Elena Muntán ◽  
Laia Andreu ◽  
Glòria Furdada ◽  
Pere Oller ◽  
...  

AbstractAvalanche hazard maps of high accuracy are difficult to produce. For land-use planning and management purposes, a good knowledge of extreme run-out zones and frequencies of avalanches is required. In the present work, vegetation recognition (especially focused on Pinus uncinata trees) and dendrochronological techniques are used to characterize avalanches that have occurred in historical times, helping to determine both the extent of large or extreme avalanches and their occurrence in time. Vegetation was studied at the Canal del Roc Roig (eastern Pyrenees, Spain) avalanche path. The avalanches descending this path affect the railway that reaches the Vall de Núria resort and the run-up to the opposite slope. During winter 1996, two important avalanches affecting this path were well documented. These are compared with the results of the vegetation study, consisting of an inventory of flora, the recording of vegetation damages along eight transverse profiles at different altitudes on the path and a dendrochronological sampling campaign. The data obtained contributed to a characterization of the predominant snow accumulation in the starting zone, the 1996 avalanches and the range of frequencies of large avalanches. Also, traces of avalanches that increase the path mapped in the avalanche paths map published by the Institut Cartogràfic de Catalunya in 2000 were identified, improving the initial existing information.


Sign in / Sign up

Export Citation Format

Share Document