scholarly journals Tetrahedra of varying density and their applications

Author(s):  
Dennis R. Bukenberger ◽  
Hendrik P. A. Lensch

Abstract We propose concepts to utilize basic mathematical principles for computing the exact mass properties of objects with varying densities. For objects given as 3D triangle meshes, the method is analytically accurate and at the same time faster than any established approximation method. Our concept is based on tetrahedra as underlying primitives, which allows for the object’s actual mesh surface to be incorporated in the computation. The density within a tetrahedron is allowed to vary linearly, i.e., arbitrary density fields can be approximated by specifying the density at all vertices of a tetrahedral mesh. Involved integrals are formulated in closed form and can be evaluated by simple, easily parallelized, vector-matrix multiplications. The ability to compute exact masses and centroids for objects of varying density enables novel or more exact solutions to several interesting problems: besides the accurate analysis of objects under given density fields, this includes the synthesis of parameterized density functions for the make-it-stand challenge or manufacturing of objects with controlled rotational inertia. In addition, based on the tetrahedralization of Voronoi cells we introduce a precise method to solve $$L_{2|\infty }$$ L 2 | ∞ Lloyd relaxations by exact integration of the Chebyshev norm. In the context of additive manufacturing research, objects of varying density are a prominent topic. However, current state-of-the-art algorithms are still based on voxelizations, which produce rather crude approximations of masses and mass centers of 3D objects. Many existing frameworks will benefit by replacing approximations with fast and exact calculations. Graphic abstract

2020 ◽  
Vol 6 (3) ◽  
pp. 359-374 ◽  
Author(s):  
Henning Pieper

The article seeks to examine why heavy metal bands used history and imagery associated with the ‘Schutzstaffel’ (SS). This includes the reasons for the focus on this particular organization as well as the intentions behind it: are compositions about historical facts or rather fictitious topics, that is, are they an accurate analysis, a provocation or just entertainment? The article takes a closer look at the historical background and the content of the songs; it also questions the awareness of the criminal character of the SS displayed by the musicians. The time span covered by the songs in question reaches from the mid-1980s to the early 2000s. Thus, it can be asked if the attitude of musicians changed over time and if they included the current state of research in their lyrics.


Author(s):  
Manuel Mazzara ◽  
Luca Biselli ◽  
Pier Paolo Greco ◽  
Nicola Dragoni ◽  
Antonio Marraffa ◽  
...  

Nowadays, acquisition of trustable information is increasingly important in both professional and private contexts. However, establishing what information is trustable and what is not, is a very challenging task. For example, how can information quality be reliably assessed? How can sources’ credibility be fairly assessed? How can gatekeeping processes be found trustworthy when filtering out news and deciding ranking and priorities of traditional media? An Internet-based solution to a human-based ancient issue is being studied, and it is called Polidoxa, from Greek “poly” (p???), meaning “many” or “several” and “doxa” (d??a), meaning “common belief” or “popular opinion.” This old problem will be solved by means of ancient philosophies and processes with truly modern tools and technologies. This is why this work required a collaborative and interdisciplinary joint effort from researchers with very different backgrounds and institutes with significantly different agendas. Polidoxa aims at offering: 1) a trust-based search engine algorithm, which exploits stigmergic behaviours of users’ network, 2) a trust-based social network, where the notion of trust derives from network activity and 3) a holonic system for bottom-up self-protection and social privacy. By presenting the Polidoxa solution, this work also describes the current state of traditional media as well as newer ones, providing an accurate analysis of major search engines such as Google and social network (e.g., Facebook). The advantages that Polidoxa offers, compared to these, are also clearly detailed and motivated. Finally, a Twitter application (Polidoxa@twitter) which enables experimentation of basic Polidoxa principles is presented.


Author(s):  
G.D. Danilatos

Over recent years a new type of electron microscope - the environmental scanning electron microscope (ESEM) - has been developed for the examination of specimen surfaces in the presence of gases. A detailed series of reports on the system has appeared elsewhere. A review summary of the current state and potential of the system is presented here.The gas composition, temperature and pressure can be varied in the specimen chamber of the ESEM. With air, the pressure can be up to one atmosphere (about 1000 mbar). Environments with fully saturated water vapor only at room temperature (20-30 mbar) can be easily maintained whilst liquid water or other solutions, together with uncoated specimens, can be imaged routinely during various applications.


Author(s):  
C. Barry Carter

This paper will review the current state of understanding of interface structure and highlight some of the future needs and problems which must be overcome. The study of this subject can be separated into three different topics: 1) the fundamental electron microscopy aspects, 2) material-specific features of the study and 3) the characteristics of the particular interfaces. The two topics which are relevant to most studies are the choice of imaging techniques and sample preparation. The techniques used to study interfaces in the TEM include high-resolution imaging, conventional diffraction-contrast imaging, and phase-contrast imaging (Fresnel fringe images, diffuse scattering). The material studied affects not only the characteristics of the interfaces (through changes in bonding, etc.) but also the method used for sample preparation which may in turn have a significant affect on the resulting image. Finally, the actual nature and geometry of the interface must be considered. For example, it has become increasingly clear that the plane of the interface is particularly important whenever at least one of the adjoining grains is crystalline.A particularly productive approach to the study of interfaces is to combine different imaging techniques as illustrated in the study of grain boundaries in alumina. In this case, the conventional imaging approach showed that most grain boundaries in ion-thinned samples are grooved at the grain boundary although the extent of this grooving clearly depends on the crystallography of the surface. The use of diffuse scattering (from amorphous regions) gives invaluable information here since it can be used to confirm directly that surface grooving does occur and that the grooves can fill with amorphous material during sample preparation (see Fig. 1). Extensive use of image simulation has shown that, although information concerning the interface can be obtained from Fresnel-fringe images, the introduction of artifacts through sample preparation cannot be lightly ignored. The Fresnel-fringe simulation has been carried out using a commercial multislice program (TEMPAS) which was intended for simulation of high-resolution images.


2005 ◽  
Vol 41 ◽  
pp. 205-218
Author(s):  
Constantine S. Mitsiades ◽  
Nicholas Mitsiades ◽  
Teru Hideshima ◽  
Paul G. Richardson ◽  
Kenneth C. Anderson

The ubiquitin–proteasome pathway is a principle intracellular mechanism for controlled protein degradation and has recently emerged as an attractive target for anticancer therapies, because of the pleiotropic cell-cycle regulators and modulators of apoptosis that are controlled by proteasome function. In this chapter, we review the current state of the field of proteasome inhibitors and their prototypic member, bortezomib, which was recently approved by the U.S. Food and Drug Administration for the treatment of advanced multiple myeloma. Particular emphasis is placed on the pre-clinical research data that became the basis for eventual clinical applications of proteasome inhibitors, an overview of the clinical development of this exciting drug class in multiple myeloma, and a appraisal of possible uses in other haematological malignancies, such non-Hodgkin's lymphomas.


1995 ◽  
Vol 38 (5) ◽  
pp. 1126-1142 ◽  
Author(s):  
Jeffrey W. Gilger

This paper is an introduction to behavioral genetics for researchers and practioners in language development and disorders. The specific aims are to illustrate some essential concepts and to show how behavioral genetic research can be applied to the language sciences. Past genetic research on language-related traits has tended to focus on simple etiology (i.e., the heritability or familiality of language skills). The current state of the art, however, suggests that great promise lies in addressing more complex questions through behavioral genetic paradigms. In terms of future goals it is suggested that: (a) more behavioral genetic work of all types should be done—including replications and expansions of preliminary studies already in print; (b) work should focus on fine-grained, theory-based phenotypes with research designs that can address complex questions in language development; and (c) work in this area should utilize a variety of samples and methods (e.g., twin and family samples, heritability and segregation analyses, linkage and association tests, etc.).


2014 ◽  
Vol 19 (2) ◽  
pp. 11-15
Author(s):  
Steven L. Demeter

Abstract The fourth, fifth, and sixth editions of the AMA Guides to the Evaluation of Permanent Impairment (AMA Guides) use left ventricular hypertrophy (LVH) as a variable to determine impairment caused by hypertensive disease. The issue of LVH, as assessed echocardiographically, is a prime example of medical science being at odds with legal jurisprudence. Some legislatures have allowed any cause of LVH in a hypertensive individual to be an allowed manifestation of hypertensive changes. This situation has arisen because a physician can never say that no component of LVH was not caused by the hypertension, even in an individual with a cardiomyopathy or valvular disorder. This article recommends that evaluators consider three points: if the cause of the LVH is hypertension, is the examinee at maximum medical improvement; is the LVH caused by hypertension or another factor; and, if apportionment is allowed, then a careful analysis of the risk factors for other disorders associated with LVH is necessary. The left ventricular mass index should be present in the echocardiogram report and can guide the interpretation of the alleged LVH; if not present, it should be requested because it facilitates a more accurate analysis. Further, if the cause of the LVH is more likely independent of the hypertension, then careful reasoning and an explanation should be included in the impairment report. If hypertension is only a partial cause, a reasoned analysis and clear explanation of the apportionment are required.


VASA ◽  
2019 ◽  
Vol 48 (1) ◽  
pp. 35-46
Author(s):  
Stephen Hofmeister ◽  
Matthew B. Thomas ◽  
Joseph Paulisin ◽  
Nicolas J. Mouawad

Abstract. The management of vascular emergencies is dependent on rapid identification and confirmation of the diagnosis with concurrent patient stabilization prior to immediate transfer to the operating suite. A variety of technological advances in diagnostic imaging as well as the advent of minimally invasive endovascular interventions have shifted the contemporary treatment algorithms of such pathologies. This review provides a comprehensive discussion on the current state and future trends in the management of ruptured abdominal aortic aneurysms as well as acute aortic dissections.


2016 ◽  
Vol 21 (1) ◽  
pp. 55-64 ◽  
Author(s):  
Silvia Convento ◽  
Cristina Russo ◽  
Luca Zigiotto ◽  
Nadia Bolognini

Abstract. Cognitive rehabilitation is an important area of neurological rehabilitation, which aims at the treatment of cognitive disorders due to acquired brain damage of different etiology, including stroke. Although the importance of cognitive rehabilitation for stroke survivors is well recognized, available cognitive treatments for neuropsychological disorders, such as spatial neglect, hemianopia, apraxia, and working memory, are overall still unsatisfactory. The growing body of evidence supporting the potential of the transcranial Electrical Stimulation (tES) as tool for interacting with neuroplasticity in the human brain, in turn for enhancing perceptual and cognitive functions, has obvious implications for the translation of this noninvasive brain stimulation technique into clinical settings, in particular for the development of tES as adjuvant tool for cognitive rehabilitation. The present review aims at presenting the current state of art concerning the use of tES for the improvement of post-stroke visual and cognitive deficits (except for aphasia and memory disorders), showing the therapeutic promises of this technique and offering some suggestions for the design of future clinical trials. Although this line of research is still in infancy, as compared to the progresses made in the last years in other neurorehabilitation domains, current findings appear very encouraging, supporting the development of tES for the treatment of post-stroke cognitive impairments.


2002 ◽  
Vol 47 (1) ◽  
pp. 8-11
Author(s):  
Virginia M. Shiller

Sign in / Sign up

Export Citation Format

Share Document