Recording Epigraphic Sources as Part of Artworks

Author(s):  
Gabriele Pieke

Art history has its own demands for recording visual representations. Objectivity and authenticity are the twin pillars of recording artistic data. As such, techniques relevant to epigraphic study, such as making line drawings, may not always be the best approach to an art historical study, which addresses, for example, questions about natural context and materiality of the artwork, the semantic, syntactic, and chronological relation between image and text, work procedures, work zones, and workshop traditions, and interactions with formal structures and beholders. Issues critical to collecting data for an art historical analysis include recording all relevant information without overcrowding the data set, creating neutral (i.e., not subjective) photographic images, collecting accurate color data, and, most critically, firsthand empirical study of the original artwork. A call for greater communication in Egyptology between epigraphy/palaeography and art history is reinforced by drawing attention to images as tools of communication and the close connection between the written word and figural art in ancient Egypt.

2021 ◽  
pp. 016555152110184
Author(s):  
Gunjan Chandwani ◽  
Anil Ahlawat ◽  
Gaurav Dubey

Document retrieval plays an important role in knowledge management as it facilitates us to discover the relevant information from the existing data. This article proposes a cluster-based inverted indexing algorithm for document retrieval. First, the pre-processing is done to remove the unnecessary and redundant words from the documents. Then, the indexing of documents is done by the cluster-based inverted indexing algorithm, which is developed by integrating the piecewise fuzzy C-means (piFCM) clustering algorithm and inverted indexing. After providing the index to the documents, the query matching is performed for the user queries using the Bhattacharyya distance. Finally, the query optimisation is done by the Pearson correlation coefficient, and the relevant documents are retrieved. The performance of the proposed algorithm is analysed by the WebKB data set and Twenty Newsgroups data set. The analysis exposes that the proposed algorithm offers high performance with a precision of 1, recall of 0.70 and F-measure of 0.8235. The proposed document retrieval system retrieves the most relevant documents and speeds up the storing and retrieval of information.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
◽  
Elmar Kotter ◽  
Luis Marti-Bonmati ◽  
Adrian P. Brady ◽  
Nandita M. Desouza

AbstractBlockchain can be thought of as a distributed database allowing tracing of the origin of data, and who has manipulated a given data set in the past. Medical applications of blockchain technology are emerging. Blockchain has many potential applications in medical imaging, typically making use of the tracking of radiological or clinical data. Clinical applications of blockchain technology include the documentation of the contribution of different “authors” including AI algorithms to multipart reports, the documentation of the use of AI algorithms towards the diagnosis, the possibility to enhance the accessibility of relevant information in electronic medical records, and a better control of users over their personal health records. Applications of blockchain in research include a better traceability of image data within clinical trials, a better traceability of the contributions of image and annotation data for the training of AI algorithms, thus enhancing privacy and fairness, and potentially make imaging data for AI available in larger quantities. Blockchain also allows for dynamic consenting and has the potential to empower patients and giving them a better control who has accessed their health data. There are also many potential applications of blockchain technology for administrative purposes, like keeping track of learning achievements or the surveillance of medical devices. This article gives a brief introduction in the basic technology and terminology of blockchain technology and concentrates on the potential applications of blockchain in medical imaging.


2004 ◽  
Vol 26 (1) ◽  
pp. 65-80 ◽  
Author(s):  
Martin Reuss

The resolution of international water disputes demands historical analysis. Too often, this analysis is not supplied by professional historians but by policymakers, engineers, and others who may lack the required knowledge and skills. The result inhibits rather than advances sound policy. Fortunately, historians are obtaining increased appreciation for what they bring to the conference table. The United Nations Educational, Scientific and Cultural Organization (UNESCO), which the United States recently rejoined, is attempting to further sound historical study; and the recently formed International Water History Association (IWHA) provides a forum to focus on the history of global water issues. These developments afford historians new and important means to make a difference in resolving some of the most pressing international resource issues.


2020 ◽  
pp. 189-209
Author(s):  
Nataliia Voitovych

The aim of the research is to study the historical preconditions and legal regulation of surveillance in combating crime in the XIX century. At the same time, the author's goal is to compare peculiarities of the instruments of system fight against crime (the method of operational search actions, hereinafter - OSA) and covert investigative activities in countries with different forms of government and diverse political systems.The methodology of the research is: adherence to the principles of objectivity, scientificity and historicism contributed to consistent disclosure of preconditions, content and principles of surveillance as a measure and a method of OSA and covert investigative activities in combating and preventing crime actions. Mutual enrichment with historical and legal methods provided systemity of the research. Historical study of surveillance in combination with the study of regulatory legal acts created new opportunities for interdisciplinary research. The application of general scientific methods, namely systematization, generalization, problem-chronological, comparative-historical, historical-legal methods allowed to trace the influence of the legal component on the history of introduction and development of surveillance in the "long" XIX century and peculiarities of its usage in the conditions of the newly formed states and political systems in the interwar period.The scientific novelty lies in a detailed historical and legal analysis of the content of regulatory legal acts concerning legal grounds for surveillance, a comprehensive study of its content, gaps and peculiarities of usage in non-democratic political regimes.Conclusions. The article provides historical analysis of evolution and usage of surveillance, which has experienced several stages connected with improving the performance of security functions, in preventing crimes. The attention is focused on the most characteristic features of  implementing surveillance as a universal measure of obtaining information and distributing tasks between the states' law enforcement agencies and a means of combating representatives of political forces and structures constituting a real and hypothetical threat to the state / regime. The similarity of performing functions by law enforcement agencies (and the role of surveillance) in the conditions of different state formations, despite fundamental differences in the forms of government and the nature of political systems, is proved.


2019 ◽  
Author(s):  
Lisa Schilhan ◽  
Christian Kaier

In times of an ever-increasing information overload, Academic Search Engine Optimization (ASEO) supports findability of relevant information and contributes to the FAIR principles. It enhances efficiency in literature and data search and therefore plays an increasing role in the research lifecycle. ASEO is an important aspect to consider when preparing a scientific manuscript for publication. Authors can increase the visibility of their papers in library catalogues, databases, repositories and search engines with simple measures like choosing informative author keywords. The more (meta-)data these search algorithms can use, the higher the probability that a data set or paper will show up in a result list. ASEO enables search algorithms and readers to quickly and unambiguously identify relevant content, thus also helping institutions to increase research visibility. In addition, authors and publishers share an interest in describing content in a way that makes it easy to find it. Librarians, with their extensive knowledge and wealth of experience in literature research and metadata management such as keyword assignment, can provide valuable advice on the role of as correct and complete metadata as possible and on suitable keywords for search algorithms. For this reason, the Publication Services at Graz University Library have recently started offering training and workshops for authors. The presentation will provide an introduction into strategies to enhance visibility and findability of online content, such as research articles, with some theoretical background as well as practical examples.


2021 ◽  
Author(s):  
Kevin Bellinguer ◽  
Robin Girard ◽  
Guillaume Bontron ◽  
Georges Kariniotakis

<div> <p>In recent years, the share of photovoltaic (PV) power in Europe has grown: the installed capacity increased from around 10 GW in 2008 to nearly 119 GW in 2018 [1]. Due to the intermittent nature of PV generation, new challenges arise regarding economic profitability and the safe operation of the power network. To overcome these issues, a special effort is made to develop efficient PV generation forecasting tools.</p> <p> </p> <p>For short-term PV production forecasting, past production observations are typically the main drivers. In addition, spatio-temporal (ST) inputs such as Satellite-Derived Surface Irradiance (SDSI) provide relevant information regarding the weather situation in the vicinity of the farm. Moreover, the literature shows us that Numerical Weather Predictions (NWPs) provide relevant information regarding weather trends.</p> <p> </p> <p>NWPs can be integrated in the forecasting process in two different ways. The most straightforward approach considers NWPs as explanatory input variables to the forecasting models. Thus, the atmosphere dynamics are directly carried by the NWPs. The alternative considers NWPs as state variables: weather information is used to filter the training data set to obtain a coherent subset of PV production observations measured under similar weather conditions as the PV production to be predicted. This approach is based on analog methods and makes the weather dynamics to be implicitly contained in the PV production observations. This conditioned learning approach permits to perform local regressions and is adaptive in the sense that the model training is conditioned to the weather situation.</p> <p>The specialized literature focuses on spot NWPs which permits to find situations that evolve in the same way but does not preserve ST patterns. In this context, the addition of SDSI features cannot make the most of the conditioning process. Ref. [3] proposes to use geopotential fields, which are wind drivers, as analog predictors.</p> <p> </p> <p>In this work, we propose the following contributions to the state of the art:</p> <p>We investigate the influence of spot NWPs on the performances of an auto-regressive (AR) and a random forest models according to the two above-mentioned approaches: either as additional explanatory features and/or as analog features. The analogy score proposed by [2] is used to find similar weather situations, then the model is trained over the associated PV production observations. The results highlight that the linear model performs better with the conditioned approach while the non-linear model obtains better performances when fed with explanatory features.</p> <p>Then, the similarity score is extended to gridded NWPs data through the use of a principal component analysis. This method allows to condition the learning to large-scale weather information. A comparison between spot and gridded NWPs conditioned approaches applied with AR model highlights that gridded NWPs improves the influence of SDSI over forecasting performances.</p> <p> </p> <p>The proposed approaches are evaluated using 9 PV plants in France and for a testing period of 12 months.</p> <p> </p> <strong>References</strong> <p>[1]      IRENA - https://www.irena.org/Statistics/Download-Data</p> <p>[2]      Alessandrini, Delle Monache, et al. An analog ensemble for short-term probabilistic solar power forecast. Applied Energy, 2015. https://doi.org/10.1016/j.apenergy.2015.08.011</p> <p>[3]      Bellinguer, Girard, Bontron, Kariniotakis. Short-term Forecasting of Photovoltaic Generation based on Conditioned Learning of Geopotential Fields. 2020, UPEC. https://doi.org/10.1109/UPEC49904.2020.9209858</p> </div>


2021 ◽  
pp. 288-294
Author(s):  
Paul Wormeli ◽  
Jenna Mazreku ◽  
Jeremy Pine ◽  
Mark Damesyn

For central cancer registries to become a more significant public health resource, they must evolve to capture more timely, accurate, and extensive data. Key stakeholders have called for a faster time to deliver work products, data extensions such as social determinants of health, and more relevant information for cancer control programs at the local level. The proposed model consists of near real-time reporting stages to replace the current time and labor-intensive efforts to populate a complete cancer case abstract on the basis of the 12- and 24-month data submission timelines. The first stage collects a cancer diagnosis minimum data set sufficient to describe population incidence and prevalence, which is then followed by a second stage capturing subsequent case updates and treatment data. A third stage procures targeted information in response to identified research projects' needs. The model also provides for further supplemental reports as may be defined to gather additional data. All stages leverage electronic health records' widespread development and the many emerging standards for data content, including national policies related to healthcare and technical standards for interoperability, such as the Fast Healthcare Interoperability Resources specifications to automate and accelerate reporting to central cancer registries. The emergence of application programming interfaces that allow for more interoperability among systems would be leveraged, leading to more efficient information sharing. Adopting this model will expedite cancer data availability to improve cancer control while supporting data integrity and flexibility in data items. It presents a long-term and feasible solution that addresses the extensive burden and unsustainable manual data collection requirements placed on Certified Tumor Registrars at disease reporting entities nationally.


Author(s):  
John Elderfield

This chapter presents the text of a lecture on the role of visual medium in art-historical study. It addresses the relationship of art history to the existential acts of painting and looking at painting and describes how the so-called story of modern art has been narrated in the history literature. It also considers how modern histories can accommodate the unfamiliar that is normally part of the story.


Author(s):  
Oana I. Craciunescu ◽  
Shiva K. Das ◽  
Terrence Z. Wong ◽  
Thaddeus V. Samulski

Thermal modeling for hyperthermia breast patients can provide relevant information to better understand the temperatures achieved during treatment. However, human breast is much perfused, making knowledge of the perfusion crucial to the accuracy of the temperature computations. It has been shown that the perfusion of blood in tumor tissue can be approximated using the relative perfusion index (RPI) determined from dynamic contrast-enhanced magnetic resonance imaging (DE-MRI). It was also concluded that the 3D reconstruction of tumor perfusion can be performed using fractal interpolation functions (FIF). The technique used was called piecewise hidden variable fractal interpolation (PHVFI). Changes in the protocol parameters for the dynamic MRI sequences in breast patients allowed us to be able to acquire more spatial slices, hence the possibility to actually verify the accuracy of the fractal interpolation. The interpolated slices were compared to the imaged slices in the original set. The accuracy of the interpolation was tested on post-hyperthermia treatment data set. The difference between the reconstruction and the original slice varied from 2 to 5%. Significantly, the fractal dimension of the interpolated slices is within 2–3% from the original images, thus preserving the fractality of the perfusion maps. The use of such a method becomes crucial when tumor size and imaging restrictions limits the number of spatial slices, requiring interpolation to fill the data between the slices.


1985 ◽  
Vol 10 (4) ◽  
pp. 37-45
Author(s):  
Iain Gordon Brown

This paper was read at the Planning Conference for the projected Artists’ Papers Index, held at the British Library in September 1985. Dr. Brown discusses some problems inherent in the definition of such an index or register of artists’ papers. The author, who is responsible for manuscripts and archives relating to artists and art history in the National Library of Scotland, goes on to outline the resources for art-historical study to be found in one large general manuscript collection-a major collection that is part of an institution with a long-established and very active acquisitions policy in this field.


Sign in / Sign up

Export Citation Format

Share Document