Leveraging KOS to Extend Our Reach with Automated Processes

Author(s):  
Chris Oliver
Keyword(s):  
2020 ◽  
Vol 99 (4) ◽  
pp. 344-350
Author(s):  
Evgeny V. Zibarev ◽  
A. S. Afanasev ◽  
O. V. Slusareva ◽  
T. I. Muragimov ◽  
V. A. Stepanets ◽  
...  

In recent years, in the Russian Federation there has been an increase in the levels of radiofrequency electromagnetic fields in residential areas, including due to an increase in the number of base stations (BS). The purpose of sanitary and epidemiological surveillance at the stages of placement and commissioning of base stations (BS) is to prevent their adverse effects on public health. The increase in the number of base stations, together with the advent of new electronic equipment and antennas, provide opportunities for improving the processes of their accounting at the stage of placement and monitoring of the levels of radiofrequency electromagnetic fields at the operation stage. This automation tool can be a geo-information portal for providing sanitary and epidemiological surveillance of cellular base stations. The prototype of the geo-information portal allows both calculating the size of sanitary protection zones (SPZ) and building restriction zones (RZ) from the BS in online mode, displaying the results of calculations in graphical form and issuing sanitary and epidemiological conclusions for the placement and operation of base stations. The geo-information portal has the ability to synchronize with the data of the radio frequency center. Federal Service for Surveillance on Consumer Rights Protection and Human Wellbeing will be able to receive up-to-date analytical data. There will be completely automated processes of collecting, processing and storing information on BS.


2010 ◽  
Vol 28 (4) ◽  
pp. 435-443 ◽  
Author(s):  
Florian Loffing ◽  
Norbert Hagemann ◽  
Bernd Strauss

Author(s):  
Glenn Vorhes ◽  
Ernest Perry ◽  
Soyoung Ahn

Truck parking is a crucial element of the United States’ transportation system as it provides truckers with safe places to rest and stage for deliveries. Demand for truck parking spaces exceeds supply and shortages are especially common in and around urban areas. Freight operations are negatively affected as truck drivers are unable to park in logistically ideal locations. Drivers may resort to unsafe practices such as parking on ramps or in abandoned lots. This report seeks to examine the potential parking availability of vacant urban parcels by establishing a methodology to identify parcels and examining whether the identified parcels are suitable for truck parking. Previous research has demonstrated that affordable, accessible parcels are available to accommodate truck parking. When used in conjunction with other policies, adaptation of urban sites could help reduce the severity of truck parking shortages. Geographic information system parcel and roadway data were obtained for one urban area in each of the 10 Mid America Association of Transportation Officials region states. Area and proximity filters were applied followed by spectral analysis of satellite imagery to identify candidate parcels for truck parking facilities within urban areas. The automated processes created a ranked short list of potential parcels from which those best suited for truck parking could be efficiently identified for inspection by satellite imagery. This process resulted in a manageable number of parcels to be evaluated further by local knowledge metrics such as availability and cost, existing infrastructure and municipal connections, and safety.


Author(s):  
Judith Ponnewitz ◽  
Hans-Joachim Bargstaedt

<p>To get a building permit is a lengthy process involving a series of review and verification phases by the con- sultants and by the authorities and their agents. The work processes are, nowadays, governed by a large de- gree of individualistic work performances.</p><p>In order to facilitate a BIM-based building permit application, which exclusively uses the model and ist data as ist sole base of information, we analyzed traditional processes in the phase of issuing a building permit. This allows to restructure the steps of designing a building according to all required criteria and, step by step, remodel for the application of automated processes.</p><p>The facilitation of authorization processes will lead to checking machines which will already be applied by the consultants. Nevertheless, authorities need a secure way to evaluate the quality of the specific design in every regard.</p><p>For this purpose, we show how to combine different algorithms to check on the quality criteria for a building permit. There are qualitative criteria but also quantitative boundaries and also some nice-to-have items which can be compensated by alternative measures.</p>


2018 ◽  
Vol 7 (10) ◽  
pp. 392 ◽  
Author(s):  
Giuseppe Masetti ◽  
Tyanne Faulkes ◽  
Christos Kastrisios

Timely and accurate identification of change detection for areas depicted on nautical charts constitutes a key task for marine cartographic agencies in supporting maritime safety. Such a task is usually achieved through manual or semi-automated processes, based on best practices developed over the years requiring a substantial level of human commitment (i.e., to visually compare the chart with the new collected data or to analyze the result of intermediate products). This work describes an algorithm that aims to largely automate the change identification process as well as to reduce its subjective component. Through the selective derivation of a set of depth points from a nautical chart, a triangulated irregular network is created to apply a preliminary tilted-triangle test to all the input survey soundings. Given the complexity of a modern nautical chart, a set of feature-specific, point-in-polygon tests are then performed. As output, the algorithm provides danger-to-navigation candidates, chart discrepancies, and a subset of features that requires human evaluation. The algorithm has been successfully tested with real-world electronic navigational charts and survey datasets. In parallel to the research development, a prototype application implementing the algorithm was created and made publicly available.


Author(s):  
Jhonny Rodrigues ◽  
Paulo Reinier Gonçalves ◽  
Luís Miguel Pina ◽  
Fernando Gomes de Almeida

As use of composite materials increases, the search for suitable automated processes gains relevance to guarantee production quality by ensuring uniformity of the process, minimizing the amount of generated scrap and reducing time and energy consumption. Limitations on production by traditional means such as hand lay-up, vacuum bagging and in-autoclave methods, tend not to be as efficient when the size and shape complexity of the part being produced increases, motivating the search for alternative processes such as the Automated Tape Laying (ATL). This work aims to describe the process of modelling and simulating a composite ATL with in situ consolidation by characterizing the machine elements, using the finite differences method in conjunction with energy balances, in order to create a digital twin of the process for further control design. The modelling approach implemented is able to follow the process dynamics when changes to the heating element are imposed as well as to predict the composite material temperature response, making it suitable to work as a digital twin of a production process using an ATL machine.


Author(s):  
Giuseppe Masetti ◽  
Tyanne Faulkes ◽  
Christos Kastrisios

Timely and accurate identification of change detection for areas depicted on nautical charts constitutes a key task for marine cartographic agencies in supporting maritime safety. Such a task is usually achieved through manual or semi-automated processes, based on best practices developed over the years requiring a substantial level of human commitment (i.e., to visually compare the chart with the new collected data or to analyze the result of intermediate products). This work describes an algorithm that aims to largely automate the change identification process as well as to reduce its subjective component. Through the selective derivation of a set of depth points from a nautical chart, a triangulated irregular network is created to apply a preliminary tilted-triangle test to all the input survey soundings. Given the complexity of a modern nautical chart, a set of feature-specific, point-in-polygon tests are then performed. As output, the algorithm provides danger-to-navigation candidates, chart discrepancies, and a subset of features that requires human evaluation. The algorithm has been successfully tested with real-world electronic navigational charts and survey datasets. In parallel to the research development, a prototype application implementing the algorithm was created and made publicly available.


Author(s):  
Linda Mueller ◽  
Valentin Scherz ◽  
Gilbert Greub ◽  
Katia Jaton ◽  
Onya Opota

Since the beginning of the COVID-19 pandemic, important health and regulatory decisions relied on SARS-CoV-2 reverse transcription polymerase chain reaction (RT-PCR) results. Our diagnostic laboratory faced a rapid increase in the number of SARS-CoV-2 RT-PCR. To maintain a rapid turnaround time, we moved from a case-by-case validation of RT-PCR results to an automated validation and immediate results transmission to clinicians. A quality-monitoring tool based on a homemade algorithm coded in R was developed, to preserve high quality and to track aberrant results. We present the results of this quality-monitoring tool applied to 35,137 RT-PCR results. Patients tested several times led to 4,939 pairwise comparisons: 88% concordant and 12% discrepant. The algorithm automatically solved 428 out of 573 discrepancies. The most likely explanation for these 573 discrepancies was related for 44.9% of the situations to the clinical evolution of the disease, 27.9% to preanalytical factors, and 25.3% to stochasticity of the assay. Finally, 11 discrepant results could not be explained, including 8 for which clinical data was not available. For patients repeatedly tested on the same day, the second result confirmed a first negative or positive result in 99.2% or 88.9% of cases, respectively. The implemented quality-monitoring strategy allowed to: i) assist the investigation of discrepant results ii) focus the attention of medical microbiologists onto results requiring a specific expertise and iii) maintain an acceptable turnaround time. This work highlights the high RT-PCR consistency for the detection of SARS-CoV-2 and the necessity for automated processes to handle a huge number of microbiological results while preserving quality.


Author(s):  
Salvador Lima ◽  
José Moreira

The Web is a crucial means for the dissemination of touristic information. However, most touristic information resources are stored directly in Web pages or in relational databases that are accessible through ad-hoc Web applications, and the use of automated processes to search, extract and interpret information can hardly be implemented. The Semantic Web technologies, aiming at representing the background knowledge about Web resources in a computational way, can be an important contribution to the development of such automated processes. This chapter introduces the concept of touristic object, giving special attention to the representation of temporal, spatial, and thematic knowledge. It also proposes a three-layered architecture for the representation of touristic objects in the Web. The central part is the domain layer, defining a Semantic Model for Tourism (SeMoT) to describe concepts, relationships, and constraints using ontologies. The data layer supports the mapping of touristic information in relational databases into Resource Description Framework (RDF) virtual graphs following the SeMoT specification. The application layer deals with the integration of information from different data sources into a unified knowledge model, offering a common vocabulary to describe touristic information resources. Finally, we also show how to use this framework for planning touristic itineraries.


Sign in / Sign up

Export Citation Format

Share Document