scholarly journals Roadmap to Early Implementation of Passenger Air Mobility: Findings from a Delphi Study

2021 ◽  
Vol 13 (19) ◽  
pp. 10612
Author(s):  
Kshitija Desai ◽  
Christelle Al Haddad ◽  
Constantinos Antoniou

Urban air mobility (UAM) has recently increased in popularity as an emerging mode of transportation, covering a wide range of applications, for on-demand or scheduled operations of smaller aircraft, in and around metropolitan areas. Due to its novelty and as it has not yet been implemented, UAM research still faces uncertainties. In particular, there is a need to develop a roadmap for the early implementation of passenger air mobility, aiming to identify the most prominent challenges, opportunities, hazards, and risks, but also to highlight the most promising use cases, or on the contrary, the ones associated with the least benefits compared to the risks or complexity they entail. To answer the previous questions, and therefore address this research gap, this study used a two-round Delphi questionnaire, targeting various stakeholder groups (product owners, policymakers, researchers, consultants, investors), leading to a total of 51 experts, out of which 34 also participated in the second round. In the first round, the main challenges, opportunities, and hazards facing the implementation of passenger UAM were identified. Findings on challenges and opportunities that were dependent on use cases only (as opposed to being dependent on technology or external factors) were then fed back into the second round, which helped evaluate the use cases based both on their complexities, as well as the associated benefits. Accordingly, medical/emergency was identified as the best use case and intracity transport as the worst (in terms of complexity vs. benefits). Similarly, a risk analysis evaluated the potential hazards associated with the implementation of UAM and their impacts on the system viability. Community backlash was found to be the most hazardous one, while malicious passenger behavior and improperly designed infrastructure as the least. Findings from this study can help better understand stakeholders’ opinions, highlighting promising use cases, but also risks to be aware of, constituting therefore a roadmap for future implementation.

Author(s):  
Sriram Vangal ◽  
Somnath Paul ◽  
Steven Hsu ◽  
Amit Agarwal ◽  
Saurabh Kumar ◽  
...  

Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3871
Author(s):  
Jiri Pokorny ◽  
Khanh Ma ◽  
Salwa Saafi ◽  
Jakub Frolka ◽  
Jose Villa ◽  
...  

Automated systems have been seamlessly integrated into several industries as part of their industrial automation processes. Employing automated systems, such as autonomous vehicles, allows industries to increase productivity, benefit from a wide range of technologies and capabilities, and improve workplace safety. So far, most of the existing systems consider utilizing one type of autonomous vehicle. In this work, we propose a collaboration of different types of unmanned vehicles in maritime offshore scenarios. Providing high capacity, extended coverage, and better quality of services, autonomous collaborative systems can enable emerging maritime use cases, such as remote monitoring and navigation assistance. Motivated by these potential benefits, we propose the deployment of an Unmanned Surface Vehicle (USV) and an Unmanned Aerial Vehicle (UAV) in an autonomous collaborative communication system. Specifically, we design high-speed, directional communication links between a terrestrial control station and the two unmanned vehicles. Using measurement and simulation results, we evaluate the performance of the designed links in different communication scenarios and we show the benefits of employing multiple autonomous vehicles in the proposed communication system.


2002 ◽  
Vol 758 ◽  
Author(s):  
Khershed P. Cooper

ABSTRACTLayered Manufacturing (LM) refers to computer-aided manufacturing processes in which parts are made in sequential layers relatively quickly. Parts that are produced by LM can be formed from a wide range of materials such as photosensitive polymers, metals and ceramics in sizes from a centimeter to a few meters with sub-millimeter feature resolutions. LM has found use in diverse areas including biomedical engineering, pharmaceuticals, aerospace, defense, electronics and design engineering. The promise of LM is the capability to make customized complex-shaped functional parts without specialized tooling and without assembly. LM is still a few years away from fully realizing its promise but its potential for manufacturing remains high. A few of the fundamental challenges in materials processing confronting the community are improving the quality of the surface finish, eliminating residual stress, controlling local composition and microstructure, achieving fine feature size and dimensional tolerance and accelerating processing speed. Until these challenges are met, the applicability of LM and its commercialization will be restricted. Sustained scientific activity in LM has advanced over the past decade into many different areas of manufacturing and has enabled exploration of novel processes and development of hybrid processes. The research community of today has the opportunity to shape the future direction of science research to realize the full potential of LM.


2017 ◽  
Vol 7 (2) ◽  
pp. 20160151 ◽  
Author(s):  
Angela Logan ◽  
Michael P. Murphy

Our understanding of the role of mitochondria in biomedical sciences has expanded considerably over the past decade. In addition to their well-known metabolic roles, mitochondrial are also central to signalling for various processes through the generation of signals such as ROS and metabolites that affect cellular homeostasis, as well as other processes such as cell death and inflammation. Thus, mitochondrial function and dysfunction are central to the health and fate of the cell. Consequently, there is considerable interest in better understanding and assessing the many roles of mitochondria. Furthermore, there is also a growing realization that mitochondrial are a promising drug target in a wide range of pathologies. The application of interdisciplinary approaches at the interface between chemistry and biology are opening up new opportunities to understand mitochondrial function and in assessing the role of the organelle in biology. This work and the experience thus gained are leading to the development of new classes of therapies. Here, we overview the progress that has been made to date on exploring the chemical biology of the organelle and then focus on future challenges and opportunities that face this rapidly developing field.


2015 ◽  
Vol 1 (1) ◽  
pp. 53-57 ◽  
Author(s):  
Simon Wallace ◽  
Steve Riley

Purpose Tourism 2025 – Growing Value Together/Whakatipu Uara Ngatahi is a framework to unite New Zealand's large and diverse tourism industry and ignite strong, aspirational economic growth. Its goal is to see the tourism industry contribute $41 billion a year to the New Zealand economy by 2025, up from $24 billion now. It provides vital context for some collective actions by big or small industry clusters and for thousands of actions individual businesses will take each year. The paper aims to discuss these issues. Design/methodology/approach A wide range of tourism industry stakeholders were consulted over an 18‐month period to ensure the project was being developed on a solid, evidence‐based foundation. There was strong stakeholder support for a framework which the private sector takes ownership of and responsibility for, but which also recognises that public sector support is vital. The project team developed a “straw‐man” growth framework model which resulted in carrying out detailed investigations and consultation to test and, where necessary, adjust that model into its final form. Findings There were four major forces shaping the global tourism market. There was one positive force for New Zealand countered by three tough challenges. The strawman growth framework comprised five separate yet inter‐connected “cycle of growth” themes. These themes are relatively consistent with global national tourism plans that were studied. Used intelligently and in harmony, with the industry fully understanding the inter‐relationships and inter‐dependencies within the “cycle of growth”, the key themes enable the tourism industry to successfully come to grips with the challenges and opportunities ahead. Originality/value Tourism 2025 is aimed at aligning the industry on a pathway towards aspirational growth.


2020 ◽  
Author(s):  
Thijs Dhollander ◽  
Adam Clemente ◽  
Mervyn Singh ◽  
Frederique Boonstra ◽  
Oren Civier ◽  
...  

Diffusion MRI has provided the neuroimaging community with a powerful tool to acquire in-vivo data sensitive to microstructural features of white matter, up to 3 orders of magnitude smaller than typical voxel sizes. The key to extracting such valuable information lies in complex modelling techniques, which form the link between the rich diffusion MRI data and various metrics related to the microstructural organisation. Over time, increasingly advanced techniques have been developed, up to the point where some diffusion MRI models can now provide access to properties specific to individual fibre populations in each voxel in the presence of multiple "crossing" fibre pathways. While highly valuable, such fibre-specific information poses unique challenges for typical image processing pipelines and statistical analysis. In this work, we review the "fixel-based analysis" (FBA) framework that implements bespoke solutions to this end, and has recently seen a stark increase in adoption for studies of both typical (healthy) populations as well as a wide range of clinical populations. We describe the main concepts related to fixel-based analyses, as well as the methods and specific steps involved in a state-of-the-art FBA pipeline, with a focus on providing researchers with practical advice on how to interpret results. We also include an overview of the scope of current fixel-based analysis studies (until August 2020), categorised across a broad range of neuroscientific domains, listing key design choices and summarising their main results and conclusions. Finally, we critically discuss several aspects and challenges involved with the fixel-based analysis framework, and outline some directions and future opportunities.


2022 ◽  
Vol 23 (2) ◽  
pp. 938
Author(s):  
Olubodun Michael Lateef ◽  
Michael Olawale Akintubosun ◽  
Olamide Tosin Olaoba ◽  
Sunday Ocholi Samson ◽  
Malgorzata Adamczyk

The evolutional development of the RNA translation process that leads to protein synthesis based on naturally occurring amino acids has its continuation via synthetic biology, the so-called rational bioengineering. Genetic code expansion (GCE) explores beyond the natural translational processes to further enhance the structural properties and augment the functionality of a wide range of proteins. Prokaryotic and eukaryotic ribosomal machinery have been proven to accept engineered tRNAs from orthogonal organisms to efficiently incorporate noncanonical amino acids (ncAAs) with rationally designed side chains. These side chains can be reactive or functional groups, which can be extensively utilized in biochemical, biophysical, and cellular studies. Genetic code extension offers the contingency of introducing more than one ncAA into protein through frameshift suppression, multi-site-specific incorporation of ncAAs, thereby increasing the vast number of possible applications. However, different mediating factors reduce the yield and efficiency of ncAA incorporation into synthetic proteins. In this review, we comment on the recent advancements in genetic code expansion to signify the relevance of systems biology in improving ncAA incorporation efficiency. We discuss the emerging impact of tRNA modifications and metabolism in protein design. We also provide examples of the latest successful accomplishments in synthetic protein therapeutics and show how codon expansion has been employed in various scientific and biotechnological applications.


Author(s):  
Matt Woodburn ◽  
Gabriele Droege ◽  
Sharon Grant ◽  
Quentin Groom ◽  
Janeen Jones ◽  
...  

The utopian vision is of a future where a digital representation of each object in our collections is accessible through the internet and sustainably linked to other digital resources. This is a long term goal however, and in the meantime there is an urgent need to share data about our collections at a higher level with a range of stakeholders (Woodburn et al. 2020). To sustainably achieve this, and to aggregate this information across all natural science collections, the data need to be standardised (Johnston and Robinson 2002). To this end, the Biodiversity Information Standards (TDWG) Collection Descriptions (CD) Interest Group has developed a data standard for describing collections, which is approaching formal review for ratification as a new TDWG standard. It proposes 20 classes (Suppl. material 1) and over 100 properties that can be used to describe, categorise, quantify, link and track digital representations of natural science collections, from high-level approximations to detailed breakdowns depending on the purpose of a particular implementation. The wide range of use cases identified for representing collection description data means that a flexible approach to the standard and the underlying modelling concepts is essential. These are centered around the ‘ObjectGroup’ (Fig. 1), a class that may represent any group (of any size) of physical collection objects, which have one or more common characteristics. This generic definition of the ‘collection’ in ‘collection descriptions’ is an important factor in making the standard flexible enough to support the breadth of use cases. For any use case or implementation, only a subset of classes and properties within the standard are likely to be relevant. In some cases, this subset may have little overlap with those selected for other use cases. This additional need for flexibility means that very few classes and properties, representing the core concepts, are proposed to be mandatory. Metrics, facts and narratives are represented in a normalised structure using an extended MeasurementOrFact class, so that these can be user-defined rather than constrained to a set identified by the standard. Finally, rather than a rigid underlying data model as part of the normative standard, documentation will be developed to provide guidance on how the classes in the standard may be related and quantified according to relational, dimensional and graph-like models. So, in summary, the standard has, by design, been made flexible enough to be used in a number of different ways. The corresponding risk is that it could be used in ways that may not deliver what is needed in terms of outputs, manageability and interoperability with other resources of collection-level or object-level data. To mitigate this, it is key for any new implementer of the standard to establish how it should be used in that particular instance, and define any necessary constraints within the wider scope of the standard and model. This is the concept of the ‘collection description scheme,’ a profile that defines elements such as: which classes and properties should be included, which should be mandatory, and which should be repeatable; which controlled vocabularies and hierarchies should be used to make the data interoperable; how the collections should be broken down into individual ObjectGroups and interlinked, and how the various classes should be related to each other. which classes and properties should be included, which should be mandatory, and which should be repeatable; which controlled vocabularies and hierarchies should be used to make the data interoperable; how the collections should be broken down into individual ObjectGroups and interlinked, and how the various classes should be related to each other. Various factors might influence these decisions, including the types of information that are relevant to the use case, whether quantitative metrics need to be captured and aggregated across collection descriptions, and how many resources can be dedicated to amassing and maintaining the data. This process has particular relevance to the Distributed System of Scientific Collections (DiSSCo) consortium, the design of which incorporates use cases for storing, interlinking and reporting on the collections of its member institutions. These include helping users of the European Loans and Visits System (ELViS) (Islam 2020) to discover specimens for physical and digital loans by providing descriptions and breakdowns of the collections of holding institutions, and monitoring digitisation progress across European collections through a dynamic Collections Digitisation Dashboard. In addition, DiSSCo will be part of a global collections data ecosystem requiring interoperation with other infrastructures such as the GBIF (Global Biodiversity Information Facility) Registry of Scientific Collections, the CETAF (Consortium of European Taxonomic Facilities) Registry of Collections and Index Herbariorum. In this presentation, we will introduce the draft standard and discuss the process of defining new collection description schemes using the standard and data model, and focus on DiSSCo requirements as examples of real-world collection descriptions use cases.


2018 ◽  
Vol 22 (2) ◽  
pp. 16 ◽  
Author(s):  
Luis Miguel Fonseca ◽  
José Pedro Domingues

<p><strong>Purpose:</strong> With the transition period for ISO 9001 certified organisations to migrate to the 2015 edition ending 15th September 2018, this investigation aims to evaluate the status of ISO 9001:2015 transition process and provide useful knowledge on the corresponding motivations, benefits, and success factors.</p><p><strong>Methodology/Approach:</strong> An empirical study of more than 300 Portuguese organisations ISO 9001 certified, or in certification process, encompassing a wide range of activities sectors, was carried out.</p><p><strong>Findings:</strong> As of May 2017, 19% of the respondents already have ISO 9001:2015 certification and all the remaining one’s plan to complete the process in time. The principal reported benefits are risk-based thinking, mapping of the organisational context, and stakeholder identification. Simultaneously those were the issues that required more attention and effort to be mastered and implemented. Additionally, there is evidence that ISO 9001:2015 enhances both internal and external organisational issues and generates benefits for all the researched dimensions. Based on the respondents’ responses, organisations who claimed that external motivations were the primary drivers to ISO 9001:2015 implementation systematically rate higher all the benefits when compared with the rating ascribed by those organisations who claimed internal motivations. Moreover, it is possible to conclude that the perceived benefits from ISO 9001:2015 implementation and certification seem to be strongly influenced by two primary dimensions: the (smaller) organisation size and the (lesser) international presence.</p><p><strong>Research Limitation/implication:</strong> Due to ISO 9001:2015 novelty, the results of this investigation should be subject to future confirmation and replicated in other countries to allow a generalisation of the conclusions. Since the survey is based on the perceptions of the organisation’s Managers, there is a potential response bias risk that should be acknowledged.</p><p><strong>Originality/Value of paper:</strong> With more than 1.2 million ISO 9001 certified organisation worldwide, this a highly relevant issue both for organisations, practitioners and academics. Due to ISO 9001:2015 novelty, this investigation aims to fill this research gap.</p>


2019 ◽  
Author(s):  
Helmut Spengler ◽  
Claudia Lang ◽  
Tanmaya Mahapatra ◽  
Ingrid Gatz ◽  
Klaus A Kuhn ◽  
...  

BACKGROUND Modern data-driven medical research provides new insights into the development and course of diseases and enables novel methods of clinical decision support. Clinical and translational data warehouses, such as Informatics for Integrating Biology and the Bedside (i2b2) and tranSMART, are important infrastructure components that provide users with unified access to the large heterogeneous data sets needed to realize this and support use cases such as cohort selection, hypothesis generation, and ad hoc data analysis. OBJECTIVE Often, different warehousing platforms are needed to support different use cases and different types of data. Moreover, to achieve an optimal data representation within the target systems, specific domain knowledge is needed when designing data-loading processes. Consequently, informaticians need to work closely with clinicians and researchers in short iterations. This is a challenging task as installing and maintaining warehousing platforms can be complex and time consuming. Furthermore, data loading typically requires significant effort in terms of data preprocessing, cleansing, and restructuring. The platform described in this study aims to address these challenges. METHODS We formulated system requirements to achieve agility in terms of platform management and data loading. The derived system architecture includes a cloud infrastructure with unified management interfaces for multiple warehouse platforms and a data-loading pipeline with a declarative configuration paradigm and meta-loading approach. The latter compiles data and configuration files into forms required by existing loading tools, thereby automating a wide range of data restructuring and cleansing tasks. We demonstrated the fulfillment of the requirements and the originality of our approach by an experimental evaluation and a comparison with previous work. RESULTS The platform supports both i2b2 and tranSMART with built-in security. Our experiments showed that the loading pipeline accepts input data that cannot be loaded with existing tools without preprocessing. Moreover, it lowered efforts significantly, reducing the size of configuration files required by factors of up to 22 for tranSMART and 1135 for i2b2. The time required to perform the compilation process was roughly equivalent to the time required for actual data loading. Comparison with other tools showed that our solution was the only tool fulfilling all requirements. CONCLUSIONS Our platform significantly reduces the efforts required for managing clinical and translational warehouses and for loading data in various formats and structures, such as complex entity-attribute-value structures often found in laboratory data. Moreover, it facilitates the iterative refinement of data representations in the target platforms, as the required configuration files are very compact. The quantitative measurements presented are consistent with our experiences of significantly reduced efforts for building warehousing platforms in close cooperation with medical researchers. Both the cloud-based hosting infrastructure and the data-loading pipeline are available to the community as open source software with comprehensive documentation. CLINICALTRIAL


Sign in / Sign up

Export Citation Format

Share Document