Innovative software solutions for subsea positionings

Author(s):  
Pierre-Yves Morvan ◽  
Gary Bagot

<p>Improving operational efficiency is a recurring challenge for subsea operations. Throughout the life of a field, from construction up to decommissioning, several subsea vehicles will be deployed to cover various tasks to perform underwater observations. An ROV or AUV assigned to a specific task will require multiple positioning sensors (LBL, USBL, INS…) to complete its mission. Defining the “good enough” subsea positioning strategy, i.e. to ensure a minimum accuracy without compromise on safety, can be a complex exercise. For instance, an overestimation of the LBL transponders required will directly induce vessel time and finally costly operations. On the other hand, a certain level of positioning redundancy may be requested for a vehicle operating close to a subsea asset in production.</p><p>To ease the design and monitoring of a subsea vehicle navigation, iXblue has developed an integrated solution. Not only has the company broadened its product range with the new intelligent Canopus LBL Transponder and the new generation Ramses transceiver, but with Delph Subsea Positioning Software, iXblue now provides a complete integrated solution for subsea positioning that goes a step further by bringing significant efficiency. Divided in 4 modules (LBL Array Planning, Navigation Simulation, Operations, DelphINS) with an intuitive user interface, Delph Subsea Positioning (DSP) is an integrated software suite for the preparation, the operation and the post-processing of iXblue positioning devices (USBL, LBL and INS).</p>

Author(s):  
Juliane Scheil ◽  
Thomas Kleinsorge

AbstractA common marker for inhibition processes in task switching are n − 2 repetition costs. The present study aimed at elucidating effects of no-go trials on n − 2 repetition costs. In contrast to the previous studies, no-go trials were associated with only one of the three tasks in the present two experiments. High n − 2 repetition costs occurred if the no-go task had to be executed in trial n − 2, irrespective of whether a response had to be withheld or not. In contrast, no n − 2 repetition costs were visible if the other two tasks were relevant in n − 2. Whereas this n − 2 effect was unaffected by whether participants could reliably exclude a no-go trial or not, effects of no-gos in trial n were determined by this knowledge. The results differ from effects of no-go trials that are not bound to a specific task. It is assumed that the present no-go variation exerted its effect not on the response level, but on the level of task sets, resulting in enhanced salience of the no-go task that leads to higher activation and, as a consequence, to stronger inhibition. The dissociation of the effects on no-gos in trials n − 2 and n as a function of foreknowledge suggests that the balance between activation and inhibition is shifted not only for single trials and tasks, but for the whole task space.


Author(s):  
Carlo Cravero ◽  
Martino Marini

The authors decided to organize their design/analysis computational tools in an integrated software suite in order to help teaching radial turbine, taking advantage of their research background and a set of codes previously developed. The software is proposed for use during class works and the student can either use a single design/analysis tool or face a complete design loop consisting of iterations between design and analysis tools. The intended users are final year students in mechanical engineering. The codes output are discussed with two practical examples in order to highlight the turbomachinery performance at design and off-design conditions. The above suite gives the student the opportunity of getting used to different concepts (choking, blade loading, performance maps, …) that are encountered in turbomachinery design and of understanding the effects of the main design parameters.


2012 ◽  
pp. 327-349
Author(s):  
J.G. Alcázar ◽  
M. Marvá ◽  
D. Orden ◽  
F. San Segundo

We describe our experience of using the following mathematical tools: an e-learning platform (Moodle), several components of the WIRIS software suite for mathematics education (the formula editor, WIRIS CAS, and WIRIS-Quizzes), the dynamical geometry package GeoGebra, the computational knowledge engine Wolfram Alpha, and the mathematics software system SAGE. Our aim in this chapter is two-fold: on the one hand, we report the use of these tools in Math refresher courses. On the other, we provide sufficient information about them for readers to decide on the usefulness of these tools in their own particular context (maybe different from that of a refresher course). More specifically, for each tool we give a general description, some comments on its use in Math refresher courses, and a list of (general) advantages and drawbacks.


2018 ◽  
pp. 7-10
Author(s):  
Evert Jan van Leeuwen

This introductory chapter provides an overview of House of Usher (1960), which was a part of American International Pictures' TV series The Curse of Corman. This TV series introduced American International's Poe pictures to a new generation. It is the emotional intensity conveyed through the mise-en-scène that sets the Poe pictures apart from their immediate rivals. The Poe pictures appealed to AIP's target audience — teenagers — because their aesthetics were also akin to the look and feel of EC horror comics. More than any of the other Poe pictures, House of Usher is a work of pulp expressionism that appeals to the angst holed up inside the minds of many a teenage audience member. Like a magic lantern, the film projector reveals a series of beautifully crafted, colourful tableau that in sequence give expression to Edgar Allan Poe's vision of human frailty and corruption, and the void that awaits beyond the threshold of life. This book explains why House of Usher has attracted a cult audience for nearly 60 years.


2020 ◽  
Vol 42 (2) ◽  
pp. 177-198
Author(s):  
Yann Giraud

Historians of economics rarely consider textbooks as more than passive receptacles of previously validated knowledge. Therefore, their active role in shaping the discipline and its image is seldom addressed. In this paper, I study the making of Paul Samuelson’s successive editions of Economics from 1967 to 1973 as an instance of how textbooks stand at the crossroads between disciplinary knowledge, pedagogy, and larger political and societal concerns. In the mid-1960s, Economics, now at its sixth edition, was at the height of its success. Considered one cornerstone of modern economics, it was also at the center of a number of criticisms dealing with the current state of the economic discipline and its teaching in the universities. While the profession expressed its concern over the lack of relevance of economics to address the pressing issues of the day and pleaded for a new “problem-solving” approach to economic education, the late 1960s witnessed the emergence of a new generation of “radical” economists criticizing the economics orthodoxy. Their contention that mainstream theory had neglected the issues of class struggle and capitalist exploitation found a favorable echo among an increasingly politicized population. Using archival materials, I show how Samuelson, helped by his editorial team at McGraw-Hill, attempted to take into account these changes in order to ensure the continuing success of subsequent editions of his text in an increasingly competitive market. This study emphasizes Samuelson’s ambiguous attitude toward his contenders, revealing, on the one hand, his apparent openness to discussion and outsiders’ suggestions, and, on the other hand, his firm attachment to mildly liberal politics and aversion to Marxism, unchanged through revisions. It also helps refine a notion that is often invoked but never fully expounded in textbook studies: that of the audience.


Author(s):  
Toru Higuchi ◽  
Marvin Troutt

This chapter explains the advancement and the price decline of products based on the VCR case. After the dominant design emerges, the product advances incrementally or cumulatively because the dominant design sets a standard design of the product and a framework for the competition. Many new generation products appeared in the market with innovative functions to spur sales. Some of them became popular and others did not. In the VCR case, most consumers bought a monaural VHS machine and, then later, a HiFi VHS machine. On the other hand, most consumers did not purchase S-VHS, D-VHS, and other advanced machines because those were too expensive in comparison with their performance. As a result, the alternation of generations of the VCR occurred only once, from the monaural to the HiFi machine.


2006 ◽  
Vol 3 (2) ◽  
pp. 279-282 ◽  
Author(s):  
Steven H. Stumpf ◽  
Simon J. Shapiro

Unstated and unacknowledged bias has a profound impact on the nature and implementation of integrative education models. Integrative education is the process of training conventional biomedical and traditional Chinese medicine practitioners in each tradition such that patient care may be effectively coordinated. A bilateral education model ensures that students in each tradition are cross-taught by experts from the ‘other’ tradition, imparting knowledge and values in unison. Acculturation is foundational to bilateral integrative medical education and practice. Principles are discussed for an open-minded bilateral educational model that can result in a new generation of integrative medicine teachers.


2015 ◽  
Vol 77 (22) ◽  
Author(s):  
Sayed Muchallil ◽  
Fitri Arnia ◽  
Khairul Munadi ◽  
Fardian Fardian

Image denoising plays an important role in image processing.  It is also part of the pre-processing technique in a binarization complete procedure that consists of pre-processing, thresholding, and post-processing.  Our previous research has confirmed that the Discrete Cosine Transform (DCT)-based filtering as the new pre-processing process improved the performance of binarization output in terms of recall and precision. This research compares three classical denoising methods; Gaussian, mean, and median filtering with the DCT-based filtering. The noisy ancient document images are filtered using those classical filtering methods. The outputs of this process are used as the input for Otsu, Niblack, Sauvola and NICK binarization methods. Then the resulted binary images of the three classical methods are compared with those of DCT-based filtering. The performance of all denoising algorithms is evaluated by calculating recall and precision of the resulted binary images.  The result of this research is that the DCT based filtering resulted in the highest recall and precision as compared to the other methods. 


Water ◽  
2020 ◽  
Vol 12 (4) ◽  
pp. 1150 ◽  
Author(s):  
Mehrnoosh Ghadimi ◽  
Sasan Zangenehtabar ◽  
Shahin Homaeigohar

Nanomaterials, i.e., those materials which have at least one dimension in the 1–100 nm size range, have produced a new generation of technologies for water purification. This includes nanosized adsorbents, nanomembranes, photocatalysts, etc. On the other hand, their uncontrolled release can potentially endanger biota in various environmental domains such as soil and water systems. In this review, we point out the opportunities created by the use of nanomaterials for water remediation and also the adverse effects of such small potential pollutants on the environment. While there is still a large need to further identify the potential hazards of nanomaterials through extensive lab or even field studies, an overview on the current knowledge about the pros and cons of such systems should be helpful for their better implementation.


Proceedings ◽  
2019 ◽  
Vol 14 (1) ◽  
pp. 49 ◽  
Author(s):  
Carsten Jaeschke ◽  
Oriol Gonzalez ◽  
Marta Padilla ◽  
Kaylen Richardson ◽  
Johannes Glöckler ◽  
...  

In this work, a new generation of gas sensing systems specially designed for breath analysis is presented. The developed system comprises a compact modular, low volume, temperature-controlled sensing chamber with three compartments that can host different sensor types. In the presented system, one compartment contains an array of 8 analog MOX sensors and the other two 10 digital MOX sensors each. Here, we test the system for the detection of low concentrations of several compounds.


Sign in / Sign up

Export Citation Format

Share Document