individual specimen
Recently Published Documents


TOTAL DOCUMENTS

32
(FIVE YEARS 12)

H-INDEX

8
(FIVE YEARS 2)

2021 ◽  
Vol 7 (2) ◽  
pp. 105-108
Author(s):  
Thomas S. Rau ◽  
Jakob Cramer ◽  
M. Geraldine Zuniga ◽  
Georg Böttcher ◽  
Thomas Lenarz

Abstract Cochlear implants include an electrode array (EA) which needs to be inserted into the cochlea. Insertion tests using artificial cochlear models (ACM) or ex vivo specimens are widely used methods during EA development to characterize EA design properties, including insertion forces. Measured forces are directly linked to the orientation of the cochlear lumen with respect to the insertion axis of the test bench. While desired insertion directions in ACM experiments can be predefined by design, specimens are individually shaped and the cochlear lumen is embedded invisibly. Therefore, a new method for accurate, individual specimen positioning is required. A key element of the proposed method is a customizable pose setting adapter (PSA) used to adjust the specimen’s fine positioning. After rigid fixation of the specimen to a holder featuring spherical registration markers and subsequent cone beam computed tomography the desired insertion direction is planned. The planned data is used to calculate the individual shape of the PSA. Finally, the PSA is 3D printed and mounted between force sensor and specimen holder to correctly align the specimen to the test bench’s insertion axis. All necessary hard- and software have been developed including the specimen holder, a software for registration and trajectory planning, and a custom Matlab script whose output drives a parametric CAD file of the PSA. Positioning accuracy was determined in a first trial using 10 virtual trajectories and was found to be 0.23 ± 0.12 mm and 0.38 ± 0.17°. The presented stereotactic positioning procedure enables high repeatability in future ex vivo insertion experiments due to accurate, image-guided control of the insertion direction.


Author(s):  
Jutta Buschbom ◽  
Breda Zimkus ◽  
Andrew Bentley ◽  
Mariko Kageyama ◽  
Christopher Lyal ◽  
...  

Transdisciplinary and cross-cultural cooperation and collaboration are needed to build extended, densely interconnected information resources. These are the prerequisites for the successful implementation and execution of, for example, an ambitious monitoring framework accompanying the post-2020 Global Biodiversity Framework (GBF) of the Convention on Biological Diversity (CBD; SCBD 2021). Data infrastructures that meet the requirements and preferences of concerned communities can focus and attract community involvement, thereby promoting participatory decision making and the sharing of benefits. Community acceptance, in turn, drives the development of the data resources and data use. Earlier this year, the alliance for biodiversity knowledge (2021a) conducted forum-based consultations seeking community input on designing the next generation of digital specimen representations and consequently enhanced infrastructures. The multitudes of connections that arise from extending the digital specimen representations through linkages in all “directions” will form a powerful network of information for research and application. Yet, with the power of an extended, accessible data network comes the responsibility to protect sensitive information (e.g., the locations of threatened populations, culturally context-sensitive traditional knowledge, or businesses’ fundamental data and infrastructure assets). In addition, existing legislation regulates access and the fair and equitable sharing of benefits. Current negotiations on ‘Digital Sequence Information’ under the CBD suggest such obligations might increase and become more complex in the context of extensible information networks. For example, in the case of data and resources funded by taxpayers in the EU, such access should follow the general principle of being “as open as possible; as closed as is legally necessary” (cp. EC 2016). At the same time, the international regulations of the CBD Nagoya Protocol (SCBD 2011) need to be taken into account. Summarizing main outcomes from the consultation discussions in the forum thread “Meeting legal/regulatory, ethical and sensitive data obligations” (alliance for biodiversity knowledge 2021b), we propose a framework of ten guidelines and functionalities to achieve community building and drive application: Substantially contribute to the conservation and protection of biodiversity (cp. EC 2020). Use language that is CBD conformant. Show the importance of the digital and extensible specimen infrastructure for the continuing design and implementation of the post-2020 GBF, as well as the mobilisation and aggregation of data for its monitoring elements and indicators. Strive to openly publish as much data and metadata as possible online. Establish a powerful and well-thought-out layer of user and data access management, ensuring security of ‘sensitive data’. Encrypt data and metadata where necessary at the level of an individual specimen or digital object; provide access via digital cryptographic keys. Link obligations, rights and cultural information regarding use to the digital key (e.g. CARE principles (Carroll et al. 2020), Local Context-labels (Local Contexts 2021), licenses, permits, use and loan agreements, etc.). Implement a transactional system that records every transaction. Amplify workforce capacity across the digital realm, its work areas and workflows. Do no harm (EC 2020): Reduce the social and ecological footprint of the implementation, aiming for a long-term sustainable infrastructure across its life-cycle, including development, implementation and management stages. Substantially contribute to the conservation and protection of biodiversity (cp. EC 2020). Use language that is CBD conformant. Show the importance of the digital and extensible specimen infrastructure for the continuing design and implementation of the post-2020 GBF, as well as the mobilisation and aggregation of data for its monitoring elements and indicators. Strive to openly publish as much data and metadata as possible online. Establish a powerful and well-thought-out layer of user and data access management, ensuring security of ‘sensitive data’. Encrypt data and metadata where necessary at the level of an individual specimen or digital object; provide access via digital cryptographic keys. Link obligations, rights and cultural information regarding use to the digital key (e.g. CARE principles (Carroll et al. 2020), Local Context-labels (Local Contexts 2021), licenses, permits, use and loan agreements, etc.). Implement a transactional system that records every transaction. Amplify workforce capacity across the digital realm, its work areas and workflows. Do no harm (EC 2020): Reduce the social and ecological footprint of the implementation, aiming for a long-term sustainable infrastructure across its life-cycle, including development, implementation and management stages. Balancing the needs for open access, as well as protection, accountability and sustainability, the framework is designed to function as a robust interface between the (research) infrastructure implementing the extensible network of digital specimen representations, and the myriad of applications and operations in the real world. With the legal, ethical and data protection layers of the framework in place, the infrastructure will provide legal clarity and security for data providers and users, specifically in the context of access and benefit sharing under the CBD and its Nagoya Protocol. Forming layers of protection, the characteristics and functionalities of the framework are envisioned to be flexible and finely-grained, adjustable to fulfill the needs and preferences of a wide range of stakeholders and communities, while remaining focused on the protection and rights of the natural world. Respecting different value systems and national policies, the framework is expected to allow a divergence of views to coexist and balance differing interests. Thus, the infrastructure of the digital extensible specimen network is fair and equitable to many providers and users. This foundation has the capacity and potential to bring together the diverse global communities using, managing and protecting biodiversity.


2021 ◽  
Vol 40 (2) ◽  
pp. 101-144
Author(s):  
Francesco Miniati ◽  
Carlotta Cappelli ◽  
Simonetta Monechi

Abstract. We present a taxonomic revision of the family Fasciculithaceae focused on forms that characterize the early evolution of this family group, which are currently included within the genera Gomphiolithus, Diantholitha, Lithoptychius and Fasciculithus. The investigation approach is based on a combined light microscope (LM) and scanning electron microscope (SEM) analysis of specimens from well-preserved ODP–DSDP site material (ODP Site 1209; Site 1262; ODP Site 1267; DSDP Site 356; DSDP Site 119) and outcrops (Bottaccione and Contessa, Italy; Qreiya, Egypt) across the Danian–Selandian transition. The direct LM–SEM comparison of the same individual specimen provides clarification of several taxa that were previously described only with the LM. One new genus (Tectulithus), five new combinations (Tectulithus janii, Tectulithus merloti, Tectulithus pileatus, Tectulithus stegastos and Tectulithus stonehengei) and six new species are defined (Diantholitha pilula, Diantholitha toquea, Lithoptychius galeottii, Lithoptychius maioranoae, Tectulithus pagodiformis and Fasciculithus realeae). The main characteristics useful to identify fasciculiths with the LM are provided, together with a 3D–2D drawing showing the main structural features. The accurate taxonomic characterization grants the development of an evolutionary lineage that documents a great fasciculith diversification during the late Danian and early Selandian. Four different well-constrained events have been documented: the lowest occurrence (LO) of Gomphiolithus, the paracme of Fasciculithaceae at the top of Chron C27r (PTC27r), the radiation of Diantholitha (LO Diantholitha), the paracme of Fasciculithaceae at the base of Chron C26r (PBC26r), the radiation of Lithoptychius (LO Lithoptychius) and the radiation of Tectulithus (lowest common occurrence of Tectulithus) that shows the biostratigraphic relevance of this group across the Danian–Selandian transition.


2021 ◽  
Vol 13 (4) ◽  
pp. 224-237
Author(s):  
Vyacheslav B. Ivanov ◽  
Arkady V. Shcherbakov

Sufficient evidence has been collected that alternative biological and ecological processes may occur in individual plant specimens that dwell in environmentally equivalent habitats. Environmental stress triggers individual, specimen-specific adaptive response. The paper shows how fractal analysis can be used to study the degree of stress that plants in different habitats and environmental factor combinations are exposed to.


Toxins ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 218
Author(s):  
Naomasa Oshiro ◽  
Hiroya Nagasawa ◽  
Kyoko Kuniyoshi ◽  
Naoki Kobayashi ◽  
Yoshiko Sugita-Konishi ◽  
...  

Ciguatera fish poisoning (CFP) is one of the most frequently encountered seafood poisoning syndromes; it is caused by the consumption of marine finfish contaminated with ciguatoxins (CTXs). The majority of CFP cases result from eating fish flesh, but a traditional belief exists among people that the head and viscera are more toxic and should be avoided. Unlike the viscera, scientific data to support the legendary high toxicity of the head is scarce. We prepared tissue samples from the fillet, head, and eyes taken from five yellow-edged lyretail (Variola louti) individuals sourced from Okinawa, Japan, and analyzed the CTXs by LC-MS/MS. Three CTXs, namely, CTX1B, 52-epi-54-deoxyCTX1B, and 54-deoxyCTX1B, were confirmed in similar proportions. The toxins were distributed nearly evenly in the flesh, prepared separately from the fillet and head. Within the same individual specimen, the flesh in the fillet and the flesh from the head, tested separately, had the same level and composition of toxins. We, therefore, conclude that flesh samples for LC-MS/MS analysis can be taken from any part of the body. However, the tissue surrounding the eyeball displayed CTX levels two to four times higher than those of the flesh. The present study is the first to provide scientific data demonstrating the high toxicity of the eyes.


Author(s):  
L.A. Bugaev ◽  
◽  
A.V. Voykina ◽  
S.G. Sergeeva

Analysis of special features of the reproductive system of so-iuy mullet Planiliza haematocheila (Temminck & Schlegel, 1845) females from the Azov and Black Sea Basin at the end of the winter season, 2019, has been conducted using the size of oocytes as its basis. Individual differences in distribution of oocyte sizes during the period of trophoplazmatic growth have been identified. Following the estimation of ordered series of oocyte sizes during the period of trophoplazmatic growth, the median and percentile values have been calculated; they can be used as reference values for qualitative characterization of ordered series for oocyte diameter in an individual specimen, using the empirical median, calculated for the respective specimen, as a basis. It has been found out that the sizes of trophoplazmatic growth oocytes, which are utilized during the spawning period of the current year, and, therefore, the degree of gonad maturity have individual characteristics independent of the age of an individual, of its length and weight, and of the content of reserve and bioactive substances in its tissues and blood.


Sexual Health ◽  
2020 ◽  
Vol 17 (1) ◽  
pp. 15 ◽  
Author(s):  
Steven G. Badman ◽  
Sara F. E. Bell ◽  
Judith A. Dean ◽  
Jime Lemoire ◽  
Luke Coffey ◽  
...  

Background The aim of this study was to compare the performance of pooled self-collected urogenital, pharyngeal and anorectal specimens to that of individual specimen results for the molecular detection of Chlamydia trachomatis (CT) and Neisseria gonorrhoeae (NG) near the point of care (POC) for diagnostic sensitivity. Methods: Clients (mostly men who have sex with men) attending an urban community testing service and three sex-on-premises venues in Brisbane, Australia, were offered CT and NG testing by trained lay providers. Participants provided three self-collected specimens (urine, pharyngeal and rectal) for testing by GeneXpert (Cepheid, Sunnyvale, CA, USA). If any of the individual specimens from a participant were positive, all three specimens were pooled and retested. Results: Of the 388 participants who provided three individual anatomical specimens, 76 (19.6%) were found to be positive for CT and/or NG at one or more sites. The pooling approach failed to detect five CT rectal and four NG pharyngeal infections. The overall performance (sensitivity) of the pooling approach compared with individual specimen testing and Cohen’s κ were 90.0% and 0.86 respectively for CT and 89.7% and 0.89 respectively for NG. Conclusions: Reduced sensitivity was observed when using pooled specimens for the detection of CT and NG using GeneXpert near the POC, similar to results reported in laboratory-based CT and NG pooling studies. These data suggest specimen pooling is feasible near to the POC, potentially saving time and costs when screening at-risk populations for CT and NG. Our data also suggest a reduction in pooled urine could improve overall test sensitivity.


Author(s):  
Zhengzhe Wu ◽  
Jere Kahanpää ◽  
Pasi Sihvonen ◽  
Anne Koivunen ◽  
Hannu Saarenmaa

Digitisation of natural history collections draws increasing attention. The digitised specimens not only facilitate the long-term preservation of biodiversity information but also boost the easy access and sharing of information. There are more than two billion specimens in the world’s natural history collections and pinned insect specimens compose of more than half of them (Tegelberg et al. 2014, Tegelberg et al. 2017). However, it is still a challenge to digitise pinned insect specimens with current state-of-art systems. The slowness of imaging pinned insects is due to the fact that they are essentially 3D objects and associated labels are pinned under the insect specimen. During the imaging process, the labels are often removed manually, which slows down the whole process. How can we avoid handling the labels pinned under often fragile and valuable specimens in order to increase the speed of digitsation? In our work (Saarenmaa et al. 2019) for T3.1.2 task in the ICEDIG (https://www.icedig.eu) project, we first briefly reviewed the state-of-the-art approaches on small insect digitisation. Then recent promising technological advances on imaging were presented, some of which have not yet been used for insect digitisation. It seems that one single approach will not be enough to digitise all insect collections efficiently. The approach has to be optimized based on the features of the specimens and their associated labels. To obtain a breakthrough in insect digitisation, it is necessary to utilize a combination of existing and new technologies in novel workflows. To explore the options, we identified six approaches for digitising pinned insects with the goal of minimum manipulations of labels as follows. Minimal labels: Image selected individual specimens without removing labels from the pin by using two cameras. This method suits for small insects with only one or a few well-spaced labels. Multiple webcams: Similar to the minimal labels approach, but with multiple webcams at different positions. This has been implemented in a prototype system with 12 cameras (Hereld et al. 2017) and in the ALICE system with six DSLR cameras (Price et al. 2018). Imaging of units: Similar to the multiple webcams approach, but image the entire unit (“Units” are small boxes or trays contained in drawers of collection cabinets, and are being used in most major insect collections). Camera in robot arm: Image the individual specimen or the unit with the camera mounted at a robot arm to capture large number of images from different views. Camera on rails: Similar to camera in robot arm approach, but the camera is mounted on rails to capture the unit. A 3D model of the insects and/or units can be created, and then labels are extracted. This is being prototyped by the ENTODIG-3D system (Ylinampa and Saarenmaa 2019). Terahertz time-gated multispectral imaging: Image the individual specimen with terahertz time-gated multispectral imaging devices. Minimal labels: Image selected individual specimens without removing labels from the pin by using two cameras. This method suits for small insects with only one or a few well-spaced labels. Multiple webcams: Similar to the minimal labels approach, but with multiple webcams at different positions. This has been implemented in a prototype system with 12 cameras (Hereld et al. 2017) and in the ALICE system with six DSLR cameras (Price et al. 2018). Imaging of units: Similar to the multiple webcams approach, but image the entire unit (“Units” are small boxes or trays contained in drawers of collection cabinets, and are being used in most major insect collections). Camera in robot arm: Image the individual specimen or the unit with the camera mounted at a robot arm to capture large number of images from different views. Camera on rails: Similar to camera in robot arm approach, but the camera is mounted on rails to capture the unit. A 3D model of the insects and/or units can be created, and then labels are extracted. This is being prototyped by the ENTODIG-3D system (Ylinampa and Saarenmaa 2019). Terahertz time-gated multispectral imaging: Image the individual specimen with terahertz time-gated multispectral imaging devices. Experiments on selected approaches 2 and 5 are in progress and the preliminary results will be presented.


Author(s):  
Tilo Henning ◽  
Patrick Plitzner ◽  
Andreas Müller ◽  
Anton Güntsch ◽  
Walter G. Berendsohn ◽  
...  

Herbarium specimens are central to botanical science and of rising importance thanks to increasing accessibility and broadened usability. Alongside the many new uses of specimen data, sit a range of traditional uses supporting the collection of morphological data and their application to taxonomy and systematics. (Henning et al. 2018). Technical workflows are needed to support the sustainable collection of this traditional information and maintain the high quality of the morphological data. Data exchange and re-usability requires the use of accepted controlled vocabularies (community approved) that are accessible (web-based ontologies and term vocabularies) and reliable (long-term availability/unique identifiers). The same applies to datasets that must be stored accessibly and sustainably by maintaining all data relationships that would facilitate convenient re-use. This project aims to construct a comprehensive workflow to optimise the delimitation and characterisation (“descriptions”) of taxa (see complementary talk by Plitzner et al.). It is implemented on the open-source software framework of the EDIT Platform for Cybertaxonomy (http://www.cybertaxonomy.org, Ciardelli et al. 2009) extending the workflow for sample data processing developed in a preceding project (Kilian et al. 2015). The principal goals of this new software component are: specimen-level recording and storage of character data in structured character matrices generating taxon characterisations by aggregating the individual specimen-based datasets using and developing community-coordinated, ontology-based exemplar vocabularies persistently linking character datasets with source specimens for high visibility and re-usability specimen-level recording and storage of character data in structured character matrices generating taxon characterisations by aggregating the individual specimen-based datasets using and developing community-coordinated, ontology-based exemplar vocabularies persistently linking character datasets with source specimens for high visibility and re-usability The angiosperm order, Caryophyllales, provides an exemplar use case through cooperation with the Global Caryophyllales Initiative (Borsch et al. 2015). A basic set of morphological terms and vocabularies has been obtained from various online sources (ontologies, glossaries) and can be used, searched and expanded in the EDIT platform. The terms are categorised into: structures, properties and states. Different editors have been developed to combine structure and property terms to characters and assign a customised state vocabulary (categorical) or suitable values and units (numerical) to them. The workflow is built around a data set defining the taxonomic environment of individual use cases. A data set is specified by the characters and a taxonomic group, which can be filtered by area or rank. The dataset can be opened in a tabular representation (character matrix) to enter preselected state terms or values for the individual specimen. The matrix provides several features for basic comparison and analysis and allows the entry of alternative datasets (e.g. literature). Finally, the aggregation of data subsets to potential taxonomic units by adding up the values and summarising character states, allows the convenient test of taxonomic hypotheses. The term additivity is used here to describe this set of workflows and processes adding value to herbarium specimens and accumulating the specimen data for a taxon description.


2019 ◽  
Vol 11 (1) ◽  
Author(s):  
Christopher Bilder ◽  
Joshua Tebbs ◽  
Christopher McMahan

ObjectiveTo develop specimen pooling algorithms that reduce the number of tests needed to test individuals for infectious diseases with multiplex assays.IntroductionAn essential tool for infectious disease surveillance is to have a timely and cost-effective testing method. For this purpose, laboratories frequently use specimen pooling to assay high volumes of clinical specimens. The simplest pooling algorithm employs a two-stage process. In the first stage, a set number of specimens are amalgamated to form a “group” that is tested as if it were one specimen. If this group tests negatively, all individuals within the group are declared disease free. If this group tests positively, a second stage is implemented with retests performed on each individual. This testing algorithm is repeated across all individuals that need to be tested. In comparison to testing each individual specimen, large reductions in the number of tests occur when overall disease prevalence is small because most groups will test negatively.Most pooling algorithms have been developed in the context of single-disease assays. New pooling algorithms are developed in the context of multiplex (multiple-disease) assays applied over two or three hierarchical stages. Individual risk information can be employed by these algorithms to increase testing efficiency.MethodsMonte Carlo simulations are used to emulate pooling and testing processes. These simulations are based on retrospective chlamydia and gonorrhea testing data collected over a two-year period in Idaho, Iowa, and Oregon. For each simulation, the number of tests and measures of accuracy are recorded. All tests were originally performed by the Aptima Combo 2 Assay. Sensitivities and specificities for this assay are included in the simulation process.The R statistical software package is used to perform all simulations. For reproducibility of the research, programs are made available at www.chrisbilder.com/grouptesting to implement the simulations.ResultsReductions in the number of tests were obtained for all states when compared to individual specimen testing. For example, the pooling of Idaho female specimens without taking into account individual risk information resulted in a 47% and a 51% reduction in tests when using two and three stages, respectively. With the addition of individual risk information, further reductions in tests occurred. For example, the pooling of Idaho female specimens resulted in an additional 5% reduction of tests when compared directly to not using individual risk information. These reductions in tests were found to be related to the type of risk information available and the variability in risk levels. For example, males were found to have much more variability than females. For Idaho, this resulted in a 15% further reduction in tests than when not using the risk information.ConclusionsSignificant reductions in the number of tests occur through pooling. These reductions are the most significant when individual risk information is taken into account by the pooling algorithm.


Sign in / Sign up

Export Citation Format

Share Document