scholarly journals The fossil record of whip spiders: the past of Amblypygi

PalZ ◽  
2021 ◽  
Author(s):  
Carolin Haug ◽  
Joachim T. Haug

AbstractWhip spiders (Amblypygi), as their name suggests, resemble spiders (Araneae) in some aspects, but differ from them by their heart-shaped (prosomal) dorsal shield, their prominent grasping pedipalps, and their subsequent elongate pair of feeler appendages. The oldest possible occurrences of whip spiders, represented by cuticle fragments, date back to the Devonian (c. 385 mya), but (almost) complete fossils are known from the Carboniferous (c. 300 mya) onwards. The fossils include specimens preserved on slabs or in nodules (Carboniferous, Cretaceous) as well as specimens preserved in amber (Cretaceous, Eocene, Miocene). We review here all fossil whip spider specimens, figure most of them as interpretative drawings or with high-quality photographs including 3D imaging (stereo images) to make the three-dimensional relief of the specimens visible. Furthermore, we amend the list by two new specimens (resulting in 37 in total). The fossil specimens as well as modern whip spiders were measured to analyse possible changes in morphology over time. In general, the shield appears to have become relatively broader and the pedipalps and walking appendages have become more elongate over geological time. The morphological details are discussed in an evolutionary framework and in comparison with results from earlier studies.

2019 ◽  
Vol 374 (1788) ◽  
pp. 20190392 ◽  
Author(s):  
Peter Smits ◽  
Seth Finnegan

A tenet of conservation palaeobiology is that knowledge of past extinction patterns can help us to better predict future extinctions. Although the future is unobservable, we can test the strength of this proposition by asking how well models conditioned on past observations would have predicted subsequent extinction events at different points in the geological past. To answer this question, we analyse the well-sampled fossil record of Cenozoic planktonic microfossil taxa (Foramanifera, Radiolaria, diatoms and calcareous nanoplankton). We examine how extinction probability varies over time as a function of species age, time of observation, current geographical range, change in geographical range, climate state and change in climate state. Our models have a 70–80% probability of correctly forecasting the rank order of extinction risk for a random out-of-sample species pair, implying that determinants of extinction risk have varied only modestly through time. We find that models which include either historical covariates or account for variation in covariate effects over time yield equivalent forecasts, but a model including both is overfit and yields biased forecasts. An important caveat is that human impacts may substantially disrupt range-risk dynamics so that the future will be less predictable than it has been in the past. This article is part of a discussion meeting issue ‘The past is a foreign country: how much can the fossil record actually inform conservation?’


2003 ◽  
Vol 94 (3) ◽  
pp. 275-281 ◽  
Author(s):  
David Penney

ABSTRACTThe currently accepted cladogram of spider phylogeny and palaeontological data are used to evaluate spider family richness through geological time. A significantly more diverse spider fossil record is predicted than observed. The predicted rate of spider family diversification is considered more accurate because of its close similarity at 0 Ma to the number of extant families. Predicted spider family palaeodiversity is compared with insect family palaeodiversity to investigate whether spiders track insects through geological time. At the family level, the insects, and observed and predicted spider fossil records show an exponential increase over time, the pattern typical of a radiating group. No significant differences are observed in the rates of change in the slopes, and hence rate of diversification of spiders and insects over time. This suggests that spiders probably co-radiated alongside the insects, with the major radiations of both groups occurring at least 100 Ma before the origin of angiosperms.


2020 ◽  
Vol 12 (10) ◽  
pp. 1638
Author(s):  
Yanghai Yu ◽  
Mauro Mariotti d’Alessandro ◽  
Stefano Tebaldini ◽  
Mingsheng Liao

Synthetic Aperture Radar (SAR) Tomography is a technique to provide direct three-dimensional (3D) imaging of the illuminated targets by processing SAR data acquired from different trajectories. In a large part of the literature, 3D imaging is achieved by assuming mono-dimensional (1D) approaches derived from SAR Interferometry, where a vector of pixels from multiple SAR images is transformed into a new vector of pixels representing the vertical profile of scene reflectivity at a given range, azimuth location. However, mono-dimensional approaches are only suited for data acquired from very closely-spaced trajectories, resulting in coarse vertical resolution. In the case of continuous media, such as forests, snow, ice sheets and glaciers, achieving fine vertical resolution is only possible in the presence of largely-spaced trajectories, which involves significant complications concerning the formation of 3D images. The situation gets even more complicated in the presence of irregular trajectories with variable headings, for which the one theoretically exact approach consists of going back to raw SAR data to resolve the targets by 3D back-projection, resulting in a computational burden beyond the capabilities of standard computers. The first aim of this paper is to provide an exhaustive discussion of the conditions under which high-quality tomographic processing can be carried out by assuming a 1D, 2D, or 3D approach to image formation. The case of 3D processing is then further analyzed, and a new processing method is proposed to produce high-quality imaging while largely reducing the computational burden, and without having to process the original raw data. Furthermore, the new method is shown to be easily parallelized and implemented using GPU processing. The analysis is supported by results from numerical simulations as well as from real airborne data from the ESA campaign AlpTomoSAR.


Geophysics ◽  
2021 ◽  
pp. 1-96
Author(s):  
Alain Bonneville ◽  
Andrew J. Black ◽  
Jennifer L. Hare ◽  
Mark E. Kelley ◽  
Mathew Place ◽  
...  

Three borehole gravity (BHG) surveys were performed in 2013, 2016, and 2018 to monitor the changes in gravity/density as a result of the injection and withdrawal of carbon dioxide (CO2) into and out of the Dover 33 carbonate reservoir reef in Northern Michigan. The observed gravity changes and inferred density changes have been modeled to determine the flow and storage zones of the injected CO2 in the reef. The high quality and low level of uncertainty of the data collected make them useful for delineating the CO2 plume position over time and for identifying the oil sweeping extent and mechanisms in the Dover 33 reef. The time-lapse gravity results indicate the effects of the changing CO2 mass within the reservoir, consistent with increasing mass from 2013 to 2016 (following CO2 injection) and a decreasing mass from 2016 to 2018 (after CO2 withdrawal). Three-dimensional imaging of fluid migrations in the reef has been obtained by coupling the time-lapse BHG results to a 3D porosity and permeability model. This coupled approach allows the evaluation of the volume of the reef affected by the injection of CO2 between 2013 and 2016, the efficiency of the oil sweeping between 2016 and 2018, and the location of the residual CO2 plume in the reef after 2018.


Author(s):  
Jerome J. Paulin

Within the past decade it has become apparent that HVEM offers the biologist a means to explore the three-dimensional structure of cells and/or organelles. Stereo-imaging of thick sections (e.g. 0.25-10 μm) not only reveals anatomical features of cellular components, but also reduces errors of interpretation associated with overlap of structures seen in thick sections. Concomitant with stereo-imaging techniques conventional serial Sectioning methods developed with thin sections have been adopted to serial thick sections (≥ 0.25 μm). Three-dimensional reconstructions of the chondriome of several species of trypanosomatid flagellates have been made from tracings of mitochondrial profiles on cellulose acetate sheets. The sheets are flooded with acetone, gluing them together, and the model sawed from the composite and redrawn.The extensive mitochondrial reticulum can be seen in consecutive thick sections of (0.25 μm thick) Crithidia fasciculata (Figs. 1-2). Profiles of the mitochondrion are distinguishable from the anterior apex of the cell (small arrow, Fig. 1) to the posterior pole (small arrow, Fig. 2).


Author(s):  
Neil Rowlands ◽  
Jeff Price ◽  
Michael Kersker ◽  
Seichi Suzuki ◽  
Steve Young ◽  
...  

Three-dimensional (3D) microstructure visualization on the electron microscope requires that the sample be tilted to different positions to collect a series of projections. This tilting should be performed rapidly for on-line stereo viewing and precisely for off-line tomographic reconstruction. Usually a projection series is collected using mechanical stage tilt alone. The stereo pairs must be viewed off-line and the 60 to 120 tomographic projections must be aligned with fiduciary markers or digital correlation methods. The delay in viewing stereo pairs and the alignment problems in tomographic reconstruction could be eliminated or improved by tilting the beam if such tilt could be accomplished without image translation.A microscope capable of beam tilt with simultaneous image shift to eliminate tilt-induced translation has been investigated for 3D imaging of thick (1 μm) biologic specimens. By tilting the beam above and through the specimen and bringing it back below the specimen, a brightfield image with a projection angle corresponding to the beam tilt angle can be recorded (Fig. 1a).


Author(s):  
S. P. Eron’ko ◽  
M. Yu. Tkachev ◽  
E. V. Oshovskaya ◽  
B. I. Starodubtsev ◽  
S. V. Mechik

Effective application of slag-forming mixtures (SFM), being fed into continuous castingg machine (CCM) moulds, depends on their even distribution on the melt surface. Manual feeding of the SFM which is widely usedd does not provide this condition, resulting in the necessity to actualize the work to elaborate systems of SFM mechanized feedingg into moulds of various types CCM. A concept of the designing of a system of SFM feeding into CCM moulds presented with the ratte strictly correspondent to the casting speed and providing formation of an even layer of fine material of given thickness on the whoole surface of liquid steel. The proposed methods of designing of the SFM mechanized feeding systems based on three-dimensional computer simulation with the subsequent verification of the correctness of the adopted technical solutions on field samples. Informattion is presented on the design features of the adjusted facilities intended for continuous supply of finely granulated and powder mixtuures on metal mirror in moulds at the production of high-quality billets, blooms and slabs. Variants of mechanical and pneumo-mechaanical SFM supply elaborated. At the mechanical supply the fine material from the feeding hopper is moved at a adjusted distance bby a rigid horizontally located screw. At the pneumo-mechanical supply the metered doze of the granular mixture is delivered by a sshort vertical screw, the lower part of which is located in the mixing chamber attached from below to the hopper and equipped with ann ejector serving for pneumatic supply of the SFM in a stream of transporting gas. It was proposed to use flexible spiral screws in the ffuture facilities of mechanical SFM feeding. It will enable to eliminate the restrictions stipulated by the lack of free surface for locatiion of the facility in the working zone of the tundish, as well as to decrease significantly the mass of its movable part and to decreaase the necessary power of the carriage moving mechanism driver. The novelty of the proposed technical solutions is protected by thhree patents. The reduction of 10–15% in the consumption of slag-forming mixtures during the transition from manual to mechanizeed feeding confirmed. The resulting economic effect from the implementation of technical development enables to recoup the costs inncurred within 8–10 months.


2020 ◽  
Vol 2020 (1) ◽  
pp. 105-108
Author(s):  
Ali Alsam

Vision is the science that informs us about the biological and evolutionary algorithms that our eyes, opticnerves and brains have chosen over time to see. This article is an attempt to solve the problem of colour to grey conversion, by borrowing ideas from vision science. We introduce an algorithm that measures contrast along the opponent colour directions and use the results to combine a three dimensional colour space into a grey. The results indicate that the proposed algorithm competes with the state of art algorithms.


Author(s):  
Halit Dogan ◽  
Md Mahbub Alam ◽  
Navid Asadizanjani ◽  
Sina Shahbazmohamadi ◽  
Domenic Forte ◽  
...  

Abstract X-ray tomography is a promising technique that can provide micron level, internal structure, and three dimensional (3D) information of an integrated circuit (IC) component without the need for serial sectioning or decapsulation. This is especially useful for counterfeit IC detection as demonstrated by recent work. Although the components remain physically intact during tomography, the effect of radiation on the electrical functionality is not yet fully investigated. In this paper we analyze the impact of X-ray tomography on the reliability of ICs with different fabrication technologies. We perform a 3D imaging using an advanced X-ray machine on Intel flash memories, Macronix flash memories, Xilinx Spartan 3 and Spartan 6 FPGAs. Electrical functionalities are then tested in a systematic procedure after each round of tomography to estimate the impact of X-ray on Flash erase time, read margin, and program operation, and the frequencies of ring oscillators in the FPGAs. A major finding is that erase times for flash memories of older technology are significantly degraded when exposed to tomography, eventually resulting in failure. However, the flash and Xilinx FPGAs of newer technologies seem less sensitive to tomography, as only minor degradations are observed. Further, we did not identify permanent failures for any chips in the time needed to perform tomography for counterfeit detection (approximately 2 hours).


Author(s):  
Vladislav Sh. Shagapov ◽  
Ismagilyan G. Khusainov ◽  
Emiliya V. Galiakbarova ◽  
Zulfya R. Khakimova

This article studies the process of relaxation of the pressure in a tank with the damaged area of the wall after pressure-testing. The authors use different methods for the diagnosis of the technical condition of objects of petroleum products storage. Pressure testing is one of nondestructive methods. The rate of pressure decrease is characteristic of the system tightness. This article studies the cases of ground and underground location of the tank. Pressure testing involves excess pressure inside of a tank and observing its decrease. Over time, one can assess the integrity of the system. This has required creating mathematical models to account the filtration of the liquid depending on the location of the tank. The results include the analytical solution of the task and the formulas for describing the dependence of the relaxation time of pressure in the tank from the liquid and soil parameters, geometry of the tank, and the damaged portion of the wall. The two- and three-dimensional cases of liquids filtration for the case of underground location of the tank were considered. The results of some numerical calculations of the dependence of reduction time and the time of half-life pressure from the area of the damaged portion of the wall were shown. The obtained solutions allow assessing the extent of the damaged area by the pressure testing with known values of tank, liquid, and soil.


Sign in / Sign up

Export Citation Format

Share Document