scholarly journals A novel approach to an old problem: Analysis of systematic errors in two models of recognition memory

2014 ◽  
Vol 52 ◽  
pp. 51-56 ◽  
Author(s):  
Adam J.O. Dede ◽  
Larry R. Squire ◽  
John T. Wixted
2020 ◽  
Author(s):  
Vincenza Luceri ◽  
Erricos C. Pavlis ◽  
Antonio Basoni ◽  
David Sarrocco ◽  
Magdalena Kuzmicz-Cieslak ◽  
...  

<p>The International Laser Ranging Service (ILRS) Analysis Standing Committee (ASC) plans to complete the re-analysis of the SLR data since 1983 to end of this year by early 2021. This will ensure that the ILRS contribution to ITRF2020 will be available to ITRS by February 2021, as agreed by all space geodetic techniques answering its call. In preparation for the development of this contribution, the ILRS completed the re-analysis of all data (1983 to present), based on an improved modeling of the data and a novel approach that ensures the results are free of systematic errors in the underlying data. The new approach was developed after the completion of ITRF2014, the ILRS ASC devoting almost entirely its efforts on this task. A Pilot Project initially demonstrated the robust estimation of persistent systematic errors at the millimeter level, leading us to adopt a consistent set of a priori corrections for data collected in past years. The initial reanalysis used these corrections, leading to improved results for the TRF attributes, reflected in the resulting new time series of the TRF origin and scale. The ILRS ASC will now use the new approach in the development of its operational products and as a tool to monitor station performance, extending the history of systematics for each system that will be used in future re-analysis. The new operational products form a seamless extension of the re-analysis series, providing a continuous product based on our best knowledge of the ground system behavior and performance, without any dependence whatsoever on a priori knowledge of systematic errors (although information provided by the stations from their own engineering investigations are always welcome and taken into consideration). The presentation will demonstrate the level of improvement with respect to the previous ILRS product series and give a glimpse of what is to be expected from the development of a preliminary version of the ITRF2020.</p>


2021 ◽  
Author(s):  
Vincenza Luceri ◽  
Erricos C. Pavlis ◽  
Antonio Basoni ◽  
David Sarrocco ◽  
Magdalena Kuzmicz-Cieslak ◽  
...  

<p>The International Laser Ranging Service (ILRS) contribution to ITRF2020 has been prepared after the re-analysis of the data from 1993 to 2020, based on an improved modeling of the data and a novel approach that ensures the results are free of systematic errors in the underlying data. This reanalysis incorporates an improved “target signature” model (CoM) that allows better separation of true systematic error of each tracking system from the errors in the model describing the target’s signature. The new approach was developed after the completion of ITRF2014, the ILRS Analysis Standing Committee (ASC) devoting almost entirely its efforts on this task. The robust estimation of persistent systematic errors at the millimeter level permitted the adoption of a consistent set of long-term mean corrections for data collected in past years, which are now applied a priori (information provided by the stations from their own engineering investigations are still taken into consideration). The reanalysis used these corrections, leading to improved results for the TRF attributes, reflected in the resulting new time series of the TRF origin and especially in the scale. Seven official ILRS Analysis Centers computed time series of weekly solutions, according to the guidelines defined by the ILRS ASC. These series were combined by the ILRS Combination Center to obtain the official ILRS product contribution to ITRF2020.</p><p>The presentation will provide an overview of the analysis procedures and models, and it will demonstrate the level of improvement with respect to the previous ILRS product series; the stability and consistency of the solution are discussed for the individual AC contributions and the combined SLR time series.</p>


Author(s):  
Mohamed Galaleldin ◽  
Francois Bouchard ◽  
Hanan Anis ◽  
Claude Lague

Makerspaces are gaining more ground in universities and other educational institutions as a novel approach to boost creativity, innovation, and provide more opportunities for experiential and hands-on learning experience. Albeit being multidisciplinary, and open spaces in nature,Makerspaces still lack integration to the curricula of engineering schools. With increasingly competitive markets, there is a need to educate future engineers with necessary skills to be more creative and to be able to compete in today’s global market. A twophase study was developed to study the integration of the Makerspace concept in engineering schools. The first phase was based on interviews with five North American University Makerspaces that vary in size, objective, business model, and management structure to identify best Makerspace practices in preparationof the establishment of the University of Ottawa’s Richard L’Abbé Makerspace. The second phase was a survey administered to engineering students who have used the Richard L’Abbé Makerspace since its opening in the fall of 2014 to assess its impact on their engineering competencies, in particular design skills, problem analysis, communication and teamwork skills, investigation skills, and entrepreneurial skills. This paper aims at studying best practices of Makerspaces on campus and their impacts onengineering education and on the development ofdesired skills and competencies for engineering students.


2021 ◽  
Vol 36 (6) ◽  
pp. 684-696
Author(s):  
Xianqiang Li ◽  
Kedan Mao ◽  
Ao Wang ◽  
Ji Tian ◽  
Wenchuang Zhou

When a high-power very low frequency (VLF) communication system is in operation, the end of the antenna is in an alternating strong electric field environment. Due to dielectric loss, abnormal temperature rise may occur at the end of the antenna. To solve the problem, analysis on the electric field distribution and temperature rising effect at the end of the antenna is first carried out in this paper. The factors that affect the electric field distribution and temperature rising, including the amplitude and frequency of the excitation voltage, the diameter of the antenna conductor and the material properties of the outer sheath of the antenna, are studied in detail. A novel approach to improve the electric field distribution and to suppress temperature rising is proposed by designing a dielectric loss eliminator, and the effectiveness of the designed device is verified by simulation.


2020 ◽  
Vol 2 (3) ◽  
pp. 1-13
Author(s):  
Aiqing Wang

In this paper, I investigate Chinese neologisms in the field of fandom from a rhetorical perspective. Chinese fans either borrow existing expressions, sometimes Internet neologisms, and employ them in a novel approach, or create new expressions. Fandom neologisms may involve conceptual metaphor and conceptual metonymy. Metaphor can be categorised into playful metaphors and visual metaphors, the former of which may be concerning war, food or sex. Sex-related metaphors in fan neologisms are expressed via euphemismby means of alphabetic words, homophones and altered characters, owing to social taboo and Internet language usage regulation. In terms of fandom neologisms involving metonymy, they may be accompanied by nominalisation, verbification and hyperbole. Moreover, my observation indicates that Chinese fandom neologisms normally demonstrate semantic opaqueness, which I presume might be correlated with recognition memory. As a subcategory of Internet neologisms generated from networked grassroots communication,fandom neologisms demonstrate an upward transmission direction, as well as a potential to enter the mainstream lexicon by means of being cited by the traditional media.


2019 ◽  
Vol 42 ◽  
Author(s):  
Olya Hakobyan ◽  
Sen Cheng

Abstract We fully support dissociating the subjective experience from the memory contents in recognition memory, as Bastin et al. posit in the target article. However, having two generic memory modules with qualitatively different functions is not mandatory and is in fact inconsistent with experimental evidence. We propose that quantitative differences in the properties of the memory modules can account for the apparent dissociation of recollection and familiarity along anatomical lines.


1978 ◽  
Vol 48 ◽  
pp. 7-29
Author(s):  
T. E. Lutz

This review paper deals with the use of statistical methods to evaluate systematic and random errors associated with trigonometric parallaxes. First, systematic errors which arise when using trigonometric parallaxes to calibrate luminosity systems are discussed. Next, determination of the external errors of parallax measurement are reviewed. Observatory corrections are discussed. Schilt’s point, that as the causes of these systematic differences between observatories are not known the computed corrections can not be applied appropriately, is emphasized. However, modern parallax work is sufficiently accurate that it is necessary to determine observatory corrections if full use is to be made of the potential precision of the data. To this end, it is suggested that a prior experimental design is required. Past experience has shown that accidental overlap of observing programs will not suffice to determine observatory corrections which are meaningful.


1988 ◽  
Vol 102 ◽  
pp. 215
Author(s):  
R.M. More ◽  
G.B. Zimmerman ◽  
Z. Zinamon

Autoionization and dielectronic attachment are usually omitted from rate equations for the non–LTE average–atom model, causing systematic errors in predicted ionization states and electronic populations for atoms in hot dense plasmas produced by laser irradiation of solid targets. We formulate a method by which dielectronic recombination can be included in average–atom calculations without conflict with the principle of detailed balance. The essential new feature in this extended average atom model is a treatment of strong correlations of electron populations induced by the dielectronic attachment process.


Author(s):  
W.J. de Ruijter ◽  
Sharma Renu

Established methods for measurement of lattice spacings and angles of crystalline materials include x-ray diffraction, microdiffraction and HREM imaging. Structural information from HREM images is normally obtained off-line with the traveling table microscope or by the optical diffractogram technique. We present a new method for precise measurement of lattice vectors from HREM images using an on-line computer connected to the electron microscope. It has already been established that an image of crystalline material can be represented by a finite number of sinusoids. The amplitude and the phase of these sinusoids are affected by the microscope transfer characteristics, which are strongly influenced by the settings of defocus, astigmatism and beam alignment. However, the frequency of each sinusoid is solely a function of overall magnification and periodicities present in the specimen. After proper calibration of the overall magnification, lattice vectors can be measured unambiguously from HREM images.Measurement of lattice vectors is a statistical parameter estimation problem which is similar to amplitude, phase and frequency estimation of sinusoids in 1-dimensional signals as encountered, for example, in radar, sonar and telecommunications. It is important to properly model the observations, the systematic errors and the non-systematic errors. The observations are modelled as a sum of (2-dimensional) sinusoids. In the present study the components of the frequency vector of the sinusoids are the only parameters of interest. Non-systematic errors in recorded electron images are described as white Gaussian noise. The most important systematic error is geometric distortion. Lattice vectors are measured using a two step procedure. First a coarse search is obtained using a Fast Fourier Transform on an image section of interest. Prior to Fourier transformation the image section is multiplied with a window, which gradually falls off to zero at the edges. The user indicates interactively the periodicities of interest by selecting spots in the digital diffractogram. A fine search for each selected frequency is implemented using a bilinear interpolation, which is dependent on the window function. It is possible to refine the estimation even further using a non-linear least squares estimation. The first two steps provide the proper starting values for the numerical minimization (e.g. Gauss-Newton). This third step increases the precision with 30% to the highest theoretically attainable (Cramer and Rao Lower Bound). In the present studies we use a Gatan 622 TV camera attached to the JEM 4000EX electron microscope. Image analysis is implemented on a Micro VAX II computer equipped with a powerful array processor and real time image processing hardware. The typical precision, as defined by the standard deviation of the distribution of measurement errors, is found to be <0.003Å measured on single crystal silicon and <0.02Å measured on small (10-30Å) specimen areas. These values are ×10 times larger than predicted by theory. Furthermore, the measured precision is observed to be independent on signal-to-noise ratio (determined by the number of averaged TV frames). Obviously, the precision is restricted by geometric distortion mainly caused by the TV camera. For this reason, we are replacing the Gatan 622 TV camera with a modern high-grade CCD-based camera system. Such a system not only has negligible geometric distortion, but also high dynamic range (>10,000) and high resolution (1024x1024 pixels). The geometric distortion of the projector lenses can be measured, and corrected through re-sampling of the digitized image.


Sign in / Sign up

Export Citation Format

Share Document