The Structure of Prediction Market Research: Important Publications and Research Clusters

2017 ◽  
Vol 11 (1) ◽  
pp. 51-65
Author(s):  
Frank M. A. Klingert

This research paper identifies the most important publications and research clusters in the field of prediction markets. Two literature reviews in 2007 and 2014 have already shown a rising number of publications and classified them into several classes. However, the a priori selection of classes limited the analysis. Furthermore, it is still not quantitatively measured which publications have influenced prediction market research most. This research paper extends the existing literature based on the analysis of more than 18,000 citations. Thus, it identifies the most important publications and relevant research topics. It indicates that prediction market research relies primarily on publications within its own field. This paper concludes that some publications have already become “classic” and four main research clusters have emerged: Efficient information aggregation, manipulation, innovation markets and forecasting elections.

Author(s):  
Maria A. Milkova

Nowadays the process of information accumulation is so rapid that the concept of the usual iterative search requires revision. Being in the world of oversaturated information in order to comprehensively cover and analyze the problem under study, it is necessary to make high demands on the search methods. An innovative approach to search should flexibly take into account the large amount of already accumulated knowledge and a priori requirements for results. The results, in turn, should immediately provide a roadmap of the direction being studied with the possibility of as much detail as possible. The approach to search based on topic modeling, the so-called topic search, allows you to take into account all these requirements and thereby streamline the nature of working with information, increase the efficiency of knowledge production, avoid cognitive biases in the perception of information, which is important both on micro and macro level. In order to demonstrate an example of applying topic search, the article considers the task of analyzing an import substitution program based on patent data. The program includes plans for 22 industries and contains more than 1,500 products and technologies for the proposed import substitution. The use of patent search based on topic modeling allows to search immediately by the blocks of a priori information – terms of industrial plans for import substitution and at the output get a selection of relevant documents for each of the industries. This approach allows not only to provide a comprehensive picture of the effectiveness of the program as a whole, but also to visually obtain more detailed information about which groups of products and technologies have been patented.


Author(s):  
Laure Fournier ◽  
Lena Costaridou ◽  
Luc Bidaut ◽  
Nicolas Michoux ◽  
Frederic E. Lecouvet ◽  
...  

Abstract Existing quantitative imaging biomarkers (QIBs) are associated with known biological tissue characteristics and follow a well-understood path of technical, biological and clinical validation before incorporation into clinical trials. In radiomics, novel data-driven processes extract numerous visually imperceptible statistical features from the imaging data with no a priori assumptions on their correlation with biological processes. The selection of relevant features (radiomic signature) and incorporation into clinical trials therefore requires additional considerations to ensure meaningful imaging endpoints. Also, the number of radiomic features tested means that power calculations would result in sample sizes impossible to achieve within clinical trials. This article examines how the process of standardising and validating data-driven imaging biomarkers differs from those based on biological associations. Radiomic signatures are best developed initially on datasets that represent diversity of acquisition protocols as well as diversity of disease and of normal findings, rather than within clinical trials with standardised and optimised protocols as this would risk the selection of radiomic features being linked to the imaging process rather than the pathology. Normalisation through discretisation and feature harmonisation are essential pre-processing steps. Biological correlation may be performed after the technical and clinical validity of a radiomic signature is established, but is not mandatory. Feature selection may be part of discovery within a radiomics-specific trial or represent exploratory endpoints within an established trial; a previously validated radiomic signature may even be used as a primary/secondary endpoint, particularly if associations are demonstrated with specific biological processes and pathways being targeted within clinical trials. Key Points • Data-driven processes like radiomics risk false discoveries due to high-dimensionality of the dataset compared to sample size, making adequate diversity of the data, cross-validation and external validation essential to mitigate the risks of spurious associations and overfitting. • Use of radiomic signatures within clinical trials requires multistep standardisation of image acquisition, image analysis and data mining processes. • Biological correlation may be established after clinical validation but is not mandatory.


1988 ◽  
Vol 32 (17) ◽  
pp. 1179-1182 ◽  
Author(s):  
P. Jay Merkle ◽  
Douglas B. Beaudet ◽  
Robert C. Williges ◽  
David W. Herlong ◽  
Beverly H. Williges

This paper describes a systematic methodology for selecting independent variables to be considered in large-scale research problems. Five specific procedures including brainstorming, prototype interface representation, feasibility/relevance analyses, structured literature reviews, and user subjective ratings are evaluated and incorporated into an integrated strategy. This methodology is demonstrated in the context of designing the user interface for a telephone-based information inquiry system. The procedure was successful in reducing an initial set of 95 independent variables to a subset of 19 factors that warrant subsequent detailed analysis. These results are discussed in terms of a comprehensive sequential research methodology useful for investigating human factors problems.


PLoS ONE ◽  
2021 ◽  
Vol 16 (8) ◽  
pp. e0255849
Author(s):  
Can Dai ◽  
Quan Chen ◽  
Tao Wan ◽  
Fan Liu ◽  
Yanbing Gong ◽  
...  

References are employed in most academic research papers to give credits and to reflect scholarliness. With the upsurge in academic publications in recent decades, we are curious to know how the number of references cited per research article has changed across different disciplines over that time. The results of our study showed significant linear growth in reference density in eight disciplinary categories between 1980 and 2019 indexed in Web of Science. It appears that reference saturation is not yet in sight. Overall, the general increase in the number of publications and the advanced accessibility of the Internet and digitized documents may have promoted the growth in references in certain fields. However, the seemingly runaway tendency should be well appreciated and objectively assessed. We suggest that authors focus on their research itself rather than on political considerations during the process of writing, especially the selection of important references to cite.


2021 ◽  
Vol 13 (22) ◽  
pp. 4509
Author(s):  
Gaspare Galati ◽  
Gabriele Pavan ◽  
Kubilay Savci ◽  
Christoph Wasserzier

In defense applications, the main features of radars are the Low Probability of Intercept (LPI) and the Low Probability of Exploitation (LPE). The counterpart uses more and more capable intercept receivers and signal processors thanks to the ongoing technological progress. Noise Radar Technology (NRT) is probably a very effective answer to the increasing demand for operational LPI/LPE radars. The design and selection of the radiated waveforms, while respecting the prescribed spectrum occupancy, has to comply with the contrasting requirements of LPI/LPE and of a favorable shape of the ambiguity function. Information theory seems to be a “technologically agnostic” tool to attempt to quantify the LPI/LPE capability of noise waveforms with little, or absent, a priori knowledge of the means and the strategies used by the counterpart. An information theoretical analysis can lead to practical results in the design and selection of NRT waveforms.


2019 ◽  
Vol 4 (1) ◽  
pp. 64-67
Author(s):  
Pavel Kim

One of the fundamental tasks of cluster analysis is the partitioning of multidimensional data samples into groups of clusters – objects, which are closed in the sense of some given measure of similarity. In a some of problems, the number of clusters is set a priori, but more often it is required to determine them in the course of solving clustering. With a large number of clusters, especially if the data is “noisy,” the task becomes difficult for analyzing by experts, so it is artificially reduces the number of consideration clusters. The formal means of merging the “neighboring” clusters are considered, creating the basis for parameterizing the number of significant clusters in the “natural” clustering model [1].


2004 ◽  
Vol 31 (1) ◽  
pp. 27-32 ◽  
Author(s):  
Abdellatif Khamlichi ◽  
Mohammed Bezzazi ◽  
Larbi Elbakkali ◽  
Ali Limam

The effects of geometrical imperfections on the critical load of elastic cylindrical shells when subjected to axial compression are studied through analytical modelling. In addition to distributed defects of both axisymmetric or asymmetric forms, emphasis is put on the more severe case of localized defects satisfying the axial symmetry. The Von Kármán – Donnell shell equations were used. The obtained results show that shell strength at buckling varies very much with the defect amplitude. These variations are not monotonic in general. They indicate however a clear reduction of the shell critical load for some defects revealed as the most dangerous ones. The proposed method does not consider the complete coupled situation that may arise from interactions between several localized defects. It facilitates nevertheless straightforward initializing of closer analyses if such couplings are to be taken into account by means of special numerical approaches, because it enables fast a priori selection of the most hazardous isolated defects.Key words: stability, buckling, imperfections, thin shells, silos, localized defects.


Sign in / Sign up

Export Citation Format

Share Document