Diverse Populations are Conflated with Heterogeneous Collectives

2021 ◽  
Vol 118 (10) ◽  
pp. 525-548
Author(s):  
Ayelet Shavit ◽  
Aaron M. Ellison ◽  

The concept of difference has a long and important research tradition. We identify and explicate a heretofore overlooked distinction in the meaning and measurement of two different meanings of 'difference': 'diversity' and 'heterogeneity'. We argue that ‘diversity’ can describe a population well enough but does not describe a collective well. In contrast, ‘heterogeneity’ describes a collective better than a population and therefore ought to describe a collective. We argue that ignoring these distinctions can lead to a surprising and disturbing conflict between diversity and heterogeneity. In particular, focusing on the 'diversity' of human communities can be self-defeating for those who truly care about the importance of diversity, inclusion, and belonging.

Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-17
Author(s):  
Ying Jin ◽  
Guangming Cui ◽  
Yiwen Zhang

Service-oriented architecture (SOA) is widely used, which has fueled the rapid growth of Web services and the deployment of tremendous Web services over the last decades. It becomes challenging but crucial to find the proper Web services because of the increasing amount of Web services. However, it proves unfeasible to inspect all the Web services to check their quality values since it will consume a lot of resources. Thus, developing effective and efficient approaches for predicting the quality values of Web services has become an important research issue. In this paper, we propose UIQPCA, a novel approach for hybrid User and Item-based Quality Prediction with Covering Algorithm. UIQPCA integrates information of both users and Web services on the basis of users’ ideas on the quality of coinvoked Web services. After the integration, users and Web services which are similar to the target user and the target Web service are selected. Then, considering the result of integration, UIQPCA makes predictions on how a target user will appraise a target Web service. Broad experiments on WS-Dream, a web service dataset which is widely used in real world, are conducted to evaluate the reliability of UIQPCA. According to the results of experiment, UIQPCA is far better than former approaches, including item-based, user-based, hybrid, and cluster-based approaches.


Author(s):  
Saeed Q. Al-Khalidi Al-Maliki

For the last five years, the Kingdom of Saudi Arabia has taken a leading role in the Middle East in providing effective electronic government (e-government) services and encouraging their use. The global average for government website usage by citizens is about 30%. The vast majority of Saudi citizens visit government offices to obtain information rather than making transactions through government portals. However, it is apparent that the rate of global e-government adoption has fallen below expectations, although some countries are doing better than others. Clearly, a better understanding of why and how citizens use government websites, as well as their general disposition towards e-governance, is an important research issue. This paper advances the discussion on this issue by proposing a conceptual model of e-government adoption that places users at the focal point of e-government adoption strategies.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Anfernee Joan B. Ng ◽  
Kun-Hong Liu

Speech emotion recognition (SER) is an important research topic. Image features like spectrograms are one of the common ways of extracting information from speech. In the area of image recognition, a relatively novel type of network called capsule networks has shown good and promising results. This study aims to use capsule networks to encode spatial information from spectrograms and analyse its performance when paired with different loss functions. Experiments comparing the capsule network with models from previous works show that the capsule network performs better than them.


Author(s):  
G. A. Richardson ◽  
A. Anderton ◽  
R. J. Chalmers ◽  
S. Cochran ◽  
P. Letton ◽  
...  

SynopsisThis paper looks at the contribution made by the Central Institutions sector in general, and The Queen's College, Glasgow in particular, to the teaching of food studies and the development of related research in Scotland. It traces briefly the developments in food studies over the last 100 years in home economics, dietetics and catering and discusses current concepts in food studies in these disciplines.The food studies teaching that now forms an integral part of the work of students on higher diploma, degree and postgraduate courses continues to develop due to the growth of staff research investigations and student project work. An important research tradition is being established in the Central Institutions and of particular note is the impressive number of research projects related to food studies.Course syllabuses are constantly being reviewed and updated in the light of recent developments and changes in society, industry, technology and research. The work at The Queen's College, Glasgow is an example of the developments in food studies at other Central Institutions and illustrates the commitment to the continued development of vocational courses by this sector of education.


Author(s):  
Lihong Chen ◽  
Jianzhuo Yan ◽  
Jianhui Chen ◽  
Ying Sheng ◽  
Zhe Xu ◽  
...  

Abstract Neuroimaging text mining extracts knowledge from neuroimaging text and has received widespread attention. Topic learning is an important research focus of neuroimaging text mining. However, current neuroimaging topic learning researches mainly use traditional probability topic models to extract topics from literature and cannot obtain high-quality neuroimaging topics. The existing topic learning methods cannot meet the requirements of topic learning oriented to full-text neuroimaging literature. In this paper, three types of neuroimaging research topic events are defined to describe the process and result of neuroimaging research. An event based topic learning pipeline, called neuroimaging Event-BTM, is proposed to realize knowledge extraction from full-text neuroimaging literature. The experimental results on the PLoS One data set show that the accuracy and completeness of proposed method are significantly better than the existing main topic learning methods.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
Lihong Chen ◽  
Jianzhuo Yan ◽  
Jianhui Chen ◽  
Ying Sheng ◽  
Zhe Xu ◽  
...  

AbstractNeuroimaging text mining extracts knowledge from neuroimaging texts and has received widespread attention. Topic learning is an important research focus of neuroimaging text mining. However, current neuroimaging topic learning researches mainly used traditional probability topic models to extract topics from literature and cannot obtain high-quality neuroimaging topics. The existing topic learning methods also cannot meet the requirements of topic learning oriented to full-text neuroimaging literature. In this paper, three types of neuroimaging research topic events are defined to describe the process and result of neuroimaging researches. An event based topic learning pipeline, called neuroimaging Event-BTM, is proposed to realize topic learning from full-text neuroimaging literature. The experimental results on the PLoS One data set show that the accuracy and completeness of the proposed method are significantly better than the existing main topic learning methods.


2021 ◽  
pp. 014616722098536
Author(s):  
Michael J. Gill ◽  
Stephanie C. Cerce

Blame permeates our social lives. When done properly, blame can facilitate the upholding of moral norms. When done with excessive intensity or harshness, however, blame can have significant negative impacts. Here, we develop and validate a scale—the Blame Intensity Inventory—to measure individual differences in the propensity for intense blame responses. First, we present evidence for its convergent and divergent validity by examining relations with existing scales. In addition, in two studies, we show that the Blame Intensity Inventory—rooted in an affective conception of blame—predicts hostile responses to offenders better than do measures focused on blame-related cognitive appraisals (e.g., free will, intentionality). Finally, in three studies, we show that Blame Intensity uniquely predicts malicious satisfaction, or gratification upon learning that an offender has suffered gratuitous harm. Results are discussed in terms of important research questions that could be addressed using the Blame Intensity Inventory.


1972 ◽  
Vol 1 ◽  
pp. 27-38
Author(s):  
J. Hers

In South Africa the modern outlook towards time may be said to have started in 1948. Both the two major observatories, The Royal Observatory in Cape Town and the Union Observatory (now known as the Republic Observatory) in Johannesburg had, of course, been involved in the astronomical determination of time almost from their inception, and the Johannesburg Observatory has been responsible for the official time of South Africa since 1908. However the pendulum clocks then in use could not be relied on to provide an accuracy better than about 1/10 second, which was of the same order as that of the astronomical observations. It is doubtful if much use was made of even this limited accuracy outside the two observatories, and although there may – occasionally have been a demand for more accurate time, it was certainly not voiced.


Author(s):  
J. Frank ◽  
P.-Y. Sizaret ◽  
A. Verschoor ◽  
J. Lamy

The accuracy with which the attachment site of immunolabels bound to macromolecules may be localized in electron microscopic images can be considerably improved by using single particle averaging. The example studied in this work showed that the accuracy may be better than the resolution limit imposed by negative staining (∽2nm).The structure used for this demonstration was a halfmolecule of Limulus polyphemus (LP) hemocyanin, consisting of 24 subunits grouped into four hexamers. The top view of this structure was previously studied by image averaging and correspondence analysis. It was found to vary according to the flip or flop position of the molecule, and to the stain imbalance between diagonally opposed hexamers (“rocking effect”). These findings have recently been incorporated into a model of the full 8 × 6 molecule.LP hemocyanin contains eight different polypeptides, and antibodies specific for one, LP II, were used. Uranyl acetate was used as stain. A total of 58 molecule images (29 unlabelled, 29 labelled with antl-LPII Fab) showing the top view were digitized in the microdensitometer with a sampling distance of 50μ corresponding to 6.25nm.


Author(s):  
A. V. Crewe

We have become accustomed to differentiating between the scanning microscope and the conventional transmission microscope according to the resolving power which the two instruments offer. The conventional microscope is capable of a point resolution of a few angstroms and line resolutions of periodic objects of about 1Å. On the other hand, the scanning microscope, in its normal form, is not ordinarily capable of a point resolution better than 100Å. Upon examining reasons for the 100Å limitation, it becomes clear that this is based more on tradition than reason, and in particular, it is a condition imposed upon the microscope by adherence to thermal sources of electrons.


Sign in / Sign up

Export Citation Format

Share Document