scholarly journals A networked voting rule for democratic representation

2018 ◽  
Vol 5 (3) ◽  
pp. 172265 ◽  
Author(s):  
Alexis R. Hernández ◽  
Carlos Gracia-Lázaro ◽  
Edgardo Brigatti ◽  
Yamir Moreno

We introduce a general framework for exploring the problem of selecting a committee of representatives with the aim of studying a networked voting rule based on a decentralized large-scale platform, which can assure a strong accountability of the elected. The results of our simulations suggest that this algorithm-based approach is able to obtain a high representativeness for relatively small committees, performing even better than a classical voting rule based on a closed list of candidates. We show that a general relation between committee size and representatives exists in the form of an inverse square root law and that the normalized committee size approximately scales with the inverse of the community size, allowing the scalability to very large populations. These findings are not strongly influenced by the different networks used to describe the individuals’ interactions, except for the presence of few individuals with very high connectivity which can have a marginal negative effect in the committee selection process.

Author(s):  
PAPIYA BHOWMIK ◽  
SUJIT KUMAR PATTANAYAK ◽  
SHIBAMAY MITRA

Many research has shown that approximately 70% of every medium to large scale industries have some type of quality improvement (QI) program. Depending on various independent studies, researchers have concluded that only onefifth of all QI projects show attractive output. The reason for this disappointing result is most of the QI programs are not result oriented. The main aim of this paper is to elaborate the value of using the Theory of Constraints (TOC), so that a result-oriented QI program can be achieved with a better bottom-line impact, which will be better than the traditional cost based selection process.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Marco Alberto De Benedetto ◽  
Elena D’Agostino ◽  
Giuseppe Sobbrio

Abstract We study the effect of the electoral system (single-ballot vs runoff) on the quality of politicians, measured by the average educational attainment, at the local level in Italy over the period 1994–2017. By exploiting the discontinuous voting rule shift nearby the 15,000 population cut-off, we have implemented a RDD and found that the change in the electoral scheme leads to an overall downward variation in the educational attainment of local politicians by about 2 % compared to years of schooling of politicians in municipalities just below the cut-off. Findings are similar when we separately focus on the educational attainment of mayors and councilors, and when we use alternative measures of quality of politicians related both to the previous occupation and to previous political experience. However, different confounding policies related to the voting scheme change at the cut-off. We show that the negative effect is not directly related to the way politicians are elected (runoff vs single-ballot scheme) but to the number of lists supporting the mayoral candidates: in municipalities below 15,000 inhabitants candidates running for mayor are supported only by one single list, whereas above the cut-off mayoral candidates might be supported by more lists. Overall, we speculate that the negative impact produced by the treatment on the educational attainment of local politicians is explained by the different selection process of candidates adopted by political parties, rather than by voters’ preferences toward low-skilled politicians.


1969 ◽  
Vol 08 (01) ◽  
pp. 07-11 ◽  
Author(s):  
H. B. Newcombe

Methods are described for deriving personal and family histories of birth, marriage, procreation, ill health and death, for large populations, from existing civil registrations of vital events and the routine records of ill health. Computers have been used to group together and »link« the separately derived records pertaining to successive events in the lives of the same individuals and families, rapidly and on a large scale. Most of the records employed are already available as machine readable punchcards and magnetic tapes, for statistical and administrative purposes, and only minor modifications have been made to the manner in which these are produced.As applied to the population of the Canadian province of British Columbia (currently about 2 million people) these methods have already yielded substantial information on the risks of disease: a) in the population, b) in relation to various parental characteristics, and c) as correlated with previous occurrences in the family histories.


2018 ◽  
Vol 16 (1) ◽  
pp. 67-76
Author(s):  
Disyacitta Neolia Firdana ◽  
Trimurtini Trimurtini

This research aimed to determine the properness and effectiveness of the big book media on learning equivalent fractions of fourth grade students. The method of research is Research and Development  (R&D). This study was conducted in fourth grade of SDN Karanganyar 02 Kota Semarang. Data sources from media validation, material validation, learning outcomes, and teacher and students responses on developed media. Pre-experimental research design with one group pretest-posttest design. Big book developed consist of equivalent fractions material, students learning activities sheets with rectangle and circle shape pictures, and questions about equivalent fractions. Big book was developed based on students and teacher needs. This big book fulfill the media validity of 3,75 with very good criteria and scored 3 by material experts with good criteria. In large-scale trial, the result of students posttest have learning outcomes completness 82,14%. The result of N-gain calculation with result 0,55 indicates the criterion “medium”. The t-test result 9,6320 > 2,0484 which means the average of posttest outcomes is better than the average of pretest outcomes. Based on that data, this study has produced big book media which proper and effective as a media of learning equivalent fractions of fourth grade elementary school.


Author(s):  
Na Li ◽  
Baofeng Jiao ◽  
Lingkun Ran ◽  
Zongting Gao ◽  
Shouting Gao

AbstractWe investigated the influence of upstream terrain on the formation of a cold frontal snowband in Northeast China. We conducted numerical sensitivity experiments that gradually removed the upstream terrain and compared the results with a control experiment. Our results indicate a clear negative effect of upstream terrain on the formation of snowbands, especially over large-scale terrain. By thoroughly examining the ingredients necessary for snowfall (instability, lifting and moisture), we found that the release of mid-level conditional instability, followed by the release of low-level or near surface instabilities (inertial instability, conditional instability or conditional symmetrical instability), contributed to formation of the snowband in both experiments. The lifting required for the release of these instabilities was mainly a result of frontogenetic forcing and upper gravity waves. However, the snowband in the control experiment developed later and was weaker than that in the experiment without upstream terrain. Two factors contributed to this negative topographic effect: (1) the mountain gravity waves over the upstream terrain, which perturbed the frontogenetic circulation by rapidly changing the vertical motion and therefore did not favor the release of instabilities in the absence of persistent ascending motion; and (2) the decrease in the supply of moisture as a result of blocking of the upstream terrain, which changed both the moisture and instability structures leeward of the mountains. A conceptual model is presented that shows the effects of the instabilities and lifting on the development of cold frontal snowbands in downstream mountains.


2021 ◽  
Vol 9 (3) ◽  
pp. 264
Author(s):  
Shanti Bhushan ◽  
Oumnia El Fajri ◽  
Graham Hubbard ◽  
Bradley Chambers ◽  
Christopher Kees

This study evaluates the capability of Navier–Stokes solvers in predicting forward and backward plunging breaking, including assessment of the effect of grid resolution, turbulence model, and VoF, CLSVoF interface models on predictions. For this purpose, 2D simulations are performed for four test cases: dam break, solitary wave run up on a slope, flow over a submerged bump, and solitary wave over a submerged rectangular obstacle. Plunging wave breaking involves high wave crest, plunger formation, and splash up, followed by second plunger, and chaotic water motions. Coarser grids reasonably predict the wave breaking features, but finer grids are required for accurate prediction of the splash up events. However, instabilities are triggered at the air–water interface (primarily for the air flow) on very fine grids, which induces surface peel-off or kinks and roll-up of the plunger tips. Reynolds averaged Navier–Stokes (RANS) turbulence models result in high eddy-viscosity in the air–water region which decays the fluid momentum and adversely affects the predictions. Both VoF and CLSVoF methods predict the large-scale plunging breaking characteristics well; however, they vary in the prediction of the finer details. The CLSVoF solver predicts the splash-up event and secondary plunger better than the VoF solver; however, the latter predicts the plunger shape better than the former for the solitary wave run-up on a slope case.


1965 ◽  
Vol 11 (11) ◽  
pp. 1023-1035 ◽  
Author(s):  
Alan Mather ◽  
Angel Assimos

Abstract A simple screening by gas-liquid chromatography (GLC) can provide definitive answers in the detection and identification of a number of volatile substances, including acetone and the common alcohols. After identification, quantitative assay by an internal-reference technic yields highly specific values for ethyl alcohol concentration with a precision at least equal to (and for low levels, better than) that of conventional assays. The unique advantage of GLC is in its simultaneous quantitative assay of mixtures, some of which cannot be satisfactorily assayed or even recognized in any other way. The combination of speed and negligible sample volumes render the technic valuable for sequential studies on capillary blood samples and, potentially, for mass screening of large populations.


2014 ◽  
Vol 952 ◽  
pp. 20-24 ◽  
Author(s):  
Xue Jun Xie

The selection of an optimal material is an important aspect of design for mechanical, electrical, thermal, chemical or other application. Many factors (attributes) need to be considered in material selection process, and thus material selection problem is a multi-attribute decision making (MADM) problem. This paper proposes a new MADM method for material selection problem. G1 method does not need to test consistency of the judgment matrix. Thus it is better than AHP. In this paper, firstly, we use the G1 method to determine the attribute weight. Then TOPSIS method is used to calculate the closeness of the candidate materials with respect positive solution. A practical material selection case is used to demonstrate the effectiveness and feasibility of the proposed method.


2012 ◽  
Vol 8 (S291) ◽  
pp. 375-377 ◽  
Author(s):  
Gregory Desvignes ◽  
Ismaël Cognard ◽  
David Champion ◽  
Patrick Lazarus ◽  
Patrice Lespagnol ◽  
...  

AbstractWe present an ongoing survey with the Nançay Radio Telescope at L-band. The targeted area is 74° ≲ l < 150° and 3.5° < |b| < 5°. This survey is characterized by a long integration time (18 min), large bandwidth (512 MHz) and high time and frequency resolution (64 μs and 0.5 MHz) giving a nominal sensitivity limit of 0.055 mJy for long period pulsars. This is about 2 times better than the mid-latitude HTRU survey, and is designed to be complementary with current large scale surveys. This survey will be more sensitive to transients (RRATs, intermittent pulsars), distant and faint millisecond pulsars as well as scintillating sources (or any other kind of radio faint sources) than all previous short-integration surveys.


2021 ◽  
pp. 1-64
Author(s):  
Gian Paolo Barbetta ◽  
Paolo Canino ◽  
Stefano Cima

Abstract The availability of cheap Wi-Fi internet connections has encouraged schools to adopt Web 2.0 platforms for teaching, with the intention of stimulating students’ academic achievement and participation in school. Moreover, during the recent explosion of the SARS-CoV-2 crisis that forced many countries to close schools (as well as offices and factories), the widespread diffusion of these applications kept school systems going. Despite their widespread use as teaching tools, the effect of adopting Web 2.0 platforms on students’ performance has never been rigorously tested. We fill this gap in the literature by analyzing the impact of using Twitter as a teaching tool on high school students’ literature skills. Based on a large-scale, randomized controlled trial that involved 70 schools and about 1,500 students, we find that using Twitter to teach literature has an overall negative effect on students’ average achievement, reducing standardized test scores by about 25 percent of a standard deviation. The negative effect is stronger on students who usually perform better.


Sign in / Sign up

Export Citation Format

Share Document