Design and Test of an Extremely Wide Flow Range Compressor

Author(s):  
P. F. Flynn ◽  
H. G. Weber

A fundamental theoretical study of the flow within a compressor wheel suggests that: (a) a necessary (but not sufficient) criterion for the prevention of surge can be obtained from the inviscid solution of the internal compressor flow problem, (b) the use of backward leaning blades is not the only blade passage modification which has the potential to increase the usable flow range of the compressor, and (c) a usable flow range much broader than previously thought to exist can be obtained. One alternative approach suggested by this study has been experimentally tested up to compressor pressure ratios in excess of 3.0. The new approach allowed a 50 percent reduction in the surge mass flow at design impeller speed while maintaining the inducer geometry of the machine identical to that of a conventional radial-bladed impeller used as a comparison standard. This new approach indicates the potential for a-priori prediction of surge flow characteristics of radial turbomachinery. Conversely, the design of hardware to a prespecified surge to choke flow ratio may be able to be accomplished by predefined blade geometry. It appears that the usable flow range for centrifugal compressors could extend down to 15 to 20 percent of the choke flow capability without sacrificing maximum component efficiencies.

Author(s):  
S.J. Selcon ◽  
A.D. Andre ◽  
S.J. Banbury ◽  
C.S. Jordan ◽  
M. Tlauka ◽  
...  

Although the design of the interface in aircraft cockpits has long been recognised as an area requiring the application of Human Factors knowledge, relatively little progress has been made in the development of formal methods to support such a design process. The design cycle has tended to be based around the iterative prototyping and testing of candidate solutions, rather than the application of a priori design rules. Such an approach is sub-optimal since it relies on the utility of the initial candidate designs being correct. The approach can be likened to chamfering the corners of a square wheel, rather than specifying in advance that wheels should be round. It is also a relatively expensive and time consuming method of design since for each new display the whole process needs to be repeated, with little transfer from previous displays. This symposium presents an alternative approach to interface design through the generation of a priori design rules that will specify the cognitive compatibility (i.e. the compatibility of display representations and organisations with the inherent cognitive abilities and skills of the user) of display solutions. The papers in the symposium will describe a number of studies where the results were used to generate empirical design rules relevant to aviation displays. The viability and utility of integrating such rules into a tool to support the design of aircraft interfaces will be discussed.


Author(s):  
José Ferreirós

This book presents a new approach to the epistemology of mathematics by viewing mathematics as a human activity whose knowledge is intimately linked with practice. Charting an exciting new direction in the philosophy of mathematics, the book uses the crucial idea of a continuum to provide an account of the development of mathematical knowledge that reflects the actual experience of doing math and makes sense of the perceived objectivity of mathematical results. Describing a historically oriented, agent-based philosophy of mathematics, the book shows how the mathematical tradition evolved from Euclidean geometry to the real numbers and set-theoretic structures. It argues for the need to take into account a whole web of mathematical and other practices that are learned and linked by agents, and whose interplay acts as a constraint. It demonstrates how advanced mathematics, far from being a priori, is based on hypotheses, in contrast to elementary math, which has strong cognitive and practical roots and therefore enjoys certainty. Offering a wealth of philosophical and historical insights, the book challenges us to rethink some of our most basic assumptions about mathematics, its objectivity, and its relationship to culture and science.


2021 ◽  
pp. 000276422110216
Author(s):  
Kazimierz M. Slomczynski ◽  
Irina Tomescu-Dubrow ◽  
Ilona Wysmulek

This article proposes a new approach to analyze protest participation measured in surveys of uneven quality. Because single international survey projects cover only a fraction of the world’s nations in specific periods, researchers increasingly turn to ex-post harmonization of different survey data sets not a priori designed as comparable. However, very few scholars systematically examine the impact of the survey data quality on substantive results. We argue that the variation in source data, especially deviations from standards of survey documentation, data processing, and computer files—proposed by methodologists of Total Survey Error, Survey Quality Monitoring, and Fitness for Intended Use—is important for analyzing protest behavior. In particular, we apply the Survey Data Recycling framework to investigate the extent to which indicators of attending demonstrations and signing petitions in 1,184 national survey projects are associated with measures of data quality, controlling for variability in the questionnaire items. We demonstrate that the null hypothesis of no impact of measures of survey quality on indicators of protest participation must be rejected. Measures of survey documentation, data processing, and computer records, taken together, explain over 5% of the intersurvey variance in the proportions of the populations attending demonstrations or signing petitions.


Author(s):  
Jian Pu ◽  
Zhaoqing Ke ◽  
Jianhua Wang ◽  
Lei Wang ◽  
Hongde You

This paper presents an experimental investigation on the characteristics of the fluid flow within an entire coolant channel of a low pressure (LP) turbine blade. The serpentine channel, which keeps realistic blade geometry, consists of three passes connected by a 180° sharp bend and a semi-round bend, 2 tip exits and 25 trailing edge exits. The mean velocity fields within several typical cross sections were captured using a particle image velocimetry (PIV) system. Pressure and flow rate at each exit were determined through the measurements of local static pressure and volume flow rate. To optimize the design of LP turbine blade coolant channels, the effect of tip ejection ratio (ER) from 180° sharp bend on the flow characteristics in the coolant channel were experimentally investigated at a series of inlet Reynolds numbers from 25,000 to 50,000. A complex flow pattern, which is different from the previous investigations conducted by a simplified square or rectangular two-pass U-channel, is exhibited from the PIV results. This experimental investigation indicated that: a) in the main flow direction, the regions of separation bubble and flow impingement increase in size with a decrease of the ER; b) the shape, intensity and position of the secondary vortices are changed by the ER; c) the mass flow ratio of each exit to inlet is not sensitive to the inlet Reynolds number; d) the increase of the ER reduces the mass flow ratio through each trailing edge exit to the extent of about 23–28% of the ER = 0 reference under the condition that the tip exit located at 180° bend is full open; e) the pressure drop through the entire coolant channel decreases with an increase in the ER and inlet Reynolds number, and a reduction about 35–40% of the non-dimensional pressure drop is observed at different inlet Reynolds numbers, under the condition that the tip exit located at 180° bend is full open.


2006 ◽  
Vol 6 (7) ◽  
pp. 561-582
Author(s):  
H.P. Yuen ◽  
R. Nair ◽  
E. Corndorf ◽  
G.S. Kanter ◽  
P. Kumar

Lo and Ko have developed some attacks on the cryptosystem called $\alpha \eta$}, claiming that these attacks undermine the security of $\alpha\eta$ for both direct encryption and key generation. In this paper, we show that their arguments fail in many different ways. In particular, the first attack in [1] requires channel loss or length of known-plaintext that is exponential in the key length and is unrealistic even for moderate key lengths. The second attack is a Grover search attack based on `asymptotic orthogonality' and was not analyzed quantitatively in [1]. We explain why it is not logically possible to "pull back'' an argument valid only at $n=\infty$ into a limit statement, let alone one valid for a finite number of transmissions n. We illustrate this by a `proof' using a similar asymptotic orthogonality argument that coherent-state BB84 is insecure for any value of loss. Even if a limit statement is true, this attack is a priori irrelevant as it requires an indefinitely large amount of known-plaintext, resources and processing. We also explain why the attacks in [1] on $\alpha\eta$ as a key-generation system are based on misinterpretations of [2]. Some misunderstandings in [1] regarding certain issues in cryptography and optical communications are also pointed out. Short of providing a security proof for $\alpha\eta$, we provide a description of relevant results in standard cryptography and in the design of $\alpha\eta$ to put the above issues in the proper framework and to elucidate some security features of this new approach to quantum cryptography.


2021 ◽  
Vol 2021 (3-4) ◽  
pp. 25-30
Author(s):  
Kirill Tkachenko

The article proposes a new approach for adjusting the parameters of computing nodes being a part of a data processing system based on analytical simulation of a queuing system with subsequent estimation of probabilities of hypotheses regarding the computing node state. Methods of analytical modeling of queuing systems and mathematical statistics are used. The result of the study is a mathematical model for assessing the information situation for a computing node, which differs from the previously published system model used. Estimation of conditional probabilities of hypotheses concerning adequate data processing by a computing node allows making a decision on the need of adjusting the parameters of a computing node. This adjustment makes it possible to improve the efficiency of working with tasks on the computing node of the data processing system. The implementation of the proposed model for adjusting the parameters of the computer node of the data processing system increases both the efficiency of process applications on the node and, in general, the efficiency of its operation. The application of the approach to all computing nodes of the data processing system increases the dependability of the system as a whole.


2013 ◽  
Vol 4 (1) ◽  
pp. 1-17 ◽  
Author(s):  
R. Amala ◽  
R. Vishnu Vardhan

In recent years the ROC curve analysis has got its attention in almost all diversified fields. Basing on the data pattern and its distribution various forms of ROC models have been derived. In this paper, the authors have assumed that the data of two populations (healthy and diseased) follows normal distribution, it is one of the most commonly used forms under parametric approach. The present paper focuses on providing an alternative approach for the tradeoff plot of ROC curve and the computation of AUC using a special function of sigmoid shape called Error function. It is assumed that the test scores of particular biomarker are normally distributed. The entire work has been carried out for providing a new approach for the construction of Binormal ROC curve, which makes use of Error function which can be called as ErROC curve. The summary measure AUC of the resulting ErROC curve has been estimated and defined as ErAUC. The authors have also focused on deriving the expression for obtaining the optimal cut-off point. The new ErROC curve model will provide the true positive rate value at each and every point of false positive rate unlike conventional Binormal ROC model.


Sign in / Sign up

Export Citation Format

Share Document