scholarly journals Software Tools for Engineering Dark Skies

1991 ◽  
Vol 112 ◽  
pp. 71-76
Author(s):  
A. M. Smith ◽  
D. L. Dilaura

ABSTRACTAtmospheric scattering of outdoor nighttime electric illumination produces the principal component of background sky luminance that seriously affects ground-based optical astronomy. The sources for this scattering are Tight emitted skyward directly from luminaires, and light reflected off the ground and other illuminated objects. Careful illumination engineering can thus significantly reduce background sky luminance in two ways: 1) by providing outdoor electric lighting equipment that controls the directions in which light is emitted, and; 2) by proper design of outdoor lighting systems which make efficient use of the east amount of light. Recent developments in applied mathematics and computer software have produced computational tools that are being used to design lighting equipment and lighting systems. The software system for luminaire design significantly reduces the cost of this process by eliminating the need for extensive prototyping and provides for inexpensive experimentation with new designs. The system for outdoor lighting calculations permits the design of highly controlled lighting systems that eliminate glare and upward directed light while providing light appropriate for the visual task. These two software systems are described, along with examples of their use in areas that directly affect astronomical observations.

Author(s):  
P. K. KAPUR ◽  
ANU. G. AGGARWAL ◽  
KANICA KAPOOR ◽  
GURJEET KAUR

The demand for complex and large-scale software systems is increasing rapidly. Therefore, the development of high-quality, reliable and low cost computer software has become critical issue in the enormous worldwide computer technology market. For developing these large and complex software small and independent modules are integrated which are tested independently during module testing phase of software development. In the process, testing resources such as time, testing personnel etc. are used. These resources are not infinitely large. Consequently, it is an important matter for the project manager to allocate these limited resources among the modules optimally during the testing process. Another major concern in software development is the cost. It is in fact, profit to the management if the cost of the software is less while meeting the costumer requirements. In this paper, we investigate an optimal resource allocation problem of minimizing the cost of software testing under limited amount of available resources, given a reliability constraint. To solve the optimization problem we present genetic algorithm which stands up as a powerful tool for solving search and optimization problems. The key objective of using genetic algorithm in the field of software reliability is its capability to give optimal results through learning from historical data. One numerical example has been discussed to illustrate the applicability of the approach.


Author(s):  
G.Y. Fan ◽  
J.M. Cowley

In recent developments, the ASU HB5 has been modified so that the timing, positioning, and scanning of the finely focused electron probe can be entirely controlled by a host computer. This made the asynchronized handshake possible between the HB5 STEM and the image processing system which consists of host computer (PDP 11/34), DeAnza image processor (IP 5000) which is interfaced with a low-light level TV camera, array processor (AP 400) and various peripheral devices. This greatly facilitates the pattern recognition technique initiated by Monosmith and Cowley. Software called NANHB5 is under development which, instead of employing a set of photo-diodes to detect strong spots on a TV screen, uses various software techniques including on-line fast Fourier transform (FFT) to recognize patterns of greater complexity, taking advantage of the sophistication of our image processing system and the flexibility of computer software.


Processes ◽  
2021 ◽  
Vol 9 (2) ◽  
pp. 275
Author(s):  
Chung Yiin Wong ◽  
Kunlanan Kiatkittipong ◽  
Worapon Kiatkittipong ◽  
Seteno K. O. Ntwampe ◽  
Man Kee Lam ◽  
...  

Oftentimes, the employment of entomoremediation to reduce organic wastes encounters ubiquitous shortcomings, i.e., ineffectiveness to valorize recalcitrant organics in wastes. Considering the cost-favorability, a fermentation process can be employed to facilitate the degradation of biopolymers into smaller organics, easing the subsequent entomoremediation process. However, the efficacy of in situ fermentation was found impeded by the black soldier fly larvae (BSFL) in the current study to reduce coconut endosperm waste (CEW). Indeed, by changing into ex situ fermentation, in which the fungal Rhizopus oligosporus was permitted to execute fermentation on CEW prior to the larval feeding, the reduction of CEW was significantly enhanced. In this regard, the waste reduction index of CEW by BSFL was almost doubled as opposed to in situ fermentation, even with the inoculation of merely 0.5 wt % of Rhizopus oligosporus. Moreover, with only 0.02 wt % of fungal inoculation size to execute the ex situ fermentation on CEW, it could spur BSFL growth by about 50%. Finally, from the statistical correlation study using principal component analysis, the presence of Rhizopus oligosporus in a range of 0.5–1.0 wt % was regarded as optimum to ferment CEW via ex situ mode, prior to the valorization by BSFL in reducing the CEW.


2021 ◽  
Vol 13 (4) ◽  
pp. 2177
Author(s):  
Edson Kogachi ◽  
Adonias Ferreira ◽  
Carlos Cavalcante ◽  
Marcelo Embiruçu

In order to improve the process management of table grape packaging, its performance should be evaluated. However, the literature on performance evaluation indicators is scarce. To address this research gap, we propose a method for the development of performance evaluation indicators for table grape packaging units, which are characterized by labor-intensive and highly seasonal production processes in the agro-economic sector. The stages include the following: contextualizing table grape packaging units, selecting the performance objectives, selecting techniques to be used in the development of the indicators, and applying the method to the packaging units of table grapes. The techniques adopted in the development of the indicators aimed at the cost, quality, flexibility, reliability, and speed performance objectives were data envelopment analysis, principal component analysis, quantification of the batch, compliance with the program within the established deadline, and measurement of the execution time of the batch, respectively. The results obtained in the case study demonstrate that the correlations between the performance indicators do not indicate the need to disregard any of them. Furthermore, the standard deviation values for each indicator are similar. Thus, both results of correlations and standard deviation confirm the importance of the indicators chosen for the performance evaluation of table grape packaging.


2018 ◽  
Vol 27 (07) ◽  
pp. 1860013 ◽  
Author(s):  
Swair Shah ◽  
Baokun He ◽  
Crystal Maung ◽  
Haim Schweitzer

Principal Component Analysis (PCA) is a classical dimensionality reduction technique that computes a low rank representation of the data. Recent studies have shown how to compute this low rank representation from most of the data, excluding a small amount of outlier data. We show how to convert this problem into graph search, and describe an algorithm that solves this problem optimally by applying a variant of the A* algorithm to search for the outliers. The results obtained by our algorithm are optimal in terms of accuracy, and are shown to be more accurate than results obtained by the current state-of-the- art algorithms which are shown not to be optimal. This comes at the cost of running time, which is typically slower than the current state of the art. We also describe a related variant of the A* algorithm that runs much faster than the optimal variant and produces a solution that is guaranteed to be near the optimal. This variant is shown experimentally to be more accurate than the current state-of-the-art and has a comparable running time.


2015 ◽  
Vol 2015 ◽  
pp. 1-13 ◽  
Author(s):  
Jiuwen Cao ◽  
Zhiping Lin

Extreme learning machine (ELM) has been developed for single hidden layer feedforward neural networks (SLFNs). In ELM algorithm, the connections between the input layer and the hidden neurons are randomly assigned and remain unchanged during the learning process. The output connections are then tuned via minimizing the cost function through a linear system. The computational burden of ELM has been significantly reduced as the only cost is solving a linear system. The low computational complexity attracted a great deal of attention from the research community, especially for high dimensional and large data applications. This paper provides an up-to-date survey on the recent developments of ELM and its applications in high dimensional and large data. Comprehensive reviews on image processing, video processing, medical signal processing, and other popular large data applications with ELM are presented in the paper.


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Bangtong Huang ◽  
Hongquan Zhang ◽  
Zihong Chen ◽  
Lingling Li ◽  
Lihua Shi

Deep learning algorithms are facing the limitation in virtual reality application due to the cost of memory, computation, and real-time computation problem. Models with rigorous performance might suffer from enormous parameters and large-scale structure, and it would be hard to replant them onto embedded devices. In this paper, with the inspiration of GhostNet, we proposed an efficient structure ShuffleGhost to make use of the redundancy in feature maps to alleviate the cost of computations, as well as tackling some drawbacks of GhostNet. Since GhostNet suffers from high computation of convolution in Ghost module and shortcut, the restriction of downsampling would make it more difficult to apply Ghost module and Ghost bottleneck to other backbone. This paper proposes three new kinds of ShuffleGhost structure to tackle the drawbacks of GhostNet. The ShuffleGhost module and ShuffleGhost bottlenecks are utilized by the shuffle layer and group convolution from ShuffleNet, and they are designed to redistribute the feature maps concatenated from Ghost Feature Map and Primary Feature Map. Besides, they eliminate the gap of them and extract the features. Then, SENet layer is adopted to reduce the computation cost of group convolution, as well as evaluating the importance of the feature maps which concatenated from Ghost Feature Maps and Primary Feature Maps and giving proper weights for the feature maps. This paper conducted some experiments and proved that the ShuffleGhostV3 has smaller trainable parameters and FLOPs with the ensurance of accuracy. And with proper design, it could be more efficient in both GPU and CPU side.


2021 ◽  
Vol 2083 (3) ◽  
pp. 032095
Author(s):  
Zhimin Ni ◽  
Fan Zhao

Abstract For the existing service-oriented software single, favors business processing, cannot guarantee the software business processing into the development of software. When the operator encounters operational problems, software failure problems and other problems related to software operation and operation, software development technicians to provide technical support to ensure the software’s business processing functions. This study will move away from dependence on other software and provide technical support to business software operators accurately and in a timely manner to effectively solve the problems that operators may encounter.


Symmetry ◽  
2021 ◽  
Vol 13 (12) ◽  
pp. 2294
Author(s):  
Hari Mohan Srivastava

Often referred to as special functions or mathematical functions, the origin of many members of the remarkably vast family of higher transcendental functions can be traced back to such widespread areas as (for example) mathematical physics, analytic number theory and applied mathematical sciences. Here, in this survey-cum-expository review article, we aim at presenting a brief introductory overview and survey of some of the recent developments in the theory of several extensively studied higher transcendental functions and their potential applications. For further reading and researching by those who are interested in pursuing this subject, we have chosen to provide references to various useful monographs and textbooks on the theory and applications of higher transcendental functions. Some operators of fractional calculus, which are associated with higher transcendental functions, together with their applications, have also been considered. Many of the higher transcendental functions, especially those of the hypergeometric type, which we have investigated in this survey-cum-expository review article, are known to display a kind of symmetry in the sense that they remain invariant when the order of the numerator parameters or when the order of the denominator parameters is arbitrarily changed.


Sign in / Sign up

Export Citation Format

Share Document