Random environmental processes for complex computer systems: a theoretical approach

1999 ◽  
Vol 02 (02) ◽  
pp. 117-135
Author(s):  
Nikitas A. Assimakopoulos

In this paper, we consider various computer inventory, computer queueing and reliability computer models where complexity due to interacting components of subsystems is apparent. In particular, our analysis focuses on a multi-item inventory computer model with stochastically dependent demands, a queueing computer network where there are dependent arrival and service processes, or a reliability computer model with stochastically dependent component lifetimes. We discuss cases where this dependence is induced only by a random environmental process which the system operates in. This process represents the sources of variation that affect all deterministic and stochastic parameters of the model. Thus, not only are the parameters of the model now stochastic processes, but they are all dependent due to the common environment they are all subject to. Our objective is to provide a convincing argument that, under fairly reasonable conditions, the analytical techniques used in these models as well as their solutions are not much more complicated than those where there is no environmental variation.

2001 ◽  
Vol 50 (7) ◽  
pp. 1221
Author(s):  
YUAN JIAN ◽  
REN YONG ◽  
LIU FENG ◽  
SHAN XIU-MING

2020 ◽  
Author(s):  
Ruonan Xu

Summary When a sample is drawn from or coincides with a finite population, the uncertainty of the coefficient estimators is often reported assuming the population is effectively infinite. The recent literature on finite-population inference instead derives an alternative asymptotic variance of the ordinary least squares estimator. Here, I extend the results to the more general setting of M-estimators and also find that the usual robust ‘sandwich’ estimator is conservative. The proposed asymptotic variance of M-estimators accounts for two sources of variation. In addition to the usual sampling-based uncertainty arising from (possibly) not observing the entire population, there is also design-based uncertainty, which is usually ignored in the common inference method, resulting from lack of knowledge of the counterfactuals. Under this alternative framework, we can obtain smaller standard errors of M-estimators when the population is treated as finite.


1996 ◽  
Vol 42 (1) ◽  
pp. 96-101 ◽  

Abstract Given the omnipresent cost-containment environment in which clinical chemists now work, they must adapt to a host of changed conditions and new pressures. Much of the onus of adapting is on the individual who must assume a different attitude to his or her work. The American Association for Clinical Chemistry can, and should, take a leadership role in developing a new type of laboratory director by working with other professional organizations in the clinical laboratory field to create training programs and retraining programs for existing clinical laboratory scientists, which will equip them for broader scientific and managerial responsibilities than hitherto. AACC needs to develop alliances with its sister organizations so that the common issues are addressed collectively rather than competitively. The scope of clinical chemistry must expand into areas other than traditional clinical chemistry, e.g., microbiology, immunology, certain aspects of hematology (including coagulation), and even aspects of blood banking. The former clinical chemist needs to become a clinical laboratory scientist and promote him- or herself as having cross-disciplinary expertise in analytical techniques and automation, which are the common threads linking all branches of clinical laboratory science.


2007 ◽  
Vol 56 (9) ◽  
pp. 147-155 ◽  
Author(s):  
B. Jiménez

Sludge reuse for agricultural production or soil reclamation is a common practice in several countries, but it entails risks if not properly performed. One such risk is the dissemination of helminthiases diseases. As a consequence, international criteria and national standards set values to limit their content in biosolids. However, little information is available on how to inactivate helminth ova from sludge, particularly when a high content is involved as is the case in the developing world. Moreover, treatment criteria are based on a limited number of studies dealing with local characteristics that, when applied to the conditions in developing countries, produce poor results. This is because design criteria were developed for Ascaris (a kind of helminth) while sludge contains a variety of genera. In addition, much information on helminth ova was produced a long time ago using inaccurate analytical techniques. This paper summarizes research and recent technical information from the literature concerning: (a) the general characteristics of helminth ova; (b) the common helminth ova genera found in sludge; (c) the main removal and inactivation mechanisms, (d) the processes that have proven effective in practical conditions at inactivating helminth ova; and (e) analytical techniques used to enumerate these pathogens.


2012 ◽  
Vol 446-449 ◽  
pp. 3803-3809
Author(s):  
Hooman Hoornahad ◽  
Eduard A. B. Koenders

The common approach to describe the rheological behavior of a granular-paste material relies on a description of the motion within the frame of continuum mechanics. However, since a granular-paste system cannot be considered as a homogeneous continuous fluid its behavior should not be estimated by common fluid models, such as Bingham or Herschell Bulkley models. Therefore, a continuum approach is not considered the best option to study the phase effects of a multi-phase material and its corresponding rheological behavior. In this particular case analytical techniques based on the multi-phase models are required. A more appropriate approach is to consider a granular-paste material as a two phase model that accounts for the effect of the gradually decreasing the volume fraction of the pasty phase until getting to zero value on the rheological behavior of the material. In this investigation, a cone test is used to evaluate the rheological behavior of a granular mix where a discrete element method (DEM) is considered as a basis of the numerical simulation.


1969 ◽  
pp. 89
Author(s):  
E. R. Alexander

In view of proposed reform of the law of occupiers' liability in Alberta, the common law approach to this area of law is examined by way of introduction. Professor Alexander adumbrates the categories of visitors and the duty of care owed to each, within the framework of the modern tort tendency to generalize. An examination in some detail is also made of the judicial techniques used in recent years to evolve the law of occupier's liability. As reform results from criticism, an examination of the criticisms of the present law, specifically judicial interpretation of the categories, as well as the categories themselves, their origin, com pass and applicability to vwdern society, are undertaken. Based on the criticisms, law reform has occurred. From the point of view of evaluating whether the reform has answered the criticisms of the common Iaio approach, the author attempts to examine the actual and proposed re form of England, Scotland, New Zealand, New South Wales, and Alberta. Particular detail is addressed to the Alberta proposals regarding com mon duty of care, the trespasser, the child trespasser and the ability to exclude liability. Concluding that convincing argument can be ad vanced for judicial reform in the area of private law, and that stare decisis does not have justification in the law of tort, Professor Alexander proposes that, while reform can be valuable as method of evolution, judicial history evidences that the Courts are able to adapt the law to meet changing social needs. The author concludes also that the common law today is preferable to the proposed Alberta reform.


2017 ◽  
Author(s):  
Andysah Putera Utama Siahaan

The common cases that often occur on a computer network is a weak point of computer security on computer networks. Network Forensic is a process of analyzing activity, recording, or even to identify the network to find digital evidence from a computer crime. Since the existence of the Internet as a global communication tool, it is a crime that often occurs gap. Internet containing the network forensics and lawful interception are important tasks for many organizations including small medium business, enterprises, banking and finance industry. This archiving and restoration of internet data can be used for legal evidence in case of disagreement. Government and intelligence agencies use technology to protect and defend national security. In general, computer forensics is simply the application of computer investigation and analysis techniques to determine the legal evidence that may be. There are several ways to find a crime on a computer network. The use of several applications supported are to improve the success of network forensic processes in the common cases.


2020 ◽  
Vol 11 (4) ◽  
pp. 262-274
Author(s):  
Dan L. Milici ◽  
◽  
Mihaela Paval ◽  

Then involvement is essential. You cannot stand aside as a member of a community. The real involvement presupposes first a connection at the level of understanding the phenomenon, an empathic relationship with the studied phenomena and processes, relationship that generate intuitions, leaps in understanding and knowledge, discoveries, revelations. The paper is based on the idea that information technology and communications today generate a global interaction based on a complex computer network, which forms a sphere that covers the planet, assimilated with a global intelligence, to which we can consciously adhere becoming a synapse. a global neural network.


Author(s):  
Robert Hassan

The rise of the network society has been hailed often as the bringer of many positive things, and has been damned in equal measure. This essay discusses the network society in terms of its effects upon the theory and practice of bourgeois and socialist democracy. Through the theoretical prism of social and technologically created time, the essay argues that the network society has created a neoliberal ‘networked time’. This is a logic that functions at the global level and operates at computer network driven speeds, incorporating in its wake not only the polity, but economy and society, too. What the temporal analysis reveals in this process, is that ‘networked time’ as a primarily digital form is unable to synchronise with the temporal rhythms of the forms of democracy that came to us from the age of Enlightenment—a slower time, with slower technologically based social rhythms that stemmed from print and machine culture. What this means is that the Enlightenment-based politics of bourgeois and socialist democracy, and their future-oriented logics of progress, are no longer tenable in our digital age. Accordingly the much-neglected passage in Manifesto of the Communist Party that envisions the ‘common ruin of the contending classes’ is coming to pass—and with it a seriously reduced scope for the resurrection of any form of democratic functioning that is based on Enlightenment politics and its temporal rhythms.


Sign in / Sign up

Export Citation Format

Share Document