scholarly journals An On-Path Caching Scheme Based on the Expected Number of Copies in Information-Centric Networks

Electronics ◽  
2020 ◽  
Vol 9 (10) ◽  
pp. 1705
Author(s):  
Yuanhang Li ◽  
Jinlin Wang ◽  
Rui Han

The Information-Centric Network (ICN) is one of the most influential future network architectures and in-network caching in ICN brings some helpful features, such as low latency and mobility support. How to allocate cache capacity and place content properly will greatly influence the performance of ICN. This paper focuses on the cache allocation problem and content placement problem under the given cache space budget. Firstly, a lightweight allocation method utilizing information of both topology and content popularity is proposed, to allocate cache space and get the expected number of copies of popular content. The expected number of copies represents the number of content copies placed in the topology. Then, an on-path caching scheme based on the expected number of copies is proposed to handle the content placement problem. In the cache allocation scenario, the lightweight allocation method performs better than other baseline methods. In the content placement scenario, Leave Copy Down (LCD) based on the expected number of copies performs the second-best and is very close to Optimal Content Placement (OCP).

1967 ◽  
Vol 4 (2) ◽  
pp. 170-174 ◽  
Author(s):  
Fredrik Esscher

When experience is insufficient to permit a direct empirical determination of the premium rates of a Stop Loss Cover, we have to fall back upon mathematical models from the theory of probability—especially the collective theory of risk—and upon such assumptions as may be considered reasonable.The paper deals with some problems connected with such calculations of Stop Loss premiums for a portfolio consisting of non-life insurances. The portfolio was so large that the values of the premium rates and other quantities required could be approximated by their limit values, obtained according to theory when the expected number of claims tends to infinity.The calculations were based on the following assumptions.Let F(x, t) denote the probability that the total amount of claims paid during a given period of time is ≤ x when the expected number of claims during the same period increases from o to t. The net premium II (x, t) for a Stop Loss reinsurance covering the amount by which the total amount of claims paid during this period may exceed x, is defined by the formula and the variance of the amount (z—x) to be paid on account of the Stop Loss Cover, by the formula As to the distribution function F(x, t) it is assumed that wherePn(t) is the probability that n claims have occurred during the given period, when the expected number of claims increases from o to t,V(x) is the distribution function of the claims, giving the conditioned probability that the amount of a claim is ≤ x when it is known that a claim has occurred, andVn*(x) is the nth convolution of the function V(x) with itself.V(x) is supposed to be normalized so that the mean = I.


Sensors ◽  
2019 ◽  
Vol 19 (11) ◽  
pp. 2449 ◽  
Author(s):  
Wenpeng Jing ◽  
Xiangming Wen ◽  
Zhaoming Lu ◽  
Haijun Zhang

Mobile edge caching is regarded as a promising way to reduce the backhaul load of the base stations (BSs). However, the capacity of BSs’ cache tends to be small, while mobile users’ content preferences are diverse. Furthermore, both the locations of users and user-BS association are uncertain in wireless networks. All of these pose great challenges on the content caching and content delivery. This paper studies the joint optimization of the content placement and content delivery schemes in the cache-enabled ultra-dense small-cell network (UDN) with constrained-backhaul link. Considering the differences in decision time-scales, the content placement and content delivery are investigated separately, but their interplay is taken into consideration. Firstly, a content placement problem is formulated, where the uncertainty of user-BS association is considered. Specifically, different from the existing works, the specific multi-location request pattern is considered that users tend to send content requests from more than one but limited locations during one day. Secondly, a user-BS association and wireless resources allocation problem is formulated, with the objective of maximizing users’ data rates under the backhaul bandwidth constraint. Due to the non-convex nature of these two problems, the problem transformation and variables relaxation are adopted, which convert the original problems into more tractable forms. Then, based on the convex optimization methods, a content placement algorithm, and a cache-aware user association and resources allocation algorithm are proposed, respectively. Finally, simulation results are given, which validate that the proposed algorithms have obvious performance advantages in terms of the network utility, the hit ratio of the cache, and the quality of service guarantee, and are suitable for the cache-enabled UDN with constrained-backhaul link.


VLSI Design ◽  
1994 ◽  
Vol 2 (3) ◽  
pp. 241-257 ◽  
Author(s):  
Chi-Yu Mao ◽  
Yu Hen Hu

In this paper, we present a Simulated Evolution Gate Matrix layout Algorithm (SEGMA) for synthesizing CMOS random logic modules. The gate-matrix layout problem is solved as a one-dimensional transistor gates placement problem. Given a placement of all the transistor gates, simulated evolution offers a systematic method to improve the quality of the layout that is measured by the number of tracks needed for the given netlist. This is accomplished by identifying a subset of gates whose relative placements are deemed “poor quality” according to a heuristic criterion. By rearranging the placement of these identified subsets of gates, it is hoped that a gate placement with better quality, meaning fewer tracks, may emerge. Since this method enables the current “generation” of gate placement to evolve into a more advanced one in a way similar to the biological evolution process, this method is called simulated evolution. To apply simulated evolution to solve the gate-matrix layout problem, we propose a novel heuristic criterion, called randomized quality factor, which facilitates the judicious selection of the subset of poor quality gates. Several carefully devised and tested strategies are also implemented. Extensive simulation results indicate that SEGMA is producing very compact gate-matrix layouts.


Sensors ◽  
2021 ◽  
Vol 21 (19) ◽  
pp. 6679
Author(s):  
Isack Thomas Nicholaus ◽  
Jun Ryeol Park ◽  
Kyuil Jung ◽  
Jun Seoung Lee ◽  
Dae-Ki Kang

Anomaly detection is one of the crucial tasks in daily infrastructure operations as it can prevent massive damage to devices or resources, which may then lead to catastrophic outcomes. To address this challenge, we propose an automated solution to detect anomaly pattern(s) of the water levels and report the analysis and time/point(s) of abnormality. This research’s motivation is the level difficulty and time-consuming managing facilities responsible for controlling water levels due to the rare occurrence of abnormal patterns. Consequently, we employed deep autoencoder, one of the types of artificial neural network architectures, to learn different patterns from the given sequences of data points and reconstruct them. Then we use the reconstructed patterns from the deep autoencoder together with a threshold to report which patterns are abnormal from the normal ones. We used a stream of time-series data collected from sensors to train the model and then evaluate it, ready for deployment as the anomaly detection system framework. We run extensive experiments on sensor data from water tanks. Our analysis shows why we conclude vanilla deep autoencoder as the most effective solution in this scenario.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Jiajie Ren ◽  
Demin Li ◽  
Lei Zhang ◽  
Guanglin Zhang

Content-centric networks (CCNs) have become a promising technology for relieving the increasing wireless traffic demands. In this paper, we explore the scaling performance of mobile content-centric networks based on the nonuniform spatial distribution of nodes, where each node moves around its own home point and requests the desired content according to a Zipf distribution. We assume each mobile node is equipped with a finite local cache, which is applied to cache contents following a static cache allocation scheme. According to the nonuniform spatial distribution of cache-enabled nodes, we introduce two kinds of clustered models, i.e., the clustered grid model and the clustered random model. In each clustered model, we analyze throughput and delay performance when the number of nodes goes infinity by means of the proposed cell-partition scheduling scheme and the distributed multihop routing scheme. We show that the node mobility degree and the clustering behavior play the fundamental roles in the aforementioned asymptotic performance. Finally, we study the optimal cache allocation problem in the two kinds of clustered models. Our findings provide a guidance for developing the optimal caching scheme. We further perform the numerical simulations to validate the theoretical scaling laws.


1990 ◽  
Vol 27 (2) ◽  
pp. 351-364 ◽  
Author(s):  
Rhonda Righter

In the classical sequential assignment problem as introduced by Derman et al. (1972) there are n workers who are to be assigned a finite number of sequentially arriving jobs. If a worker of value p is assigned a job of value x the return is px, where we interpret the return as the probability that the given worker correctly completes the given job. The job value is a random value that is observed upon arrival, and jobs must be assigned or rejected when they arrive. Each worker can only do one job. Derman et al. showed that when the objective is to maximize the expected return, i.e., the expected number of correctly completed jobs, the optimal policy is a simple threshold policy, which does not depend on the worker values. Their result was extended by Albright (1974) to allow job arrivals according to a Poisson process and a single random deadline for job completion (which is equivalent to discounting). Righter (1987) further extended the result to permit workers to have independent random deadlines for job completions. Here we show that when there are independent deadlines a simple threshold policy that is independent of the worker values stochastically maximizes the number of correctly completed jobs, and therefore maximizes the expected number of correctly completed jobs. We also show that there is no policy that stochastically maximizes the number of correctly completed jobs when there is a single deadline. However, when there is single deadline and the objective is to maximize the probability that n jobs are done correctly by n workers, then the optimal policy is determined by a single threshold that is independent of n and of the worker values.


Sign in / Sign up

Export Citation Format

Share Document