Scheduling with Discretely Compressible Processing Times to Minimize Makespan

2014 ◽  
Vol 575 ◽  
pp. 926-930
Author(s):  
Shu Xia Zhang ◽  
Yu Zhong Zhang

In this paper, we address the scheduling model with discretely compressible processing times, where processing any job with a compressed processing time incurs a corresponding compression cost. We consider the following problem: scheduling with discretely compressible processing times to minimize makespan with the constraint of total compression cost on identical parallel machines. Jobs may have simultaneous release times. We design a pseudo-polynomial time algorithm by approach of dynamic programming and an FPTAS.

2013 ◽  
Vol 787 ◽  
pp. 1020-1024
Author(s):  
Shu Xia Zhang ◽  
Yu Zhong Zhang

In this paper, we address the single machine scheduling problem with discretely compressible processing times, where processing any job with a compressed processing time incurs a corresponding compression cost. We consider the following problem: scheduling with discretely compressible processing times to minimize makespan with the constraint of total compression cost. Jobs may have different release times. We design a pseudo-polynomial time algorithm by approach of dynamic programming and an FPTAS.


2001 ◽  
Vol 15 (4) ◽  
pp. 465-479 ◽  
Author(s):  
Ger Koole ◽  
Rhonda Righter

We consider a batch scheduling problem in which the processing time of a batch of jobs equals the maximum of the processing times of all jobs in the batch. This is the case, for example, for burn-in operations in semiconductor manufacturing and other testing operations. Processing times are assumed to be random, and we consider minimizing the makespan and the flow time. The problem is much more difficult than the corresponding deterministic problem, and the optimal policy may have many counterintuitive properties. We prove various structural properties of the optimal policy and use these to develop a polynomial-time algorithm to compute the optimal policy.


2012 ◽  
Vol 2012 ◽  
pp. 1-10
Author(s):  
Romeo Rizzi ◽  
Luca Nardin

The Interactive Knapsacks Heuristic Optimization (IKHO) problem is a particular knapsacks model in which, given an array of knapsacks, every insertion in a knapsack affects also the other knapsacks, in terms of weight and profit. The IKHO model was introduced by Isto Aho to model instances of the load clipping problem. The IKHO problem is known to be APX-hard and, motivated by this negative fact, Aho exhibited a few classes of polynomial instances for the IKHO problem. These instances were obtained by limiting the ranges of two structural parameters, c and u, which describe the extent to which an insertion in a knapsack in uences the nearby knapsacks. We identify a new and broad class of instances allowing for a polynomial time algorithm. More precisely, we show that the restriction of IKHO to instances where is bounded by a constant can be solved in polynomial time, using dynamic programming.


2017 ◽  
Vol 34 (04) ◽  
pp. 1750015 ◽  
Author(s):  
Shi-Sheng Li ◽  
De-Liang Qian ◽  
Ren-Xia Chen

We consider the problem of scheduling [Formula: see text] jobs with rejection on a set of [Formula: see text] machines in a proportionate flow shop system where the job processing times are machine-independent. The goal is to find a schedule to minimize the scheduling cost of all accepted jobs plus the total penalty of all rejected jobs. Two variations of the scheduling cost are considered. The first is the maximum tardiness and the second is the total weighted completion time. For the first problem, we first show that it is [Formula: see text]-hard, then we construct a pseudo-polynomial time algorithm to solve it and an [Formula: see text] time for the case where the jobs have the same processing time. For the second problem, we first show that it is [Formula: see text]-hard, then we design [Formula: see text] time algorithms for the case where the jobs have the same weight and for the case where the jobs have the same processing time.


Author(s):  
Yangjun Chen ◽  
◽  
Dunren Che ◽  

In this paper, we present a polynomial-time algorithm for TPQ (tree pattern queries) minimization without XML constraints involved. The main idea of the algorithm is a dynamic programming strategy to find all the matching subtrees within a TPQ. A matching subtree implies a redundancy and should be removed in such a way that the semantics of the original TPQ is not damaged. Our algorithm consists of two parts: one for subtree recognization and the other for subtree deletion. Both of them needs only O(<I>n</I>2) time, where <I>n</I> is the number of nodes in a TPQ.


2007 ◽  
Vol 24 (01) ◽  
pp. 45-56 ◽  
Author(s):  
LONGMIN HE ◽  
SHIJIE SUN ◽  
RUNZI LUO

This paper considers a batch scheduling problem in a two-stage hybrid flowshop that consists of m dedicated parallel machines in stage 1 and a batch processor in stage 2. The processing time of a batch is defined as the largest processing time of the jobs contained in that batch. The criterion is to minimize the makespan, the time by which all operations of jobs have been processed. For such a problem, we present a polynomial time algorithm for the case with all jobs having the same processing time on the batch processor. An approximation algorithm with a competitive ratio 2 for the general case is also presented.


2015 ◽  
Vol 2015 ◽  
pp. 1-9
Author(s):  
Shanlin Li ◽  
Maoqin Li ◽  
Hong Yan

In the real world, there are a large number of supply chains that involve the short lifespan products. In this paper, we consider an integrated production and distribution batch scheduling problem on a single machine for the orders with a short lifespan, because it may be cheaper or faster to process and distribute orders in a batch than to process and distribute them individually. Assume that the orders have the identical processing time and come from the same location, and the batch setup time is a constant. The problem is to choose the number of batches and batch sizes to minimize the total delivery time without violating the order lifespan. We first give a backward dynamic programming algorithm, but it is not an actually polynomial-time algorithm. Then we propose a constant time partial dynamic programming algorithm by doing further research into the recursion formula in the algorithm. Further, using the difference characteristics of the optimal value function, a specific calculating formula to solve the problem with the setup time being integer times of the processing time is obtained.


2014 ◽  
Vol 31 (05) ◽  
pp. 1450036 ◽  
Author(s):  
Ji-Bo Wang ◽  
Ming-Zheng Wang

We consider a single-machine common due-window assignment scheduling problem, in which the processing time of a job is a function of its position in a sequence and its resource allocation. The window location and size, along with the associated job schedule that minimizes a certain cost function, are to be determined. This function is made up of costs associated with the window location, window size, earliness, and tardiness. For two different processing time functions, we provide a polynomial time algorithm to find the optimal job sequence and resource allocation, respectively.


Mathematics ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 633
Author(s):  
Nodari Vakhania ◽  
Frank Werner

We consider the problem of scheduling n jobs with identical processing times and given release as well as delivery times on m uniform machines. The goal is to minimize the makespan, i.e., the maximum full completion time of any job. This problem is well-known to have an open complexity status even if the number of jobs is fixed. We present a polynomial-time algorithm for the problem which is based on the earlier introduced algorithmic framework blesscmore (“branch less and cut more”). We extend the analysis of the so-called behavior alternatives developed earlier for the version of the problem with identical parallel machines and show how the earlier used technique for identical machines can be extended to the uniform machine environment if a special condition on the job parameters is imposed. The time complexity of the proposed algorithm is O(γm2nlogn), where γ can be either n or the maximum job delivery time qmax. This complexity can even be reduced further by using a smaller number κ<n in the estimation describing the number of jobs of particular types. However, this number κ becomes only known when the algorithm has terminated.


2021 ◽  
Vol 13 (4) ◽  
pp. 1-24
Author(s):  
Jessica Chen ◽  
Henry Milner ◽  
Ion Stoica ◽  
Jibin Zhan

The HTTP adaptive streaming technique opened the door to cope with the fluctuating network conditions during the streaming process by dynamically adjusting the volume of the future chunks to be downloaded. The bitrate selection in this adjustment inevitably involves the task of predicting the future throughput of a video session, owing to which various heuristic solutions have been explored. The ultimate goal of the present work is to explore the theoretical upper bounds of the QoE that any ABR algorithm can possibly reach, therefore providing an essential step to benchmarking the performance evaluation of ABR algorithms. In our setting, the QoE is defined in terms of a linear combination of the average perceptual quality and the buffering ratio. The optimization problem is proven to be NP-hard when the perceptual quality is defined by chunk size and conditions are given under which the problem becomes polynomially solvable. Enriched by a global lower bound, a pseudo-polynomial time algorithm along the dynamic programming approach is presented. When the minimum buffering is given higher priority over higher perceptual quality, the problem is shown to be also NP-hard, and the above algorithm is simplified and enhanced by a sequence of lower bounds on the completion time of chunk downloading, which, according to our experiment, brings a 36.0% performance improvement in terms of computation time. To handle large amounts of data more efficiently, a polynomial-time algorithm is also introduced to approximate the optimal values when minimum buffering is prioritized. Besides its performance guarantee, this algorithm is shown to reach 99.938% close to the optimal results, while taking only 0.024% of the computation time compared to the exact algorithm in dynamic programming.


Sign in / Sign up

Export Citation Format

Share Document