scholarly journals Local Branching Strategy-Based Method for the Knapsack Problem with Setup

2020 ◽  
Author(s):  
Samah Boukhari ◽  
Isma Dahmani ◽  
Mhand Hifi

In this paper, we propose to solve the knapsack problem with setups by combining mixed linear relaxation and local branching. The problem with setups can be seen as a generalization of 0–1 knapsack problem, where items belong to disjoint classes (or families) and can be selected only if the corresponding class is activated. The selection of a class involves setup costs and resource consumptions thus affecting both the objective function and the capacity constraint. The mixed linear relaxation can be viewed as driving problem, where it is solved by using a special blackbox solver while the local branching tries to enhance the solutions provided by adding a series of invalid / valid constraints. The performance of the proposed method is evaluated on benchmark instances of the literature and new large-scale instances. Its provided results are compared to those reached by the Cplex solver and the best methods available in the literature. New results have been reached.

2015 ◽  
Vol 2015 ◽  
pp. 1-11 ◽  
Author(s):  
Telmo Pinto ◽  
Cláudio Alves ◽  
Raïd Mansi ◽  
José Valério de Carvalho

Despite other variants of the standard knapsack problem, very few solution approaches have been devised for the multiscenario max-min knapsack problem. The problem consists in finding the subset of items whose total profit is maximized under the worst possible scenario. In this paper, we describe an exact solution method based on column generation and branch-and-bound for this problem. Our approach relies on a reformulation of the standard compact integer programming model based on the Dantzig-Wolfe decomposition principle. The resulting model is potentially stronger than the original one since the corresponding pricing subproblem does not have the integrality property. The details of the reformulation are presented and analysed together with those concerning the column generation and branch-and-bound procedures. To evaluate the performance of our algorithm, we conducted extensive computational experiments on large scale benchmark instances, and we compared our results with other state-of-the-art approaches under similar circumstances. We focused in particular on different relevant aspects that allow an objective evaluation of the efficacy of our approach. From different standpoints, the branch-and-price algorithm proved to outperform the other state-of-the-art methods described so far in the literature.


1996 ◽  
Vol 76 (06) ◽  
pp. 0939-0943 ◽  
Author(s):  
B Boneu ◽  
G Destelle ◽  

SummaryThe anti-aggregating activity of five rising doses of clopidogrel has been compared to that of ticlopidine in atherosclerotic patients. The aim of this study was to determine the dose of clopidogrel which should be tested in a large scale clinical trial of secondary prevention of ischemic events in patients suffering from vascular manifestations of atherosclerosis [CAPRIE (Clopidogrel vs Aspirin in Patients at Risk of Ischemic Events) trial]. A multicenter study involving 9 haematological laboratories and 29 clinical centers was set up. One hundred and fifty ambulatory patients were randomized into one of the seven following groups: clopidogrel at doses of 10, 25, 50,75 or 100 mg OD, ticlopidine 250 mg BID or placebo. ADP and collagen-induced platelet aggregation tests were performed before starting treatment and after 7 and 28 days. Bleeding time was performed on days 0 and 28. Patients were seen on days 0, 7 and 28 to check the clinical and biological tolerability of the treatment. Clopidogrel exerted a dose-related inhibition of ADP-induced platelet aggregation and bleeding time prolongation. In the presence of ADP (5 \lM) this inhibition ranged between 29% and 44% in comparison to pretreatment values. The bleeding times were prolonged by 1.5 to 1.7 times. These effects were non significantly different from those produced by ticlopidine. The clinical tolerability was good or fair in 97.5% of the patients. No haematological adverse events were recorded. These results allowed the selection of 75 mg once a day to evaluate and compare the antithrombotic activity of clopidogrel to that of aspirin in the CAPRIE trial.


2021 ◽  
Vol 13 (6) ◽  
pp. 3571
Author(s):  
Bogusz Wiśnicki ◽  
Dorota Dybkowska-Stefek ◽  
Justyna Relisko-Rybak ◽  
Łukasz Kolanda

The paper responds to research problems related to the implementation of large-scale investment projects in waterways in Europe. As part of design and construction works, it is necessary to indicate river ports that play a major role within the European transport network as intermodal nodes. This entails a number of challenges, the cardinal one being the optimal selection of port locations, taking into account the new transport, economic, and geopolitical situation that will be brought about by modernized waterways. The aim of the paper was to present an original methodology for determining port locations for modernized waterways based on non-cost criteria, as an extended multicriteria decision-making method (MCDM) and employing GIS (Geographic Information System)-based tools for spatial analysis. The methodology was designed to be applicable to the varying conditions of a river’s hydroengineering structures (free-flowing river, canalized river, and canals) and adjustable to the requirements posed by intermodal supply chains. The method was applied to study the Odra River Waterway, which allowed the formulation of recommendations regarding the application of the method in the case of different river sections at every stage of the research process.


2021 ◽  
Vol 22 (15) ◽  
pp. 7773
Author(s):  
Neann Mathai ◽  
Conrad Stork ◽  
Johannes Kirchmair

Experimental screening of large sets of compounds against macromolecular targets is a key strategy to identify novel bioactivities. However, large-scale screening requires substantial experimental resources and is time-consuming and challenging. Therefore, small to medium-sized compound libraries with a high chance of producing genuine hits on an arbitrary protein of interest would be of great value to fields related to early drug discovery, in particular biochemical and cell research. Here, we present a computational approach that incorporates drug-likeness, predicted bioactivities, biological space coverage, and target novelty, to generate optimized compound libraries with maximized chances of producing genuine hits for a wide range of proteins. The computational approach evaluates drug-likeness with a set of established rules, predicts bioactivities with a validated, similarity-based approach, and optimizes the composition of small sets of compounds towards maximum target coverage and novelty. We found that, in comparison to the random selection of compounds for a library, our approach generates substantially improved compound sets. Quantified as the “fitness” of compound libraries, the calculated improvements ranged from +60% (for a library of 15,000 compounds) to +184% (for a library of 1000 compounds). The best of the optimized compound libraries prepared in this work are available for download as a dataset bundle (“BonMOLière”).


2021 ◽  
Vol 16 (2) ◽  
pp. 1-34
Author(s):  
Rediet Abebe ◽  
T.-H. HUBERT Chan ◽  
Jon Kleinberg ◽  
Zhibin Liang ◽  
David Parkes ◽  
...  

A long line of work in social psychology has studied variations in people’s susceptibility to persuasion—the extent to which they are willing to modify their opinions on a topic. This body of literature suggests an interesting perspective on theoretical models of opinion formation by interacting parties in a network: in addition to considering interventions that directly modify people’s intrinsic opinions, it is also natural to consider interventions that modify people’s susceptibility to persuasion. In this work, motivated by this fact, we propose an influence optimization problem. Specifically, we adopt a popular model for social opinion dynamics, where each agent has some fixed innate opinion, and a resistance that measures the importance it places on its innate opinion; agents influence one another’s opinions through an iterative process. Under certain conditions, this iterative process converges to some equilibrium opinion vector. For the unbudgeted variant of the problem, the goal is to modify the resistance of any number of agents (within some given range) such that the sum of the equilibrium opinions is minimized; for the budgeted variant, in addition the algorithm is given upfront a restriction on the number of agents whose resistance may be modified. We prove that the objective function is in general non-convex. Hence, formulating the problem as a convex program as in an early version of this work (Abebe et al., KDD’18) might have potential correctness issues. We instead analyze the structure of the objective function, and show that any local optimum is also a global optimum, which is somehow surprising as the objective function might not be convex. Furthermore, we combine the iterative process and the local search paradigm to design very efficient algorithms that can solve the unbudgeted variant of the problem optimally on large-scale graphs containing millions of nodes. Finally, we propose and evaluate experimentally a family of heuristics for the budgeted variant of the problem.


Computation ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 80
Author(s):  
John Fernando Martínez-Gil ◽  
Nicolas Alejandro Moyano-García ◽  
Oscar Danilo Montoya ◽  
Jorge Alexander Alarcon-Villamil

In this study, a new methodology is proposed to perform optimal selection of conductors in three-phase distribution networks through a discrete version of the metaheuristic method of vortex search. To represent the problem, a single-objective mathematical model with a mixed-integer nonlinear programming (MINLP) structure is used. As an objective function, minimization of the investment costs in conductors together with the technical losses of the network for a study period of one year is considered. Additionally, the model will be implemented in balanced and unbalanced test systems and with variations in the connection of their loads, i.e., Δ− and Y−connections. To evaluate the costs of the energy losses, a classical backward/forward three-phase power-flow method is implemented. Two test systems used in the specialized literature were employed, which comprise 8 and 27 nodes with radial structures in medium voltage levels. All computational implementations were developed in the MATLAB programming environment, and all results were evaluated in DigSILENT software to verify the effectiveness and the proposed three-phase unbalanced power-flow method. Comparative analyses with classical and Chu & Beasley genetic algorithms, tabu search algorithm, and exact MINLP approaches demonstrate the efficiency of the proposed optimization approach regarding the final value of the objective function.


Land ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 295
Author(s):  
Yuan Gao ◽  
Anyu Zhang ◽  
Yaojie Yue ◽  
Jing’ai Wang ◽  
Peng Su

Suitable land is an important prerequisite for crop cultivation and, given the prospect of climate change, it is essential to assess such suitability to minimize crop production risks and to ensure food security. Although a variety of methods to assess the suitability are available, a comprehensive, objective, and large-scale screening of environmental variables that influence the results—and therefore their accuracy—of these methods has rarely been explored. An approach to the selection of such variables is proposed and the criteria established for large-scale assessment of land, based on big data, for its suitability to maize (Zea mays L.) cultivation as a case study. The predicted suitability matched the past distribution of maize with an overall accuracy of 79% and a Kappa coefficient of 0.72. The land suitability for maize is likely to decrease markedly at low latitudes and even at mid latitudes. The total area suitable for maize globally and in most major maize-producing countries will decrease, the decrease being particularly steep in those regions optimally suited for maize at present. Compared with earlier research, the method proposed in the present paper is simple yet objective, comprehensive, and reliable for large-scale assessment. The findings of the study highlight the necessity of adopting relevant strategies to cope with the adverse impacts of climate change.


1988 ◽  
Vol 32 (17) ◽  
pp. 1179-1182 ◽  
Author(s):  
P. Jay Merkle ◽  
Douglas B. Beaudet ◽  
Robert C. Williges ◽  
David W. Herlong ◽  
Beverly H. Williges

This paper describes a systematic methodology for selecting independent variables to be considered in large-scale research problems. Five specific procedures including brainstorming, prototype interface representation, feasibility/relevance analyses, structured literature reviews, and user subjective ratings are evaluated and incorporated into an integrated strategy. This methodology is demonstrated in the context of designing the user interface for a telephone-based information inquiry system. The procedure was successful in reducing an initial set of 95 independent variables to a subset of 19 factors that warrant subsequent detailed analysis. These results are discussed in terms of a comprehensive sequential research methodology useful for investigating human factors problems.


Sign in / Sign up

Export Citation Format

Share Document