The State of Empirical Evaluation in Static Feature Location

2019 ◽  
Vol 28 (1) ◽  
pp. 1-58 ◽  
Author(s):  
Abdul Razzaq ◽  
Asanka Wasala ◽  
Chris Exton ◽  
Jim Buckley

2021 ◽  
Author(s):  
Muhammad Shahroz Nadeem ◽  
Sibt Hussain ◽  
Fatih Kurugollu

This paper solves the textual deblurring problem, In this paper we propose a new loss function, we provide empirical evaluation of the design choices based on which a memory friendly CNN model is proposed, that performs better then the state of the art CNN method.



Author(s):  
Yanchen Deng ◽  
Ziyu Chen ◽  
Dingding Chen ◽  
Wenxin Zhang ◽  
Xingqiong Jiang

Asymmetric distributed constraint optimization problems (ADCOPs) are an emerging model for coordinating agents with personal preferences. However, the existing inference-based complete algorithms which use local eliminations cannot be applied to ADCOPs, as the parent agents are required to transfer their private functions to their children. Rather than disclosing private functions explicitly to facilitate local eliminations, we solve the problem by enforcing delayed eliminations and propose AsymDPOP, the first inference-based complete algorithm for ADCOPs. To solve the severe scalability problems incurred by delayed eliminations, we propose to reduce the memory consumption by propagating a set of smaller utility tables instead of a joint utility table, and to reduce the computation efforts by sequential optimizations instead of joint optimizations. The empirical evaluation indicates that AsymDPOP significantly outperforms the state-of-the-art, as well as the vanilla DPOP with PEAV formulation.



2014 ◽  
Vol 49 ◽  
pp. 733-773 ◽  
Author(s):  
N. Fernandez Garcia ◽  
J. Arias Fisteus ◽  
L. Sanchez Fernandez

In recent years, the task of automatically linking pieces of text (anchors) mentioned in a document to Wikipedia articles that represent the meaning of these anchors has received extensive research attention. Typically, link-to-Wikipedia systems try to find a set of Wikipedia articles that are candidates to represent the meaning of the anchor and, later, rank these candidates to select the most appropriate one. In this ranking process the systems rely on context information obtained from the document where the anchor is mentioned and/or from Wikipedia. In this paper we center our attention in the use of Wikipedia links as context information. In particular, we offer a review of several candidate ranking approaches in the state-of-the-art that rely on Wikipedia link information. In addition, we provide a comparative empirical evaluation of the different approaches on five different corpora: the TAC 2010 corpus and four corpora built from actual Wikipedia articles and news items.



2014 ◽  
Vol 2014 ◽  
pp. 1-12 ◽  
Author(s):  
Meng Fanrong ◽  
Zhu Mu ◽  
Zhou Yong ◽  
Zhou Ranran

Detecting local community structure in complex networks is an appealing problem that has attracted increasing attention in various domains. However, most of the current local community detection algorithms, on one hand, are influenced by the state of the source node and, on the other hand, cannot effectively identify the multiple communities linked with the overlapping nodes. We proposed a novel local community detection algorithm based on maximum clique extension called LCD-MC. The proposed method firstly finds the set of all the maximum cliques containing the source node and initializes them as the starting local communities; then, it extends each unclassified local community by greedy optimization until a certain objective is satisfied; finally, the expected local communities will be obtained until all maximum cliques are assigned into a community. An empirical evaluation using both synthetic and real datasets demonstrates that our algorithm has a superior performance to some of the state-of-the-art approaches.



1970 ◽  
Vol 15 (3) ◽  
Author(s):  
Rosemary Du Plessis Novitz ◽  
Nabila Jaber

A number of publications opposing pay equity published by the Business Round Table, the Employers' Federation and the Centre for Independent Studies between 1988 and 1990 have opposed pay equity. The ideas in these publications are shared by many neo-classical economists who advocate the operation of a "free" labour market with mininial intervention by the State or unions. These arguments are currently being used to justify the repeal of pay equity legislation. This article provides an empirical evaluation of claims that a deregulated labour market will be to the advantage of ufomen workers and that pay equity policies benefit only the most skilled women in paid work. It demonstrates that state intervention in the labour market and the unionisation of won1en in employment are associated with reductions in the earnings gap between women and men.



2021 ◽  
Author(s):  
Muhammad Shahroz Nadeem ◽  
Sibt Hussain ◽  
Fatih Kurugollu

This paper solves the textual deblurring problem, In this paper we propose a new loss function, we provide empirical evaluation of the design choices based on which a memory friendly CNN model is proposed, that performs better then the state of the art CNN method.



2021 ◽  
Vol 52 (3) ◽  
pp. 541-561
Author(s):  
Iris Reus ◽  
Tim-Benedikt Attow ◽  
Nico Fenske

How did the state parliament of Saxony deal with the topic of digitization at schools and how did the Covid-19 pandemic increase the problem’s immediacy? This question was investigat­ed by means of quantitative and qualitative analyses of parliamentary activities between 2017 and 2021 . The empirical evaluation for the period before the pandemic yields only a few activities in the Landtag, with the majority of initiatives coming from a cross-bencher . The­matically, the processes in 2017 and 2018 concentrated on digitization-related measures of the state government in general (or demands in this regard) as well as details of technical implementation . Despite the DigitalPakt going into effect in 2019, a period of inactivity fol­lowed for almost two years . Acute pressure to act set in when children had to be taught at home due the Covid 19 pandemic beginning in March 2020 . As the media analysis shows, the issue received constant and partly even high attention, with critical reports predominat­ing especially during the period of the second school closure . However, this problem pressure was not clearly reflected in the activities of the state parliament, either in terms of time or content, even though parliamentary interest increased noticeably during the pandemic . The­matically, the parliamentary groups turned their attention to the DigitalPakt and the design of media education concepts .



2019 ◽  
Author(s):  
Gabriel O. Ramos ◽  
Ana L. C. Bazzan ◽  
Bruno C. Da Silva

Traffic congestions present a major challenge in large cities. Consid- ering the distributed, self-interested nature oftraffic we tackle congestions using multiagent reinforcement learning (MARL). In this thesis, we advance the state- of-the-art by delivering the first MARL convergence guarantees in congestion- like problems. We introduce an algorithm through which drivers can learn opti- mal routes by locally estimating the regret associated with their decisions, which we prove to converge to an equilibrium. In order to mitigate the effects ofselfish- ness, we also devise a decentralised tolling scheme, which we prove to minimise traffic congestion levels. Our theoretical results are supported by an extensive empirical evaluation on realistic traffic networks. 1.



2010 ◽  
Vol 39 ◽  
pp. 51-126 ◽  
Author(s):  
M. Katz ◽  
C. Domshlak

State-space search with explicit abstraction heuristics is at the state of the art of cost-optimal planning. These heuristics are inherently limited, nonetheless, because the size of the abstract space must be bounded by some, even if a very large, constant. Targeting this shortcoming, we introduce the notion of (additive) implicit abstractions, in which the planning task is abstracted by instances of tractable fragments of optimal planning. We then introduce a concrete setting of this framework, called fork-decomposition, that is based on two novel fragments of tractable cost-optimal planning. The induced admissible heuristics are then studied formally and empirically. This study testifies for the accuracy of the fork decomposition heuristics, yet our empirical evaluation also stresses the tradeoff between their accuracy and the runtime complexity of computing them. Indeed, some of the power of the explicit abstraction heuristics comes from precomputing the heuristic function offline and then determining h(s) for each evaluated state s by a very fast lookup in a ``database.'' By contrast, while fork-decomposition heuristics can be calculated in polynomial time, computing them is far from being fast. To address this problem, we show that the time-per-node complexity bottleneck of the fork-decomposition heuristics can be successfully overcome. We demonstrate that an equivalent of the explicit abstraction notion of a ``database'' exists for the fork-decomposition abstractions as well, despite their exponential-size abstract spaces. We then verify empirically that heuristic search with the ``databased" fork-decomposition heuristics favorably competes with the state of the art of cost-optimal planning.



Sign in / Sign up

Export Citation Format

Share Document