e-Government Interoperability Framework in Lithuania

Author(s):  
Rimantas Gatautis ◽  
Elena Vitkauskaite ◽  
Genadijus Kulvietis ◽  
Demetrios Sarantis

An e-Government Interoperability Framework (eGIF) is one way to achieve e-Government interoperability. An eGIF is a set of standards and guidelines that a government uses to specify the preferred way that its agencies, citizens and partners interact with each other. In order to come up to the expectations of their stakeholders and to achieve real resolution of the evolving interoperability problems, the scope of the eGIFs needs to be extended, including service composition and discovery, development and management of semantic schemas for governmental documents, certification mechanisms and authentication standards. Moreover, a shift from a paper-based specification towards a repository of services, data schemas and process models is needed, in order to serve the ever-changing nature of governments under transformation. Upon conducting a state of the art analysis of relevant frameworks at a pan-European and national level, lessons learnt from the pioneers UK eGIF, German SAGA and Greek eGIF are presented. The proposed Lithuanian eGIF model describes new approach, outlines the technical, semantic and organization dimensions and stresses the importance of political interoperability. It also provides three layers model moving from only standards and specifications based approach to systems and coordination support elements. Finally the chapter tackles the issues that rose within stakeholders’ community in the e-Government interoperability context.

2017 ◽  
Vol 50 (4) ◽  
pp. 1-33 ◽  
Author(s):  
Andreas Schoknecht ◽  
Tom Thaler ◽  
Peter Fettke ◽  
Andreas Oberweis ◽  
Ralf Laue

Author(s):  
Fenareti Lampathaki ◽  
Christos Tsiakaliaris ◽  
Antonis Stasis ◽  
Yannis Charalabidis

National Interoperability Frameworks (NIFs) have been established during the last years as the governmental policy cornerstones for deploying joined-up information systems and providing one-stop services to citizens and businesses all over the world. In order to meet the rising expectations of their stakeholders, to cope with technological evolutions in the future internet era, and to achieve efficient resolution of the evolving interoperability problems, NIF’s developers face new challenges: The scope of the frameworks needs to be extended, including service composition and discovery, development and management of semantic elements, certification mechanisms and authentication standards. Moreover, a shift from a paper-based specification towards a repository of services, data schemas, process models and standards is needed, in order to serve the ever-changing requirements of governments under transformation. Going beyond an analysis of relevant frameworks at an international level, this chapter illustrates the best practices and the future directions for NIF’s, proposing an infrastructure that can meet the demands for modelling, storing, managing and transforming vast numbers of service descriptions, XML hierarchies, as well as specific technical, semantic, organisational and legal interoperability standards.


2013 ◽  
pp. 923-946
Author(s):  
Fenareti Lampathaki ◽  
Christos Tsiakaliaris ◽  
Antonis Stasis ◽  
Yannis Charalabidis

National Interoperability Frameworks (NIFs) have been established during the last years as the governmental policy cornerstones for deploying joined-up information systems and providing one-stop services to citizens and businesses all over the world. In order to meet the rising expectations of their stakeholders, to cope with technological evolutions in the future internet era, and to achieve efficient resolution of the evolving interoperability problems, NIF’s developers face new challenges: The scope of the frameworks needs to be extended, including service composition and discovery, development and management of semantic elements, certification mechanisms and authentication standards. Moreover, a shift from a paper-based specification towards a repository of services, data schemas, process models and standards is needed, in order to serve the ever-changing requirements of governments under transformation. Going beyond an analysis of relevant frameworks at an international level, this chapter illustrates the best practices and the future directions for NIF’s, proposing an infrastructure that can meet the demands for modelling, storing, managing and transforming vast numbers of service descriptions, XML hierarchies, as well as specific technical, semantic, organisational and legal interoperability standards.


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Brett J. M. Petzer ◽  
Anna J. Wieczorek ◽  
Geert P. J. Verbong

AbstractAn urban mobility transition requires a transition in space allocation, since most mobility modes are dependent on urban open space for circulation and the storage of vehicles. Despite increasing attention to space and spatiality in transitions research, the finite, physical aspects of urban space, and the means by which it is allocated, have not been adequately acknowledged as an influence on mobility transitions. A conceptual framework is introduced to support comparison between cities in terms of the processes by which open space is (re-)distributed between car and bicycle circulatory and regulatory space. This framework distinguishes between regulatory allocation mechanisms and the appropriation practices of actors. Application to cases in Amsterdam, Brussels and Birmingham reveal unique relationships created by the zero-sum nature of urban open space between the dominant automobility mode and subordinate cycling mode. These relationships open up a new approach to forms of lock-in that work in favour of particular mobility modes within the relatively obdurate urban built environment. Empirically, allocation mechanisms that routinise the production of car space at national level within the EU are shown to be far more prevalent than those for bicycle space, highlighting the constraints faced by radical city-level policies aimed at space reallocation.


Energies ◽  
2021 ◽  
Vol 14 (14) ◽  
pp. 4312
Author(s):  
Marzena Smol

Circular economy (CE) is an economic model, in which raw materials remain in circulation as long as possible and the generation of waste is minimized. In the fertilizer sector, waste rich in nutrients should be directed to agriculture purposes. This paper presents an analysis of recommended directions for the use of nutrient-rich waste in fertilizer sector and an evaluation of possible interest in this kind of fertilizer by a selected group of end-users (nurseries). The scope of research includes the state-of-the-art analysis on circular aspects and recommended directions in the CE implementation in the fertilizer sector (with focus on sewage-based waste), and survey analysis on the potential interest of nurseries in the use of waste-based fertilizers in Poland. There are more and more recommendations for the use of waste for agriculture purposes at European and national levels. The waste-based products have to meet certain requirements in order to put such products on the marker. Nurserymen are interested in contributing to the process of transformation towards the CE model in Poland; however, they are not fully convinced due to a lack of experience in the use of waste-based products and a lack of social acceptance and health risk in this regard. Further actions to build the social acceptance of waste-based fertilizers, and the education of end-users themselves in their application is required.


2020 ◽  
pp. 1-16
Author(s):  
Meriem Khelifa ◽  
Dalila Boughaci ◽  
Esma Aïmeur

The Traveling Tournament Problem (TTP) is concerned with finding a double round-robin tournament schedule that minimizes the total distances traveled by the teams. It has attracted significant interest recently since a favorable TTP schedule can result in significant savings for the league. This paper proposes an original evolutionary algorithm for TTP. We first propose a quick and effective constructive algorithm to construct a Double Round Robin Tournament (DRRT) schedule with low travel cost. We then describe an enhanced genetic algorithm with a new crossover operator to improve the travel cost of the generated schedules. A new heuristic for ordering efficiently the scheduled rounds is also proposed. The latter leads to significant enhancement in the quality of the schedules. The overall method is evaluated on publicly available standard benchmarks and compared with other techniques for TTP and UTTP (Unconstrained Traveling Tournament Problem). The computational experiment shows that the proposed approach could build very good solutions comparable to other state-of-the-art approaches or better than the current best solutions on UTTP. Further, our method provides new valuable solutions to some unsolved UTTP instances and outperforms prior methods for all US National League (NL) instances.


2021 ◽  
Vol 13 (2) ◽  
pp. 50
Author(s):  
Hamed Z. Jahromi ◽  
Declan Delaney ◽  
Andrew Hines

Content is a key influencing factor in Web Quality of Experience (QoE) estimation. A web user’s satisfaction can be influenced by how long it takes to render and visualize the visible parts of the web page in the browser. This is referred to as the Above-the-fold (ATF) time. SpeedIndex (SI) has been widely used to estimate perceived web page loading speed of ATF content and a proxy metric for Web QoE estimation. Web application developers have been actively introducing innovative interactive features, such as animated and multimedia content, aiming to capture the users’ attention and improve the functionality and utility of the web applications. However, the literature shows that, for the websites with animated content, the estimated ATF time using the state-of-the-art metrics may not accurately match completed ATF time as perceived by users. This study introduces a new metric, Plausibly Complete Time (PCT), that estimates ATF time for a user’s perception of websites with and without animations. PCT can be integrated with SI and web QoE models. The accuracy of the proposed metric is evaluated based on two publicly available datasets. The proposed metric holds a high positive Spearman’s correlation (rs=0.89) with the Perceived ATF reported by the users for websites with and without animated content. This study demonstrates that using PCT as a KPI in QoE estimation models can improve the robustness of QoE estimation in comparison to using the state-of-the-art ATF time metric. Furthermore, experimental result showed that the estimation of SI using PCT improves the robustness of SI for websites with animated content. The PCT estimation allows web application designers to identify where poor design has significantly increased ATF time and refactor their implementation before it impacts end-user experience.


Cybersecurity ◽  
2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Shushan Arakelyan ◽  
Sima Arasteh ◽  
Christophe Hauser ◽  
Erik Kline ◽  
Aram Galstyan

AbstractTackling binary program analysis problems has traditionally implied manually defining rules and heuristics, a tedious and time consuming task for human analysts. In order to improve automation and scalability, we propose an alternative direction based on distributed representations of binary programs with applicability to a number of downstream tasks. We introduce Bin2vec, a new approach leveraging Graph Convolutional Networks (GCN) along with computational program graphs in order to learn a high dimensional representation of binary executable programs. We demonstrate the versatility of this approach by using our representations to solve two semantically different binary analysis tasks – functional algorithm classification and vulnerability discovery. We compare the proposed approach to our own strong baseline as well as published results, and demonstrate improvement over state-of-the-art methods for both tasks. We evaluated Bin2vec on 49191 binaries for the functional algorithm classification task, and on 30 different CWE-IDs including at least 100 CVE entries each for the vulnerability discovery task. We set a new state-of-the-art result by reducing the classification error by 40% compared to the source-code based inst2vec approach, while working on binary code. For almost every vulnerability class in our dataset, our prediction accuracy is over 80% (and over 90% in multiple classes).


Sensors ◽  
2019 ◽  
Vol 19 (2) ◽  
pp. 230 ◽  
Author(s):  
Slavisa Tomic ◽  
Marko Beko

This work addresses the problem of target localization in adverse non-line-of-sight (NLOS) environments by using received signal strength (RSS) and time of arrival (TOA) measurements. It is inspired by a recently published work in which authors discuss about a critical distance below and above which employing combined RSS-TOA measurements is inferior to employing RSS-only and TOA-only measurements, respectively. Here, we revise state-of-the-art estimators for the considered target localization problem and study their performance against their counterparts that employ each individual measurement exclusively. It is shown that the hybrid approach is not the best one by default. Thus, we propose a simple heuristic approach to choose the best measurement for each link, and we show that it can enhance the performance of an estimator. The new approach implicitly relies on the concept of the critical distance, but does not assume certain link parameters as given. Our simulations corroborate with findings available in the literature for line-of-sight (LOS) to a certain extent, but they indicate that more work is required for NLOS environments. Moreover, they show that the heuristic approach works well, matching or even improving the performance of the best fixed choice in all considered scenarios.


Sign in / Sign up

Export Citation Format

Share Document