scholarly journals MAP-OPT: A software for supporting decision-making in the field of modified atmosphere packaging of fresh non respiring foods

2017 ◽  
Vol 2 (1) ◽  
pp. 28-47 ◽  
Author(s):  
Valérie Guillard ◽  
Olivier Couvert ◽  
Valérie Stahl ◽  
Patrice Buche ◽  
Aurélie Hanin ◽  
...  

Abstract In this paper,we present the implementation of a dedicated software, MAP-OPT, for optimising the design of ModifiedAtmosphere Packaging of refrigerated fresh, nonrespiring food products. The core principle of this software is to simulate the impact of gas (O2/CO2) exchanges on the growth of gas-sensitive microorganisms in the packed food system. In its simplest way, this tool, associated with a data warehouse storing food, bacteria and packaging properties, allows the user to explore his/her system in a user-friendly manner by adjusting/changing the pack geometry, packaging material and gas composition (mixture of O2/CO2/N2). Via the @Web application, the data warehouse associated with MAP-OPT is structured by an ontology, which allows data to be collected and stored in a standardized format and vocabulary in order to be easily retrieved using a standard querying methodology. In an optimisation approach, the MAP-OPT software enables to determine the packaging characteristics (e.g. gas permeability) suitable for a target application (e.g. maximal bacterial population at the best-before-date). These targeted permeabilities are then used to query the packaging data warehouse using the@Web applicationwhich proposes a ranking of the most satisfying materials for the target application (i.e. packaging materialswhose characteristics are the closest to the target ones identified by the MAP-OPT software). This approach allows a more rational dimensioning of MAP of non-respiring food products by selecting the packaging material fitted to “just necessary” (and not by default, that with the greatest barrier properties). A working example of MAP dimensioning for a strictly anaerobic, CO2-sensitive microorganism, Pseudomonas fluorescens, is given to highlight the usefulness of the software.

Author(s):  
Charles Greenidge ◽  
Hadrian Peter

Data warehouses have established themselves as necessary components of an effective Information Technology (IT) strategy for large businesses. In addition to utilizing operational databases data warehouses must also integrate increasing amounts of external data to assist in decision support. An important source of such external data is the Web. In an effort to ensure the availability and quality of Web data for the data warehouse we propose an intermediate data-staging layer called the Meta-Data Engine (M-DE). A major challenge, however, is the conversion of data originating in the Web, and brought in by robust search engines, to data in the data warehouse. The authors therefore also propose a framework, the Semantic Web Application (SEMWAP) framework, which facilitates semi-automatic matching of instance data from opaque web databases using ontology terms. Their framework combines Information Retrieval (IR), Information Extraction (IE), Natural Language Processing (NLP), and ontology techniques to produce a matching and thus provide a viable building block for Semantic Web (SW) Applications.


2019 ◽  
Vol 33 (3) ◽  
pp. 347-361 ◽  
Author(s):  
Vesna Ocelić Bulatović ◽  
Anamarija Turković ◽  
Emi Govorčin Bajsić ◽  
Romana Zovko ◽  
Antun Jozinović ◽  
...  

Low-density polyethylene (LDPE) is extensively used as packaging material, and as such has a short service life, but long environmental persistence. The alternative to reducing the impact of LDPE as packaging material on the environment is to blend it with carbohydrate-based polymers, like starch. Therefore, the focus of this investigation was to prepare bio-based blends of LDPE and thermoplastic starch (TPS) containing different amounts of TPS using a Brabender kneading chamber. Due to incompatibility of LDPE/ TPS blends, a styrene–ethylene/butylene–styrene block copolymer, grafted with maleic anhydride (SEBS-g-MA) containing 2 mol % anhydride groups, was added as a compatibilizer. The effect of the biodegradable, hydrophilic TPS, its content, and the incorporation of the compatibilizer on the properties of LDPE/TPS blends were analysed. The characterization was performed by means of thermogravimetric analysis (TG), differential scanning calorimetry (DSC), scanning electron microscopy (SEM), and water absorption (WA). Based on the results of the morphological structure, a good dispersion of the TPS phase in LDPE matrix was obtained with the incorporation of compatibilizer, which resulted in better thermal and barrier properties of these materials.


2020 ◽  
Vol 13 (6) ◽  
pp. 94-109
Author(s):  
Rajeev Kumar ◽  
◽  
Mamdouh Alenezi ◽  
Md Ansari ◽  
Bineet Gupta ◽  
...  

Nowadays, most of the cyber-attacks are initiated by extremely malicious programs known as Malware. Malwares are very vigorous and can penetrate the security of information and communication systems. While there are different techniques available for malware analysis, it becomes challenging to select the most effective approach. In this context, the decision-making process may be an efficient means of empirically assessing the impact of different methods for securing the web applications. In this research study, we have used a methodology that includes the integration of Fuzzy AHP and Fuzzy TOPSIS technique for evaluating the impact of different malware analysis techniques in web application perspective. This study uses different versions of a university’s web application for evaluating the impact of several existing malware analysis techniques. The findings of the study show that the Reverse Engineering approach is the most efficient technique for analyzing complex malware. The outcome of this study would definitely aid the future researchers and developers in selecting the appropriate techniques for scanning the web application code and enhancing the security.


Author(s):  
Hadrian Peter

Data warehouses have established themselves as necessary components of an effective IT strategy for large businesses. To augment the streams of data being siphoned from transactional/operational databases warehouses must also integrate increasing amounts of external data to assist in decision support. Modern warehouses can be expected to handle up to 100 Terabytes or more of data. (Berson and Smith, 1997; Devlin, 1998; Inmon 2002; Imhoff et al, 2003; Schwartz, 2003; Day 2004; Peter and Greenidge, 2005; Winter and Burns 2006; Ladley, 2007). The arrival of newer generations of tools and database vendor support has smoothed the way for current warehouses to meet the needs of the challenging global business environment ( Kimball and Ross, 2002; Imhoff et al, 2003; Ross, 2006). We cannot ignore the role of the Internet in modern business and the impact on data warehouse strategies. The web represents the richest source of external data known to man ( Zhenyu et al, 2002; Chakrabarti, 2002; Laender et al, 2002) but we must be able to couple raw text or poorly structured data on the web with descriptions, annotations and other forms of summary meta-data (Crescenzi et al, 2001). In recent years the Semantic Web initiative has focussed on the production of “smarter data”. The basic idea is that instead of making programs with near human intelligence, we rather carefully add meta-data to existing stores so that the data becomes “marked up” with all the information necessary to allow not-sointelligent software to perform analysis with minimal human intervention. (Kalfoglou et al, 2004) The Semantic Web builds on established building block technologies such as Unicode, URIs(Uniform Resource Indicators) and XML (Extensible Markup Language) (Dumbill, 2000; Daconta et al, 2003; Decker et al, 2000). The modern data warehouse must embrace these emerging web initiatives. In this paper we propose a model which provides mechanisms for sourcing external data resources for analysts in the warehouse.


Author(s):  
Nashat Mansour ◽  
Nabil Baba

The number of internet web applications is rapidly increasing in a variety of fields and not much work has been done for ensuring their quality, especially after modification. Modifying any part of a web application may affect other parts. If the stability of a web application is poor, then the impact of modification will be costly in terms of maintenance and testing. Ripple effect is a measure of the structural stability of source code upon changing a part of the code, which provides an assessment of how much a local modification in the web application may affect other parts. Limited work has been published on computing the ripple effect for web application. In this paper, the authors propose, a technique for computing ripple effect in web applications. This technique is based on direct-change impact analysis and dependence analysis for web applications developed in the .Net environment. Also, a complexity metric is proposed to be included in computing the ripple effect in web applications.


2021 ◽  
Author(s):  
Lutz Bornmann ◽  
Rüdiger Mutz ◽  
Robin Haunschild ◽  
Felix de Moya-Anegon ◽  
Mirko de Almeida Madeira Clemente ◽  
...  

AbstractIn over five years, Bornmann, Stefaner, de Moya Anegon, and Mutz (2014b) and Bornmann, Stefaner, de Moya Anegón, and Mutz (2014c, 2015) have published several releases of the www.excellencemapping.net tool revealing (clusters of) excellent institutions worldwide based on citation data. With the new release, a completely revised tool has been published. It is not only based on citation data (bibliometrics), but also Mendeley data (altmetrics). Thus, the institutional impact measurement of the tool has been expanded by focusing on additional status groups besides researchers such as students and librarians. Furthermore, the visualization of the data has been completely updated by improving the operability for the user and including new features such as institutional profile pages. In this paper, we describe the datasets for the current excellencemapping.net tool and the indicators applied. Furthermore, the underlying statistics for the tool and the use of the web application are explained.


2018 ◽  
Vol 7 (2.30) ◽  
pp. 6
Author(s):  
Daljit Kaur ◽  
Dr Parminder Kaur

With the growth of web and Internet, every era of human life has been affected. People want to make their or their organization’s presence globally visible through this medium. Web applications and/or mobile apps are used for the purpose of making their recognition as well as to attract the clients worldwide. With the demand of putting the business or services online faster than anyone else, web applications are developed in hustle and under pressure by developers and most of the times they ignore the few essential activities for securing them from severe attacks, which may be a greater loss for the business. This work is an effort to understand the complex distributed environment of web applications and show the impact of husting the web development process.  


2016 ◽  
Vol 1 (1) ◽  
Author(s):  
Nanou Peelman ◽  
Peter Ragaert ◽  
Elien Verguldt ◽  
Frank Devlieghere ◽  
Bruno De Meulenaer

AbstractThe research aim was to evaluate the applicability of biobased plastics for packing long shelf-life food products, both on laboratory and industrial scale. Therefore, the shelf-life (room temperature) of tortilla chips, dry biscuits and potato flakes packed under air or modified atmosphere (MAP) in xylan and cellulose-based packages was evaluated and compared with their shelf-life in reference (conventional) packaging materials. These tests were followed by packaging trials on industrial lines. Furthermore, overall migration studies and printability tests were performed. Most of the biobased packages showed sufficient barrier towards moisture and gasses to serve as a food packaging material and MAP packaging of long shelf-life food products is possible. But for very moisture-sensitive food products (e.g. dry biscuits), no suited packaging material was found. The quality of the tortilla chips and potato flakes could be guaranteed during their shelf-life, even if packaging materials with lower barrier properties were used. Still, brittleness and seal properties are critical for use on industrial scale (important for use on vertical flow packaging machines). Furthermore, the films were printable and migration tests showed compliance with legislation. This study shows promising results towards the industrial application of biobased packaging materials for long shelflife food products.


Author(s):  
David Parsons

This chapter explores how Web application software architecture has evolved from the simple beginnings of static content, through dynamic content, to adaptive content and the integrated client-server technologies of the Web 2.0. It reviews how various technologies and standards have developed in a repeating cycle of innovation, which tends to fragment the Web environment, followed by standardisation, which enables the wider reach of new technologies. It examines the impact of the Web 2.0, XML, Ajax and mobile Web clients on Web application architectures, and how server side processes can support increasingly rich, diverse and interactive clients. It provides an overview of a server-side Java-based architecture for contemporary Web applications that demonstrates some of the key concepts under discussion. By outlining the various forces that influence architectural decisions, this chapter should help developers to take advantage of the potential of innovative technologies without sacrificing the broad reach of standards based development.


Sign in / Sign up

Export Citation Format

Share Document