Practicing Software Engineering in the 21st Century
Latest Publications


TOTAL DOCUMENTS

18
(FIVE YEARS 0)

H-INDEX

1
(FIVE YEARS 0)

Published By IGI Global

9781931777506, 9781931777667

Author(s):  
Valentina Plekhanova

This chapter presents a project proposal that defines future work in engineering the learning processes in cognitive systems. This proposal outlines a number of directions in the fields of systems engineering, machine learning, knowledge engineering and profile theory, that lead to the development of formal methods for the modeling and engineering of learning systems. This chapter describes a framework for formalization and engineering the cognitive processes, which is based on applications of computational methods. The proposed work studies cognitive processes in software development process and considers a cognitive system as a multi-agents system of human-cognitive agents. It is important to note that this framework can be applied to different types of learning systems, and there are various techniques from different theories (e.g., system theory, quantum theory, neural networks) can be used for the description of cognitive systems, which in turn can be represented by different types of cognitive agents.


Author(s):  
Vijay V. Raghavan

Populist approaches to studying information systems security include architectural, infrastructure-related and system-level security. This study focuses on software security implemented and monitored during systems development and implementation stages. Moving away from the past checklist methods of studying software security, this study provides a model that could be used in categorizing checklists into meaningful clusters. Many constructs, such as principle of least privilege, execution monitoring, social engineering and formalism and pragmatism in security implementations, are identified in the model. The identification of useful constructs to study can form the basis of evaluating security in software systems as well as provide guidelines of implementing security in new systems developed.


Author(s):  
Judith Kabeli ◽  
Peretz Shoval

FOOM (Functional and Object-Oriented Methodology) is an integrated methodology for information systems’ analysis and design, which combines two essential software-engineering paradigms: the functional/data approach (or process-oriented) and the object-oriented (OO) approach. Having applied FOOM in a variety of domains, this chapter presents the application of the methodology to the specification of the IFIP Conference system. We focus on the analysis and design phases. FOOM-analysis phase includes data modeling and functional analysis activities and produces an initial Class Diagram and a hierarchy of OO data flow diagrams (OO-DFDs). The products of the design phase include: (a) a complete class diagram; (b) object classes for the menus, forms and reports and (c) a behavior schema, which consists of detailed descriptions of the methods and the application transactions, expressed in pseudocode and message diagrams.


Author(s):  
John Mendonca ◽  
Jeff Brewer

Historically, the approach to software engineering has been based on a search for an optimal (ideal) methodology — that is, the identification and application of a set of processes, methods and tools that can consistently and predictably lead to software development success. This chapter presents the basis for pursuing a more flexible and adaptive approach to methodology. Less methodical methodologies, under a variety of names, take a contingency-oriented approach. Because of the limitations in the nature of methodology, the high failure rate in software development, the need to develop methodology within an environmental context and the pressures of fast-paced “e-development,” the authors argue that further exploration and definition of an adaptive, contingency-based approach to methodology is justified.


Author(s):  
Rick Gibson

This chapter will identify the key aspects of software engineering and systems engineering in an effort to highlight areas of consensus and conflict to support current efforts by practitioners and academics in both disciplines in redefining their professions and bodies of knowledge. By using the Software Engineering Institute’s Capability Maturity Model –Integrated (CMMISM) project, which combines best practices from the systems and software engineering disciplines, it can be shown that significant point of agreement and consensus are evident. Nevertheless, valid objections to such integration remain as areas of conflict. This chapter will provide an opportunity for these two communities to resolve unnecessary differences in terminology and methodologies that are reflected in their different perspectives and entrenched in their organizational cultures.


Author(s):  
Zoran Stojanovic ◽  
Ajantha Dahanayake

Although Component-Based Development (CBD) platforms and technologies, such as CORBA, COM+/.NET and Enterprise Java Beans (EJB), are now de facto standards for implementation and deployment of complex enterprise distributed systems, the full benefit of the component way of thinking has not yet been gained. Current CBD approaches and methods treat components mainly as binary-code implementation packages or as larger grained business objects in system analysis and design. Little attention has been paid to the potential of the component way of thinking in filling the gap between business and information technology (IT) issues. This chapter proposes a service-based approach to the component concept representing the point of convergence of business and technology concerns. The approach defines components as the main building blocks of business-driven service-based system architecture that provides effective business-IT alignment.


Author(s):  
Laura Felice ◽  
Daniel Riesco

During the RAISE specification development process, a variety of components and infrastructures are built. All of these components are not independent, but they are related to each other, especially when we specify different systems in the same infrastructure. The RAISE method is based on the idea that software development is a stepwise, evolutionary process of applying semantics-preserving transitions. So, the reuse process is crucial in all stages of the development, but there is not explicit reference to the specification reusability in this development process. This chapter presents a rigorous process for reusability for RAISE Specification Language (RSL) components. We provide the mechanism to select a reusable component in order to guide RAISE developers in software specification and construction.


Author(s):  
T. Y. Chen ◽  
Iyad Rahwan ◽  
Yun Yang

This chapter introduces a novel notion of temporal interaction diagrams for distributed and parallel programming. An interaction diagram is a graphical view of computation processes and communication between different entities in distributed and parallel processes. It can be used for the specification, implementation and testing of interaction policies in distributed and parallel systems. Expressing interaction diagrams in a linear form, known as fragmentation, facilitate automation of design and testing of such systems. Existing interaction diagram formalisms lack the flexibility and capability of describing more general temporal order constraints. They only support rigid temporal order, and, hence, have limited semantic expressiveness. We propose an improved interaction diagram formalism in which more general temporal constraints can be expressed. This enables us to capture multiple valid interaction sequences using a single interaction diagram.


Author(s):  
Gordana Jovanovic-Dolecek

This chapter presents the design of narrowband highpass linear-phase finite impulse response (FIR) filters using the sharpening recursive running sum (RRS) filter and the interpolated finite impulse response (IFIR) structure. The novelty of this technique is based on the use of sharpening RRS filter as an image suppressor in the IFIR structure. In that way, the total number of multiplications per output sample is considerably reduced.


Author(s):  
Gordana Jovanovic-Dolecek ◽  
Javier Diaz-Carmona

This chapter describes a design of a narrowband lowpass finite impulse response (FIR) filter using a small number of multipliers per output sample (MPS). The method is based on the use of a frequency-improved recursive running sum (RRS), called the sharpening RRS filter, and the interpolated finite impulse response (IFIR) structure. The filter sharpening technique uses multiple copies of the same filter according to an amplitude change function (ACF), which maps a transfer function before sharpening to a desired form after sharpening. Three ACFs are used in the design, as illustrated in the accompanying examples.


Sign in / Sign up

Export Citation Format

Share Document