Cloud Native Data Architecture

2021 ◽  
pp. 325-369
Author(s):  
Shivakumar R Goniwada
Keyword(s):  
2019 ◽  
Vol 2 (1) ◽  
pp. 1-16
Author(s):  
Anjas Tryana

With the development of technology today, it is very important for every company to plan and develop a system to support business processes in each company. Achieving the goals of an enterprise faces challenges and changes that require strategies for effective measures and efficient use of resources. One important and increasingly widely used strategy is the use and improvement of information system support for the enterprise. This plan can utilize enterprise architecture planning methodology that produces data architecture, application architecture, technology architecture, and the direction of its implementation plan for the enterprise.CV Biensi Fesyenindo is engaged in retail garment, with branches throughout Indonesia, covering the areas of Kalimantan, Sulawesai, NTB, NTT, Bali, Java and Sumatra. In their daily activities, they carry out production to distribution processes to meet market and employee needs.The enterprise architecture model used in this study is by using Enterprise Architecture Planning (EAP). EAP is a process of defining enterprise architecture that focuses on data architecture, applications and technology in supporting business and plans to implement the architecture, where the EAP method has several stages, starting from planning in planning, business modeling , Current System and Technology (Current System & Technology), Data Architecture (Data Architecture), Application Architecture (Applications Architecture), Technology Architecture (Technology Architecture), Implementation Plans (Implementation Plans).The results of this study are recommendations for information systems for Fesyenindo Biensi CV in the form of enterprise architecture planing blue print planning that is successful in defining 5 main business processes, which consist of application architecture data architecture and for technological architecture to produce technology architecture proposals divided into 5 chapters 110 pages .


2019 ◽  
Vol 5 (1) ◽  
pp. 48-54
Author(s):  
Dicky Yudha Handika ◽  
Rahmat Mulyana ◽  
Nia Ambarsari

Dalam mendukung program pemerintah yaitu Sistem Pemerintahan Berbasis Elektronik (SPBE), Dinas Pariwisata dan Kebudayaan Kabupaten Bandung Barat (Disparbud KBB) khususnya pada fungsi Kepegawaian Umum dan Bina Budaya diupayakan dapat menyelaraskan antara layanan bisnis dan layanan teknologi informasi dalam melakukan perubahan dari segi layanan internal maupun publik dalam mengembangkan potensi kebudayaan dan pariwisata untuk meningkatkan perekonomian masyarakat Kabupaten Bandung Barat. Enterprise Architecture (EA) menjadi jawaban sebagai pengembangan rancangan dokumentasi blueprint TI. Perancangan EA pada penelitian ini menggunakan kerangka kerja TOGAF ADM untuk memenuhi kebutuhan pengembangan sistem melalui Phase C: Information System Architecture (Data Architecture & Application Architecture). Metode analisis dokumen, wawancara dan observasi dilakukan dalam menunjang pencaharian kebutuhan informasi organisasi selama pengembangan EA. Adapun hasil yang didapatkan berupa solusi integrase proses data dan aplikasi melalui teknologi Government Service Bus (GSB).


Author(s):  
Michael Goul ◽  
T. S. Raghu ◽  
Ziru Li

As procurement organizations increasingly move from a cost-and-efficiency emphasis to a profit-and-growth emphasis, flexible data architecture will become an integral part of a procurement analytics strategy. It is therefore imperative for procurement leaders to understand and address digitization trends in supply chains and to develop strategies to create robust data architecture and analytics strategies for the future. This chapter assesses and examines the ways companies can organize their procurement data architectures in the big data space to mitigate current limitations and to lay foundations for the discovery of new insights. It sets out to understand and define the levels of maturity in procurement organizations as they pertain to the capture, curation, exploitation, and management of procurement data. The chapter then develops a framework for articulating the value proposition of moving between maturity levels and examines what the future entails for companies with mature data architectures. In addition to surveying the practitioner and academic research literature on procurement data analytics, the chapter presents detailed and structured interviews with over fifteen procurement experts from companies around the globe. The chapter finds several important and useful strategies that have helped procurement organizations design strategic roadmaps for the development of robust data architectures. It then further identifies four archetype procurement area data architecture contexts. In addition, this chapter details exemplary high-level mature data architecture for each archetype and examines the critical assumptions underlying each one. Data architectures built for the future need a design approach that supports both descriptive and real-time, prescriptive analytics.


2021 ◽  
pp. 1-13
Author(s):  
Daniel A. Contreras ◽  
Zachary Batist ◽  
Ciara Zogheib ◽  
Tristan Carter

Abstract The documentation and analysis of archaeological lithics must navigate a basic tension between examining and recording data on individual artifacts or on aggregates of artifacts. This poses a challenge both for artifact processing and for database construction. We present here an R Shiny solution that enables lithic analysts to enter data for both individual artifacts and aggregates of artifacts while maintaining a robust yet flexible data structure. This takes the form of a browser-based database interface that uses R to query existing data and transform new data as necessary so that users entering data of varying resolutions still produce data structured around individual artifacts. We demonstrate the function and efficacy of this tool (termed the Queryable Artifact Recording Interface [QuARI]) using the example of the Stelida Naxos Archaeological Project (SNAP), which, focused on a Paleolithic and Mesolithic chert quarry, has necessarily confronted challenges of processing and analyzing large quantities of lithic material.


Author(s):  
Shinichi Fukushige ◽  
Yuki Matsuyama ◽  
Eisuke Kunii ◽  
Yasushi Umeda

Within the framework of sustainability in manufacturing industry, product lifecycle design is a key approach for constructing resource circulation systems of industrial products that drastically reduce environmental loads, resource consumption and waste generation. In such design, designers should consider both a product and its lifecycle from a holistic viewpoint, because the product’s structure, geometry, and other attributes are closely coupled with the characteristics of the lifecycle. Although product lifecycle management (PLM) systems integrate product data during its lifecycle into one data architecture, they do not focus on support for lifecycle design process. In other words, PLM does not provide explicit models for designing product lifecycles. This paper proposes an integrated model of a product and its lifecycle and a method for managing consistency between the two. For the consistency management, three levels of consistency (i.e., topological, geometric, and semantic) are defined. Based on this management scheme, the product lifecycle model allows designers to evaluate environmental, economic, and other performance of the designed lifecycle using lifecycle simulation.


Author(s):  
M.Dolores Ruiz ◽  
Juan Gomez-Romero ◽  
Carlos Fernandez-Basso ◽  
Maria J. Martin-Bautista

2021 ◽  
Vol 10 (1) ◽  
pp. 99-107
Author(s):  
ByungRae Cha ◽  
Sun Park ◽  
JaeHyun Seo ◽  
JongWon Kim ◽  
Byeong-Chun Shin
Keyword(s):  

2016 ◽  
Vol 33 (4) ◽  
pp. 621-634 ◽  
Author(s):  
Jingyin Tang ◽  
Corene J. Matyas

AbstractThe creation of a 3D mosaic is often the first step when using the high-spatial- and temporal-resolution data produced by ground-based radars. Efficient yet accurate methods are needed to mosaic data from dozens of radar to better understand the precipitation processes in synoptic-scale systems such as tropical cyclones. Research-grade radar mosaic methods of analyzing historical weather events should utilize data from both sides of a moving temporal window and process them in a flexible data architecture that is not available in most stand-alone software tools or real-time systems. Thus, these historical analyses require a different strategy for optimizing flexibility and scalability by removing time constraints from the design. This paper presents a MapReduce-based playback framework using Apache Spark’s computational engine to interpolate large volumes of radar reflectivity and velocity data onto 3D grids. Designed as being friendly to use on a high-performance computing cluster, these methods may also be executed on a low-end configured machine. A protocol is designed to enable interoperability with GIS and spatial analysis functions in this framework. Open-source software is utilized to enhance radar usability in the nonspecialist community. Case studies during a tropical cyclone landfall shows this framework’s capability of efficiently creating a large-scale high-resolution 3D radar mosaic with the integration of GIS functions for spatial analysis.


Complexity ◽  
2019 ◽  
Vol 2019 ◽  
pp. 1-13
Author(s):  
Cristina Sánchez-Rebollo ◽  
Cristina Puente ◽  
Rafael Palacios ◽  
Claudia Piriz ◽  
Juan P. Fuentes ◽  
...  

Social networks are being used by terrorist organizations to distribute messages with the intention of influencing people and recruiting new members. The research presented in this paper focuses on the analysis of Twitter messages to detect the leaders orchestrating terrorist networks and their followers. A big data architecture is proposed to analyze messages in real time in order to classify users according to different parameters like level of activity, the ability to influence other users, and the contents of their messages. Graphs have been used to analyze how the messages propagate through the network, and this involves a study of the followers based on retweets and general impact on other users. Then, fuzzy clustering techniques were used to classify users in profiles, with the advantage over other classifications techniques of providing a probability for each profile instead of a binary categorization. Algorithms were tested using public database from Kaggle and other Twitter extraction techniques. The resulting profiles detected automatically by the system were manually analyzed, and the parameters that describe each profile correspond to the type of information that any expert may expect. Future applications are not limited to detecting terrorist activism. Human resources departments can apply the power of profile identification to automatically classify candidates, security teams can detect undesirable clients in the financial or insurance sectors, and immigration officers can extract additional insights with these techniques.


Sign in / Sign up

Export Citation Format

Share Document