Technology’s Impact on the Performance Management Transformation

Author(s):  
Ryan S. O’Leary ◽  
Elizabeth Lentz

Organizations are facing unprecedented levels of change as forces like hypercompetition, knowledge commoditization, and technology are having a significant impact on performance and threating organizational survival. These pressures, combined with the shortcomings of existing performance management (PM) approaches, are forcing organizations to develop more effective PM practices. Competing effectively in this environment demands novel, cutting-edge, and more impactful approaches. There has been a movement toward processes that are data driven, flexible, continuous, engaging, future focused, and development oriented and that allow for more informed decision-making by organizations, managers, and employees. To support and drive this transformation, organizations are increasingly relying on technology, people analytics, and data science solutions. This chapter reviews the evolution of PM technology, examines its potential to add value and lead to behavioral change, reviews possible limitations and barriers to implementation to ensure the hype is not outpacing reality, and provides practical guidance for organizations looking to implement these new technologies. It is argued that advances in technology can support the activities needed to truly drive performance, but that it does not create change in and of itself. Emergent technology must be combined with organizational change strategies and evaluated for its individual, team, and organizational impact. In addition, it is imperative to examine how technology can be most effectively leveraged to improve specific performance-driving behaviors that the research has shown lead to more impactful performance management approaches rather than simply automating processes that have not worked well in the past.

2021 ◽  
Author(s):  
Ramez Saeed ◽  
Saad Abdelrahman ◽  
Andrea Scozari ◽  
Abdelazim Negm

<p><strong>ABSTRACT</strong></p><p>With the fast and highly growing demand for all possible ways of remote work as a result of COVID19 pandemic, new technologies using Satellite data were highly encouraged for multidisciplinary applications in different fields such as; agriculture, climate change, environment, coastal management, maritime, security and Blue Economy.</p><p>This work supports applying Satellite Derived Bathymetry (SDB) with the available low-cost multispectral satellite imagery applications, instruments and readily accessible data for different areas with only their benthic parameters, water characteristics and atmospheric conditions.  The main goal of this work is to derive bathymetric data needed for different hydrographic applications, such as: nautical charting, coastal engineering, water quality monitoring, sediment movement monitoring and supporting both green carbon and marine data science.  Also, this work proposes and assesses a SDB procedure that makes use of publicly-available multispectral satellite images (Sentinel2 MSI) and applies algorithms available in the SNAP software package for extracting bathymetry and supporting bathymetric layers against highly expensive traditional in-situ hydrographic surveys. The procedure was applied at SAFAGA harbor area, located south of Hurghada at (26°44′N, 33°56′E), on the Egyptian Red Sea coast.  SAFAGA controls important maritime traffic line in Red Sea such as (Safaga – Deba, Saudi Arabia) maritime cruises.  SAFAGA depths change between 6 m to 22m surrounded by many shoal batches and confined waters that largely affect maritime safety of navigation.  Therefore, there is always a high demand for updated nautical charts which this work supports.  The outcome of this work provides and fulfils those demands with bathymetric layers data for the approach channel and harbour usage bands electronic nautical chart of SAFAGA with reasonable accuracies.  The coefficient of determination (R<sup>2</sup>) differs between 0.42 to 0.71 after applying water column correction by Lyzenga algorithm and deriving bathymetric data depending on reflectance /radiance of optical imagery collected by sentinel2 missions with in-situ depth data values relationship by Stumpf equation.  The adopted approach proved to give  highly reasonable results that could be used in nautical charts compilation. Similar methodologies could be applied to inland water bodies.  This study is part of the MSc Thesis of the first author and is in the framework of a bilateral project between ASRT of Egypt and CNR of Italy which is still running.</p><p><strong>Keywords: Algorithm, Bathymetry, Sentinel 2, nautical charting, Safaga port, satellite imagery, water depth, Egypt.</strong></p>


Author(s):  
Cat Drew

Data science can offer huge opportunities for government. With the ability to process larger and more complex datasets than ever before, it can provide better insights for policymakers and make services more tailored and efficient. As with all new technologies, there is a risk that we do not take up its opportunities and miss out on its enormous potential. We want people to feel confident to innovate with data. So, over the past 18 months, the Government Data Science Partnership has taken an open, evidence-based and user-centred approach to creating an ethical framework. It is a practical document that brings all the legal guidance together in one place, and is written in the context of new data science capabilities. As part of its development, we ran a public dialogue on data science ethics, including deliberative workshops, an experimental conjoint survey and an online engagement tool. The research supported the principles set out in the framework as well as provided useful insight into how we need to communicate about data science. It found that people had a low awareness of the term ‘data science’, but that showing data science examples can increase broad support for government exploring innovative uses of data. But people's support is highly context driven. People consider acceptability on a case-by-case basis, first thinking about the overall policy goals and likely intended outcome, and then weighing up privacy and unintended consequences. The ethical framework is a crucial start, but it does not solve all the challenges it highlights, particularly as technology is creating new challenges and opportunities every day. Continued research is needed into data minimization and anonymization, robust data models, algorithmic accountability, and transparency and data security. It also has revealed the need to set out a renewed deal between the citizen and state on data, to maintain and solidify trust in how we use people's data for social good. This article is part of the themed issue ‘The ethical impact of data science’.


2020 ◽  
Author(s):  
Guillaume Drouen ◽  
Daniel Schertzer ◽  
Ioulia Tchiguirinskaia

<p>As cities are put under greater pressure from the threat of the global impact of climate change, in particular the risk of heavier rainfall and flooding, there is a growing need to establish a hierarchical form of resilience in which critical infrastructure can become sustainable. The main difficulty is that geophysics and urban dynamics are strongly nonlinear with an associated, extreme variability over a wide range of space-time scales. To better link the fundamental and experimental research on these topics, an advanced urban hydro-meteorological observatory with the associated SaaS developments, the Fresnel platform (https://hmco.enpc.fr/portfolio-archive/fresnel-platform/), has been purposely set-up to provide the concerned communities with the necessary observation data thanks to an unprecedented deployment of higher resolution sensors, that easily yield Big Data.</p><p>To give an example, the installation of the polarimetric X-band radar at the ENPC’s campus (East of Paris) introduced a paradigm change in the prospects of environmental monitoring in Ile-de France. The radar is operated since May 2015 and has several characteristics that makes it of central importance for the environmental monitoring of the region. In particular, it demonstrated the crucial importance to have high resolution 3D+1 data, whereas earlier remote sensing developments have been mostly focused on vertical measurements.</p><p>This presentation discusses the associated Fresnel SaaS (Sofware as a Service) platform as an example of nowadays IT tools to dynamically enhance urban resilience. It is rooted on an integrated suite of modular components based on an asynchronous event-driven JavaScript runtime environment. It features non-blocking interaction model and high scalability to ensure optimized availability. It includes a comprehensive and (real-time) accessible database to support multi-criteria choices and it has been built up through stakeholder consultation and participative co-creation. At the same time these components are designed in such a way that they are tunable for specific case studies with the help of an adjustable visual interface. Depending on that case study, these components can be integrated to satisfy the particular needs with the help of maps other visual tools and forecasting systems, eventually from third parties.</p><p>All these developments have greatly benefited from the support of the Chair “Hydrology for a Resilient City” (https://hmco.enpc.fr/portfolio-archive/chair-hydrology-for-resilient-cities/) endowed by the world leader industrial in water management and from previous EU framework programmes. To sustain the necessary public-private partnerships, Fresnel facilitates synergies between research and innovation, fosters the theoretical research, national and international collaborative networking, and the development of various aspects of data science for a resilient city.</p>


2021 ◽  
Vol 280 ◽  
pp. 02004
Author(s):  
Valentyna Lavrenenko ◽  
Hanna Yanhol ◽  
Bohdan Tishkov

The development of the ideology of sustainable development stimulated the emergence of companies’ Performance Management Systems with an emphasis on the environmental aspects of their activities. Benchmarking, as a modern management tool, is often used for competitive analysis and setting development goals. This study’s scientific problem is to assess the feasibility of applying benchmarking studies to assess the global industry’s environmental aspects. The purpose of the study is to identify the prerequisites for using benchmarking to improve environmental performance, as well as to identify best practices among world-leading companies. For benchmarking, a logical information model is proposed in the study. On its basis, eight world leaders were selected, trends in the industry’s development were analysed, and reference values of environmental indicators were established. For environmental performance assessment, it is proposed to use such indicators as greenhouse gas emissions, energy consumption, material efficiency, environmental management systems. Comparative benchmarking analysis of world leaders and 16 largest Ukrainian companies allowed determining the reserves for increasing environmental performance. The directions for increasing environmental performance are Investment in resource-saving technologies, production of higher value-added products, investments in energy-saving and new technologies, improvement of management systems, and certification. These ideas are complemented by recommendations for improving environmental performance, based on the Circular Economy Concept’s philosophy and Industry 4.0. The study’s practical significance is that Ukrainian companies can use their results to achieve higher environmental and economic outcomes.


2021 ◽  
Vol 63 (2) ◽  
pp. 89-106
Author(s):  
Vesna Buha ◽  
Miroslav Bjegović ◽  
Rada Lečić

The digital age has brought great challenges to organizations. New technologies, methodologies, and products require a competent project manager who is able to meet the requirements of the digital age. Thus, the aim of this paper is: an analysis of the relevant theoretical context, the necessary competencies of the project manager, consideration of his position and actions in the organizational framework, and especially in organizations dealing with security. Also, the paper deals with the analysis of the model DevSecOps that implements security within IT practices. The results of the presented research indicate the fact that the total number of jobs performed by the project manager shows global growth in quantitative and qualitative terms. Research on human resource development in the field of project management distinguishes innovators from traditionalists. Innovators, in the age of globalization, have given priority to security training, followed by market topics, data science, innovative ways of behaving, and collaborative leadership. A case study is presented, which considers the steps of building a new product in the field of IT with implemented activities that provide an adequate level of security.


2021 ◽  
Author(s):  
Alice Fremand

<p>Open data is not a new concept. Over sixty years ago in 1959, knowledge sharing was at the heart of the Antarctic Treaty which included in article III 1c the statement: “scientific observations and results from Antarctica shall be exchanged and made freely available”. ​At a similar time, the World Data Centre (WDC) system was created to manage and distribute the data collected from the International Geophysical Year (1957-1958) led by the International Council of Science (ICSU) building the foundations of today’s research data management practices.</p><p>What about now? The WDC system still exists through the World Data System (WDS). Open data has been endorsed by a majority of funders and stakeholders. Technology has dramatically evolved. And the profession of data manager/curator has emerged. Utilising their professional expertise means that their role is far wider than the long-term curation and publication of data sets.</p><p>Data managers are involved in all stages of the data life cycle: from data management planning, data accessioning to data publication and re-use. They implement open data policies; help write data management plans and provide advice on how to manage data during, and beyond the life of, a science project. In liaison with software developers as well as scientists, they are developing new strategies to publish data either via data catalogues, via more sophisticated map-based viewer services or in machine-readable form via APIs. Often, they bring the expertise of the field they are working in to better assist scientists satisfy Findable, Accessible, Interoperable and Re-usable (FAIR) principles. Recent years have seen the development of a large community of experts that are essential to share, discuss and set new standards and procedures. The data are published to be re-used, and data managers are key to promoting high-quality datasets and participation in large data compilations.</p><p>To date, there is no magical formula for FAIR data. The Research Data Alliance is a great platform allowing data managers and researchers to work together, develop and adopt infrastructure that promotes data-sharing and data-driven research. However, the challenge to properly describe each data set remains. Today, scientists are expecting more and more from their data publication or data requests: they want interactive maps, they want more complex data systems, they want to query data, combine data from different sources and publish them rapidly.  By developing new procedures and standards, and looking at new technologies, data managers help set the foundations to data science.</p>


New technologies in data science are allowing long-term investors to bring much more rigor to their operations. In this article the authors provide empirical examples in support of these data-driven advances, demonstrating their practical applications. They use the UC Investments office as their case study and discuss how adoption of advanced data science techniques can move organizations past the current unsatisfactory state of the art and toward an unprecedented level of operational finesse. Specifically, the authors focus on a methodological innovation in fair valuation of illiquid assets that is supported by an automated, rigorous process. They test this process in a real-world setting and find, at least in this case, that these advances can enhance roll forward outputs in terms of timeliness, accuracy, and granularity. This finding has several potential impacts, not only for reporting, but also for investment, risk management, actuarial purposes, and even personal compensation of teams.


Sign in / Sign up

Export Citation Format

Share Document