scholarly journals Die Evolution der bundesdeutschen Forschungs- und Technologiepolitik: Rückblick und Bestandsaufnahme

2002 ◽  
Vol 3 (3) ◽  
pp. 279-301 ◽  
Author(s):  
Andreas Fier ◽  
Dietmar Harhoff

Abstract We consider the development of German federal research and technology (R&T) policies since the 1960s and sketch the evolution of today's highly differentiated and complex set of policy instruments. Advances from economic theory and empirical results are reflected in this evolution, but have not necessarily been the driving force. In some instances, innovative policy instruments have been introduced in order to accommodate the state of the art in economic analysis; in other cases, such innovations have preceded a thorough analysis of the respective policy instruments. A major point of concern is the lack of comprehensive evaluation and cost-benefit analyses in R&T policies. In this regard, German policy practice lags behind well-established procedures in other countries.

2021 ◽  
Vol 3 (2) ◽  
pp. 84-97
Author(s):  
Peter Jeremiah Setiawan ◽  
Madeleine Celandine Guinevere ◽  
Fauzy Iskandar Alamsyah ◽  
Mohammad Irvan

Mastery theory of law is one of the criteria for a good court. One of the law theories currently being developed is economic analysis of law theory. One of the decisions that the judge considered was using economic analysis of law theory in making a decision is a decision of 45/Pid.Sus/TPK/2011/PN.BDG. Therefore, this article will analyze further into the decision of 45/Pid.Sus/TPK/2011/PN.BDG. This research is legal research that uses statute approach, conceptual approach, and case approach. Based on the research, it showed that the features of economic analysis of law theory are: 1) Focused on the philosophy of justice utilitarianism which is the fundamental concept based on felicific calculus, 2) Using the basis of consideration: a) Economic theory as a foundation for legal analysis, b) Using analysis of cost-benefit to create a law and/or c) Consideration of opportunity cost which law will be formed, and 3) Output which is achieved is wealth maximization. Related to the Decision Number Register 45/Pid.Sus/TPK/2011/PN.BDG. in fact, arguable that judges make the decision based on economic analysis of law theory because related to ratio decedendi has fulfilled 3 (three) characteristic economic analysis of law theory.


Author(s):  
Jacques Thomassen ◽  
Carolien van Ham

This chapter presents the research questions and outline of the book, providing a brief review of the state of the art of legitimacy research in established democracies, and discusses the recurring theme of crisis throughout this literature since the 1960s. It includes a discussion of the conceptualization and measurement of legitimacy, seeking to relate legitimacy to political support, and reflecting on how to evaluate empirical indicators: what symptoms indicate crisis? This chapter further explains the structure of the three main parts of the book. Part I evaluates in a systematic fashion the empirical evidence for legitimacy decline in established democracies; Part II reappraises the validity of theories of legitimacy decline; and Part II investigates what (new) explanations can account for differences in legitimacy between established democracies. The chapter concludes with a short description of the chapters included in the volume.


Animals ◽  
2021 ◽  
Vol 11 (5) ◽  
pp. 1297
Author(s):  
Juntae Kim ◽  
Hyo-Dong Han ◽  
Wang Yeol Lee ◽  
Collins Wakholi ◽  
Jayoung Lee ◽  
...  

Currently, the pork industry is incorporating in-line automation with the aim of increasing the slaughtered pork carcass throughput while monitoring quality and safety. In Korea, 21 parameters (such as back-fat thickness and carcass weight) are used for quality grading of pork carcasses. Recently, the VCS2000 system—an automatic meat yield grading machine system—was introduced to enhance grading efficiency and therefore increase pork carcass production. The VCS2000 system is able to predict pork carcass yield based on image analysis. This study also conducted an economic analysis of the system using a cost—benefit analysis. The subsection items of the cost-benefit analysis considered were net present value (NPV), internal rate of return (IRR), and benefit/cost ratio (BC ratio), and each method was verified through sensitivity analysis. For our analysis, the benefits were grouped into three categories: the benefits of reducing labor costs, the benefits of improving meat yield production, and the benefits of reducing pig feed consumption through optimization. The cost-benefit analysis of the system resulted in an NPV of approximately 615.6 million Korean won, an IRR of 13.52%, and a B/C ratio of 1.65.


Molecules ◽  
2021 ◽  
Vol 26 (8) ◽  
pp. 2168
Author(s):  
Samir M. Ahmad ◽  
Oriana C. Gonçalves ◽  
Mariana N. Oliveira ◽  
Nuno R. Neng ◽  
José M. F. Nogueira

The analysis of controlled drugs in forensic matrices, i.e., urine, blood, plasma, saliva, and hair, is one of the current hot topics in the clinical and toxicological context. The use of microextraction-based approaches has gained considerable notoriety, mainly due to the great simplicity, cost-benefit, and environmental sustainability. For this reason, the application of these innovative techniques has become more relevant than ever in programs for monitoring priority substances such as the main illicit drugs, e.g., opioids, stimulants, cannabinoids, hallucinogens, dissociative drugs, and related compounds. The present contribution aims to make a comprehensive review on the state-of-the art advantages and future trends on the application of microextraction-based techniques for screening-controlled drugs in the forensic context.


2021 ◽  
Vol 32 ◽  
pp. 100289
Author(s):  
Thomas van der Pol ◽  
Jochen Hinkel ◽  
Jan Merkens ◽  
Leigh MacPherson ◽  
Athanasios T. Vafeidis ◽  
...  

2021 ◽  
Vol 14 (5) ◽  
pp. 785-798
Author(s):  
Daokun Hu ◽  
Zhiwen Chen ◽  
Jianbing Wu ◽  
Jianhua Sun ◽  
Hao Chen

Persistent memory (PM) is increasingly being leveraged to build hash-based indexing structures featuring cheap persistence, high performance, and instant recovery, especially with the recent release of Intel Optane DC Persistent Memory Modules. However, most of them are evaluated on DRAM-based emulators with unreal assumptions, or focus on the evaluation of specific metrics with important properties sidestepped. Thus, it is essential to understand how well the proposed hash indexes perform on real PM and how they differentiate from each other if a wider range of performance metrics are considered. To this end, this paper provides a comprehensive evaluation of persistent hash tables. In particular, we focus on the evaluation of six state-of-the-art hash tables including Level hashing, CCEH, Dash, PCLHT, Clevel, and SOFT, with real PM hardware. Our evaluation was conducted using a unified benchmarking framework and representative workloads. Besides characterizing common performance properties, we also explore how hardware configurations (such as PM bandwidth, CPU instructions, and NUMA) affect the performance of PM-based hash tables. With our in-depth analysis, we identify design trade-offs and good paradigms in prior arts, and suggest desirable optimizations and directions for the future development of PM-based hash tables.


1978 ◽  
Vol 3 (3) ◽  
pp. 148-159 ◽  
Author(s):  
Howard S. Adelman

Presented are (1) a brief synthesis of several key conceptual and methodological concerns and some ethical perspectives related to identification of psycho-educational problems and (2) conclusions regarding the current state of the art. The conceptual discussion focuses on differentiating prediction from identification and screening from diagnosis; three models used in developing assessment procedures also are presented. Methodologically, the minimal requirements for satisfactory research are described and current problems are highlighted. Three ethical perspectives are discussed; cost-benefit for the individual, models-motives-goals underlying practices, and cost-benefit for the culture. The current state of the art is seen as not supporting the efficacy of the widespread use of currently available procedures for mass screening. Given this point and the methodological and ethical concerns discussed, it is suggested that policy makers reallocate limited resources away from mass identification and toward health maintenance and other approaches to prevention and early-age intervention.


2020 ◽  
Vol 14 (4) ◽  
pp. 653-667
Author(s):  
Laxman Dhulipala ◽  
Changwan Hong ◽  
Julian Shun

Connected components is a fundamental kernel in graph applications. The fastest existing multicore algorithms for solving graph connectivity are based on some form of edge sampling and/or linking and compressing trees. However, many combinations of these design choices have been left unexplored. In this paper, we design the ConnectIt framework, which provides different sampling strategies as well as various tree linking and compression schemes. ConnectIt enables us to obtain several hundred new variants of connectivity algorithms, most of which extend to computing spanning forest. In addition to static graphs, we also extend ConnectIt to support mixes of insertions and connectivity queries in the concurrent setting. We present an experimental evaluation of ConnectIt on a 72-core machine, which we believe is the most comprehensive evaluation of parallel connectivity algorithms to date. Compared to a collection of state-of-the-art static multicore algorithms, we obtain an average speedup of 12.4x (2.36x average speedup over the fastest existing implementation for each graph). Using ConnectIt, we are able to compute connectivity on the largest publicly-available graph (with over 3.5 billion vertices and 128 billion edges) in under 10 seconds using a 72-core machine, providing a 3.1x speedup over the fastest existing connectivity result for this graph, in any computational setting. For our incremental algorithms, we show that our algorithms can ingest graph updates at up to several billion edges per second. To guide the user in selecting the best variants in ConnectIt for different situations, we provide a detailed analysis of the different strategies. Finally, we show how the techniques in ConnectIt can be used to speed up two important graph applications: approximate minimum spanning forest and SCAN clustering.


Sign in / Sign up

Export Citation Format

Share Document