scholarly journals How valuation approach choice affects financial analysts’ target price accuracy

Author(s):  
Gülcan Erkilet ◽  
Gerrit Janke ◽  
Rainer Kasperzak

AbstractThis paper examines which valuation approaches financial analysts use to value a company and whether the chosen valuation approach affects the target price accuracy. To address these questions, we conduct content analyses of 867 hand-collected analyst reports on German publicly listed companies published between January 2014 and June 2017. We find that sell-side analysts more frequently use the single-period market approach when formulating target prices, followed by the multi-period income approach, and a mixture of both by combining the results, the so-called hybrid valuation approach. Additionally, we show that 612 of the analyzed analyst reports are based on a holistic valuation methodology instead of a sum of the parts valuation technique. Both univariate and multivariate analyses emphasize that the choice of valuation approach is significantly associated with the accuracy of price targets. Specifically, the income and market approach lead to significantly more accurate target prices compared to the hybrid approach. We also find that the target price accuracy is higher when analysts apply the holistic rather than the sum of the parts valuation approach to determine the fundamental value of the company. Additional results emphasize that target price accuracy improves when analysts use the sum of the parts valuation that bases solely on market or income approaches rather than hybrid approaches. Hence, we contribute to theory and practice by providing evidence on the link between the choice of valuation approach and the analysts’ target price accuracy as well as on the performance of certain valuation techniques.

2016 ◽  
Vol 32 (6) ◽  
pp. 8-11 ◽  
Author(s):  
Patrick Mazzariol ◽  
Mark Thomas

Purpose Academics and practitioners are in relative agreement on what drives a company’s fundamental value, primarily it’s current assets and future cash flows. The practice of paying a premium may thus be due to the non-tangible factors associated with perceived value that currently are not incorporated into the assets of the company and the expected growth of the cash flows. Design/methodology/approach This paper looks at the most common theoretical models used in the calculation of the value of a firm. It then explains how human factors can cause divergence in the original price set. Findings Empirical evidence proves that the price paid for a company can easily reach 40-50 per cent above this calculation of the current value. Until valuation models can account for the factors that drive premium pricing, it is necessary to recognize that intangible and, in some cases, emotional aspects will have a great influence on the final price. Practical implications The paper provides strategic insights and practical thinking that have influenced some of the world’s leading organizations. Originality/value The briefing saves busy executives and researchers hours of reading time by selecting only the very best, most pertinent information and presenting it in a condensed and easy-to-digest format.


Author(s):  
Eric S Tvedte ◽  
Mark Gasser ◽  
Benjamin C Sparklin ◽  
Jane Michalski ◽  
Carl E Hjelmen ◽  
...  

Abstract The newest generation of DNA sequencing technology is highlighted by the ability to generate sequence reads hundreds of kilobases in length. Pacific Biosciences (PacBio) and Oxford Nanopore Technologies (ONT) have pioneered competitive long read platforms, with more recent work focused on improving sequencing throughput and per-base accuracy. We used whole-genome sequencing data produced by three PacBio protocols (Sequel II CLR, Sequel II HiFi, RS II) and two ONT protocols (Rapid Sequencing and Ligation Sequencing) to compare assemblies of the bacteria Escherichia coli and the fruit fly Drosophila ananassae. In both organisms tested, Sequel II assemblies had the highest consensus accuracy, even after accounting for differences in sequencing throughput. ONT and PacBio CLR had the longest reads sequenced compared to PacBio RS II and HiFi, and genome contiguity was highest when assembling these datasets. ONT Rapid Sequencing libraries had the fewest chimeric reads in addition to superior quantification of E. coli plasmids versus ligation-based libraries. The quality of assemblies can be enhanced by adopting hybrid approaches using Illumina libraries for bacterial genome assembly or polishing eukaryotic genome assemblies, and an ONT-Illumina hybrid approach would be more cost-effective for many users. Genome-wide DNA methylation could be detected using both technologies, however ONT libraries enabled the identification of a broader range of known E. coli methyltransferase recognition motifs in addition to undocumented D. ananassae motifs. The ideal choice of long read technology may depend on several factors including the question or hypothesis under examination. No single technology outperformed others in all metrics examined.


Author(s):  
Kaixian Gao ◽  
Guohua Yang ◽  
Xiaobo Sun

With the rapid development of the logistics industry, the demand of customer become higher and higher. The timeliness of distribution becomes one of the important factors that directly affect the profit and customer satisfaction of the enterprise. If the distribution route is planned rationally, the cost can be greatly reduced and the customer satisfaction can be improved. Aiming at the routing problem of A company’s vehicle distribution link, we establish mathematical models based on theory and practice. According to the characteristics of the model, genetic algorithm is selected as the algorithm of path optimization. At the same time, we simulate the actual situation of a company, and use genetic algorithm to plan the calculus. By contrast, the genetic algorithm suitable for solving complex optimization problems, the practicability of genetic algorithm in this design is highlighted. It solves the problem of unreasonable transportation of A company, so as to get faster efficiency and lower cost.


2017 ◽  
Vol 21 (1) ◽  
pp. 12-17 ◽  
Author(s):  
David J. Pauleen

Purpose Dave Snowden has been an important voice in knowledge management over the years. As the founder and chief scientific officer of Cognitive Edge, a company focused on the development of the theory and practice of social complexity, he offers informative views on the relationship between big data/analytics and KM. Design/methodology/approach A face-to-face interview was held with Dave Snowden in May 2015 in Auckland, New Zealand. Findings According to Snowden, analytics in the form of algorithms are imperfect and can only to a small extent capture the reasoning and analytical capabilities of people. For this reason, while big data/analytics can be useful, they are limited and must be used in conjunction with human knowledge and reasoning. Practical implications Snowden offers his views on big data/analytics and how they can be used effectively in real world situations in combination with human reasoning and input, for example in fields from resource management to individual health care. Originality/value Snowden is an innovative thinker. He combines knowledge and experience from many fields and offers original views and understanding of big data/analytics, knowledge and management.


2021 ◽  
Vol 11 (5) ◽  
pp. 2338
Author(s):  
Rosanna Maria Viglialoro ◽  
Sara Condino ◽  
Giuseppe Turini ◽  
Marina Carbone ◽  
Vincenzo Ferrari ◽  
...  

Simulation-based medical training is considered an effective tool to acquire/refine technical skills, mitigating the ethical issues of Halsted’s model. This review aims at evaluating the literature on medical simulation techniques based on augmented reality (AR), mixed reality (MR), and hybrid approaches. The research identified 23 articles that meet the inclusion criteria: 43% combine two approaches (MR and hybrid), 22% combine all three, 26% employ only the hybrid approach, and 9% apply only the MR approach. Among the studies reviewed, 22% use commercial simulators, whereas 78% describe custom-made simulators. Each simulator is classified according to its target clinical application: training of surgical tasks (e.g., specific tasks for training in neurosurgery, abdominal surgery, orthopedic surgery, dental surgery, otorhinolaryngological surgery, or also generic tasks such as palpation) and education in medicine (e.g., anatomy learning). Additionally, the review assesses the complexity, reusability, and realism of the physical replicas, as well as the portability of the simulators. Finally, we describe whether and how the simulators have been validated. The review highlights that most of the studies do not have a significant sample size and that they include only a feasibility assessment and preliminary validation; thus, further research is needed to validate existing simulators and to verify whether improvements in performance on a simulated scenario translate into improved performance on real patients.


Author(s):  
Tomohiko Sakao ◽  
Erik Sundin

Remanufacturing has gained attention from industry, but the literature lacks the scientific comprehension to realize efficient remanufacturing. This hinders a company from commencing or improving remanufacturing efficiently. To fill this gap, the paper proposes a set of practical success factors for remanufacturing. To do so, it analyzes remanufacturing practices in industry through interviews with staff from remanufacturing companies with long experience. The practical success factors are found to be (1) addressing product and component value, (2) having a customer-oriented operation, (3) having an efficient core acquisition, (4) obtaining the correct information, and (5) having the right staff competence. Next, the paper further analyzes remanufacturing processes theoretically with both cause and effect analysis and means-ends analysis. Since the factors show that, among other things, the product/service system (PSS) is highly relevant to remanufacturing in multiple ways, theories on the PSS are partly utilized. As a result, the distinctive nature of remanufacturing underlying in the processes is found to have high variability, high uncertainty and, thus, also complexity. The obtained insights from practice and theory are found to support each other. In addition, a fishbone diagram for remanufacturing is proposed based on the analysis, including seven m's, adding two new m's (marketing and maintenance) on top of the traditional five m's (measurement, material, human, method, and machine) in order to improve customer value. The major contribution of the paper lies in its insights, which are grounded in both theory and practice.


Author(s):  
Bao-Fei Li ◽  
Parampreet Singh ◽  
Anzhong Wang

In this paper, we first provide a brief review of the effective dynamics of two recently well-studied models of modified loop quantum cosmologies (mLQCs), which arise from different regularizations of the Hamiltonian constraint and show the robustness of a generic resolution of the big bang singularity, replaced by a quantum bounce due to non-perturbative Planck scale effects. As in loop quantum cosmology (LQC), in these modified models the slow-roll inflation happens generically. We consider the cosmological perturbations following the dressed and hybrid approaches and clarify some subtle issues regarding the ambiguity of the extension of the effective potential of the scalar perturbations across the quantum bounce, and the choice of initial conditions. Both of the modified regularizations yield primordial power spectra that are consistent with current observations for the Starobinsky potential within the framework of either the dressed or the hybrid approach. But differences in primordial power spectra are identified among the mLQCs and LQC. In addition, for mLQC-I, striking differences arise between the dressed and hybrid approaches in the infrared and oscillatory regimes. While the differences between the two modified models can be attributed to differences in the Planck scale physics, the permissible choices of the initial conditions and the differences between the two perturbation approaches have been reported for the first time. All these differences, due to either the different regularizations or the different perturbation approaches in principle can be observed in terms of non-Gaussianities.


2020 ◽  
Vol 10 (2) ◽  
pp. 1
Author(s):  
Eddie Fisher ◽  
Yorkys Santana González

There appears to be a continuing and inconclusive debate amongst scholars whether theoretical knowledge or practical experience is more important in related and associated areas such as education, recruitment and employability. This research, limited to a literature review and face to face interviews, conducted a systematic investigation to obtain and analyze valid and reliable research data to establish whether theoretical knowledge or practical experience are of paramount importance. The outcome of this research suggests that a hybrid approach should be adopted, with the major focus being on practical experience supported by relevant theoretical knowledge and not the converse. A number of additional recommendations are presented how to balance and close the gap between theory and practice including a redesign of ordinary and advanced level educational teaching. Far greater emphasis needs to be placed on young people gaining early practical experience inside and outside the classroom. This can be achieved by developing practical workshops (pilot studies) for use in safe laboratory-type environments and by extending work placements within organizations during term times.   


Author(s):  
Mina Farmanbar ◽  
Önsen Toygar

This paper proposes hybrid approaches based on both feature level and score level fusion strategies to provide a robust recognition system against the distortions of individual modalities. In order to compare the proposed schemes, a virtual multimodal database is formed from FERET face and PolyU palmprint databases. The proposed hybrid systems concatenate features extracted by local and global feature extraction methods such as Local Binary Patterns, Log Gabor, Principal Component Analysis and Linear Discriminant Analysis. Match score level fusion is performed in order to show the effectiveness and accuracy of the proposed schemes. The experimental results based on these databases reported a significant improvement of the proposed schemes compared with unimodal systems and other multimodal face–palmprint fusion methods.


2014 ◽  
Vol 57 (4) ◽  
pp. 921-945 ◽  
Author(s):  
KATE DAVISON

ABSTRACTThis article considers the intersection between polite manners and company in eighteenth-century England. Through the laughter of gentlemen, it makes a case for a concept of occasional politeness, which is intended to emphasize that polite comportment was only necessary on certain occasions. In particular, it was the level of familiarity shared by a company that determined what was considered appropriate. There was unease with laughter in polite sociability, yet contemporaries understood that polite prudence could be waived when men met together in friendly homosocial encounters. In these circumstances, there existed a tacit acceptance of looser manners that might be called ‘intimate bawdiness’, which had its origins in a renaissance humanist train of thought that valorized wit as the centrepiece of male sociability. This argument tempers the importance of politeness by stressing the social contexts for which it was – and was not – a guiding principle. Ultimately, it suggests that the category of company might be one way of rethinking eighteenth-century sociability in a more pluralistic fashion, which allows for contradictory practices to co-exist. As such, it moves towards breaking down the binary oppositions of polite and impolite, elite and popular, and theory and practice that have been imposed on the period.


Sign in / Sign up

Export Citation Format

Share Document