The ingenious marketing of modern paintings

2014 ◽  
Vol 6 (2) ◽  
pp. 211-233
Author(s):  
Thomas M. Bayer ◽  
John Page

Purpose – This paper aims to analyze the evolution of the marketing of paintings and related visual products from its nascent stages in England around 1700 to the development of the modern art market by 1900, with a brief discussion connecting to the present. Design/methodology/approach – Sources consist of a mixture of primary and secondary sources as well as a series of econometric and statistical analyses of specifically constructed and unique data sets that list nearly more than 50,000 different sales of paintings during this period. One set records sales of paintings at various English auction houses during the eighteenth and nineteenth centuries; the second set consists of all purchases and sales of paintings recorded in the stock books of the late nineteenth-century London art dealer, Arthur Tooth, during the years of 1870/1871. The authors interpret the data under a commoditization model first introduced by Igor Kopytoff in 1986 that posits that markets and their participants evolve toward maximizing the efficiency of their exchange process within the prevailing exchange technology. Findings – We found that artists were largely responsible for a series of innovations in the art market that replaced the prevailing direct relationship between artists and patron with a modern market for which painters produced works on speculation to be sold by enterprising middlemen to an anonymous public. In this process, artists displayed a remarkable creativity and a seemingly instinctive understanding of the principles of competitive marketing that should dispel the erroneous but persistent notion that artistic genius and business savvy are incompatible. Research limitations/implications – A similar marketing analysis could be done of the development of the art markets of other leading countries, such as France, Italy and Holland, as well as the current developments of the art market. Practical implications – The same process of the development of the art market in England is now occurring in Latin America and China. Also, the commoditization process continues in the present, now using the Internet and worldwide art dealers. Originality/value – This is the first article to trace the historical development of the marketing of art in all of its components: artists, dealers, artist organizations, museums, curators, art critics, the media and art historians.

2018 ◽  
Vol 39 (2) ◽  
pp. 56-59 ◽  
Author(s):  
Peter Buell Hirsch

Purpose Alternative data is a term describing the data exhaust that organizations, especially asset managers, are using to develop insights about companies to give them a trading edge. As the use of this data becomes more prevalent, it is critical that business leaders understand how this kind of data can be used against their organizations. This viewpoint articulates some of the steps they will need to take to do this. Design/methodology/approach The methodology used in this viewpoint is a review of recent literature covering alternative data and its uses. Findings This paper describes the different ways in which alternative data is being used and cites surprising examples of how this can make companies vulnerable or threaten their reputation. Research limitations/implications As an overview of selected examples from secondary sources, this paper is not a comprehensive treatment of the subject. Practical implications By studying the issues raised in the paper, business leaders can arm themselves with insights into the use of alternative data and mitigate reputational fallout from its use against their companies. Social implications A better understanding of how alternative data is being used can help protect both individuals and social organizations from being treated inequitably and increase transparency in the use of large and hidden data sets. Originality/value To the best of the author’s knowledge, this is the first treatment of the use of alternative data from the perspective of corporate reputation.


2014 ◽  
Vol 27 (8) ◽  
pp. 1233-1240 ◽  
Author(s):  
Ingrid Jeacle ◽  
Chris Carter

Purpose – The purpose of this paper is to consider the role of interdisciplinary accounting research and suggest ways of broadening its creative scope to embrace significant contemporary phenomenon. Design/methodology/approach – The paper is conceptual in nature and therefore draws only on secondary sources. Findings – The paper suggests that one of the defining features of interdisciplinary accounting research is that it should be a creative space in which novel ideas emerge and new agendas flourish. The authors identify three such creative spaces of scholarly inquiry: the media space, the virtual space and the popular culture space. Originality/value – The paper identifies three new creative spaces in which interdisciplinary accounting research may continue to flourish. It also identifies a possible threat to creativity within future interdisciplinary accounting research.


Author(s):  
Jeeyun Oh ◽  
Mun-Young Chung ◽  
Sangyong Han

Despite of the popularity of interactive movie trailers, rigorous research on one of the most apparent features of these interfaces – the level of user control – has been scarce. This study explored the effects of user control on users’ immersion and enjoyment of the movie trailers, moderated by the content type. We conducted a 2 (high user control versus low user control) × 2 (drama film trailer versus documentary film trailer) mixed-design factorial experiment. The results showed that the level of user control over movie trailer interfaces decreased users’ immersion when the trailer had an element of traditional story structure, such as a drama film trailer. Participants in the high user control condition answered that they were less fascinated with, absorbed in, focused on, mentally involved with, and emotionally affected by the movie trailer than participants in the low user control condition only with the drama movie trailer. The negative effects of user control on the level of immersion for the drama trailer translated into users’ enjoyment. The impact of user control over interfaces on immersion and enjoyment varies depending on the nature of the media content, which suggests a possible trade-off between the level of user control and entertainment outcomes.


2004 ◽  
Vol 101 (Supplement3) ◽  
pp. 326-333 ◽  
Author(s):  
Klaus D. Hamm ◽  
Gunnar Surber ◽  
Michael Schmücking ◽  
Reinhard E. Wurm ◽  
Rene Aschenbach ◽  
...  

Object. Innovative new software solutions may enable image fusion to produce the desired data superposition for precise target definition and follow-up studies in radiosurgery/stereotactic radiotherapy in patients with intracranial lesions. The aim is to integrate the anatomical and functional information completely into the radiation treatment planning and to achieve an exact comparison for follow-up examinations. Special conditions and advantages of BrainLAB's fully automatic image fusion system are evaluated and described for this purpose. Methods. In 458 patients, the radiation treatment planning and some follow-up studies were performed using an automatic image fusion technique involving the use of different imaging modalities. Each fusion was visually checked and corrected as necessary. The computerized tomography (CT) scans for radiation treatment planning (slice thickness 1.25 mm), as well as stereotactic angiography for arteriovenous malformations, were acquired using head fixation with stereotactic arc or, in the case of stereotactic radiotherapy, with a relocatable stereotactic mask. Different magnetic resonance (MR) imaging sequences (T1, T2, and fluid-attenuated inversion-recovery images) and positron emission tomography (PET) scans were obtained without head fixation. Fusion results and the effects on radiation treatment planning and follow-up studies were analyzed. The precision level of the results of the automatic fusion depended primarily on the image quality, especially the slice thickness and the field homogeneity when using MR images, as well as on patient movement during data acquisition. Fully automated image fusion of different MR, CT, and PET studies was performed for each patient. Only in a few cases was it necessary to correct the fusion manually after visual evaluation. These corrections were minor and did not materially affect treatment planning. High-quality fusion of thin slices of a region of interest with a complete head data set could be performed easily. The target volume for radiation treatment planning could be accurately delineated using multimodal information provided by CT, MR, angiography, and PET studies. The fusion of follow-up image data sets yielded results that could be successfully compared and quantitatively evaluated. Conclusions. Depending on the quality of the originally acquired image, automated image fusion can be a very valuable tool, allowing for fast (∼ 1–2 minute) and precise fusion of all relevant data sets. Fused multimodality imaging improves the target volume definition for radiation treatment planning. High-quality follow-up image data sets should be acquired for image fusion to provide exactly comparable slices and volumetric results that will contribute to quality contol.


2019 ◽  
Vol 45 (9) ◽  
pp. 1183-1198
Author(s):  
Gaurav S. Chauhan ◽  
Pradip Banerjee

Purpose Recent papers on target capital structure show that debt ratio seems to vary widely in space and time, implying that the functional specifications of target debt ratios are of little empirical use. Further, target behavior cannot be adjudged correctly using debt ratios, as they could revert due to mechanical reasons. The purpose of this paper is to develop an alternative testing strategy to test the target capital structure. Design/methodology/approach The authors make use of a major “shock” to the debt ratios as an event and think of a subsequent reversion as a movement toward a mean or target debt ratio. By doing this, the authors no longer need to identify target debt ratios as a function of firm-specific variables or any other rigid functional form. Findings Similar to the broad empirical evidence in developed economies, there is no perceptible and systematic mean reversion by Indian firms. However, unlike developed countries, proportionate usage of debt to finance firms’ marginal financing deficits is extensive; equity is used rather sparingly. Research limitations/implications The trade-off theory could be convincingly refuted at least for the emerging market of India. The paper here stimulated further research on finding reasons for specific financing behavior of emerging market firms. Practical implications The results show that the firms’ financing choices are not only depending on their own firm’s specific variables but also on the financial markets in which they operate. Originality/value This study attempts to assess mean reversion in debt ratios in a unique but reassuring manner. The results are confirmed by extensive calibration of the testing strategy using simulated data sets.


2020 ◽  
Vol 69 (8/9) ◽  
pp. 717-736
Author(s):  
Małgorzata Kowalska-Chrzanowska ◽  
Przemysław Krysiński

Purpose This paper aims to answer the question of how the Polish representatives of social communication and media sciences communicate the most recent scientific findings in the media space, i.e. what types of publications are shared, what activities do they exemplify (sharing information about their own publications, leading discussions, formulating opinions), what is the form of the scientific communication created by them (publication of reference lists' descriptions, full papers, preprints and post prints) and what is the audience reception (number of downloads, displays, comments). Design/methodology/approach The authors present the results of analysis conducted on the presence of the most recent (2017–2019) publications by the Polish representatives of the widely understood social communication and media sciences in three selected social networking services for scientists: ResearchGate, Google Scholar and Academia.edu. The analyses covered 100 selected representatives of the scientific environment (selected in interval sampling), assigned, according to the OECD classification “Field of Science”, in the “Ludzie nauki” (Men of Science) database to the “media and communication” discipline. Findings The conducted analyses prove a low usage level of the potential of three analysed services for scientists by the Polish representatives of social communication and media sciences. Although 60% of them feature profiles in at least one of the services, the rest are not present there at all. From the total of 113 identified scientists' profiles, as little as 65 feature publications from 2017 to 2019. Small number of alternative metrics established in them, implies, in turn, that if these metrics were to play an important role in evaluation of the value and influence of scientific publications, then this evaluation for the researched Polish representatives of social communication and media sciences would be unfavourable. Originality/value The small presence of the Polish representatives of the communication and media sciences in three analysed services shows that these services may be – for the time being – only support the processes of managing own scientific output. Maybe this quite a pessimistic image of scientists' activities in the analysed services is conditioned by a simple lack of the need to be present in electronic channels of scientific communication or the lack of trust to the analysed services, which, in turn, should be linked to their shortcomings and flaws. However, unequivocal confirmation of these hypotheses might be brought by explorations covering a larger group of scientists, and complemented with survey studies. Thus, this research may constitute merely a starting point for further explorations, including elaboration of good practices with respect to usage of social media by scientists.


2016 ◽  
Vol 12 (2) ◽  
pp. 126-149 ◽  
Author(s):  
Masoud Mansoury ◽  
Mehdi Shajari

Purpose This paper aims to improve the recommendations performance for cold-start users and controversial items. Collaborative filtering (CF) generates recommendations on the basis of similarity between users. It uses the opinions of similar users to generate the recommendation for an active user. As a similarity model or a neighbor selection function is the key element for effectiveness of CF, many variations of CF are proposed. However, these methods are not very effective, especially for users who provide few ratings (i.e. cold-start users). Design/methodology/approach A new user similarity model is proposed that focuses on improving recommendations performance for cold-start users and controversial items. To show the validity of the authors’ similarity model, they conducted some experiments and showed the effectiveness of this model in calculating similarity values between users even when only few ratings are available. In addition, the authors applied their user similarity model to a recommender system and analyzed its results. Findings Experiments on two real-world data sets are implemented and compared with some other CF techniques. The results show that the authors’ approach outperforms previous CF techniques in coverage metric while preserves accuracy for cold-start users and controversial items. Originality/value In the proposed approach, the conditions in which CF is unable to generate accurate recommendations are addressed. These conditions affect CF performance adversely, especially in the cold-start users’ condition. The authors show that their similarity model overcomes CF weaknesses effectively and improve its performance even in the cold users’ condition.


2015 ◽  
Vol 16 (1) ◽  
pp. 62-85 ◽  
Author(s):  
Cheri Jeanette Duncan ◽  
Genya Morgan O'Gara

Purpose – The purpose of this paper is to examine the development of a flexible collections assessment rubric comprised of a suite of tools for more consistently and effectively evaluating and expressing a holistic value of library collections to a variety of constituents, from administrators to faculty and students, with particular emphasis to the use of data already being collected at libraries to “take the temperature” of how responsive collections are in supporting institutional goals. Design/methodology/approach – Using a literature review, internal and external conversations, several collections pilot projects, and a variety of other investigative mechanisms, this paper explores methods for creating a more flexible, holistic collection development and assessment model using both qualitative and quantitative data. Findings – The products of scholarship that academic libraries include in their collections are expanding exponentially and range from journals and monographs in all formats, to databases, data sets, digital text and images, streaming media, visualizations and animations. Content is also being shared in new ways and on a variety of platforms. Yet the framework for evaluating this new landscape of scholarly output is in its infancy. So, how do libraries develop and assess collections in a consistent, holistic, yet agile, manner? Libraries must employ a variety of mechanisms to ensure this goal, while remaining flexible in adapting to the shifting collections environment. Originality/value – In so much as the authors are aware, this is the first paper to examine an agile, holistic approach to collections using both qualitative and quantitative data.


2016 ◽  
Vol 36 (3/4) ◽  
pp. 242-257 ◽  
Author(s):  
Erika Cudworth

Purpose – The purpose of this paper is to map the field of sociological animal studies through some examples of critical and mainstream approaches and considers their relation to advocacy. It makes the argument that while all these initiatives have made important contributions to the project of “animalising sociology” and suggest a need for change in species relations, the link between analysis and political strategy is uncertain. Design/methodology/approach – The paper develops its argument by using secondary sources, reviewing sociological positions and offering illustrations of possible interventions. Findings – Sociological interventions in the field of animal studies have been informed by critical perspectives, such as feminism and Marxism, or taken less critical routes deploying actor-network theory and symbolic interactionism. Whilst those working in critical traditions may appear to have a more certain political agenda, an analysis of “how things are” does not always lead to a clear position on “what is to be done” in terms of social movement agendas or policy intervention. In addition, concepts deployed in advocacy such as “liberation”, “quality of life” or “care” are problematic when applied beyond the human. Despite this, there are possibilities for coalition and solidarity around certain claims for change. Research limitations/implications – If the central argument of the paper were taken seriously by general sociologists, then sociology may be more open to “animal studies”. In implications for exisitng sociological animal studies scholarship is to trouble some of the certainties around advocacy. Practical implications – If the central argument of the paper were taken seriously by advocacy groups, then the hiatus between “welfarism” and “liberation” might be overcome. Originality/value – There have been recent attempts to map the field of scholarship in animal studies, but surprisingly little consideration of how different emergent positions inform questions of advocacy and the possibilities for political intervention.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Tressy Thomas ◽  
Enayat Rajabi

PurposeThe primary aim of this study is to review the studies from different dimensions including type of methods, experimentation setup and evaluation metrics used in the novel approaches proposed for data imputation, particularly in the machine learning (ML) area. This ultimately provides an understanding about how well the proposed framework is evaluated and what type and ratio of missingness are addressed in the proposals. The review questions in this study are (1) what are the ML-based imputation methods studied and proposed during 2010–2020? (2) How the experimentation setup, characteristics of data sets and missingness are employed in these studies? (3) What metrics were used for the evaluation of imputation method?Design/methodology/approachThe review process went through the standard identification, screening and selection process. The initial search on electronic databases for missing value imputation (MVI) based on ML algorithms returned a large number of papers totaling at 2,883. Most of the papers at this stage were not exactly an MVI technique relevant to this study. The literature reviews are first scanned in the title for relevancy, and 306 literature reviews were identified as appropriate. Upon reviewing the abstract text, 151 literature reviews that are not eligible for this study are dropped. This resulted in 155 research papers suitable for full-text review. From this, 117 papers are used in assessment of the review questions.FindingsThis study shows that clustering- and instance-based algorithms are the most proposed MVI methods. Percentage of correct prediction (PCP) and root mean square error (RMSE) are most used evaluation metrics in these studies. For experimentation, majority of the studies sourced the data sets from publicly available data set repositories. A common approach is that the complete data set is set as baseline to evaluate the effectiveness of imputation on the test data sets with artificially induced missingness. The data set size and missingness ratio varied across the experimentations, while missing datatype and mechanism are pertaining to the capability of imputation. Computational expense is a concern, and experimentation using large data sets appears to be a challenge.Originality/valueIt is understood from the review that there is no single universal solution to missing data problem. Variants of ML approaches work well with the missingness based on the characteristics of the data set. Most of the methods reviewed lack generalization with regard to applicability. Another concern related to applicability is the complexity of the formulation and implementation of the algorithm. Imputations based on k-nearest neighbors (kNN) and clustering algorithms which are simple and easy to implement make it popular across various domains.


Sign in / Sign up

Export Citation Format

Share Document