scholarly journals Blockchain-mediated Licensing: Legal Engineering for Artist Empowerment

2020 ◽  
Author(s):  
charles adjovu ◽  
Ewa Fabian

Licensing is one of the essential means of exploiting the monetary value of a musical work, and yet it is an area fraught with many issues and transactional costs which make it a difficult process for individuals and organizations. Many issues in music licensing arise from the legal complexity (e.g., national and international copyright law), business complexity (authentication, tracking, accounting, etc.), value web complexity (transparency of relationships among stakeholders), and technical complexity (e.g., establishing a global repertoire database for music, sufficient metadata standards) of working with music. Then, in addition to these issues, there are specific transactional costs (identification, negotiation, monitoring, and enforcement) associated with the licensing process. To mitigate the complexity and transactional costs associated with music and the licensing process, researchers and technologists have been investigating how new technologies and design models from the Web3 space, such as blockchain, linked data and Ricardian Contracts, can automate processes to reduce complexity, speed up payments, improve tracking, and provide other benefits in the music industry. In our report, we make our own attempt to reduce the complexity and transactional costs in the licensing process by developing an automated music license. In doing so, we first conducted a literature review scoping the intersection of music complexity and Web3 technologies to provide background and context to automating music licensing. Then we developed the Practical Tokenized Drafting (PTD) method, a set of core principles and practices for drafting Ricardian Contracts that interact with Web3 technologies (RC-Web3 Templates), and the Tokenized Music License (TML), an RC-Web3 Template standard form for music licensing on the OpenLaw platform. Both the PTD and TML can be adapted to meet the needs of music industry stakeholders and provide guidance to legal practitioners in drafting RC-Web3 Templates.

Author(s):  
Stephanie Cosner Berzin ◽  
Claudia J. Coulton

Innovative applications of new digital technology present opportunities for social and human services to reach more people with greater impact on our most vexing social problems. These new technologies can be deployed to more strategically target social spending, speed up the development of effective programs, and bring a wider array of help to more individuals and communities.


Data Mining ◽  
2013 ◽  
pp. 336-365
Author(s):  
Bing He ◽  
Bin Xie ◽  
Sanjuli Agrawal ◽  
David Zhao ◽  
Ranga Reddy

With the ever growing demand on high throughput for mobile users, 3G cellular networks are limited in their network capacity for offering high data services to a large number of users. Consequently, many Internet services such as on-demand video and mobile TV are hard to be satisfactorily supported by the current 3G cellular networks. 3GPP Long Term Evolution (LTE) is a recently proposed 4G standard, representing a significant advance of 3G cellular technology. Attractively, LTE would offer an uplink data speed up to 50 Mbps and a downlink speed up to 100 Mbps for various services such as traditional voice, high-speed data, multimedia unicast, and multimedia broadcasting. In such a short time, it has been broadly accepted by major wireless vendors such as Verizon-Vodafone, AT&T, NTT-Docomo, KDDI, T-Mobile, and China Mobile. In order for high data link speed, LTE adapts new technologies that are new to 3G network such as Orthogonal Frequency Division Multiplexing (OFDM) and Multiple-Input Multiple-Output (MIMO). MIMO allows the use of more than one antenna at the transmitter and receiver for higher data transmission. The LTE bandwidth can be scalable from 1.25 to 20 MHz, satisfying the need of different network operators that may have different bandwidth allocations for services, based on its managed spectrum. In this chapter, we discuss the major advance of the LTE and its recent research efforts in improving its performance. Our illustration of LTE is comprehensive, spanning from the LTE physical layer to link layer. In addition, the LTE security is also discussed.


2018 ◽  
Vol 14 (3) ◽  
pp. 167-183
Author(s):  
Ahmed Ktob ◽  
Zhoujun Li

This article describes how recently, many new technologies have been introduced to the web; linked data is probably the most important. Individuals and organizations started emerging and publishing their data on the web adhering to a set of best practices. This data is published mostly in English; hence, only English agents can consume it. Meanwhile, although the number of Arabic users on the web is immense, few Arabic datasets are published. Publication catalogs are one of the primary sources of Arabic data that is not being exploited. Arabic catalogs provide a significant amount of meaningful data and metadata that are commonly stored in excel sheets. In this article, an effort has been made to help publishers easily and efficiently share their catalogs' data as linked data. Marefa is the first tool implemented that automatically extracts RDF triples from Arabic catalogs, aligns them to the BIBO ontology and links them with the Arabic chapter of DBpedia. An evaluation of the framework was conducted, and some statistical measures were generated during the different phases of the extraction process.


Popular Music ◽  
1993 ◽  
Vol 12 (3) ◽  
pp. 289-300 ◽  
Author(s):  
Jason Toynbee

After a decade of relative stability a flurry of closures and launches has changed the face of the British popular music press (see Table 1). This article examines the new structure and its development in three interlocking areas: in music industry organisation, rock and pop genres, and the process of discursive formation in the press itself. On the first of these, I argue that the function of the music press as an ‘institutional regulator’ (Hirsch 1990, p. 132) of music industry output has been subverted, during the 1980s, by a fluctuating record market and the growth of music programming on television and radio. A uni-directional model of cultural production is now problematic. Media, including the press, may be sponsors or initiators of music texts rather than mere filters. Secondly the New Pop, and more recently rave, has threatened the straightforward alignment of taste and cultural capital which underpinned rock hegemony. In the patchwork world of contemporary youth music the para-pedagogic work of the rock press in both guiding and excluding communities of taste has an extra urgency. My third point concerns critical method. Just because the rock/pop field has become decentred, the drive to fix meaning tends to generate crisis as successive critical solutions break down, revealed as monstrously excessive, as ‘more than’ the music. Journalists usually anticipate failure. But this only serves to speed up change and the movement towards the next untenable position.


2010 ◽  
Vol 10 (3) ◽  
pp. 174-180 ◽  
Author(s):  
Jonathan Purday

AbstractEuropeana is the focus of a number of IPR issues. The portal provides access to three sets of assets: the Open Source code base, the authority-controlled metadata and the digitised content. Europeana has integrated metadata standards across the heritage domains and will now licence the metadata as a resource for the development of linked data and semantic web applications. The main rights issues concern the digitisation of public domain content and orphan works. Digitisation of out-of-copyright analogue material does not create new rights and the Europeana Public Domain Charter provides guidelines for the sector. An inhibitor to digitisation is the problem of orphan works, now under review by the European Commission.


2021 ◽  
Author(s):  
Cristohper Ramos Flores

<p><b>This thesis presents a novel music-technology project, the HypeSax, which affords new roles to the saxophone and enhances its sound capacities. This document presents a discussion of the musical ideas and design criteria behind the development of this new instrument, addressing issues of embodiment that arise from the use of new technologies, and of what this new medium means in the discussion of the ontology of the musical work. This project is intended to research the medium through a case study, in which the medium becomes the central focus of my compositional decisions.</b></p> <p>As part of this project, a body of new musical works, associated with the HypeSax, was created. These compositions and the creative process from which they originated are analysed in relation to the HypeSax, questioning if the musical work is limited to the composition or if other processes such as the development of the medium, which in this case is the HypeSax, can be considered part of its ontology.</p> <p>The desire to understand and define the ontology of the musical work has led musicians, musicologists and philosophers to formulate multiple propositions that observe perspectives of creation and reception, as well as different ways in which these interact. This thesis proposes the integration of a new element in the conversation of the work-concept: the medium. The argument presented is that, in light of compositional practices in the twenty-first century, the creative work begins when musicians design instruments, software, audio setups, and other new technologies, actively transforming the medium through which their work works are created. Despite the fact that the medium has always been in close relation with the composition, performance and reception of the work, it has not been considered an element in the ontology of the work. Nevertheless, it becomes impossible to ignore the importance of the medium as new technologies facilitate its manipulation as a part of the creative process. </p> <p>New works featuring the HypeSax are discussed, as well as how this novel medium provides the affordances and possibilities that allow the creation of said works. This case study serves to demonstrate the importance of the medium in the context of a new tripartite model of the work-concept where score, performance and medium are integrated, in a non-hierarchical structure, as one inseparable reality of music.</p>


2021 ◽  
Author(s):  
◽  
Kyle Brannick

<p>With major recording artist Thom Yorke predicting the record industry will crumble in “Months” (Hudson, 2010), and sensationalist headlines such as “iPods and Young People Have Utterly Destroyed Music” (Buchanan, 2009) becoming commonplace, this research attempts to determine the current state of New Zealand music in the digital age. Despite the doom and gloom coming from the press in regards to the music industry, musicians haven’t stopped continuing to record, release, and promote their music as the costs of doing so continues to decline with the advent of new technologies. This research looks specifically into the music hosting website Bandcamp and determines what methods New Zealand musicians are currently using on the site in an effort to get their music into the ears and onto the hard drives of fans. Although a large amount of research has been performed on the impacts of piracy on music sales, very little has been conducted on what strategies musicians are implementing to increase their exposure and connect with their fan base in the 21st century, with no specific research having been performed on the unique circumstances faced by artists in New Zealand. This paper first presents a historical overview of the music industry in the last century, as well as a summary of where the industry currently stands in regards to Copyright, distribution methods, and price models in order to provide perspective on the difficulties and variety of choices currently facing musicians. Within this research paper, several hypotheses were tested in order to determine what factors have a significant effect on the amount of exposure that an artist has received for their music. In order to test these hypotheses, the number of audio streams and downloads that an artist has received for their songs posted to the music hosting site Bandcamp was used as a measure to determine the amount of exposure that a specific artist has received. Due to the subjective nature of the quality of music which each musician creates, a survey was sent to over 500 New Zealand musicians whom provided at least one song for download on the website in order to gather as much overall data on the success generated by New Zealand musicians online as possible. A quantitative analysis was then performed to determine what social networking and music hosting sites are most popular with Kiwi artists; whether musicians are still creating physical copies of their works; and what licenses and payment models artists are applying to their songs. This analysis identified two important factors as statistically significant in terms of affecting the number of downloads and audio streams an artist receives on Bandcamp, the length of time that an artist has been present on the site and the payment model that an artist applies to their works. In addition to the quantitative analysis performed on the success that artists were achieving on Bandcamp, a qualitative analysis was performed on the motivations artists had for applying specific pricing models and licenses to their works. The results of this analysis found a nearly unanimous positive response from musicians who had applied traditional Copyright to their work when asked if they would allow their fans to share their music without expressed permission. This research also determined that a majority of musicians currently applying traditional Copyright to their works are unfamiliar, unaware, or uninformed about Creative Commons licenses, with traditional Copyright being applied more out of habit than a desire for their works to be protected under the rights granted under traditional Copyright. A discussion about what these results indicate for artists is also presented as a guide for future and current musicians looking to upload their music to Bandcamp, depending on the goals that the musician is looking to achieve with their music. Finally, this paper concludes with an analysis of what limitations are present in the results of the research, as well as where the need exists for future research.</p>


2015 ◽  
Vol 59 (2) ◽  
pp. 97
Author(s):  
Lisa Romano

For the past few years, librarians have heard how Linked Data will be the future of bibliographic data. Linked Data for Libraries, Archives, and Museums: How to Clean, Link and Publish Your Metadata tries to make sense of the hype. The goal of this book is to introduce “the process of making your collections available, from the arduous processes of cleaning and connecting to publishing it for the world” (xiv). Specifically, this book describes metadata standards including Linked Data, associated tools and technologies, and the sustainability of metadata and technologies. The authors critically evaluate various options that can be used to clean, enrich, and publish metadata along with the history, advantages, and disadvantages of each.


2021 ◽  
Vol 8 ◽  
Author(s):  
Michelle S. Koo ◽  
Vance T. Vredenburg ◽  
John B. Deck ◽  
Deanna H. Olson ◽  
Kathryn L. Ronnenberg ◽  
...  

Emerging infectious diseases have been especially devastating to amphibians, the most endangered class of vertebrates. For amphibians, the greatest disease threat is chytridiomycosis, caused by one of two chytridiomycete fungal pathogens Batrachochytrium dendrobatidis (Bd) and Batrachochytrium salamandrivorans (Bsal). Research over the last two decades has shown that susceptibility to this disease varies greatly with respect to a suite of host and pathogen factors such as phylogeny, geography (including abiotic factors), host community composition, and historical exposure to pathogens; yet, despite a growing body of research, a comprehensive understanding of global chytridiomycosis incidence remains elusive. In a large collaborative effort, Bd-Maps was launched in 2007 to increase multidisciplinary investigations and understanding using compiled global Bd occurrence data (Bsal was not discovered until 2013). As its database functions aged and became unsustainable, we sought to address critical needs utilizing new technologies to meet the challenges of aggregating data to facilitate research on both Bd and Bsal. Here, we introduce an advanced central online repository to archive, aggregate, and share Bd and Bsal data collected from around the world. The Amphibian Disease Portal (https://amphibiandisease.org) addresses several critical community needs while also helping to build basic biological knowledge of chytridiomycosis. This portal could be useful for other amphibian diseases and could also be replicated for uses with other wildlife diseases. We show how the Amphibian Disease Portal provides: (1) a new repository for the legacy Bd-Maps data; (2) a repository for sample-level data to archive datasets and host published data with permanent DOIs; (3) a flexible framework to adapt to advances in field, laboratory, and informatics technologies; and (4) a global aggregation of Bd and Bsal infection data to enable and accelerate research and conservation. The new framework for this project is built using biodiversity informatics best practices and metadata standards to ensure scientific reproducibility and linkages across other biological and biodiversity repositories.


2019 ◽  

Drought is one of the prime abiotic stresses in the world. Now, amongst the new technologies available for speed up the releasing of new drought tolerance genotypes, there is an emanate discipline called machine learning. The study presents Machine Learning for identification, classification and prediction of drought tolerance maize inbred lines based on SSR genetic markers datasets generated from PCR reactions. A total of 356 SSR reproducible fragment alleles were detected across the 71 polymorphic SSR loci. A dataset of 12 inbred lines with these fragments prepared as attributes and was imported into RapidMiner software. After removal of duplicates, useless and correlated features, 311 feature attributes were polymorphic, ranging in size from 1500 to 3500 bp. The most important attribute fragment alleles in different attribute weighting selected. Ten datasets created using attribute selection (weighting) algorithms. Different classification algorithms were applied on datasets. These can be used to identify groups of alleles with similar patterns of expression, and are able to create some models that have been applied successfully in the prediction, classification and pattern recognition in drought stress. Some unsupervised models were able to differentiate tolerant inbred lines from susceptible. Four unsupervised models were able to produce the different decision trees with root and leaves. The most important attribute alleles almost in all of models were phi033a3, bnlg1347a1 and bnlg172a2 respectively, that can help to identify tolerant maize inbred lines with high precision.


Sign in / Sign up

Export Citation Format

Share Document