Record integrity and licensing your intellectual property

2009 ◽  
Vol 15 (1) ◽  
Author(s):  
John Avellanet

For the biotechnology executive looking to cut a deal with a pharmaceutical company, it is not the type of new molecule that matters, but the quality of the records and data that back up the new biotechnology drug's efficacy that matters. Executives unprepared for pharma's due diligence weaken their own negotiating hand. This paper presents three key aspects to research data quality that any prospective pharma due diligence will examine. Recommendations within the paper stress the elements of a biotechnology preparation strategy that will not only serve to prove the biotechnology's scientific quality, but will also give the biotechnology company an edge over others competing for a slice of the pharma pie.


Information ◽  
2020 ◽  
Vol 11 (4) ◽  
pp. 175 ◽  
Author(s):  
Tibor Koltay

This paper focuses on the characteristics of research data quality, and aims to cover the most important issues related to it, giving particular attention to its attributes and to data governance. The corporate word’s considerable interest in the quality of data is obvious in several thoughts and issues reported in business-related publications, even if there are apparent differences between values and approaches to data in corporate and in academic (research) environments. The paper also takes into consideration that addressing data quality would be unimaginable without considering big data.



Data ◽  
2020 ◽  
Vol 5 (4) ◽  
pp. 90
Author(s):  
Stefano Canali

In this commentary, I propose a framework for thinking about data quality in the context of scientific research. I start by analyzing conceptualizations of quality as a property of information, evidence and data and reviewing research in the philosophy of information, the philosophy of science and the philosophy of biomedicine. I identify a push for purpose dependency as one of the main results of this review. On this basis, I present a contextual approach to data quality in scientific research, whereby the quality of a dataset is dependent on the context of use of the dataset as much as the dataset itself. I exemplify the approach by discussing current critiques and debates of scientific quality, thus showcasing how data quality can be approached contextually.



GigaScience ◽  
2021 ◽  
Vol 10 (4) ◽  
Author(s):  
Mikhail G Dozmorov ◽  
Katarzyna M Tyc ◽  
Nathan C Sheffield ◽  
David C Boyd ◽  
Amy L Olex ◽  
...  

Abstract Background Sequencing of patient-derived xenograft (PDX) mouse models allows investigation of the molecular mechanisms of human tumor samples engrafted in a mouse host. Thus, both human and mouse genetic material is sequenced. Several methods have been developed to remove mouse sequencing reads from RNA-seq or exome sequencing PDX data and improve the downstream signal. However, for more recent chromatin conformation capture technologies (Hi-C), the effect of mouse reads remains undefined. Results We evaluated the effect of mouse read removal on the quality of Hi-C data using in silico created PDX Hi-C data with 10% and 30% mouse reads. Additionally, we generated 2 experimental PDX Hi-C datasets using different library preparation strategies. We evaluated 3 alignment strategies (Direct, Xenome, Combined) and 3 pipelines (Juicer, HiC-Pro, HiCExplorer) on Hi-C data quality. Conclusions Removal of mouse reads had little-to-no effect on data quality as compared with the results obtained with the Direct alignment strategy. Juicer extracted more valid chromatin interactions for Hi-C matrices, regardless of the mouse read removal strategy. However, the pipeline effect was minimal, while the library preparation strategy had the largest effect on all quality metrics. Together, our study presents comprehensive guidelines on PDX Hi-C data processing.



2020 ◽  
Vol 4 (4) ◽  
pp. 29 ◽  
Author(s):  
Otmane Azeroual

Databases such as research data management systems (RDMS) provide the research data in which information is to be searched for. They provide techniques with which even large amounts of data can be evaluated efficiently. This includes the management of research data and the optimization of access to this data, especially if it cannot be fully loaded into the main memory. They also provide methods for grouping and sorting and optimize requests that are made to them so that they can be processed efficiently even when accessing large amounts of data. Research data offer one thing above all: the opportunity to generate valuable knowledge. The quality of research data is of primary importance for this. Only flawless research data can deliver reliable, beneficial results and enable sound decision-making. Correct, complete and up-to-date research data are therefore essential for successful operational processes. Wrong decisions and inefficiencies in day-to-day operations are only the tip of the iceberg, since the problems with poor data quality span various areas and weaken entire university processes. Therefore, this paper addresses the problems of data quality in the context of RDMS and tries to shed light on the solution for ensuring data quality and to show a way to fix the dirty research data that arise during its integration before it has a negative impact on business success.



PsycCRITIQUES ◽  
2013 ◽  
Vol 58 (22) ◽  
Author(s):  
Lennart Sjöberg


Author(s):  
Estefania Rabaneda Romero
Keyword(s):  


Author(s):  
Maria Shilyaeva

The article is devoted to a new term in Russian practice — Due Diligence, which allows you to identify and reduce possible risks that arise when buying, taking over a company or investing. In Russia, Due Diligence has its own characteristics and features, which are discussed in this article.



2018 ◽  
Author(s):  
Мария Григорьевна Алпатова ◽  
Мария Игоревна Щеглова ◽  
Elmira Kalybaevna Adil’bekova ◽  
Nuradin Alibaev ◽  
Arunas Svitojus

The conference is a major international forum for analyzing and discussing trends and approaches in research in the field of basic science and applied research. We provide a platform for discussions on innovative, theoretical and empirical research. The form of the conference: in absentia, without specifying the form in the collection of articles. Working languages: Russian, English Doctors and candidates of science, scientists, specialists of various profiles and directions, applicants for academic degrees, teachers, graduate students, undergraduates, students are invited to participate in the conference. There is one blind verification process in the journal. All articles will be initially evaluated by the editor for compliance with the journal. Manuscripts that are considered appropriate are then usually sent to at least two independent peer reviewers to assess the scientific quality of the article. The editor is responsible for the final decision on whether to accept or reject the article. The editor's decision is final. The main criterion used in assessing the manuscript submitted to the journal is: uniqueness or innovation in the work from the point of view of the methodology being developed and / or its application to a problem of particular importance in the public sector or service sector and / or the setting in which the efforts, for example, in the developing region of the world. That is, the very model / methodology, application and context of problems, at least one of them must be unique and important. Additional criteria considered in the consideration of the submitted document are its accuracy, organization / presentation (ie logical flow) and recording quality.



2019 ◽  
Author(s):  
Изабелла Станиславовна Чибисова ◽  
Диана Ильгизаровна Шарипова ◽  
Альфия Галиевна Зулькарнаева ◽  
Ксения Александровна Дулова ◽  
Садег Амирзадеган ◽  
...  

The conference is a major international forum for analyzing and discussing trends and approaches in research in the field of basic science and applied research. We provide a platform for discussions on innovative, theoretical and empirical research. The form of the conference: in absentia, without specifying the form in the collection of articles. Working languages: Russian, English Doctors and candidates of science, scientists, specialists of various profiles and directions, applicants for academic degrees, teachers, graduate students, undergraduates, students are invited to participate in the conference. There is one blind verification process in the journal. All articles will be initially evaluated by the editor for compliance with the journal. Manuscripts that are considered appropriate are then usually sent to at least two independent peer reviewers to assess the scientific quality of the article. The editor is responsible for the final decision on whether to accept or reject the article. The editor's decision is final. The main criterion used in assessing the manuscript submitted to the journal is: uniqueness or innovation in the work from the point of view of the methodology being developed and / or its application to a problem of particular importance in the public sector or service sector and / or the setting in which the efforts, for example, in the developing region of the world. That is, the very model / methodology, application and context of problems, at least one of them must be unique and important. Additional criteria considered in the consideration of the submitted document are its accuracy, organization / presentation (ie logical flow) and recording quality.



2017 ◽  
Vol 4 (1) ◽  
pp. 25-31 ◽  
Author(s):  
Diana Effendi

Information Product Approach (IP Approach) is an information management approach. It can be used to manage product information and data quality analysis. IP-Map can be used by organizations to facilitate the management of knowledge in collecting, storing, maintaining, and using the data in an organized. The  process of data management of academic activities in X University has not yet used the IP approach. X University has not given attention to the management of information quality of its. During this time X University just concern to system applications used to support the automation of data management in the process of academic activities. IP-Map that made in this paper can be used as a basis for analyzing the quality of data and information. By the IP-MAP, X University is expected to know which parts of the process that need improvement in the quality of data and information management.   Index term: IP Approach, IP-Map, information quality, data quality. REFERENCES[1] H. Zhu, S. Madnick, Y. Lee, and R. Wang, “Data and Information Quality Research: Its Evolution and Future,” Working Paper, MIT, USA, 2012.[2] Lee, Yang W; at al, Journey To Data Quality, MIT Press: Cambridge, 2006.[3] L. Al-Hakim, Information Quality Management: Theory and Applications. Idea Group Inc (IGI), 2007.[4] “Access : A semiotic information quality framework: development and comparative analysis : Journal ofInformation Technology.” [Online]. Available: http://www.palgravejournals.com/jit/journal/v20/n2/full/2000038a.html. [Accessed: 18-Sep-2015].[5] Effendi, Diana, Pengukuran Dan Perbaikan Kualitas Data Dan Informasi Di Perguruan Tinggi MenggunakanCALDEA Dan EVAMECAL (Studi Kasus X University), Proceeding Seminar Nasional RESASTEK, 2012, pp.TIG.1-TI-G.6.



Sign in / Sign up

Export Citation Format

Share Document