scholarly journals A Mobile-Oriented System for Integrity Preserving in Audio Forensics

2019 ◽  
Vol 9 (15) ◽  
pp. 3097 ◽  
Author(s):  
Diego Renza ◽  
Jaime Andres Arango ◽  
Dora Maria Ballesteros

This paper addresses a problem in the field of audio forensics. With the aim of providing a solution that helps Chain of Custody (CoC) processes, we propose an integrity verification system that includes capture (mobile based), hash code calculation and cloud storage. When the audio is recorded, a hash code is generated in situ by the capture module (an application), and it is sent immediately to the cloud. Later, the integrity of the audio recording given as evidence can be verified according to the information stored in the cloud. To validate the properties of the proposed scheme, we conducted several tests to evaluate if two different inputs could generate the same hash code (collision resistance), and to evaluate how much the hash code changes when small changes occur in the input (sensitivity analysis). According to the results, all selected audio signals provide different hash codes, and these values are very sensitive to small changes over the recorded audio. On the other hand, in terms of computational cost, less than 2 s per minute of recording are required to calculate the hash code. With the above results, our system is useful to verify the integrity of audio recordings that may be relied on as digital evidence.

Author(s):  
Matthew N.O. Sadiku ◽  
Adebowale E. Shadare ◽  
Sarhan M. Musa

Digital chain of custody is the record of preservation of digital evidence from collection to presentation in the court of law. This is an essential part of digital investigation process.  Its key objective is to ensure that the digital evidence presented to the court remains as originally collected, without tampering. The chain of custody is important for admissible evidence in court. Without a chain of custody, the opposing attorney can challenge or dismiss the evidence presented. The aim of this paper is to provide a brief introduction to the concept of digital chain custody.


2014 ◽  
Vol 107 (9) ◽  
pp. 30-36 ◽  
Author(s):  
Yudi Prayudi ◽  
Ahmad Ashari ◽  
Tri K Priyambodo

2016 ◽  
Author(s):  
Andrew Dawson ◽  
Peter Düben

Abstract. This paper describes the rpe library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialised hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision. The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.


2020 ◽  
Author(s):  
Kevin Dew ◽  
L Signal ◽  
J Stairmand ◽  
A Simpson ◽  
D Sarfati

© The Author(s) 2018. This study identified ways in which patients and medical specialists negotiated decisions about cancer treatment by observing decision-making discussion in situ. Audio-recordings of cancer care consultations with 18 patients, their support people, and their medical specialists, including medical oncologists, radiation oncologists and surgeons were collected in different regions of New Zealand. Patients were followed up with interviews and specialists provided consultation debriefings. The interpretation of the data drew on the concepts of epistemic and deontic rights to argue that in complex consultations, such as occur in cancer care, we need to reconsider the simple dichotomy of preferred consultations styles as paternalistic or based on shared decision-making. Decision-making is a dynamic process with specialists and patients linked into networks that impact on decision-making and where rights to knowledge and rights to decision-making are interactionally negotiated. The level of information and understanding that patients desire to exercise rights needs to be reconsidered.


Symmetry ◽  
2020 ◽  
Vol 12 (7) ◽  
pp. 1193
Author(s):  
Shaochen Jiang ◽  
Liejun Wang ◽  
Shuli Cheng ◽  
Anyu Du ◽  
Yongming Li

The existing learning-based unsupervised hashing method usually uses a pre-trained network to extract features, and then uses the extracted feature vectors to construct a similarity matrix which guides the generation of hash codes through gradient descent. Existing research shows that the algorithm based on gradient descent will cause the hash codes of the paired images to be updated toward each other’s position during the training process. For unsupervised training, this situation will cause large fluctuations in the hash code during training and limit the learning efficiency of the hash code. In this paper, we propose a method named Deep Unsupervised Hashing with Gradient Attention (UHGA) to solve this problem. UHGA mainly includes the following contents: (1) use pre-trained network models to extract image features; (2) calculate the cosine distance of the corresponding features of the pair of images, and construct a similarity matrix through the cosine distance to guide the generation of hash codes; (3) a gradient attention mechanism is added during the training of the hash code to pay attention to the gradient. Experiments on two existing public datasets show that our proposed method can obtain more discriminating hash codes.


2020 ◽  
Author(s):  
Nathalie Casas ◽  
Guilhem Mollon ◽  
Ali Daouadji

<p>How do earthquakes start? What are the parameters influencing fault evolutions? What are the local parameters controlling the seismic or aseismic character of slip?</p><p>To predict the dynamic behaviour of faults, it is important to understand slip mechanisms and their source. Lab or in-situ experiments can be very helpful, but tribological experience has shown that it is complicated to install local sensors inside a mechanical contact, and that they could disturb the behaviour of the sheared medium. Even with technical improvements on lab tools, some interesting data regarding gouge kinematics and rheology remains very difficult or impossible to obtain. Numerical modelling seems to be another way of understanding physics of earthquakes.</p><p>Fault zone usually present a granular gouge, coming from the wear material of previous slips. That is why, in this study, we present a numerical model to observe the evolution and behaviours of fault gouges. We chose to focus on physics of contacts inside a granular gouge at a millimetre-scale, studying contact interactions and friction coefficient between the different bodies. In order to get access to this kind of information, we implement a 2D granular fault gouge with Discrete Element Modelling in the software MELODY (Mollon, 2016). The gouge model involves two rough surfaces representing the rock walls separated by the granular gouge.</p><p>One of the interests of this code is its ability to represent realistic non-circular grain shapes with a Fourier-Voronoï method (Mollon et al., 2012). As most of the simulations reported in the literature use circular (2D)/spherical (3D) grains, we wanted to analyse numerically the contribution of angular grains. We confirm that they lead to higher friction coefficients and different global behaviours (Mair et al., 2002), (Guo et al., 2004).</p><p>In a first model, we investigate dry contacts to spotlight the influence of inter-particular cohesion and small particles on slip behaviour and static friction. A second model is carried out to observe aseismic and seismic slips occurring within the gouge. As stability depends on the interplay between the peak of static friction and the stiffness of the surrounding medium, the model includes the stiffness of the loading apparatus on the rock walls.</p><p>The work presented here focuses on millimetre-scale phenomena, but the employed model cannot be extended to the scale of the entire fault network, for computational cost reasons. It is expected, however, that it will lead to a better understanding of local behaviours that may be injected as simplified interface laws in larger-scale simulations.</p>


2007 ◽  
Vol 01 (03) ◽  
pp. 307-318 ◽  
Author(s):  
ATMAN JBARI ◽  
ABDELLAH ADIB ◽  
DRISS ABOUTAJDINE

In this paper, we address the problem of Blind Audio Separation (BAS) by content evaluation of audio signals in the Time-Scale domain. Most of the proposed techniques rely on independence or at least uncorrelation assumption of the source signals exploiting mutual information or second/high order statistics. Here, we present a new algorithm, for instantaneous mixture, that considers only different time-scale source signature properties. Our approach lies in wavelet transformation advantages and proposes for this a new representation; Spatial Time Scale Distributions (STSD), to characterize energy and interference of the observed data. The BAS will be allowed by joint diagonalization, without a prior orthogonality constraint, of a set of selected diagonal STSD matrices. Several criteria will be proposed, in the transformed time-scale space, to assess the separated audio signal contents. We describe the logistics of the separation and the content rating, thus an exemplary implementation on synthetic signals and real audio recordings show the high efficiency of the proposed technique to restore the audio signal contents.


2015 ◽  
Vol 3 (1) ◽  
Author(s):  
Cesar Villamizar ◽  
Ailin Orjuela ◽  
Marco Adarme

El análisis forense consiste en determinar las causas del compromiso de seguridad de un sistema. En la actualidad se conocen normas y principios generales como la Organización Internacional en Evidencia Digital (IOCE). El objetivo del estudio fue caracterizar la legislación colombiana en cuanto a la normatividad específica y necesaria para  el diseño de la técnica informática en cuanto a la extracción de la evidencia digital para anclar la cadena de custodia. Se utilizó una investigación descriptiva de tipo documental y aplicada, mediante el análisis de diferentes fuentes sobre sistemas de información, integridad, confidencialidad y disponibilidad de datos bajo custodia judicial. La normatividad actual permite fundamentar el uso de técnicas informáticas para la extracción de la evidencia digital y asegurar la cadena de custodia, basado en la protección constitucional del derecho a la intimidad, por lo que se deben respetar la libertad y promover las demás garantías. También la normatividad se apoya en la Ley  527 de Agosto 18 de 1999 que trata de los instrumentos magnéticos e informáticos, así como la ley 527 de 1999 sobre el comercio electrónico para Colombia, la Ley 1273 de 2009 para la protección de la información y la Ley 1273 del 2009 que tipifica los delitos informáticos.AbstractForensic analysis is to determine the causes of compromise security of a system. At present general rules and principles as the International Organization for Digital Evidence (IOCE) they are known. The aim of the study was to characterize Colombian law as to the specific and necessary for the design of computer technical regulations regarding the extraction of digital evidence to anchor the chain of custody. A descriptive documentary research and applied type was used, by analyzing different sources on information systems, integrity, confidentiality and availability of data in judicial custody. Current regulations allow substantiate the use of computer techniques to extract digital evidence and ensure the chain of custody, based on the constitutional protection of the right to privacy, so they must respect freedom and promote other warranties. Regulations also relies on Law 527 of August 18, 1999 which is magnetic and software tools, as well as the law 527 of 1999 on electronic commerce for Colombia, Law 1273 of 2009 for the protection of information and Law 1273 of 2009 which criminalizes cybercrime.


Sign in / Sign up

Export Citation Format

Share Document