scholarly journals Penyisipan Citra pada Audio dengan Kode PN Terdistribusi Gaussian

Author(s):  
GELAR BUDIMAN ◽  
SUCI AULIA ◽  
I NYOMAN APRAZ RAMATRYANA

ABSTRAKPada makalah ini, perancangan audio watermarking memanfaatkan kode PN yang terdistribusi Gaussian atau Normal dengan menggunakan citra sebagai watermark yang disisipkan pada audio. Watermark yang berupa citra biner diubah ke dalam vektor 1 dimensi, kemudian dijumlahkan dengan kode PN terdistribusi normal yang disaring dengan filter psikoakustik. Setelah itu, sinyal dikalikan dengan faktor gain α sebelum dijumlahkan dengan host audio untuk mendapatkan watermarked audio. Hasil dari simulasi menunjukkan bahwa sistem memiliki kapasitas watermark yang tinggi pada 689.06 bps, imperseptibilitas yang baik pada SNR>26 dB, dan tahan terhadap serangan LPF mulai frekuensi cut off 6 kHz keatas, serangan Additive Noise mulai 40 dB keatas, resampling pada rate 16 kHz, LSC 1% - 10%, dan kompresi MP3 untuk rate 192 kbps.Kata kunci: Audio Watermarking, Kode PN, distribusi normal, filter sikoakustik ABSTRACTIn this paper, the design of audio watermarking utilizes PN code that is Gaussian or Normal distributed by using the image as a watermark inserted in the audio. The watermark in the form of binary images is converted into a 1-dimensional vector, then summed up with a normally distributed PN code filtered by a psychoacoustic filter. After that, the signal is multiplied by α gain factor before adding it to the audio host to get the watermarked audio. The result of the simulation shows that the system has a high watermark capacity at 689.06 bps, good imperceptibility at SNR> 26 dB, and withstand LPF attacks starting from 6 kHz cut-off frequency and above, Additive Noise attacks from 40 dB up, resampling at 16 kHz , LSC 1% - 10%, and MP3 compression for 192 kbps rate.Keywords: Audio Watermarking, PN code, normal distribution, psychoacoustic filter

Author(s):  
IRMA SAFITRI ◽  
NUR IBRAHIM ◽  
HERLAMBANG YOGASWARA

ABSTRAKPenelitian ini mengembangkan teknik Compressive Sensing (CS) untuk audio watermarking dengan metode Lifting Wavelet Transform (LWT) dan Quantization Index Modulation (QIM). LWT adalah salah satu teknik mendekomposisi sinyal menjadi 2 sub-band, yaitu sub-band low dan high. QIM adalah suatu metode yang efisien secara komputasi atau perhitungan watermarking dengan menggunakan informasi tambahan. Audio watermarking dilakukan menggunakan file audio dengan format *.wav berdurasi 10 detik dan menggunakan 4 genre musik, yaitu pop, classic, rock, dan metal. Watermark yang disisipkan berupa citra hitam putih dengan format *.bmp yang masing-masing berukuran 32x32 dan 64x64 pixel. Pengujian dilakukan dengan mengukur nilai SNR, ODG, BER, dan PSNR. Audio yang telah disisipkan watermark, diuji ketahanannya dengan diberikan 7 macam serangan berupa LPF, BPF, HPF, MP3 compression, noise, dan echo. Penelitian ini memiliki hasil optimal dengan nilai SNR 85,32 dB, ODG -8,34x10-11, BER 0, dan PSNR ∞.Kata kunci: Audio watermarking, QIM, LWT, Compressive Sensing. ABSTRACTThis research developed Compressive Sensing (CS) technique for audio watermarking using Wavelet Transform (LWT) and Quantization Index Modulation (QIM) methods. LWT is one technique to decompose the signal into 2 sub-bands, namely sub-band low and high. QIM is a computationally efficient method or watermarking calculation using additional information. Audio watermarking was done using audio files with *.wav format duration of 10 seconds and used 4 genres of music, namely pop, classic, rock, and metal. Watermark was inserted in the form of black and white image with *.bmp format each measuring 32x32 and 64x64 pixels. The test was done by measuring the value of SNR, ODG, BER, and PSNR. Audio that had been inserted watermark was tested its durability with given 7 kinds of attacks such as LPF, BPF, HPF, MP3 Compression, Noise, and Echo. This research had optimal result with SNR value of 85.32 dB, ODG value of -8.34x10-11, BER value of 0, and PSNR value of ∞.Keywords: Audio watermarking, QIM, LWT, Compressive Sensing.


2021 ◽  
Vol 14 (11) ◽  
pp. 540
Author(s):  
Eyden Samunderu ◽  
Yvonne T. Murahwa

Developments in the world of finance have led the authors to assess the adequacy of using the normal distribution assumptions alone in measuring risk. Cushioning against risk has always created a plethora of complexities and challenges; hence, this paper attempts to analyse statistical properties of various risk measures in a not normal distribution and provide a financial blueprint on how to manage risk. It is assumed that using old assumptions of normality alone in a distribution is not as accurate, which has led to the use of models that do not give accurate risk measures. Our empirical design of study firstly examined an overview of the use of returns in measuring risk and an assessment of the current financial environment. As an alternative to conventional measures, our paper employs a mosaic of risk techniques in order to ascertain the fact that there is no one universal risk measure. The next step involved looking at the current risk proxy measures adopted, such as the Gaussian-based, value at risk (VaR) measure. Furthermore, the authors analysed multiple alternative approaches that do not take into account the normality assumption, such as other variations of VaR, as well as econometric models that can be used in risk measurement and forecasting. Value at risk (VaR) is a widely used measure of financial risk, which provides a way of quantifying and managing the risk of a portfolio. Arguably, VaR represents the most important tool for evaluating market risk as one of the several threats to the global financial system. Upon carrying out an extensive literature review, a data set was applied which was composed of three main asset classes: bonds, equities and hedge funds. The first part was to determine to what extent returns are not normally distributed. After testing the hypothesis, it was found that the majority of returns are not normally distributed but instead exhibit skewness and kurtosis greater or less than three. The study then applied various VaR methods to measure risk in order to determine the most efficient ones. Different timelines were used to carry out stressed value at risks, and it was seen that during periods of crisis, the volatility of asset returns was higher. The other steps that followed examined the relationship of the variables, correlation tests and time series analysis conducted and led to the forecasting of the returns. It was noted that these methods could not be used in isolation. We adopted the use of a mosaic of all the methods from the VaR measures, which included studying the behaviour and relation of assets with each other. Furthermore, we also examined the environment as a whole, then applied forecasting models to accurately value returns; this gave a much more accurate and relevant risk measure as compared to the initial assumption of normality.


2021 ◽  
Vol 10 (1) ◽  
pp. 85-93
Author(s):  
Ubudia Hiliaily Chairunnnisa ◽  
Abdul Hoyyi ◽  
Hasbi Yasin

The basic assumption that is often used in bond valuations is the assumption on the Black-Scholes model. The practical assumption of the Black-Scholes model is the return of assets with normal distribution, but in reality there are many conditions where the return of assets of a company is not normally distributed and causing improperly developed bond valuation modeling. The Fast-Fourier Transform model (FFT) was developed as a solution to this problem. The Fast-Fourier Transformation Model is a Fourier transformation technique with high accuracy and is more effective because it uses characteristic functions. In this research, a modeling will be carried out to calculate bond valuations designed to take advantage of the computational power of the FFT. The characteristic function used is the Variance Gamma, which has the advantage of being able to capture data return behavior that is not normally distributed. The data used in this study are Sustainable Bonds I of Bank Danamon Phase I Year  2019 Series B, Sustainable Bonds II of Bank CIMB Niaga II Phase IV Year 2018 Series C, Sustainable Subordinated Bonds II of Bank UOB Indonesia Phase II 2019. The results obtained are FFT model using the Variance Gamma characteristic function gives more precise results for the return of assets with not normal distribution.  Keywords: Bonds, Bond Valuation, Black-Scholes, Fast-Fourier Transform, Variance Gamma


2019 ◽  
Vol 43 (2) ◽  
pp. 277-281 ◽  
Author(s):  
R. Magdeev ◽  
Al. Tashlinskii

In this paper, a comparative analysis of the correlation-extreme method, the method of contour analysis and the method of stochastic gradient identification in the objects identification for a binary image is carried out. The results are obtained for a situation where possible deformations of an identified object with respect to a pattern can be reduced to a similarity model, that is, the pattern and the object may differ in scale, orientation angle, shift along the base axes, and additive noise. The identification of an object is understood as the recognition of its image with an estimate of the strain parameters relative to the template.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Payam Amini ◽  
Abbas Moghimbeigi ◽  
Farid Zayeri ◽  
Leili Tapak ◽  
Saman Maroufizadeh ◽  
...  

Associated longitudinal response variables are faced with variations caused by repeated measurements over time along with the association between the responses. To model a longitudinal ordinal outcome using generalized linear mixed models, integrating over a normally distributed random intercept in the proportional odds ordinal logistic regression does not yield a closed form. In this paper, we combined a longitudinal count and an ordinal response variable with Bridge distribution for the random intercept in the ordinal logistic regression submodel. We compared the results to that of a normal distribution. The two associated response variables are combined using correlated random intercepts. The random intercept in the count outcome submodel follows a normal distribution. The random intercept in the ordinal outcome submodel follows Bridge distribution. The estimations were carried out using a likelihood-based approach in direct and conditional joint modelling approaches. To illustrate the performance of the model, a simulation study was conducted. Based on the simulation results, assuming a Bridge distribution for the random intercept of ordinal logistic regression results in accurate estimation even if the random intercept is normally distributed. Moreover, considering the association between longitudinal count and ordinal responses resulted in estimation with lower standard error in comparison to univariate analysis. In addition to the same interpretation for the parameter in marginal and conditional estimates thanks to the assumption of a Bridge distribution for the random intercept of ordinal logistic regression, more efficient estimates were found compared to that of normal distribution.


Blood ◽  
2005 ◽  
Vol 106 (11) ◽  
pp. 4025-4025
Author(s):  
Lisa Wakeman ◽  
Roger Munro ◽  
Nick Dorward ◽  
Ann Benton ◽  
Andy Gibb ◽  
...  

Abstract Reference ranges (RRs) in coagulation are applicable only to specific analyser and reagent combinations and frequently need to be re-established if any of these are changed. In no other sphere of clinical laboratory practice are RRs more affected by such a wide range of multiple demographic and pre-analytical variables. For most routine clinical laboratories therefore, the collection of multiple, separate RRs is not feasible so a representative group of healthy adults such as laboratory staff frequently constitute the reference population from which these limits are calculated. Early morning venous samples were collected into glass B-D Vacutainers (Ref: 367691) from 221 healthy laboratory personnel (F= 159; M = 62) aged 20–63 yrs for both gender. Age groups were equally represented. Samples were processed on a Sysmex CA-1500 analyser within 1 hour of collection. Appropriate NCCLS guidelines were followed throughout. Reagents employed were - Actin FSL (APTT); Innovin (PT); Dade-Behring reference, calibration and deficient plasmas (factor assays); Dade-Behring kit ref: OWWR15 (ATIII); Chromogenix kit ref: 82209863 (Protein C). Outliers were excluded, data examined for normal distribution from histograms and significance levels calculated from the Anderson - Darling test of normality. RRs for normally distributed parameters were calculated using means ± 2SDs. RRs for non-normally distributed parameters were calculated using the log natural transformation and the antilog of 2.5- and 97.5- percentiles. Italicised parameters shown below are non-normally distributed. Parameter Reference Range Anderson Darling P-Value P-value for normal distribution Mann Whitney U-test (M versus F) *=significant difference PT sec 10.0 – 11.8 <0.005 0.003* APTT sec 24.7 – 31.7 0.006 0.232 TCT sec 13.8 – 17.4 0.035 0.198 Fib g/L Clauss 1.6 – 4.2 0.190 t-test not significant Fib g/L Derived 2.1 – 4.9 0.200 t-test not significant II % 82 – 133 <0.005 0.019* V% 70 – 150 0.021 0.303 VII % 60 – 164 0.008 0.037* X% 75 – 147 0.539 t-test not significant VIII % 48 – 204 <0.005 0.520 IX % 65 – 142 <0.005 0.275 XI % 61 – 142 <0.005 0.394 XII % 59 – 133 0.088 t-test not significant Protein C % 75 – 160 0.036 0.024* ATIII % 86 – 128 0.329 t-test not significant Kruskal Wallis tests on our data indicate that all coagulation factors are positively associated with age except factors IX and XII. Significant differences (p=0.014) in factor VIIIc was found between those of blood group O and non group O. Significant correlation was found between declining APTTs and associated increasing factor VIIIc when measured in individual volunteers.


1980 ◽  
Vol 17 (4) ◽  
pp. 1108-1113 ◽  
Author(s):  
Gunnar Englund

It is well known that the number of renewals in the time interval [0, t] for an ordinary renewal process is approximately normally distributed under general conditions. We give a remainder term estimate for this normal distribution approximation.


Sign in / Sign up

Export Citation Format

Share Document