Gaussian Filtering Method to Remove Noise in Images

2021 ◽  
Vol 10 (1) ◽  
pp. 53
Author(s):  
I Dewa Gede Rama Satya ◽  
I Made Widiartha

Capturing every moment is not taboo in this era. One way to capture the moment is to use a photo, but the results are often unsatisfactory. Noise, is one of the many causes of unsatisfactory results. Noise is a disturbance caused by digital data storage received by the image data receiver which can interfere with image quality. Noise can be caused by physical (optical) disturbances in the image capturing device, such as dust on the camera lens or due to improper processing. To get rid of this noise, you can use various methods, of which Gaussian Filtering is one of them. In this research, we will implement it using Matlab. The type of file used is a photo that has a jpg format and has noise above 75%. After doing image processing, it shows the results of the image which initially has noise and after the image quality improvement process is carried out, the image quality is clearer and the noise decreases.

2016 ◽  
Vol 3 (2) ◽  
pp. 207-216
Author(s):  
Jani Kusanti ◽  
Yusuf Zain Santosa

Identification of malaria parasites in red blood cells has been done, with the aim of as tools to identify experts microscopic parasites more quickly. This study aimed to compare the level of accuracy in the results to identify and classify parasites based on the pattern shape and texture patterns. The comparison is based on the characteristics of the pattern used, the steps being taken in this study is the image quality improvement process, the process of segmentation with Otsu method, feature extraction process on the image data to be tested. The process of pattern recognition and pattern shapes texture. The last step is to test the identification and classification of plasmodium falciparum parasite into 12 classes using methods Learning Vector Quantization (LVQ). The results of this study indicate that the pattern forms can provide a higher level of accuracy compared to LVQ texture pattern. LVQ with input shape pattern successfully identified 91% of image data correctly and input texture successfully identified 48% of image data properly.


Author(s):  
Roger Clarke

The last thirty years of computing has resulted in many people being heavily dependent on digital data. Meanwhile, there has been a significant change in the patterns of data storage and of processing. Despite the many risks involved in data management, there is a dearth of guidance and support for individuals and small organisations. A generic risk assessment is presented, resulting in practicable backup plans that are applicable to the needs of those categories of IT user.


Author(s):  
SANTHOSH KUMAR R ◽  
CYRIL PRASANNA RAJ ◽  
Y. MANJULA ◽  
M.Z. KURIAN

With the fast growing of digital data exchange, security information becomes much important in data storage and transmission. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. Better identification of which data is relevant to human perception at higher compression ratio is needed. In this DWT-AES processor a Reconfigurable Secure Image Coding is proposed. The prominent feature of this method is a partial encryption of key lengths of 128, 192 or 256 bits. Considerable Security level also mentioned. This paper presents the AES algorithm with regard to FPGA. However, linking these two designs to achieve secure image coding is leading.


Author(s):  
Richard S. Chemock

One of the most common tasks in a typical analysis lab is the recording of images. Many analytical techniques (TEM, SEM, and metallography for example) produce images as their primary output. Until recently, the most common method of recording images was by using film. Current PS/2R systems offer very large capacity data storage devices and high resolution displays, making it practical to work with analytical images on PS/2s, thereby sidestepping the traditional film and darkroom steps. This change in operational mode offers many benefits: cost savings, throughput, archiving and searching capabilities as well as direct incorporation of the image data into reports.The conventional way to record images involves film, either sheet film (with its associated wet chemistry) for TEM or PolaroidR film for SEM and light microscopy. Although film is inconvenient, it does have the highest quality of all available image recording techniques. The fine grained film used for TEM has a resolution that would exceed a 4096x4096x16 bit digital image.


2019 ◽  
Vol 2019 (1) ◽  
pp. 360-368
Author(s):  
Mekides Assefa Abebe ◽  
Jon Yngve Hardeberg

Different whiteboard image degradations highly reduce the legibility of pen-stroke content as well as the overall quality of the images. Consequently, different researchers addressed the problem through different image enhancement techniques. Most of the state-of-the-art approaches applied common image processing techniques such as background foreground segmentation, text extraction, contrast and color enhancements and white balancing. However, such types of conventional enhancement methods are incapable of recovering severely degraded pen-stroke contents and produce artifacts in the presence of complex pen-stroke illustrations. In order to surmount such problems, the authors have proposed a deep learning based solution. They have contributed a new whiteboard image data set and adopted two deep convolutional neural network architectures for whiteboard image quality enhancement applications. Their different evaluations of the trained models demonstrated their superior performances over the conventional methods.


2018 ◽  
Vol 6 (3) ◽  
pp. 359-363
Author(s):  
A. Saxena ◽  
◽  
S. Sharma ◽  
S. Dangi ◽  
A. Sharma ◽  
...  

Author(s):  
Ye. Didenko ◽  
O. Stepanenko

One of the indicators of the effective use of artillery is the accuracy of the fire impact on the objects of enemy. The accuracy of the artillery is achieved by completing the implementation of all measures for the preparation of shooting and fire control. Main measures of ballistic preparation are to determine and take into account the summary deviation of the initial velocity. The existing procedure for determining the summary deviation of the initial velocity for the check (main) cannon of battery leads to accumulation of ballistic preparation errors. The supply of artillery units with means of determining the initial speed of the projectile is insufficient. Among the many known methods for measuring the initial velocity, not enough attention was paid to the methods of analyzing the processes that occur during a shot in the "charge-shell-barrel" system. Under the action of the pressure of the powder gases in the barrel channel and the forces of the interaction of the projectile with the barrel there are springy deformations in the radial direction. To measure springy deformations it is advisable to use strain gauge sensors. Monitoring of deformation in a radial direction by time can be used to determine the moment of passing a projectile past the strain gauge mounted on the outer surface of the barrel. In the case of springy deformations, the initial resistance of the sensor varies in proportion to its value. The speed of the shell (mine) in the barrel can be determined by time between pulses of signals obtained from strain gauges located at a known distance from each other. The simplicity of the proposed method for measuring the initial velocity of an artillery shell provides an opportunity for equipping each cannon (mortar) with autonomous means for measuring the initial velocity. With the simultaneous puting into action of automatic control systems can be automatically taking into account the measurement results. This will change the existing procedure for determining the total deviation of the initial velocity and improve the accuracy, timeliness and suddenness of the opening of artillery fire, which are components of its efficiency.


Sign in / Sign up

Export Citation Format

Share Document