scholarly journals Design and Implementation of RS(450, 406) Decoder

Author(s):  
Akhilesh Yadav ◽  
Poonam Jindal ◽  
Devaraju Basappa

Nowadays, in the field of data transmission between receiver and transmitter, the Reed Solomon code is used very frequently. FEC codes have two foremost and influential operations: (1) calculating parity symbols at the encoder side and (2) transmitting message symbols with parity symbols and decoding the received codeword at the second side by using the decoding algorithms. Gigabit automotive ethernet is used in the automotive car to provide better bandwidth for every kind of applications to connect functional components of the vehicles. This error correction technique is used in the gigabit automotive ethernet to reduce the channel noise during data transmission. RS (450, 406) is a powerful error correction techniques used in automotive ethernet. This paper focused only on the analysis of Reed Solomon decoding. Reed Solomon decoding is more efficient decoding techniques for correcting both burst and random errors. The critical steps of the Reed Solomon decoding are to solve the error evaluator and error calculator polynomial, which is also known as KES solver.

Error correction and detection during data transmission is a major issue. For resolving this, many error correction techniques are available. The Reed-Solomon coding is the most powerful forward error correction technique used in Gigabit Automotive Ethernet to compact channel noise during data transmission. The car becomes more smarter day by day and more new advanced electronics is being used in-vehicle. Gigabit Automotive Ethernet(1000BASE-T1) provide fast bandwidth for many kinds of applications and connect different functional parts in the car. The Reed Solomon(RS) coding is the powerful forward error correction(FEC) technique used in 1000BASE-T1 Automotive Ethernet. RS(450,406) coding is also known as shortened Reed Solomon codes. The Reed Solomon(RS) codes are generally used in communication system due to its ability of correcting both random and burst errors. Reed Solomon codes are no-binary systematic linear block codes. RS coding is widely used in high speed communication system. This RS code is implemented using Galois field(GF). The Automotive Ethernet is encoded using RS(450,406) codes through GF(512) for FEC. This RS codes can corrects the error up to t=22 symbol, while other encoding techniques corrects the error in t bits. In this paper we implemented the RS(Reed Solomon) code in Cadence ncsim Verilog software and used Cadence Simvision for showing timing diagrams. This RS code uses 9-bit based shortened (450,406) code.


2010 ◽  
Vol 44-47 ◽  
pp. 1070-1074
Author(s):  
Xi Su ◽  
Peng Bai ◽  
Yan Ping Feng ◽  
Yuan Yuan Wu

As the high speed expanding scheme adopted by IEEE802.11b standard, the Complementary Code Keying modulation technology can only perform well in error probability with high signal noise ratio. However, when SNR is comparably low or the number of users in the same channel augments suddenly, the system’s error probability soars. On the basis of the research on Complementary Code Keying, RS coding and decoding algorithms, this paper analyzed code parameters’ influence on error correction and accordingly promoted a coding scheme suitable for Complementary Code Keying: RS(255,239). Simulation Results demonstrated that once SNR is upon 5.1dB the system performance would be greatly enhanced.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2009
Author(s):  
Fatemeh Najafi ◽  
Masoud Kaveh ◽  
Diego Martín ◽  
Mohammad Reza Mosavi

Traditional authentication techniques, such as cryptographic solutions, are vulnerable to various attacks occurring on session keys and data. Physical unclonable functions (PUFs) such as dynamic random access memory (DRAM)-based PUFs are introduced as promising security blocks to enable cryptography and authentication services. However, PUFs are often sensitive to internal and external noises, which cause reliability issues. The requirement of additional robustness and reliability leads to the involvement of error-reduction methods such as error correction codes (ECCs) and pre-selection schemes that cause considerable extra overheads. In this paper, we propose deep PUF: a deep convolutional neural network (CNN)-based scheme using the latency-based DRAM PUFs without the need for any additional error correction technique. The proposed framework provides a higher number of challenge-response pairs (CRPs) by eliminating the pre-selection and filtering mechanisms. The entire complexity of device identification is moved to the server side that enables the authentication of resource-constrained nodes. The experimental results from a 1Gb DDR3 show that the responses under varying conditions can be classified with at least a 94.9% accuracy rate by using CNN. After applying the proposed authentication steps to the classification results, we show that the probability of identification error can be drastically reduced, which leads to a highly reliable authentication.


2017 ◽  
Vol 9 ◽  
pp. 03007
Author(s):  
Nikolaos Bardis ◽  
Nikolaos Doukas ◽  
Oleksandr P. Markovskyi

2010 ◽  
Vol 56 (2) ◽  
pp. 663-668 ◽  
Author(s):  
Jun Lee ◽  
Seong-Hun Lee

Author(s):  
Rohitkumar R Upadhyay

Abstract: Hamming codes for all intents and purposes are the first nontrivial family of error-correcting codes that can actually correct one error in a block of binary symbols, which literally is fairly significant. In this paper we definitely extend the notion of error correction to error-reduction and particularly present particularly several decoding methods with the particularly goal of improving the error-reducing capabilities of Hamming codes, which is quite significant. First, the error-reducing properties of Hamming codes with pretty standard decoding definitely are demonstrated and explored. We show a sort of lower bound on the definitely average number of errors present in a decoded message when two errors for the most part are introduced by the channel for for all intents and purposes general Hamming codes, which actually is quite significant. Other decoding algorithms are investigated experimentally, and it generally is definitely found that these algorithms for the most part improve the error reduction capabilities of Hamming codes beyond the aforementioned lower bound of for all intents and purposes standard decoding. Keywords: coding theory, hamming codes, hamming distance


Sign in / Sign up

Export Citation Format

Share Document