scholarly journals VIGENERE CIPHER DENGAN MODIFIKASI PLAINTEXT

2020 ◽  
Vol 20 (1) ◽  
pp. 15
Author(s):  
Dwi Rahmasari Kinasih Gusti ◽  
Kiswara Agung Santoso ◽  
Ahmad Kamsyakawuni

Cryptography is knowledge of encoding messages by observe to security aspects. Cryptography uses two types of keys, namely symmetric keys and asymmetric keys. Vigenere cipher is a technique to encrypt messages by symmetric key. Vigenere cipher can be combined by several patterns and ASCII code. The pattern used can vary as long as the text can be returned to original message (can be decrypted). On this paper, we will modified plaintext before encrypt using vigenere cipher. The way to modified the plaintext are flip and shift rows of bit. The effect of the algorithm changes can be seen based on the renewal value obtained. If the correlation value gets smaller, it means the algorithm is better. The results of this study is the correlation value using vigenere cipher with modified plaintext is better compared to vigenere cipher with original plaintext. Keywords: ASCII, Patterned, Bits, Vigenere Cipher

2021 ◽  
Vol 58 (1) ◽  
pp. 3420-3427
Author(s):  
P. A. S. D. Perera, G. S . Wijesiri

The present-day society depends heavily on digital technology where it is used in many applications such as banking and e-commerce transactions, computer passwords, etc. Therefore, it is important to protect information when storing and sharing them. Cryptography is the study of secret writing which applies complex math rules to convert the original message into an incomprehensible form.  Graph theory is applied in the field of cryptography as graphs can be simply converted into matrices There are two approaches of cryptography; symmetric cryptography and asymmetric cryptography. This paper proposes a new connection between graph theory and symmetric cryptography to protect the information from the unauthorized parties. This proposed methodology uses a matrix as the secret key which adds more security to the cryptosystem. It converts the plaintext into several graphs and represents these graphs in their matrix form. Also, this generates several ciphertexts. The size of the resulting ciphertexts are larger than the plaintext size.


2021 ◽  
Vol 2 (2) ◽  
pp. 87-94
Author(s):  
Avinash Krishnan Raghunath ◽  
Dimple Bharadwaj ◽  
M Prabhuram ◽  
Aju D

ryptography is a technique to secure data transmissions and ensure confiden-tiality, authenticity and integrity of data exchanged over the digital networks by utilizing mathematical algorithms to transform the plain text (original message) to cipher text (encrypted message) using a key or seed value. The general con-sensus regarding the use of non-deterministic true random numbers (TRN) which are generated from the physical environment such as entropy keys, at-mospheric noise, etc., as a public or private key has received limited encour-agement due to the demanding hardware requirements needed to extract the necessary data from the environment. Therefore, this research aims at design-ing and developing a lightweight program to generate a True Random Number (TRNG) key using live audio recordings which is further randomized using system date and time. These TRNs can be used to replace the deterministic pseudo random number cryptographic keys that are presently used by indus-tries for a symmetric key encryption algorithm which devolves the algorithm to being conditionally secured. Using the audio based TRNG key would render the same encryption algorithm as unconditionally secured.


2018 ◽  
Vol 11 (2) ◽  
pp. 177-186
Author(s):  
Desi Nurnaningsih ◽  
Angga Aditya Permana

ABSTRAKData sangatlah berharga bagi semua pengguna komputer, belakangan ini kriptografi menjadi metode yang digunakan dalam mengamankan data. Kriptografi adalah ilmu yang mempelajari teknik-teknik matematika dalam mengamankan suatu informasi atau pesan asli (Plainteks) menjadi sebuah teks tersembunyi (Chiperteks) dan kemudian di ubah menjadi pesan asli kembali. Kriptografi mempunyai tiga unsur penting yaitu pembangkitan kunci, enkripsi dan deskipsi. Dalam kriptografi di kenal algoritma block chiper yang didalamnya terdapat AES (Anvanced Encyption Standard) merupakan bagian dari Modern Symmetric Key Cipher, algoritma ini menggunakan kunci yang sama pada saat proses enkripsi dan deskripsi sehingga data yang kita miliki akan sulit dimengerti maknanya. Teknik algoritma tersebut digunakan untuk mengkonversi data dalam bentuk kode-kode tertentu, untuk tujuan agar informasi yang tersimpan tidak bisa di baca siapa pun kecuali orang-orang yang berhak. Oleh karena itu, sistem keamanan data sangat di perlukan untuk menjaga kerahasian informasi agar tetap terjaga. ABSTRACTData is a valueable  for all computer users, cryptography is the one of  method used to securing data. Cryptography is the study of mathematical techniques in securing an information or original message (Plainteks) into a hidden text (Chiperteks) and then converted into the original message again. Cryptography has three important elements, first is key generation, second is encryption and latetly is description. In cryptography known as cipher block algorithms in which AES (Advanced Encyption Standard) is part of Modern Symmetric Key Cipher, this algorithm uses the same key during the encryption and description process so that the data we have will be difficult to understand. The algorithm technique is used to convert data in the form of certain codes, so that the information stored cannot be read by anyone except those who are entitled. Therefore, the data security system is very necessary to maintain the confidentiality of information. 


Author(s):  
Trisna Setiawaty ◽  
Olven Manahan

Ksecurity wheel is one of the essential needs of the data, or information. With this regard the importance of the information or data by the party or person of interest. This research aims to design and build a system for data security deed of sale that can help the Notary / PPAT. The system was built using the RC4 (Rivest Code 4) which is a symmetric key cryptographic algorithms and is stream cipher, there is a process of encryption and decryption. The encryption is the process of encoding the original message or plaintext into ciphertext encrypted text. While decryption is the process of encoding back to cipherteksmenjadiplainteks (original data). This research resulted in a system that is able to change the data that can be read into the data that is not easily understood.


2017 ◽  
Author(s):  
Andysah Putera Utama Siahaan

The human fingerprint always becomes the way to verify the originality of the ownership. It can be connected to the security methods to increase the security level. Hill Cipher in one of the cryptography algorithms that can attach the digital fingerprint pattern. There are several matrix sizes to implement its process. This study focuses to a 3 x 3 matrix in the application. It provides nine integer numbers to perform the encryption which determinant has already been tested before. The concept is to link the digital fingerprint pattern to produce the automatic key generator. Not all the determinant value can get the ciphertext back to the original message. A threshold is used to adjust the determinant. It produces the different numbers when to be shifted. The correct numbers will be occupied in the matrix. When the numbers are available, the cryptography process can be performed.


In this paper we have proposed energy optimization on symmetric key encryption and decryption algorithm are using genetic algorithm where we have used symmetric key for encryption & decryption algorithm that optimize the energy in terms of power consumption related to the laptop battery. This method combines the concept of a genetic algorithms and cryptography in very different ways. The algorithm uses deterministic way to generate pseudo random number and applied crossover & mutation deterministically. The algorithm exploits the features of the GA deterministically because GA is fast. Finally we got the actual message of ASCII code by using decrypted algorithms minimises power consumption relatively


2018 ◽  
Vol 41 ◽  
Author(s):  
Maria Babińska ◽  
Michal Bilewicz

AbstractThe problem of extended fusion and identification can be approached from a diachronic perspective. Based on our own research, as well as findings from the fields of social, political, and clinical psychology, we argue that the way contemporary emotional events shape local fusion is similar to the way in which historical experiences shape extended fusion. We propose a reciprocal process in which historical events shape contemporary identities, whereas contemporary identities shape interpretations of past traumas.


2020 ◽  
Vol 43 ◽  
Author(s):  
Aba Szollosi ◽  
Ben R. Newell

Abstract The purpose of human cognition depends on the problem people try to solve. Defining the purpose is difficult, because people seem capable of representing problems in an infinite number of ways. The way in which the function of cognition develops needs to be central to our theories.


1976 ◽  
Vol 32 ◽  
pp. 233-254
Author(s):  
H. M. Maitzen

Ap stars are peculiar in many aspects. During this century astronomers have been trying to collect data about these and have found a confusing variety of peculiar behaviour even from star to star that Struve stated in 1942 that at least we know that these phenomena are not supernatural. A real push to start deeper theoretical work on Ap stars was given by an additional observational evidence, namely the discovery of magnetic fields on these stars by Babcock (1947). This originated the concept that magnetic fields are the cause for spectroscopic and photometric peculiarities. Great leaps for the astronomical mankind were the Oblique Rotator model by Stibbs (1950) and Deutsch (1954), which by the way provided mathematical tools for the later handling pulsar geometries, anti the discovery of phase coincidence of the extrema of magnetic field, spectrum and photometric variations (e.g. Jarzebowski, 1960).


Author(s):  
W.M. Stobbs

I do not have access to the abstracts of the first meeting of EMSA but at this, the 50th Anniversary meeting of the Electron Microscopy Society of America, I have an excuse to consider the historical origins of the approaches we take to the use of electron microscopy for the characterisation of materials. I have myself been actively involved in the use of TEM for the characterisation of heterogeneities for little more than half of that period. My own view is that it was between the 3rd International Meeting at London, and the 1956 Stockholm meeting, the first of the European series , that the foundations of the approaches we now take to the characterisation of a material using the TEM were laid down. (This was 10 years before I took dynamical theory to be etched in stone.) It was at the 1956 meeting that Menter showed lattice resolution images of sodium faujasite and Hirsch, Home and Whelan showed images of dislocations in the XlVth session on “metallography and other industrial applications”. I have always incidentally been delighted by the way the latter authors misinterpreted astonishingly clear thickness fringes in a beaten (”) foil of Al as being contrast due to “large strains”, an error which they corrected with admirable rapidity as the theory developed. At the London meeting the research described covered a broad range of approaches, including many that are only now being rediscovered as worth further effort: however such is the power of “the image” to persuade that the above two papers set trends which influence, perhaps too strongly, the approaches we take now. Menter was clear that the way the planes in his image tended to be curved was associated with the imaging conditions rather than with lattice strains, and yet it now seems to be common practice to assume that the dots in an “atomic resolution image” can faithfully represent the variations in atomic spacing at a localised defect. Even when the more reasonable approach is taken of matching the image details with a computed simulation for an assumed model, the non-uniqueness of the interpreted fit seems to be rather rarely appreciated. Hirsch et al., on the other hand, made a point of using their images to get numerical data on characteristics of the specimen they examined, such as its dislocation density, which would not be expected to be influenced by uncertainties in the contrast. Nonetheless the trends were set with microscope manufacturers producing higher and higher resolution microscopes, while the blind faith of the users in the image produced as being a near directly interpretable representation of reality seems to have increased rather than been generally questioned. But if we want to test structural models we need numbers and it is the analogue to digital conversion of the information in the image which is required.


Sign in / Sign up

Export Citation Format

Share Document