A Review Paper on Cryptography
Ieee account.
- Change Username/Password
- Update Address
Purchase Details
- Payment Options
- Order History
- View Purchased Documents
Profile Information
- Communications Preferences
- Profession and Education
- Technical Interests
- US & Canada: +1 800 678 4333
- Worldwide: +1 732 981 0060
- Contact & Support
- About IEEE Xplore
- Accessibility
- Terms of Use
- Nondiscrimination Policy
- Privacy & Opting Out of Cookies
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.
Open Access is an initiative that aims to make scientific research freely available to all. To date our community has made over 100 million downloads. It’s based on principles of collaboration, unobstructed discovery, and, most importantly, scientific progression. As PhD students, we found it difficult to access the research we needed, so we decided to create a new Open Access publisher that levels the playing field for scientists across the world. How? By making research easy to access, and puts the academic needs of the researchers before the business interests of publishers.
We are a community of more than 103,000 authors and editors from 3,291 institutions spanning 160 countries, including Nobel Prize winners and some of the world’s most-cited researchers. Publishing on IntechOpen allows authors to earn citations and find new collaborators, meaning more people see your work not only from your own field of study, but from other related fields too.
Brief introduction to this section that descibes Open Access especially from an IntechOpen perspective
Want to get in touch? Contact our London head office or media team here
Our team is growing all the time, so we’re always on the lookout for smart people who want to help us reshape the world of scientific publishing.
Home > Books > Biometrics and Cryptography
Perspective Chapter: Cryptography – Recent Advances and Research Perspectives
Submitted: 06 May 2023 Reviewed: 15 May 2023 Published: 27 December 2023
DOI: 10.5772/intechopen.111847
Cite this chapter
There are two ways to cite this chapter:
From the Edited Volume
Biometrics and Cryptography
Edited by Sudhakar Radhakrishnan and Carlos M. Travieso-González
To purchase hard copies of this book, please contact the representative in India: CBS Publishers & Distributors Pvt. Ltd. www.cbspd.com | [email protected]
Chapter metrics overview
174 Chapter Downloads
Impact of this chapter
Total Chapter Downloads on intechopen.com
Total Chapter Views on intechopen.com
Cryptography is considered as a branch of both mathematics and computer science, and it is related closely to information security. This chapter explores the earliest known cryptographic methods, including the scytale, Caesar cipher, substitution ciphers, and transposition ciphers. Also, explains the evolution of these methods over time. The development of symmetric and asymmetric key cryptography, hash functions, and digital signatures is also discussed. The chapter highlights major historical events and technological advancements that have driven the need for stronger and more efficient encryption methods. In addition, the chapter explores the potential for integrating artificial intelligence tools with cryptographic algorithms and the future of encryption technology.
- cryptography
- mathematics
- computer science
- information security
- Caesar cipher
- substitution ciphers
- transposition ciphers
- symmetric key cryptography
- asymmetric key cryptography
- hash functions
- digital signatures
- historical events
- technological advancements
- artificial intelligence
Author Information
Monther tarawneh *.
- Computer Science Department, Isra University, Amman, Jordan
*Address all correspondence to: [email protected]
1. Introduction
Cryptography is the science converting information into an unreadable format as a practice of protecting confidential messages from unauthorized access [ 1 ]. Cryptographic algorithms have come a long way since the early days of cryptography and have evolved to keep up with the changing technological landscape. In this chapter, we will explore the history of cryptographic algorithms and their evolution over time.
The earliest known cryptographic methods date back to ancient civilizations, where methods, such as simple substitution and transposition ciphers, were used to conceal messages and prevent non-authorized people from understanding messages. These methods evolved over time to include more complex ciphers, such as the Caesar cipher and the Vigenère cipher, which were used during the Middle Ages. The development of the printing press and the subsequent increase in literacy rates led to the need for more secure methods of encryption, which led to the development of more complex ciphers such as the Playfair cipher and the Enigma machine.
Symmetric key cryptography is one of the oldest and most widely used types of encryption. It is based on the concept of using the same key to encrypt and decrypt a message. The history of symmetric key algorithms dates back to ancient times, where simple substitution ciphers were used to encrypt messages. Over time, more complex algorithms were developed such as the Hill cipher and the data encryption standard (DES). The development of the advanced encryption standard (AES) in the late twentieth century marked a significant improvement in symmetric key cryptography as it provided stronger encryption and faster processing times.
Asymmetric key cryptography, also known as public-key cryptography, is a more recent development in the field of cryptography. It is based on the use of two different keys—a public key and a private key—to encrypt and decrypt messages. The concept of asymmetric key cryptography was first introduced by Whitfield Diffie and Martin Hellman in 1976 [ 2 ]. This led to the development of various algorithms such as the Rivest-Shamir-Adleman (RSA) algorithm [ 3 ] and the Diffie-Hellman key exchange [ 4 ].
Hash functions are another important component of modern-day encryption. A hash function is a mathematical function that takes an input (or message) and produces a fixed-length output (or hash) [ 5 ]. Hash functions are used to ensure the integrity of data as any change to the original input will result in a different hash. The history of hash functions dates back to the 1950s, where the concept of message digests was introduced. Over time, more complex algorithms were developed such as the secure hash algorithm (SHA) and the message digest (MD) [ 5 , 6 ].
Digital signatures are used to provide authentication and non-repudiation in digital communications. A digital signature is a mathematical scheme for demonstrating the authenticity of a digital message or document. The history of digital signature algorithms dates back to the early 1980s, where the concept of public-key cryptography was first introduced. Over time, various algorithms were developed such as the digital signature algorithm (DSA) and the elliptic curve digital signature algorithm (ECDSA) [ 7 ].
The evolution of cryptographic algorithms has been driven by major historical events and technological advancements. With the advent of the internet and the increase in digital communication, the need for stronger and more efficient encryption methods became more pressing. As computing power continues to increase, the potential for cracking encryption algorithms also increases. This has led to the need for stronger and more advanced cryptographic algorithms, such as post-quantum cryptography, which can withstand attacks from quantum computers.
In addition to the potential threats to encryption technology, there is also the potential for integrating artificial intelligence tools with cryptographic algorithms. For example, machine learning algorithms could be used to identify potential vulnerabilities in encryption systems and improve their security.
As the digital landscape continues to evolve, the importance of staying ahead of the curve in encryption technology cannot be overstated. This chapter provides an overview of the history and evolution of cryptographic algorithms, highlighting the need for ongoing innovation and development in this field. By continuing to push the boundaries of encryption technology, we can help to safeguard the privacy and security of sensitive data in the digital age.
Encryption is a critical component of modern communication and information security [ 8 ]. By converting data into a secure format that can only be accessed with the correct key or password, encryption ensures that sensitive information is protected from unauthorized access. Throughout history, cryptography has played a significant role in the security of sensitive information from the early substitution ciphers used by ancient civilizations to the modern public-key encryption algorithms.
Recent developments in technology have led to new challenges and opportunities in the field of cryptography. The rise of quantum computing [ 9 ], blockchain technology [ 10 ], and the need for secure communication in an increasingly connected world have all driven new research and innovation in the field of cryptography [ 11 ].
This chapter provides an overview of various cryptographic techniques, including symmetric and asymmetric encryption, hashing, digital signatures, homomorphic encryption, multiparty computation, and lightweight cryptography. Each of these techniques has its own strengths and weaknesses and is suited to different use cases and scenarios. The chapter also explores the future of cryptography, including developments in post-quantum cryptography, blockchain-based cryptography, and other emerging technologies. By understanding the principles and applications of modern cryptography, we can better protect our digital assets and maintain the privacy and security of our communication.
2. Ancient cryptography methods
The history of cryptography dates back to ancient civilizations, where people used various methods to protect their messages from unauthorized access. The earliest examples of cryptography being used to protect information were found in an inscription carved around 1900 BC, in the main chamber of the tomb of the nobleman Khnumhotep II, in Egypt [ 12 , 13 ]. The inscription, known as the “Cryptography Inscription,” described a method for hiding the meaning of hieroglyphic inscriptions by using symbols to represent individual letters. The symbols were then scrambled in a specific way to make the text difficult to read. The main purpose of the “Cryptography Inscription” was not to hide the message but rather to change its form in a way that would make it appear dignified. While the symbols used in the inscription were scrambled, they were still readable by those who were familiar with the method of substitution used. It means that the inscription was intended for a specific audience who were already familiar with the method rather than as a means of keeping the message secret from all who might view it.
2.1 Substitution cipher
Monoalphabetic substitution: a basic cryptography method where each character of the plaintext is replaced with a corresponding character of cipher text. The same substitute symbol or letter is used every time a particular plaintext letter appears. For example, if “A” is substituted with “D,” every “A” in the plaintext will be replaced with “D” in the cipher text as shown in Figure 1 . This makes it vulnerable to frequency analysis attacks as the frequency of each letter in the cipher text will correspond to the frequency of the original letters in the plaintext. Therefore, it is considered a weak encryption method and is no longer used for serious cryptographic applications. However, it can still be used as a simple way to obscure text such as in puzzles or games.
One of the earliest examples of a monoalphabetic substitution cipher is the Caesar cipher, which was used by Julius Caesar to communicate secretly with his generals. In this cipher, each letter in the plaintext is shifted a certain number of places down the alphabet. For example, if the shift value is three, then the letter A is replaced by D, B is replaced by E, and so on shown in Figure 2 . The recipient of the message would need to know the shift value to decrypt the message.
Another example of a monoalphabetic substitution cipher is the simple substitution cipher in which each plaintext letter is replaced by a corresponding symbol or letter from a fixed substitution pattern. Unlike the Caesar cipher, the substitution pattern for the simple substitution cipher is not based on a fixed shift value. Instead, the substitution pattern is usually chosen randomly or based on a key provided to the recipient.
Despite being simple to implement, monoalphabetic substitution ciphers are not secure by today’s standards as it makes it easier for an attacker to crack the code.
Polyalphabetic substitution: It is made up of multiple monoalphabetic substitutions. In this method, a series of monoalphabetic substitutions are performed on the plaintext, using different substitution alphabets for each letter of the plaintext. This helps to make the ciphertext more difficult to crack as the same plaintext letter can be encrypted in different ways depending on its position in the message.
Vigenère cipher is the most known polyalphabetic substitution, which was invented in the sixteenth century and used by the French military for several centuries [ 14 ]. The Vigenère cipher uses a series of different alphabets, each generated by shifting the previous alphabet by one letter. The cipher is implemented using the Vigenère square (or table), which is made up of twenty-six distinct cipher alphabets as shown in Figure 3 . In the header row, the alphabet is written in its normal order. In each subsequent row, the alphabet is shifted one letter to the right until a 26 × 26 block of letters is formed.
Monoalphabetic substitution cryptography.
Caesar cipher with 1, 2, 3, and 4 shit to the left.
Vigenère square.
Vigenère cipher can be done using the simplest way, which is similar to Caeser cipher or sophisticated way, where keyword is used for the encryption to specify the letter, the keyword is repeated over the length of the plaintext, and each letter of the keyword is used to shift the corresponding letter of the plaintext by a certain number of positions in the alphabet. For example, if you encrypt “security” using the simple way, it will be “TGFYWOAG.” But when using the sophisticated way with “IBRI” as a keyword, the cipher text will be “AFTCZJKG.” To make the cipher more secure, Vigenère suggested using a different keyword for each message rather than reusing the same keyword over and over again. He also suggested using longer keywords to make the cipher even harder to crack. However, if the length of the keyword is known, it can be easily broken using frequency analysis [ 15 ]. Figure 4 shows an example of onetime pad encryption/decryption.
onetime-pad encryption/decryption example.
The onetime pad cipher is not a type of Vigenère cipher. It is a completely different encryption method that is based on using a long, randomly generated key that is at least as long as the plaintext. The key is made up of a series of random symbols, and each symbol is used only once to encrypt one character of the plaintext. Because the key is truly random and used only once, the onetime pad cipher is considered unbreakable, provided that the key is kept secret and destroyed after use by both the sender and the receiver.
The key must be as long as the plaintext for the onetime pad to be unbreakable. Because onetime pad is based on perfect secrecy, which means that the ciphertext provides no information about the plaintext, even if the attacker has unlimited computational power.
Generating truly random keys that are as long as the plaintext is a challenging task, and transmitting them securely to the recipient is also a difficult problem. This is why the onetime pad is mostly used in special cases such as diplomatic and intelligence traffic. Also, onetime pad only guarantees confidentiality and not integrity. This means that an attacker who intercepts the ciphertext can not recover the plaintext, but they can easily modify the ciphertext to change the meaning of the message. Onetime pad requires a unique key for every message, and the keys should be securely destroyed after use to prevent reuse.
The Playfair cipher is a polygraphic substitution cipher invented in 1854 by Sir Charles Wheatstone [ 16 ]. It was the first cipher that allowed for the encryption of pairs of letters instead of single letters. The Playfair cipher uses a 5 × 5 grid of letters, with each letter of the alphabet appearing once. The letters in the grid are usually chosen using a keyword. The keyword is then written into the grid, and the remaining spaces are filled with the letters of the alphabet in order.
5 × 5 table
Skip letter J
Keyword has no repeating letter
fill in the remaining letters in alphabetic order (skip letter J)
Message must be split into pairs
Repeating plaintext letters that are in the same pair are separated with X
If there is an odd letter at the end of the message insert the letter X
Move each letter down one position
Upon reaching end of table, wrap around
Move each letter right one position
Swap the letters with the ones on the end of the rectangle
Playfair cipher steps (A: simple and B: Sophisticated).
An electromechanical machine developed in 2017 [ 17 ] that used a rotating disc with an embedded key to encode a substitution table that changed with every new character typed. This device was the first example of a rotor machine. The following year, a German engineer, invented the Enigma machine [ 18 ], which used multiple rotors instead of one. Initially designed for commercial use, the German military soon recognized the potential of the Enigma machine and began using it to send coded transmissions.
2.2 Transposition cipher
Transposition cipher is an earlier method, where the letters of the message are rearranged according to a certain pattern, but the letters themselves are not changed as shown in Figure 6 . Unlike substitution ciphers, which replace plaintext characters with different symbols or letters, transposition ciphers do not change the characters themselves. Instead, they simply reorder the characters to create a new message. The security of a transposition cipher is based on the difficulty of reconstructing the original message from the reordered characters without knowledge of the used transposition algorithm.
Transposition cipher example.
The Rail Fence cipher is a type of transposition cipher that was first used during the American Civil War. The technique involves writing the plaintext diagonally on a grid, then reading the letters in a zigzag pattern along the rows of the grid to produce the ciphertext. The number of rows in the grid can be adjusted to increase the complexity of the cipher.
For example, suppose we want to encrypt the message “HELLO WORLD” using a Rail Fence cipher with three rows. Write the letters on a grid as shown in Figure 7 .
Rail Fence encryption example.
To decrypt the message, we would write the ciphertext diagonally on a grid, then read the letters in the same zigzag pattern along the rows of the grid to recover the plaintext.
While these ancient methods of cryptography may seem primitive by today’s standards, they laid the foundation for the development of more complex encryption techniques in the future. The principles of substitution and transposition ciphers are still used in modern cryptography, and the need for secure communication continues to drive the evolution of cryptographic algorithms.
3. Symmetric key cryptography
Symmetric key cryptography schemes are categorized as stream ciphers or block ciphers. Stream ciphers work on a single bit at a time and execute some form of feedback structure so that the key is repeatedly changing. A block cipher encrypts one block at a time utilizing the same key on each block. In general, the same plaintext block will continually encrypt to the same ciphertext when using the similar key in a block cipher, whereas the same plaintext will encrypt to different ciphertext.
The history of symmetric key cryptography can be traced back to the days of Julius Caesar, who used a simple substitution cipher to protect his military communications. Over time, various types of symmetric key encryption algorithms were developed, such as the Vigenère cipher, which used a polyalphabetic substitution method, and the Enigma machine, which used a combination of substitution and transposition methods.
3.1 Data encryption standard (DES)
Initial permutation (IP): The 64-bit input plaintext is shuffled (rearranged) according to a fixed permutation table to produce the permuted input. The initial permutation and its inverse are defined by tables that indicate the position of each bit in the input to the output as shown in Figure 8 . The permutation tables are used in the encryption and decryption processes to rearrange the bits of the input according to the specified permutation.
Separation: The left and right halves of each 64-bit intermediate value are treated as separate 32-bit quantities, labeled L (left) and R (right).
Expansion: The input key for each round is 48 bits and the right side (R) is 32 bits. In order to XOR Ki with Ri, we need to expand the length of Ri to 48 bits. The expansion table in Figure 10 is used for this purpose.
The 64-bit key is permuted using a fixed permutation called the permutation choice 1 (PC-1) as shown in Figure 11 . The output of this step is a 56-bit key, where eight of the bits are parity bits and are not used in the encryption process.
The 56-bit key is then split into two 28-bit halves, C0 and D0.
Each of the halves is subjected to a series of circular shifts or rotations. In particular, for rounds 1, 2, 9, and 16, the shifts are one bit, while for all other rounds, the shifts are two bits.
After each shift, the two halves are combined to form a 56-bit value, which is then permuted using a fixed permutation called the permutation choice 2 (PC-2) as shown in Figure 9 . The output of this step is a 48-bit subkey.
This process is repeated for each round of the encryption process, resulting in a total of 16 subkeys.
The subkeys are used in the encryption process as inputs to the round function, which combines them with the plaintext to produce the ciphertext.
Substitution: This 48-bit result passes through a substitution function that produces a 32-bit output. The S-boxes, also known as substitution boxes, are the only nonlinear elements in the DES design. The S-boxes are used to introduce confusion in the ciphertext by replacing each block of 6 bits of the input with a different 4-bit output. There are 8S-boxes in DES as shown in Figure 12 , each taking a 6-bit input and producing a 4-bit output. Each row of an S-box defines a substitution for a specific 4-bit input value, while the column of the S-box defines the output value for that input value based on the remaining 2 bits of the input. This allows for a total of 16 x 4 = 64 possible substitutions in each S-box.
Permutation: The 32-bit outputs from the S-boxes are then concatenated and subjected to a fixed permutation using the P-box permutation.
Final permutation (IP-1): The pre-output is shuffled according to another fixed permutation table, which is the inverse of the initial permutation, to produce the 64-bit cipher text. The figure shows the internal structure of a single round.
The initial permutation and its inverse.
Internal structure of single round.
Expansion permutation table.
Tables used in subkeys generation.
S-boxes used in the substitution step in DES.
The main steps summarized in Figure 13 . The DES key generates 48 bits long 16 round keys from the initial 56 bit key. These keys are used in each round of the encryption process to modify the plaintext. The key involves applying a series of operations, including a permutation, a compression function, and left shifts, to the 56-bit key. The resulting subkeys are used one at a time in each round of the encryption process.
DES Algorithm steps.
However, due to its small key size, DES is now considered insecure [ 19 ] and has been replaced by the advanced encryption s (AES).
The plaintext is encrypted using the first 56-bit key (K1) with the DES algorithm to produce a ciphertext.
The ciphertext from step 1 is decrypted using the second 56-bit key (K2) with the DES algorithm to produce an intermediate value.
The intermediate value from step 2 is encrypted again using the third 56-bit key (K3) with the DES algorithm to produce the final ciphertext.
Thus, 3DES involves encrypting the plaintext with K1, decrypting the result with K2, and encrypting again with K3. The three keys K1, K2, and K3 are usually independent keys generated randomly, although some variants of 3DES use a “keying option” that allows for fewer keys to be used while still maintaining a higher level of security.
While 3DES is slower than DES due to its triple encryption process, it is still considered a relatively fast algorithm and can be implemented in hardware, as well as software. Also, due to its small key size, DES is now considered insecure [ 19 ] and has been replaced by the advanced encryption standard (AES).
3.2 Advanced encryption standard (AES)
The AES (Advanced Encryption Standard) is a symmetric block cipher that operates on fixed-size 128-bit blocks and supports key sizes of 128, 192, and 256 bits. It was standardized by NIST (National Institute of Standards and Technology) in 2001 as a replacement for the aging DES (Data Encryption Standard) cipher.
The AES was selected from a pool of 15 candidate algorithms that were submitted in response to a call for proposals issued by NIST in 1997 [ 21 ]. The selection process involved several rounds of analysis and testing, culminating in the selection of Rijndael [ 22 ], a cipher developed by Belgian cryptographers Joan Daemen and Vincent Rijmen, as the winner.
The AES encryption and decryption algorithms use a series of rounds, where all operations are performed on 8-bit bytes (one Word) ( Figure 14 ). Each round of processing works on the input state array and produces an output state array. The output state array produced by the last round is rearranged into a 128-bit output block. The state array is a 4 × 4 matrix of bytes that represents the input block. Each round, the state array is modified by a series of operations that include byte substitution, permutation, and arithmetic operations over a finite field as shown in the figure below. After the final round, the state array contains the encrypted or decrypted data, which are then copied to an output matrix to produce the final ciphertext or plaintext block.
The structure of AES algorithm.
SubBytes : The substitute bytes stage of AES uses a fixed S-box, which is a 256-byte lookup table, to perform a byte-by-byte substitution of the input block. The S-box is designed so that each input byte is replaced by a unique output byte. The inverse S-box is used in the decryption process, which maps each output byte back to its original input byte. The S-box is a nonlinear component of the AES algorithm, which helps to increase the resistance of the cipher to various attacks. For example, 19 will be mapped to the value crossed between row 1 and column 9, which is equal to D4 in the S-Box as shown in Figure 15 .
ShiftRows : The shiftRows stage is a permutation step that cyclically shifts the bytes in each row of the state array by a certain number of bytes. This operation is applied to each row independently, with no mixing of the bytes between the rows. The number of bytes shifted is determined by the row number: the first row is not shifted at all, the second row is shifted by one byte to the left, the third row is shifted by two bytes to the left, and the fourth row is shifted by three bytes to the left as shown in Figure 16 .
This operation provides diffusion of the input data, which increases the security of the cipher. The inverse operation, used for decryption, is a cyclic shift to the right instead of the left so that the original byte positions are restored.
MixColumns : each column of the state array is treated as a polynomial over the finite field GF(2^8), where each byte is a coefficient of the polynomial. The bytes are then multiplied by a fixed polynomial, and the result is reduced modulo another fixed polynomial. This transformation ensures that each byte in a column is dependent on all four bytes in the same column as demonstrated in Figure 17 .
The multiplication and reduction are done using a pre-computed table of values. The table is constructed in such a way that multiplication is reduced to a simple table lookup and XOR operation.
During decryption, the inverse operation of MixColumns is performed. This involves multiplying each column by a different fixed polynomial and reducing the result modulo another fixed polynomial.
AddRoundkey : Each byte of the current block is XORed with the corresponding byte of the round key. The round key is derived from the main encryption key using a key schedule algorithm, which generates a set of round keys for each round of encryption. This stage serves to add a layer of confusion to the encryption process, making it more difficult to analyze and break the cipher. Figure 18 describe the AddRoundkey process in AES.
S-Box used in AES.
ShiftRows operation and its output (with example).
Mix column function.
Description of the AddRoundkey in AES.
The AES key expansion algorithm takes as input a 128-bit (16-byte) key and generates a sequence of round keys, one for each round of the AES encryption process. The key expansion algorithm uses a key schedule to generate these round keys, which involves performing a series of operations on the input key to generate an expanded key.
The key schedule begins by copying the input key into the first four words of the key schedule. Then, the key expansion algorithm applies a series of operations to the last four words of the current key schedule to generate the next four words. This process is repeated until the key schedule contains the necessary number of round keys for the specified key size. For example, for a 128-bit key, the key schedule will generate 11 round keys, one for each of the 10 rounds of AES encryption plus an initial round key. For a 192-bit key, the key schedule will generate 13 round keys, and for a 256-bit key, the key schedule will generate 15 round keys.
RotWord performs a one-byte circular left shift on a word.
SubWord performs a byte substitution on each byte of its input word, using the S-box.
The result of steps 1 and 2 is XORed with a round constant, Rcon[j].
The values of Rcon[j] in hexadecimal.
The AES cipher is widely used in various applications, including secure communications, data storage, and authentication. Its security has been extensively analyzed, and it is considered to be highly secure against various types of attacks.
3.3 More symmetric algorithms
Blowfish [ 23 ]: A symmetric key block cipher that uses variable-length keys (up to 448 bits) and a block size of 64 bits. Blowfish is widely used in cryptographic applications and is known for its fast encryption and decryption speed.
Twofish [ 24 ]: A symmetric key block cipher that is a successor to Blowfish. It uses a block size of 128 bits and supports key sizes up to 256 bits. Twofish is considered a strong and secure encryption algorithm but is slower than some other algorithms.
Rivest Cipher 4 (RC4) [ 25 ]: A symmetric key stream cipher that is widely used in wireless networks, secure socket layer (SSL), and other applications. RC4 uses a variable-length key (up to 2048 bits) to generate a stream of pseudo-random bytes, which are XORed with the plaintext to produce the ciphertext. However, RC4 has been found to be vulnerable to attacks and is now considered insecure for many applications.
3.4 Mode of operation
Since block ciphers operate on fixed-size blocks of data, they cannot be directly used to encrypt or decrypt messages that are larger than the block size. A mode of operation is a technique used to apply a block cipher to encrypt or decrypt data that is larger than the block size of the cipher.
Modes of operation are used to overcome this limitation by allowing the encryption or decryption of data that is larger than the block size of the cipher. These modes provide methods to break up the input message into blocks, and then apply the block cipher to each block. This process is typically performed using feedback mechanisms that generate input for each subsequent block, based on the output of the previous block.
Electronic codebook (ECB): This is the simplest mode of operation, where each block of plaintext is encrypted independently with the same key as shown in Figure 20 . However, it is not suitable for encrypting large amounts of data or data with a predictable structure. It suffers from the lack of diffusion, which means that identical plaintext blocks will result in identical ciphertext blocks. This makes it vulnerable to attacks as patterns in the plaintext can be easily observed in the ciphertext. For example, an image encrypted with ECB mode will have visible patterns and blocks, making it easy for an attacker to identify certain parts of the image even without decrypting it. Therefore, it is not recommended to use ECB mode for encrypting lengthy messages or sensitive data.
Cipher block chaining (CBC): The cipher block chaining (CBC) mode of operation addresses the issue of repetitive plaintext blocks in ECB mode. This mode XORs each plaintext block with the previous ciphertext block before encryption as shown in Figure 21 . This helps to provide diffusion and makes the encryption process more secure than ECB. Itis worth noting that the sequential nature of CBC encryption can also be an advantage in some cases as it provides a natural form of authentication. If a ciphertext block is corrupted or modified during transmission, the corresponding plaintext block will be affected, and the error will propagate through the rest of the decryption process, making it easier to detect tampering.
However, one-bit change in a plaintext or IV affects all following ciphertext blocks can also be a weakness. This can make it difficult to implement certain types of secure communications protocols such as those that require random access to encrypted data. Additionally, CBC requires a secure and unpredictable initialization vector (IV) for each message, which can be challenging to generate and transmit securely in some scenarios. Finally, as with any mode of operation that relies on a shared secret key, CBC is vulnerable to attacks that exploit weaknesses in the underlying block cipher or key management protocols.
Cipher feedback (CFB): In this mode, the block cipher is used as a feedback mechanism to create a stream cipher. The plaintext is XORed with the output of the block cipher, and the result is encrypted to produce the ciphertext as shown in Figure 22 . This mode allows for variable-length plaintext and provides a self-synchronizing stream cipher. The initial value is called the initialization vector (IV), and it is used to seed the process. The size of the shift registers determines the amount of feedback. For example, if s = 8, the encryption process operates on an 8-bit subset of the plaintext block at a time. If s = n, then the entire plaintext block is used at once.
One advantage of CFB mode is that it allows for error propagation to be contained. If a bit error occurs during transmission, only the block that contains the error is affected. The other blocks remain unchanged. However, one disadvantage of CFB mode is that it is sequential, which means that it cannot be parallelized.
Output feedback (OFB): OFB mode operates on full blocks of plaintext and ciphertext such as other block cipher modes of operation. However, instead of encrypting the plaintext, the block cipher is used to encrypt an IV to generate a keystream. The keystream is then XORed with the plaintext to produce the ciphertext. The key stream is generated independently for each block, so the encryption and decryption can be parallelized as shown in Figure 23 . The main difference between OFB and CFB is that OFB generates a key stream that is independent of the plaintext, while CFB uses the ciphertext as feedback to generate the key stream.
Counter (CTR): This mode encrypts a counter value with a block cipher to produce a keystream, which is then XORed with the plaintext to produce the ciphertext. This mode is similar to OFB, but it allows for parallel encryption and decryption and can be used for random. The counter is incremented for each block of plaintext, and the resulting keystream is used to encrypt that block, see Figure 24 . The advantage of the CTR mode is that it allows for parallel encryption and decryption of blocks since the keystream is generated independently of the plaintext or ciphertext. This can lead to significant speed improvements over other modes, particularly for large messages.
One potential drawback of CTR mode is the need to ensure that the counter values are never repeated as this could compromise the security of the encryption. This can be achieved by using a unique counter value for each block of plaintext, for example by using a nonce (a number used only once) as part of the counter value.
ECB mode encryption.
CBC mode encryption.
CFB mode encryption.
OFB mode encryption.
Counter mode encryption.
4. Asymmetric key cryptography
Asymmetric key cryptography, also known as public-key cryptography, is a cryptographic system that uses a pair of keys to encrypt and decrypt data. The pair of keys consists of a public key, which is known to everyone, and a private key, which is kept secret by its owner. The public key is used for encrypting the data, while the private key is used for decrypting the data. Unlike symmetric key cryptography, where the same key is used for both encryption and decryption, in asymmetric key cryptography, the two keys are mathematically related, but it is computationally infeasible to derive the private key from the public key.
The main advantage of asymmetric key cryptography is that it provides a secure method of communication between two parties without the need for a pre-shared secret key. Asymmetric key cryptography is used in many applications, including digital signatures, key exchange, and encryption of sensitive data.
Some examples of asymmetric key cryptographic algorithms include RSA [ 26 ], Diffie-Hellman [ 27 ], and elliptic curve cryptography (ECC) [ 28 ]. These algorithms are widely used in various applications, including secure communication, digital signatures, and online transactions [ 29 ].
RSA is a widely used public-key cryptosystem. It is been named after its inventors Ron Rivest, Adi Shamir, and Leonard Adleman. Its security is based on the difficulty of factoring large integers, which serves as the foundation for its mathematical operation. RSA has been used for over four decades and is still considered a secure and practical public-key cryptosystem. RSA involves the generation of a public and a private key pair. The public key is distributed to others, while the private key is kept secret. The public key can be used to encrypt messages that only the owner of the private key can decrypt.
Choose two large prime numbers p and q.
Calculate n = p * q and φ(n) = (p−1) * (q−1).
Choose an integer e such that 1 < e < φ(n) and gcd(e, φ(n)) = 1. This value is called the public exponent.
Compute d, the multiplicative inverse of e modulo φ(n). This value is called the private exponent.
Represent the plaintext M as a positive integer less than n.
Compute the ciphertext C as C = Me mod n.
Decryption : Compute the plaintext M as M = Cd mod n.
The security of RSA is based on the difficulty of factoring large composite numbers into their prime factors. Breaking RSA encryption requires factoring the modulus n into its two prime factors p and q, which is a computationally intensive task for large values of n. Therefore, the security of RSA increases as the size of the keys and the modulus increase.
4.2 Diffie-Hellman
Diffie-Hellman (DH) is a key exchange algorithm that allows two parties to establish a shared secret key over an insecure channel. It was developed by Whitfield Diffie and Martin Hellman in 1976 and is based on the discrete logarithm problem in modular arithmetic.
Alice and Bob publicly agree on a large prime number p and a primitive root of p, denoted by g.
Alice randomly chooses a secret integer a and calculates A = g^a mod p. She sends A to Bob.
Bob randomly chooses a secret integer b and calculates B = g^b mod p. He sends B to Alice.
Alice computes the shared secret key as K = B^a mod p.
Bob computes the shared secret key as K = A^b mod p.
Alice and Bob now have a shared secret key that can be used for symmetric encryption.
The security here relies on the fact that computing the discrete logarithm of g mod p is computationally infeasible. This means that an attacker who intercepts A and B cannot calculate a or b, and therefore cannot compute the shared secret key K.
The DH algorithm can be used for secure communication by combining it with a symmetric encryption algorithm. The shared secret key derived using DH is used as the key for the symmetric encryption algorithm, providing confidentiality for communication. Widely used in many cryptographic protocols such as Secure Socket Layer (SSL)/Transport Layer Security (TLS), Secure Shell Protocol (SSH), and Virtual private networks (VPNs) [ 31 , 32 ]. However, it does not provide authentication [ 32 ], and therefore a man-in-the-middle attack is possible if the channel is not authenticated. To address this issue, DH is often used in combination with digital signatures or other authentication mechanisms [ 33 ].
5. Hash functions
Deterministic: The same input should always produce the same output.
Uniform: The output should appear to be random and uniformly distributed, even if the input has patterns or biases.
One-way: It should be computationally infeasible to derive the input data from the hash value.
Collision-resistant: It should be computationally infeasible to find two different input values that produce the same hash output.
Hash functions are commonly used in various security applications such as password storage, digital signatures, and message authentication codes.
6. Digital signatures
Digital signatures are used to ensure the authenticity, integrity, and non-repudiation of a digital document or message. The process of creating a digital signature involves applying a mathematical algorithm to the message or document using the signer’s private key. The resulting value, known as the signature, is unique to both the message and the signer’s private key.
The receiver of the message or document can verify the signature using the signer’s public key, which confirms that the message was indeed sent by the signer and that it has not been altered since it was signed.
Digital signatures can be used in a variety of applications, including software updates, online transactions, and legal documents. They provide a means of verifying the identity of the sender, ensuring the integrity of the message or document, and preventing the sender from denying that they sent the message or document .
7. Future of cryptography
Cryptography has come a long way since its early beginnings, and it continues to play a critical role in securing our digital world today. The advancement of technology has led to more complex and sophisticated encryption methods, which have become essential for protecting sensitive information such as financial transactions, personal data, and confidential communication. With the rise of the internet and mobile technology, cryptography has become more important than ever. It is used in everything from e-commerce to social media to secure online communication [ 34 ]. As technology continues to evolve, so will the field of cryptography, and new techniques and algorithms will be developed to stay ahead of emerging threats. The future of cryptography holds great promise as researchers work to develop quantum-resistant encryption and new methods for securing blockchain technology. As we rely more and more on digital communication and storage, the role of cryptography in securing our data will only become more critical.
7.1 Quantum cryptography
Quantum computers have the potential to break many of the current cryptographic schemes that rely on the difficulty of certain mathematical problems [ 35 ]. Quantum cryptography aims to develop new cryptographic schemes that are resistant to attacks by quantum computers [ 36 ]. It makes use of the principles of quantum mechanics to provide a high level of security. Also, uses quantum mechanical properties to protect information in transit.
In traditional cryptography, the security of the system relies on the complexity of mathematical algorithms, while in quantum cryptography, the security relies on the laws of physics. Specifically, quantum cryptography uses the principle of quantum entanglement, which involves the correlation of quantum states between two particles.
The most widely known application of quantum cryptography is quantum key distribution (QKD) [ 37 ]. QKD is a protocol that enables two parties to establish a shared secret key that is completely secure against eavesdropping, even by an attacker with unlimited computing power. QKD works by transmitting a series of quantum states, or qubits, between two parties, typically named Alice and Bob. The qubits are generated using a laser and a polarizer. Alice sends a random sequence of polarizations to Bob, who measures the polarizations using his own set of polarizers. By comparing the polarizations, Alice and Bob can detect the presence of an eavesdropper.
There are many challenges to overcome before quantum cryptography can be widely adopted. One of the main challenges is the difficulty of building practical quantum cryptography systems, which require precise control of the quantum states involved. Additionally, there is a need for more research in quantum computing, as well as a need for new protocols that can be used to secure communications in different contexts.
7.2 Homomorphic encryption
Homomorphic encryption is another type of encryption that allows computation to be performed on ciphertext [ 38 ], which means that data can be encrypted and manipulated without the need to decrypt it first. In other words, it enables computations to be performed on data without revealing the data itself. This is a significant breakthrough in the field of cryptography as it allows for secure computation and data analysis without compromising privacy [ 39 ]. Homomorphic encryption has numerous applications in various fields such as finance, healthcare, and cloud computing [ 40 ]. For instance, it can be used to perform secure data analysis on sensitive data [ 41 ], such as medical records, without the need to reveal the data to unauthorized parties. It can also be used in cloud computing to protect data privacy while still allowing for secure computation in the cloud.
7.3 Block chain cryptography
Blockchain-based cryptography is a critical component of blockchain technology, which is widely used in various fields such as finance, healthcare, and supply chain management [ 42 ]. it is a distributed ledger that records transactions in a secure and transparent manner. Cryptography is used in blockchain to ensure the confidentiality, integrity, and authenticity of data stored in the blockchain network.
One of the essential cryptographic techniques used in blockchain is the digital signature. A digital signature is a mathematical scheme that validates the authenticity and integrity of a message or data. Digital signatures are used to verify transactions in the blockchain network, ensuring that the sender is the actual owner of the assets and preventing any tampering of the data [ 42 ].
Another critical cryptographic technique used in the blockchain is hash functions. Hash functions are used to create a unique digital fingerprint of data stored in the blockchain network. This unique digital fingerprint, also known as a hash value, ensures that the data is tamper-proof and cannot be altered without being detected.
Blockchain technology also employs public-key cryptography, which is a cryptographic technique that uses a pair of keys, one public and one private. Public keys are used to encrypt data, while private keys are used to decrypt data. This technique ensures the confidentiality and security of data stored in the blockchain network.
Blockchain-based cryptography plays a vital role in ensuring the security and transparency of data stored in the blockchain network. As blockchain technology continues to evolve, we can expect to see new cryptographic techniques and algorithms that will further enhance the security and efficiency of blockchain-based applications.
7.4 Multiparty computation
Multiparty computation (MPC) is a cryptographic technique that enables a group of parties to jointly compute a function on their private inputs, without revealing those inputs to each other or to any third party. This technique allows parties to collaborate and compute a result without sharing their individual data, which can be particularly useful in scenarios where data privacy is critical, such as in financial transactions or medical research [ 43 ].
Each party inputs its private data into the system, which then generates a shared output based on the combined inputs of all parties. The protocol ensures that no individual party can learn anything about the private inputs of any other party, and the final output is only known to those parties who have contributed inputs.
MPC has many practical applications, including secure auctions, electronic voting systems, and privacy-preserving data analysis. However, it can be computationally expensive, especially when the number of parties and the complexity of the function being computed increase. Despite these challenges, MPC is a powerful tool for achieving secure collaboration and computation among multiple parties [ 44 ].
7.5 Lightweight cryptography
Lightweight cryptography refers to a subset of cryptographic algorithms that are specifically designed to operate efficiently on low-resource devices such as smart cards, RFID tags, and wireless sensor nodes. These devices often have limited processing power, memory, and energy resources, making it challenging to implement traditional cryptographic algorithms on them. Lightweight cryptography aims to address these challenges by developing cryptographic algorithms that have low computational and memory requirements, while still providing a reasonable level of security.
The development of lightweight cryptography has become increasingly important with the proliferation of the Internet of Things (IoT) and other low-power, low-cost devices. These devices are becoming more prevalent in our daily lives, and many of them require secure communication and authentication. Lightweight cryptography can provide a practical and efficient solution for securing these devices, without sacrificing security. Some examples of lightweight cryptography algorithms include SIMON and SPECK block ciphers, which were designed by the National Security Agency (NSA) for use in constrained environments. Another example is the lightweight version of the advanced encryption standard (AES), known as AES-Lite. These algorithms have been adopted by various standardization bodies and are widely used in industry for securing low-resource devices.
8. Conclusions
Cryptography is a critical aspect of modern information security. It has evolved significantly over time, from basic substitution ciphers to sophisticated algorithms that provide secure communication and transactions. Today, we have various types of cryptographic schemes, including symmetric and asymmetric encryption, hash functions, digital signatures, homomorphic encryption, and multiparty computation. The development of lightweight cryptography has also enabled secure communication and transactions on low-power devices such as IoT devices. As technology continues to advance, the field of cryptography will play an increasingly vital role in ensuring secure communication and transactions in an interconnected world. The future of cryptography is exciting and promising, and we can expect to see more innovations that will enhance the security and privacy of our digital world.
- 1. Bruce S. Applied cryptography: protocols, algorithms, and source code in C. 2nd ed. Hoboken, New Jersey: John Wiley & Sons; 1996
- 2. Diffie W, Hellman ME. Multiuser cryptographic techniques. In: Proceedings of the June 7-10, 1976, national computer conference and exposition. ACM Digital Library; 1976. pp. 109-112
- 3. Blakley GR, Borosh I. Rivest-Shamir-Adleman public key cryptosystems do not always conceal messages. Computers & Mathematics with Applications. 1979; 5 :169-178
- 4. Rescorla E. Diffie-Hellman Key Agreement Method. 2070-1721, 1999
- 5. Sobti R, Geetha G. Cryptographic hash functions: A review. International Journal of Computer Science Issues (IJCSI). 2012; 9 :461
- 6. Rogaway P, Shrimpton T. Cryptographic hash-function basics: Definitions, implications, and separations for preimage resistance, second-preimage resistance, and collision resistance. In: FSE, 2004, Lecture Notes in Computer Science. Vol. 3017. Springer Verlag; 2004. pp. 371-388
- 7. Menezes AJ, van Oorschot PC, Vanstone SA. Handbook of applied cryptography (202101 ed.). 2021; 1 :1-810
- 8. Wong D. Real-world cryptography. Shelter Island, NY: Manning Publications; 2021
- 9. Chaubey NK, Prajapati BB. Quantum cryptography and the future of cyber security. Hershey, PA: IGI Global; 2020. DOI: 10.4018/978-1-7998-2253-0
- 10. Poongothai T, Jayarajan K, Rajeshkumar G, Patra P. Blockchain technology in healthcare applications. Journal of Critical Reviews. 2020; 7 :8701-8707
- 11. Bertaccini M. Cryptography algorithms: A guide to algorithms in blockchain, quantum cryptography, zero - knowledge protocols, and omomorphic encryption. Birmingham, UK: Packt Publishing, Limited; 2022. DOI: 10.1007/978-183882-844-4
- 12. Singh S. The Code Book. Vol. 7. New York: Doubleday; 1999
- 13. Davies D. A brief history of cryptography. Information Security Technical Report. 1997; 2 :14-17
- 14. Mendelsohn CJ. Blaise de Vigenère and the “Chiffre Carré”. In: Proceedings of the American Philosophical Society. 1940; 83 (4):103-129
- 15. Schrödel T. Breaking short Vigenère ciphers. Cryptologia. 2008; 32 :334-347
- 16. Wade NJ. Charles Wheatstone (1802–1875). ed: SAGE Publications ed. Vol. 31. London, England: Sage UK; 2002. pp. 265-272
- 17. Kruh L. Cipher equipment. Cryptologia. 1977; 1 :143-149
- 18. Smart NP, Smart NP. The enigma machine. Cryptography Made Simple. 2016; 64 (2):133-161
- 19. Sidhu A. Analyzing modern cryptography techniques and reviewing their timeline. Security and Communication Networks. 2023; 10 :1-18
- 20. Stamp M. Information security: principles and practice. Hoboken, NJ: John Wiley & Sons; 2011
- 21. Smid ME. Development of the advanced encryption standard. Journal of Research of the National Institute of Standards and Technology. 2021; 126 :1-18
- 22. Daemen J, Rijmen V. AES proposal: Rijndael. National Institute of Standards and Technology; 1999
- 23. Schneier B. Description of a new variable-length key, 64 bit block cipher (Blowfish). In: Fast Software Encryption: Cambridge Security Workshop Cambridge, UK, December 9 11, 1993 Proceedings. Berlin, Heidelberg: Springer; 2005. pp. 191-204
- 24. Schneier B. The twofish encryption algorithm. Dr Dobb's Journal: Software Tools for the Professional Programmer. 1998; 23 :30-34
- 25. Rivest RL. The RC4 encryption algorithm, 1992. Vol. 25. RSA Data Security Inc.; 2016. pp. 1-23.
- 26. Rivest RL, Shamir A, Adleman L. A method for obtaining digital signatures and public-key cryptosystems. Communications of the ACM. 1978; 21 :120-126
- 27. Hellman M. New directions in cryptography. IEEE Transactions on Information Theory. 1976; 22 :644-654
- 28. Lenstra HW. Factoring integers with elliptic curves. Annals of Mathematics. 1987; 126 (3):649-673
- 29. Pachghare V. Cryptography and information security. Noida, Uttar Pradesh, India: PHI Learning Pvt. Ltd.; 2019
- 30. Katz J, Lindell Y. Introduction to modern cryptography. Boca Raton, FL: CRC Press; 2020
- 31. Li Y. Design and analysis of cryptographic protocols [Dissertation], 2015. Bochum: Ruhr-Universität Bochum; 2016
- 32. Carts DA. A review of the Diffie-Hellman algorithm and its use in secure internet protocols. SANS Institute; 2001; 751 :1-7
- 33. Medina R III. Systems and Methods for Digital Signature Detection. ed: Google Patents ed. 2015
- 34. Tarawneh M, AlZyoud F, Sharrab Y, Kanaker H. Secure E-health framework in cloud-based environment. In: 2022 International Arab Conference on Information Technology (ACIT). IEEE; 2022. pp. 1-5
- 35. Subramani S, Svn SK. Review of security methods based on classical cryptography and quantum cryptography. Cybernetics and Systems. 2023; 54 (1):1-19
- 36. Mavroeidis V, Vishi K, Zych MD, Jøsang A. The impact of quantum computing on present cryptography. arXiv Preprint arXiv:1804.00200. 2018
- 37. Renner R. Security of quantum key distribution. International Journal of Quantum Information. 2008; 6 :1-127
- 38. Lauter KE, Dai W, Laine K. Protecting privacy through homomorphic encryption. Cham, Switzerland: Springer; 2022
- 39. Doan TVT, Messai M-L, Gavin G, Darmont J. A survey on implementations of homomorphic encryption schemes. The Journal of Supercomputing. 2023; 79 :15098-15139
- 40. Chatterjee A, Aung KMM. Fully homomorphic encryption in real world applications. Singapore: Springer; 2019
- 41. Viand A, Knabenhans C, Hithnawi A. Verifiable fully homomorphic encryption. arXiv Preprint arXiv:2301.07041. 2023
- 42. Bolfing A. Cryptographic Primitives in Blockchain Technology: A Mathematical Introduction. New York, USA: Oxford University Press; 2020
- 43. Goldreich O. Secure multi-party computation. Manuscript. Preliminary version. 1998; 78 :1-78
- 44. Darby ML, Nikolaou M. MPC: Current practice and challenges. Control Engineering Practice. 2012; 20 :328-342
© 2023 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Continue reading from the same book
Edited by Sudhakar Radhakrishnan and Carlos M. Travieso-Gonzalez
Published: 19 June 2024
By Sudhakar Radhakrishnan and Sherine Jenny Rajan
40 downloads
By Asma’a Al-Hakimi, Muhammad Ibrahim Ravi Bin Gobi a...
46 downloads
By Gideon Samid
63 downloads
IntechOpen Author/Editor? To get your discount, log in .
Discounts available on purchase of multiple copies. View rates
Local taxes (VAT) are calculated in later steps, if applicable.
Support: [email protected]
Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.
- View all journals
- Explore content
- About the journal
- Publish with us
- Sign up for alerts
- Perspective
- Published: 11 May 2022
Transitioning organizations to post-quantum cryptography
- David Joseph ORCID: orcid.org/0000-0002-6040-4061 1 ,
- Rafael Misoczki 2 ,
- Marc Manzano 1 ,
- Joe Tricot 1 ,
- Fernando Dominguez Pinuaga 1 ,
- Olivier Lacombe 2 ,
- Stefan Leichenauer 1 ,
- Jack Hidary 1 ,
- Phil Venables 2 &
- Royal Hansen 2
Nature volume 605 , pages 237–243 ( 2022 ) Cite this article
10k Accesses
135 Altmetric
Metrics details
- Applied mathematics
Quantum computers are expected to break modern public key cryptography owing to Shor’s algorithm. As a result, these cryptosystems need to be replaced by quantum-resistant algorithms, also known as post-quantum cryptography (PQC) algorithms. The PQC research field has flourished over the past two decades, leading to the creation of a large variety of algorithms that are expected to be resistant to quantum attacks. These PQC algorithms are being selected and standardized by several standardization bodies. However, even with the guidance from these important efforts, the danger is not gone: there are billions of old and new devices that need to transition to the PQC suite of algorithms, leading to a multidecade transition process that has to account for aspects such as security, algorithm performance, ease of secure implementation, compliance and more. Here we present an organizational perspective of the PQC transition. We discuss transition timelines, leading strategies to protect systems against quantum attacks, and approaches for combining pre-quantum cryptography with PQC to minimize transition risks. We suggest standards to start experimenting with now and provide a series of other recommendations to allow organizations to achieve a smooth and timely PQC transition.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
24,99 € / 30 days
cancel any time
Subscribe to this journal
Receive 51 print issues and online access
185,98 € per year
only 3,65 € per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
An approach for security evaluation and certification of a complete quantum communication system
Post-quantum distributed ledger technology: a systematic survey
Enhancing quantum cryptography with quantum dot single-photon sources
Data availability.
The datasets analysed in the report are available from SUPERCOP at https://bench.cr.yp.to/supercop.html . Source data are provided with this paper.
Shor, P. W. Polynomial-time algorithms for prime factorization and discrete logarithms on a quantum computer. In Proc. 35th Annual Symposium on Foundations of Computer Science 124–134 (Soc. Industr. Appl. Math., 1994). Shor’s quantum algorithm demonstrated how to factorize large integers in polynomial time, which is an exponential speed-up over the best classical algorithms .
Bernstein, D. J. & Lange, T. Post-quantum cryptography. Nature 549 , 188–194 (2017).
Article ADS CAS Google Scholar
Arute, F. et al. Quantum supremacy using a programmable superconducting processor. Nature 574 , 505–510 (2019).
Gidney, C. & Ekerå, M. How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits. Quantum 5 , 433 (2021). Gidney and Ekerå describe the resources required to implement Shor’s algorithm to break today’s standard cryptography, assuming noisy qubits .
Article Google Scholar
Bennett, C. H. & Brassard, G. Quantum cryptography: public key distribution and coin tossing. Proceedings of the IEEE International Conference on Computers, Systems, and Signal Processing 175–179 (1984).
Alagic, G. et al. Computational security of quantum encryption. In International Conference on Information Theoretic Security 47–71 (Springer, 2016).
Barnum, H., Crepeau, C., Gottesman, D., Smith, A. & Tapp, A. Authentication of quantum messages. In Proc. 43rd Annual IEEE Symposium on Foundations of Computer Science 449–458 (IEEE, 2002).
Paquin, C., Stebila, D. & Tamvada, G. Benchmarking post-quantum cryptography in TLS. In International Conference on Post-Quantum Cryptography 72–91 (Springer, 2020).
Rose, S., Borchert, O., Mitchell, S. & Connelly, S. Zero Trust Architecture (NIST, 2020); https://csrc.nist.gov/publications/detail/sp/800-207/final
Kearney, J. J. & Perez-Delgado, C. A. Vulnerability of blockchain technologies to quantum attacks. Array 10 , 100065 (2021).
Lemke, K., Paar, C. & Wolf, M. Embedded Security in Cars (Springer, 2006).
Anderson, R. & Fuloria, S. Security economics and critical national infrastructure. In Economics of Information Security and Privacy 55–66 (Springer, 2010).
Gura, N., Patel, A., Wander, A., Eberle, H. & Shantz, S. C. Comparing elliptic curve cryptography and RSA on 8-bit CPUs. In International Workshop on Cryptographic Hardware and Embedded Systems 119–132 (Springer, 2004).
Rivest, R. L., Shamir, A. & Adleman, L. A method for obtaining digital signatures and public-key cryptosystems. Commun. ACM 21 , 120–126 (1978).
Article MathSciNet Google Scholar
Miller, V. S. Use of elliptic curves in cryptography. In Conference on the Theory and Application of Cryptographic Techniques 417–426 (Springer, 1985).
Koblitz, N. Elliptic curve cryptosystems. Math. Comput. 48 , 203–209 (1987).
Chang, S. et al. Third-Round Report of the SHA-3 Cryptographic Hash Algorithm Competition NISTIR 7896 (NIST, 2012).
Hülsing, A., Butin, D., Gazdag, S.-L., Rijneveld, J. & Mohaisen, A. XMSS: eXtended Merkle signature scheme. RFC 8391 (2018); https://datatracker.ietf.org/doc/html/rfc8391
McGrew, D., Curcio, M. & Fluhrer, S. Leighton-Micali hash-based signatures. RFC 8554 (2019); https://datatracker.ietf.org/doc/html/rfc8554
Cooper, D. A. et al. Recommendation for Stateful Hash-based Signature Schemes NIST Special Publication 800-208 (NIST, 2020); https://csrc.nist.gov/publications/detail/sp/800-208/final
Alagic, G. et al. Status Report on the Second Round of the NIST Post-quantum Cryptography Standardization Process (US Department of Commerce, NIST, 2020); https://csrc.nist.gov/publications/detail/nistir/8309/final This report describes NIST’s findings after evaluation of the second round, and explains the motivation for selecting the seven finalist schemes as well as the eight alternative track schemes for evaluation in the third round .
Gheorghiu, V. & Mosca, M. Benchmarking the quantum cryptanalysis of symmetric, public-key and hash-based cryptographic schemes. Preprint at https://arxiv.org/abs/1902.02332 (2019).
Bernstein, D. J. et al. SPHINCS: practical stateless hash-based signatures. In Proc. EUROCRYPT Vol. 9056 368–397 (Springer, 2015).
Nechvatal, J. et al. Report on the development of the advanced encryption standard (AES). J. Res. Natl Inst. Stand. Technol. 106, 511–577 (2001).
Chen, L. et al. Report on Post-quantum Cryptography (NIST, 2016); https://csrc.nist.gov/publications/detail/nistir/8105/final
McEliece, R. J. A public-key cryptosystem based on algebraic coding theory. Jet Propulsion Laboratory, Pasadena. DSN Progress Reports 4244 , 114–116 (1978).
ADS Google Scholar
Dierks, T. & Allen, C. The TLS protocol version 1.0. RFC 2246 (1999); https://www.ietf.org/rfc/rfc2246.txt
Rescorla, E. & Dierks, T. The transport layer security (TLS) protocol version 1.3. RFC 8446 (2018); https://datatracker.ietf.org/doc/html/rfc8446
Rescorla, E. & Schiffman, A. The secure hypertext transfer protocol. RFC 2660 (1999); https://datatracker.ietf.org/doc/html/rfc2660
Holz, R., Amann, J., Mehani, O., Wachs, M. & Kaafar, M. A. TLS in the wild: an Internet-wide analysis of TLS-based protocols for electronic communication. Proceedings of the Network and Distributed System Security Symposium (NDSS) (2016).
Steblia, D., Fluhrer, S. & Gueron, S. Hybrid Key Exchange in TLS 1.3 (IETF, 2020); https://tools.ietf.org/id/draft-stebila-tls-hybrid-design-03.html
Tjhai, C. et al. Multiple Key Exchanges in IKEv2 (IETF, 2021); https://www.ietf.org/archive/id/draft-ietf-ipsecme-ikev2-multiple-ke-03.txt
CYBER; Quantum-Safe Hybrid Key Exchanges ETSI TS 103 744, (ETSI, 2020); https://www.etsi.org/deliver/etsi_ts/103700_103799/103744/01.01.01_60/ts_103744v010101p.pdf
Quantum Safe Cryptography and Security; An Introduction, Benefits, Enablers and Challenges White Paper No. 8 (ETSI, 2015); https://www.etsi.org/technologies/quantum-safe-cryptography
Barker, W., Souppaya, M. & Newhouse, W. Migration to Post-Quantum Cryptography (NIST & CSRC, 2021); https://csrc.nist.gov/publications/detail/white-paper/2021/08/04/migration-to-post-quantum-cryptography/final
Lu, X. et al. LAC: practical ring-LWE based public-key encryption with byte-level modulus. IACR Cryptol. ePrint Arch. 2018 , 1009 (2018).
Google Scholar
Announcement of nation-wide cryptographic algorithm design competition result. Chinese Association for Cryptology Research https://www.cacrnet.org.cn/site/content/854.html (2021).
Alagic, G. et al. Status Report on the First Round of the NIST Post-Quantum Cryptography Standardization Process (NIST, 2019); https://www.nist.gov/publications/status-report-first-round-nist-post-quantum-cryptography-standardization-process
Ott, D. et al. Identifying research challenges in post quantum cryptography migration and cryptographic agility. Preprint at https://arxiv.org/abs/1909.07353 (2019).
Bindel, N., Brendel, J., Fischlin, M., Goncalves, B. & Stebila, D. Hybrid key encapsulation mechanisms and authenticated key exchange. In International Conference on Post-Quantum Cryptography 206–226 (Springer, 2019).
Crockett, E., Paquin, C. & Stebila, D. Prototyping post-quantum and hybrid key exchange and authentication in TLS and SSH. IACR Cryptol. ePrint Arch. 2019 , 858 (2019). Implementations of NIST round two PQC algorithms in TLS, providing insightful data on which algorithms are likely to be performant enough for widespread use and which will suffer severe performance issues .
Ounsworth, M. & Pala, M. Composite Signatures For Use In Internet PKI (IETF, 2021); https://www.ietf.org/archive/id/draft-ounsworth-pq-composite-sigs-05.txt
Barker, E., Chen, L. & Davis, R. Recommendation for Key-Derivation Methods in Key-Establishment Schemes (NIST, 2020); https://www.nist.gov/publications/recommendation-key-derivation-methods-key-establishment-schemes
Peikert, C. A decade of lattice cryptography. Found. Trends Theor. Comput. Sci. 10 , 283–424 (2016).
Bernstein, D. J., Buchmann, J. & Dahmen, E. Post-Quantum Cryptography (Springer, 2009).
Stebila, D. & Mosca, M. Post-quantum key exchange for the internet and the open quantum safe project. In International Conference on Selected Areas in Cryptography 14–37 (Springer, 2016).
Langley, A. BoringSSL. GitHub https://github.com/google/boringssl (2020).
Duong, T. Tink. GitHub https://github.com/google/tink (2020).
Bernstein, D. J. & Lange, T. SUPERCOP: system for unified performance evaluation related to cryptographic operations and primitives (VAMPIRE Lab, 2018); https://bench.cr.yp.to/supercop.html
Mosca, M. & Piani, M. Quantum Threat Timeline (Global Risk Institute, 2021); https://globalriskinstitute.org/publications/2021-quantum-threat-timeline-report/
Memorandum on Improving the Cybersecurity of National Security, Department of Defense, and Intelligence Community Systems. The White House https://www.whitehouse.gov/briefing-room/presidential-actions/2022/01/19/memorandum-on-improving-the-cybersecurity-of-national-security-department-of-defense-and-intelligence-community-systems/ (2022).
Download references
Author information
Authors and affiliations.
SandboxAQ, Palo Alto, CA, USA
David Joseph, Marc Manzano, Joe Tricot, Fernando Dominguez Pinuaga, Stefan Leichenauer & Jack Hidary
Google, Mountain View, CA, USA
Rafael Misoczki, Olivier Lacombe, Phil Venables & Royal Hansen
You can also search for this author in PubMed Google Scholar
Contributions
D.J., R.M. and M.M. drafted the paper and provided technical expertise. J.T., F.D.P., O.L., P.V. and S.L. participated in extensive discussions, providing business and organizational perspectives and edits, and J.H. and R.H. drove the project from an executive level, helping to gather resources, provide direction and edit the manuscript. A substantial part of this paper was written while all the authors were a part of Alphabet.
Corresponding author
Correspondence to David Joseph .
Ethics declarations
Competing interests.
The authors declare no competing interests.
Peer review
Peer review information.
Nature thanks Tanja Lange and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Source data fig. 3, rights and permissions.
Reprints and permissions
About this article
Cite this article.
Joseph, D., Misoczki, R., Manzano, M. et al. Transitioning organizations to post-quantum cryptography. Nature 605 , 237–243 (2022). https://doi.org/10.1038/s41586-022-04623-2
Download citation
Received : 18 May 2021
Accepted : 08 March 2022
Published : 11 May 2022
Issue Date : 12 May 2022
DOI : https://doi.org/10.1038/s41586-022-04623-2
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
Quick links
- Explore articles by subject
- Guide to authors
- Editorial policies
Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.
Evaluating the merits and constraints of cryptography-steganography fusion: a systematic analysis
- Regular Contribution
- Open access
- Published: 05 May 2024
- Volume 23 , pages 2607–2635, ( 2024 )
Cite this article
You have full access to this open access article
- Indy Haverkamp 1 &
- Dipti K. Sarmah 1
2086 Accesses
2 Citations
Explore all metrics
In today's interconnected world, safeguarding digital data's confidentiality and security is crucial. Cryptography and steganography are two primary methods used for information security. While these methods have diverse applications, there is ongoing exploration into the potential benefits of merging them. This review focuses on journal articles from 2010 onwards and conference papers from 2018 onwards that integrate steganography and cryptography in practical applications. The results are gathered through different databases like Scopus, IEEE, and Web of Science. Our approach involves gaining insights into real-world applications explored in the existing literature and categorizing them based on domains and technological areas. Furthermore, we comprehensively analyze the advantages and limitations associated with these implementations, examining them from three evaluation perspectives: security, performance, and user experience. This categorization offers guidance for future research in unexplored areas, while the evaluation perspectives provide essential considerations for analyzing real-world implementations.
Similar content being viewed by others
A Systematic Review of Computational Image Steganography Approaches
Steganography and Steganalysis (in digital forensics): a Cybersecurity guide
The results of priority research in the field of information security
Avoid common mistakes on your manuscript.
1 Introduction
Our daily lives are becoming increasingly linked with the digital realm, encompassing various activities such as messaging, cloud data storage, and financial transactions. Ensuring the security and confidentiality of this data is vital. Cryptography and steganography, two essential sciences of information security [ 74 , 77 ], offer solutions to render messages unintelligible to eavesdroppers and imperceptible to detection, respectively. These techniques play a crucial role in protecting sensitive information. Both fields serve the purpose of ensuring the confidentiality of data [ 69 ], however, in different ways: Cryptography shields the content of a message through the use of encryption keys, ensuring its protection. On the other hand, steganography focuses on concealing the very presence of the message within a "cover" medium [ 74 ]. While cryptography finds extensive usage in various everyday applications, both techniques have their respective domains of application and can potentially be combined for enhanced security measures. Steganography encompasses a wide range of techniques and can be applied in different forms, such as images, audio, video, and text, to many applications, for example, IoT communication [ 7 , 21 , 39 ], military [ 71 ], cloud storage [ 2 , 18 , 46 , 67 ], and more [ 28 , 31 , 32 , 89 , 93 ]. The growth of interest in steganography was sparked in two ways: the multimedia industry could greatly benefit from possible water-marking techniques, and restrictions on cryptographic schemes by governments triggered interest in alternative ways for communication to stay secretive [ 8 ] (Fig. 1 ).
Graph of published journal articles and conference papers on Scopus ( https://www.scopus.com/ —with query: ("cryptography" AND "steganography") AND ("application" OR "real-world") AND ("security" OR "cyberattack" OR "cybersecurity")) from 1996 to June 2023
Figure 2 visually represents the exponential growth of publications focusing on the applications of combining steganography and cryptography, as observed in Scopus. Footnote 1 This trend highlights the increasing interest in merging or comparing these two disciplines within the research community. While the combination of multiple security mechanisms may appear advantageous, it is important to note that the suitability of combining cryptography with steganography can vary. Several factors, including bandwidth availability [ 37 , 81 ] and latency considerations [ 88 ], can influence the feasibility of such integration. For instance, incorporating additional layers of security may result in increased data size, potentially exceeding the available bandwidth and causing slower transmission speeds. Interestingly, the computational complexity of a combined approach does not always exhibit a linear increase. A notable example is presented in [ 25 ], where steganography with Diffie-Hellman encryption demonstrated the same time complexity as steganography alone. However, when using RSA [ 91 ] encryption, a higher time complexity was observed [ 25 ]. Therefore, the choice between these techniques heavily relies on the specific security requirements of the given situation and the particular types of cryptography and steganography employed. In this paper, we refer to "a combined approach" to the combined use of steganography and cryptography. Furthermore, ’method’ and’scheme’ interchangeably refer to a paper’s combined implementation.
Data gathering and study selection processes of both literature searches
As the number of systems requiring protection from cyberattacks continues to rise, the exploration of applications where steganography and cryptography can be combined becomes increasingly intriguing. Nonetheless, to identify potential areas for improvement or future research, it is imperative to gain a comprehensive understanding of the current state of research in this field.
The goal of this research is in threefold steps as mentioned in the following. This also helps to formulate the research questions.
The research does a systematic literature review aiming to bring forth a novel perspective by identifying and analyzing papers that delve into the combined application of cryptography and steganography across real-world applications.
The research categorizes these applications based on diverse domains and contexts, such as their domain of applications (e.g., Medical or Transportation) and technological domain (e.g., Cloud Computing or Internet of Things).
The research also explores several relevant studies to identify the advantages, limitations, and trade-offs discussed in the existing literature and gain insight into how the performance of these combined implementations can be effectively analyzed.
The findings derived from this comprehensive review yield valuable insights into the current research landscape, contributing to advancements in fortifying systems against cyber threats. Consequently, these findings prompt the formulation of the following research questions, which further drive exploration and inquiry in this field. The primary research question focuses on exploring the advantages and limitations of utilizing a combined steganography and cryptography approach in diverse real-world applications as a means to enhance security against cyberattacks on a system.
To address this primary question, three key sub-questions necessitate analysis:
What are the various real-world applications where combined steganography and cryptography approaches can be used? (RQ1)
What are the advantages, limitations, and trade-offs of using a combined approach in these applications? (RQ2)
How are implementations of a combined approach evaluated across different real-world applications? (RQ3)
By addressing these sub-questions, a comprehensive understanding of the benefits, constraints, and evaluation methods surrounding the combined application of steganography and cryptography can be obtained, leading to significant insights for bolstering system security against cyber threats.
This paper is organized into several sections, including the Introduction section as referred to in Sect. 1 . Section 2 discusses the background and related work of the steganography and cryptography techniques as well as evaluation methods. Section 3 elaborates on the methodology of the research including the search strategy for conducting a literature review, databases to collect resources, and tools to optimize the efficiency of the review process. The results are presented in Sect. 4 which includes different types of applications and categorization approaches of these applications, exploring limitations and advantages of the applications, and analyzing these methods to provide valuable insights into the combination of cryptography and steganography methods in terms of security, performance, and user perspective. Section 5 gives the concluding remarks and presents the future scope of the research. References are drawn at the end of the paper.
2 Background and related work
There is high interest in organizations, researchers, and end users in the sciences of steganography and cryptography to enhance security for different applications and several domains. In this research, we analyzed several papers that focus on the combination of cryptography and steganography to identify the real gap and pros and cons of combining both sciences. For that, we focused on several relevant applications, and one of the important and interesting applications in the medical domain where Bhardwaj, R. [ 13 ] addresses the critical challenge of ensuring patient data privacy and security in telemedicine applications. The author proposes an enhanced reversible data-hiding technique operating in the encrypted domain. The proposed algorithm embeds secret messages in hexadecimal form and utilizes four binary bits of electronic patient information (EPI) in each cover image block, ensuring secure communication. However, this approach mitigates underflow and overflow problems, enabling precise information embedding even in low-intensity pixel areas.
On the other side, the research [ 22 ] discusses the growing challenge of securing medical data in healthcare applications due to the expanding presence of the Internet of Things (IoT). They propose a hybrid security model combining cryptography and steganography to protect diagnostic text data within medical images. The encryption process precedes the embedding of encrypted data into cover images, both color and grayscale, to accommodate varying text sizes. Performance evaluation based on six statistical parameters indicates promising results, with PSNR values ranging from 50.59 to 57.44 for color images and from 50.52 to 56.09 for grayscale images. The proposed model demonstrates its effectiveness in securely hiding confidential patient data within cover images while maintaining high imperceptibility and capacity, with minimal degradation in the received stego-image.
Further, the research [ 34 ] states that as the elderly population increases and more people suffer from heart problems, hospitals worldwide are expected to use remote electrocardiogram (ECG) patient monitoring systems. These systems will gather a lot of ECG signals from patients at home, along with other health measurements like blood pressure and temperature, for analysis by remote monitoring systems. It's crucial to keep patient information safe when transmitting data over public networks and storing it on hospital servers. This study introduces a technique using wavelets, which are like a special math tool, to hide patient data in ECG signals. It combines encryption, which is like a lock, and scrambling, which is like mixing things up, to protect the data. This technique lets us put patient info into ECG signals without changing how they look or work. Tests show that the technique keeps data safe (with less than 1% change) and doesn't mess up the ECG readings. This means doctors can still read the ECGs even after we take out the hidden patient info, keeping medical data private and accurate.
Furthermore, the paper [ 41 ] proposes a novel steganography technique in their work, aiming to enhance the security of data transmission in telemedicine applications. This technique involves concealing patient information within medical images using a dynamically generated key, determined through graph coloring and the pixel count of the cover image. By combining steganography with cryptography, the patient information is encrypted using the RSA algorithm to strengthen security measures. Notably, this proposed method ensures reversibility, allowing for the lossless restoration of original medical images after data extraction from the stego medical image. Experimental evaluations demonstrate the efficacy of this approach, showcasing its superior security compared to alternative information hiding methods, particularly in terms of key generation complexity and the quality of restored images as measured by Peak Signal-to-Noise Ratio (PSNR) and Mean Square Error (MSE).
The researchers in [ 45 ] also worked along similar lines to enhance robust security measures in the handling of medical images, particularly when sensitive patient records are involved. To address this, a 128-bit secret key is generated based on the histogram of the medical image. Initially, the digital imaging and communications in medicine (DICOM) image undergoes a decomposition process to extract its sensitive features. The resulting image is then divided into blocks dependent on the generated key, followed by key-dependent diffusion and substitution processes. Encryption is performed over five rounds to ensure robust security. Subsequently, the secret key is embedded within the encrypted image using steganography, further enhancing the security of the proposed cipher. At the receiver's end, the secret key is extracted from the embedded image and decryption is carried out in reverse.
An innovative approach is presented in this paper [ 48 ], proposing an integrated method that combines cryptography and steganography to bolster data security in automotive applications. In this technique, data is first encrypted using a modified RSA cryptographic algorithm, and the encrypted data is then embedded along the edges of an image using the Least Significant Bit (LSB) technique. Edge detection [ 55 ] is accomplished using a fuzzy logic approach. This integrated approach is primarily designed for applications such as Diagnostics over Internet Protocol (DoIP) and Software Updates over the Air (SOTA), which involve the exchange of highly sensitive data. Additionally, the authenticity of the source of software updates is verified using a Hash Algorithm in SOTA.
Additionally, this paper [ 57 ] introduces a technique for encrypting and decrypting patient medical details, medical images, text, and pictorial forms using unique algorithms, aligning with the literature discussed above in the medical field. However, this research enhances security through the utilization of chaotic signals [ 59 ]. Signal generation and analysis are conducted using Matlab 7.10, demonstrating the efficacy of this method. In similar lines, the paper by Parah et al. [ 61 ] introduces a novel and reversible data hiding scheme tailored for e-healthcare applications, emphasizing high capacity and security. The Pixel to Block (PTB) conversion technique is employed to generate cover images efficiently, ensuring the reversibility of medical images without the need for interpolation. To enable tamper detection and content authentication at the receiver, a fragile watermark and Block Checksum are embedded in the cover image, computed for each 4 × 4 block. Intermediate Significant Bit Substitution (ISBS) is utilized to embed Electronic Patient Records (EPR), watermark, and checksum data, preventing LSB removal/replacement attacks. Evaluation of the scheme includes perceptual imperceptibility and tamper detection capability under various image processing and geometric attacks. Experimental results demonstrate the scheme's reversibility, high-quality watermarked images, and effective tamper detection and localization.
In this research [ 75 ], the authors propose a new and secure steganography-based End-to-End (E2E) verifiable online voting system to address issues within the voting process. This research introduces a novel approach to online voting by integrating visual cryptography with image steganography to bolster system security while maintaining system usability and performance. The voting system incorporates a password-hashed-based scheme and threshold decryption scheme for added security measures. Further, the research [ 78 ] discusses the advantages of combining both steganography and cryptography for having more secure communication. Initially, the Advanced Encryption Standard (AES) algorithm is adapted and employed to encrypt the secret message. Subsequently, the encrypted message is concealed using a steganography method. This hybrid technique ensures dual-layered security, offering both high embedding capacity and quality stego images for enhanced data protection.
Furthermore, the authors in [ 87 ] introduce a novel Reversible data hiding in encrypted images (RDHEI) scheme leveraging the median edge detector (MED) and a hierarchical block variable length coding (HBVLC) technique. In this approach, the image owner predicts the pixel values of the carrier image with MED, followed by slicing the prediction error array into bit-planes and encoding them plane by plane. Experimental results demonstrate that the proposed scheme not only restores secret data and the carrier image without loss but also surpasses state-of-the-art methods in embedding rate across images with diverse features.
The previously discussed paper primarily centered around application domains. In contrast, we examined several papers that primarily focused on technological domains. This paper [ 1 ] presents the Circle Search Optimization with Deep Learning Enabled Secure UAV Classification (CSODL-SUAVC) model tailored for Industry 4.0 environments. The CSODL-SUAVC model aims to achieve two core objectives: secure communication via image steganography and image classification. The proposed methodology involves Multi-Level Discrete Wavelet Transformation (ML-DWT), CSO-related Optimal Pixel Selection (CSO-OPS), and signcryption-based encryption. The proposed CSODL-SUAVC model is experimentally validated using benchmark datasets, demonstrating superior performance compared to recent approaches across various evaluation aspects.
In their paper [ 5 ], the authors introduce an improved system designed to safeguard sensitive text data on personal computers by combining cryptography and steganography techniques. The system's security is fortified by employing RSA cryptography followed by audio-based steganography. The study includes system modeling and implementation for testing, aimed at exploring the relationship between security, capacity, and data dependency. Experimentation involved securing data within 15 differently sized audio files, yielding insightful results.
Additionally, the research in [ 7 ] discusses the promising growth of the Internet of Things (IoT) and the prevalent use of digital images, which has resulted in an increased adoption of image steganography and cryptography. However, current systems encounter challenges related to security, imperceptibility, and capacity. In response, they propose a new Crypt-steganography scheme that integrates three primary elements: hybrid additive cryptography (HAC) for secure message encryption, the bit interchange method (BIGM) to ensure imperceptibility during embedding, and a novel image partitioning method (IPM) for enhanced randomness in pixel selection. Evaluations confirm the scheme's effectiveness in addressing these challenges.
Also, the authors of [ 9 ] presented a novel approach to safeguard data on the cloud using Reversible Data Hiding in an Encrypted Image (RDHEI), coupled with homomorphic encryption and a rhombus pattern prediction scheme. With this method, third parties can perform data-hiding operations on encrypted images without knowledge of the original content, ensuring high-level security. The proposed method demonstrates strong protective measures, as evidenced by experimentations. Additionally, the approach enables seamless image recovery and covert extraction.
Further, in this paper, the authors [ 10 ], explore how malicious Android applications are evading detection by hiding within images using techniques like Concatenation, Obfuscation, Cryptography, and Steganography. They assess the vulnerability of ten popular Android anti-malware solutions to these methods. Surprisingly, only one solution detected two hiding techniques, while the others remained blind to all eight. This evaluation offers insights into the evolving landscape of Android malware and the effectiveness of current detection systems.
Insufficient security measures in data transmission led to issues like data integrity, confidentiality, and loss, especially with big data. Executing multiple security algorithms reduces throughput and increases security overhead, impacting robustness against data loss. Conversely, compression techniques sacrifice data confidentiality. Existing studies lack comprehensive security policies to address these concerns collectively. Therefore, the authors in their paper [ 14 ] propose an integrated approach to enhance confidentiality and provide backup for accidental data loss by combining simplified data encryption standards (SDES) and advanced pattern generation. A novel error control technique maximizes data integrity against transmission errors. A new compression method improves robustness against data loss while maintaining efficiency. Enhanced confidentiality and integrity are achieved through advanced audio steganography. Implementing this integrated technique in a GPU environment accelerates execution speed and reduces complexity. Experiments validate the method's effectiveness in ensuring data confidentiality and integrity, outperforming contemporary approaches.
By covering one more technological aspect, the research [ 19 ] introduces a secure Near Field Communication (NFC) smartphone access system using digital keys and Encrypted Steganography Graphical Passwords (ESGP). User perceptions and intentions are evaluated through experiments and surveys, emphasizing security as a key factor in adopting NFC ESGP systems. This offers valuable insights for enhancing security through two-factor authentication on NFC-enabled smartphones.
Further, recognizing Fog computing as an intriguing domain bridging the cloud and Internet of Things (IoT) necessitates a secure communication channel to prevent attacks. In the paper [ 33 ], the strengths and weaknesses of hybrid strategies of cryptography and steganography in fog environments, where real-time transmission is crucial, are discussed. This paper presents a novel Fog-Based Security Strategy (FBS2) that integrates cryptography and steganography techniques. The Cryptography Technique (PCT) entails two phases: confusion and diffusion, which scramble and modify pixel values of secret images using innovative methodologies. The Steganography Technique utilizes discrete wavelet packet transform, employing a new matching procedure based on the most significant bits of encrypted secret images and cover image pixels. Experimental results illustrate FBS2's superiority in efficiency, security, and processing time, executing it well-suited for fog environments.
Furthermore, the paper [ 36 ] explores Industry 5.0, which merges human and machine capabilities to meet complex manufacturing demands through optimized robotized processes. Industry 5.0 utilizes collaborative robots (cobots) for improved productivity and safety, while unmanned aerial vehicles (UAVs) are expected to have a significant role. Despite UAVs' advantages like mobility and energy efficiency, challenges such as security and reliability persist. To address this, the article presents AIUAV-SCC, an artificial intelligence-based framework tailored for Industry 5.0. It consists of two main phases: image steganography-based secure communication and deep learning (DL)-based classification. Initially, a new image steganography technique is employed, integrating multilevel discrete wavelet transformation, quantum bacterial colony optimization, and encryption processes.
Another interesting method proposed in the research [ 85 ] for encrypting digital images using a special type of mathematical system called a chaotic system. Chaotic systems have properties that make them very difficult to predict and control, which is useful for encryption. The method proposed in this paper uses a specific type of chaotic system called the two-dimensional Hénon-Sine map (2D-HSM), which has been designed to be more effective than other chaotic systems for this purpose. Additionally, the method incorporates a technique inspired by DNA to further enhance the encryption process. This new encryption scheme aims to protect images when they are sent over the Internet. The paper presents experimental tests to show that this scheme performs better than other methods in terms of security and resistance to attacks.
Furthermore, advanced cloud computing is considered one of the prominent technologies offering cost-saving benefits and flexible services. With the increasing volume of multimedia data, many data owners choose to outsource their data to the cloud. However, this trend raises privacy concerns, as users relinquish control over their data. To address these concerns, reversible data hiding schemes for encrypted image data in cloud computing have been proposed by [ 86 ]. These schemes aim to ensure data security without relying on the trustworthiness of cloud servers. Theoretical analysis confirms the security and correctness of the proposed encryption model, with acceptable computation costs adjustable based on security needs.
We also focused on Conference papers that explore the combination of cryptography and steganography, covering various applications and technological domains. The work at [ 23 ], presents a novel framework that combines a hybrid encryption scheme using chaotic maps and 2D Discrete Wavelet Transform (DWT) Steganography to enhance security by maintaining patient privacy. Additionally, a web-based monitoring platform is deployed for tracking electronic medical records during transmission. Experimental results show that the proposed framework outperforms existing approaches in terms of accuracy, sensitivity, and perceptibility, with high imperceptibility and limited degradation in the stego image.
Along similar lines, the authors [ 29 ] present the aim of their study to protect the privacy and confidentiality of data during multimedia exchanges between two IoT hops in uncertain environments. To achieve this, a robust multilevel security approach based on information hiding and cryptography is proposed to deter attackers and ensure data confidentiality. Existing schemes often struggle to strike a balance between medical image quality and security, and directly embedding secret data into images followed by encryption can make it easy for intruders to detect and extract hidden information. This study yields superior results in terms of imperceptibility and security by employing the right method in the right context.
Also, another application aspect Reversible data hiding (RDH) ensures secure digital data transmission, especially vital in telemedicine where medical images and electronic patient records (EPR) are exchanged. This study [ 47 ] proposes a novel RDH scheme that embeds EPR data during image encryption. Using a block-wise encryption technique, the scheme hides EPR data bits within the encrypted image. A support vector machine (SVM)-based classification scheme is employed for data extraction and image recovery. Experimental results show superior performance compared to existing schemes in terms of embedding rate and bit error rate.
Further, Network security is crucial in safeguarding against malicious attacks, especially with the rapid growth of e-commerce worldwide. This study [ 52 ] proposes a novel approach to enhance online shopping security by minimizing the sharing of sensitive customer data during fund transfers. Combining text-based steganography, visual cryptography, and OTP (One Time Password), the proposed payment system ensures customer data privacy, prevents identity theft, and increases customer confidence. By utilizing steganography and visual cryptography, the method minimizes information sharing between consumers and online merchants, thereby enhancing data security and preventing misuse of information.
Further moving forward with another interesting research [ 62 ] that focusses on E-commerce platform transactions. this study proposes a two-layered security mechanism for e-transactions using dynamic QR codes. The first layer involves encapsulating payment information within a dynamic QR code, unique to each order, which includes bank details, user information, and order specifics. The second layer employs encryption through Secure Electronic Transactions (SET) to further secure the payment process. This dual-layer approach enhances security by introducing dynamic QR codes, reducing vulnerability to cyber-attacks and ensuring secure transmission of payment data. On the other side, the authors [ 3 ] proposed a lightweight dynamic encryption algorithm using DNA-based stream ciphers. This algorithm generates a one-time dynamic key (DLFSR) based on collected data, encoding both the text and key into a dynamic DNA format. The ciphertext is then produced through an addition process using a proposed table, with decryption information hidden within for key distribution. Statistical tests and performance evaluations demonstrate the algorithm's effectiveness in providing security for restricted devices, outperforming previous approaches.
To safeguard IPv6 packet identities against Denial-of-Service (DoS) attacks, this paper [ 6 ] proposes a combination of cryptography and steganography methods. Ensuring secure communication in IPv6 network applications is crucial due to prevalent issues like DoS attacks and IP spoofing. The proposed approach involves generating unique identities for each node, encrypting them, and embedding them into transmitted packets. Upon reception, packets are verified to authenticate the source before processing. The paper conducts nine experiments to evaluate the proposed scheme, which includes creating IPv6 addresses, applying logistics mapping, RSA encryption, and SHA2 authentication. Network performance is assessed using OPNET modular, demonstrating improved computation power consumption and better overall results, including memory usage, packet loss, and traffic throughput. In a similar line, the paper [ 11 ] suggests a hybrid security method using hashing, encryption, and image steganography to better protect user credentials in databases. The aim is to help developers integrate strong password security practices into their software development process to prevent data breaches. Experimental results show the effectiveness of this approach.
Security is crucial across various applications, including cloud storage and messaging. While AES, DES, and RSA are common encryption methods, relying solely on one can lead to vulnerabilities if the encryption key is compromised. To address this, hybrid cryptography is employed in this research [ 12 ], combining existing techniques with three new methods. Data is divided into three sections and encrypted with AES, DES, and RSA respectively. Encryption keys are stored using LSB steganography in an image, ensuring additional security. Users must retrieve the keys from the image to access and decrypt the data stored in the cloud, enhancing overall security. Further, Castillo et al. [ 17 ] present a new mobile app that secures images using AES encryption and LSB steganography. It employs a 256-bit AES key for robust protection and utilizes the Diffie-Hellman algorithm for secure key exchange. The app development follows the Rapid Application Development Model, ensuring iterative refinement and early testing. Evaluation based on ISO/IEC/IEEE 29119 Testing Standards indicates user satisfaction with an overall mean rating of 4.17.
As mentioned above, one of the interesting areas is Cloud Computing (CC) which has emerged as a popular model for delivering services over the Internet, with Software as a Service (SaaS) being a prominent example. Despite its benefits, security remains a concern. This paper [ 24 ] presents an application model for securing SaaS applications hosted on private clouds. The model consists of two micro-services: an Application Layer Firewall to prevent malicious activity, and a secure login application for sensitive data transmission. Additionally, a Hidden Markov Model layer is implemented for intrusion detection. The second micro-service uses Advanced Encryption Standard (AES) for document encryption within the private cloud. Further security is provided through a novel Video Steganography approach using the Least Significant Bit (LSB) technique. Overall, the paper outlines a comprehensive approach to enhance security in SaaS applications.
Further, considering confidentiality and integrity important aspects for sharing confidential information while communication, the research [ 38 ] introduces “Stag-chain”, a blockchain-based design combining steganography, AES encryption, and InterPlanetary File System (IPFS) Protocol for decentralized cloud storage. The image file is stored on the cloud temporarily, replaced by a normal image afterward. This scheme aims to develop an app ensuring data confidentiality, secure data transmission, and protection against unauthorized access. Furthermore, Madavi et al. [ 43 ] introduce a compact steganography technique for robust data hiding while maintaining perfect invisibility. It combines DES, AES, and RC4 encryption methods for enhanced security. The study aims to achieve data security using steganography with the Least Significant Bit (LSB) Algorithm and Hybrid Encryption, encrypting user input and concealing it within image files for maximum security during message transmission.
Additionally, the authors [ 51 ] introduce a highly secure web-based authentication system utilizing Image Steganography and the 128-bit Advanced Encryption Standard (AES) algorithm. This system encrypts user passwords using AES. Also, face identification photographs are used as stegoimages to conceal the encrypted passwords, further enhancing security. The proposed work demonstrated resilience against advanced steganalysis attacks, including the chi-squared attack and neighborhood histogram. The authors recommended this secure authentication method for future web applications dealing with sensitive user data.
In [ 65 ], the authors investigate using audio signal processing for cybersecurity in voice applications. As voice interfaces become ubiquitous in devices, the research focuses on securely identifying and authenticating users through cryptography and audio steganography, ensuring both security and usability. Also, the paper [ 66 ] introduces security strategies aimed at enhancing data protection in the cloud, addressing concerns such as confidentiality, accessibility, and integrity. By leveraging steganography, encryption-decryption techniques, compression, and file splitting, our proposed approach aims to overcome the limitations of traditional data protection methods, providing clients with an effective and secure means to store and share information.
Further, the transmission of satellite images via the Internet has gained considerable attention, especially with the rise of cloud and web-based satellite information services. Ensuring secure and high-quality data transfer to users has become a priority. To address this in the research [ 70 ], a combination of steganography and cryptography techniques is employed. Steganography hides data within images, audio, or video, while cryptography ensures data remains unintelligible to cyber attackers. This fusion approach offers a unique method for information protection. The paper proposes combining steganography algorithms such as Least Significant Bit (LSB) and Most Significant Bit (MSB) with cryptography algorithms like Rivest-Shamir-Adleman (RSA) for enhanced security.
This is another interesting research under technology development [ 73 ]. The rise of multimedia applications has fueled the use of digital archives, with cloud storage being a common choice for storing, transmitting, and sharing multimedia content. However, the reliance on cloud services poses security risks, compromising data privacy. To mitigate these risks, data access is restricted to authenticated users, and data is encrypted before storage in the cloud. Cipher Text-Policy Attribute-Based Encryption (CP-ABE) is used to encrypt data and control access, but traditional CP-ABE requires substantial computing resources. To address this, an efficient pairing-free CP-ABE scheme using elliptic curve cryptography is proposed, reducing memory and resource requirements. However, even with CP-ABE, plaintext retrieval is easier with cryptanalysis. To enhance data security and ownership, cryptography is combined with steganography, embedding ciphertext into images to thwart cryptanalysis and improve data security and privacy, particularly for multimedia applications.
Further, Modern healthcare relies on secure medical imaging systems for accurate diagnosis. This paper [ 79 ] proposes a method to protect the JPEG compression processor used in these systems from threats like counterfeiting and Trojan insertion. By integrating robust structural obfuscation and hardware steganography, the approach ensures double-layered defense with minimal design cost. Also, Online shopping presents risks such as credit card fraud and identity theft. This paper [ 80 ] introduces a novel scheme to detect and prevent phishing sites using extended visual cryptography, steganography, and an Android application. The scheme reduces user interaction by automatically uploading shares and QR code details during authentication, enhancing security by minimizing errors from manual intervention.
Ensuring image security and copyright protection, especially post-COVID-19, is challenging. This paper [ 98 ] introduces SecDH, a medical data hiding scheme designed to address these challenges specifically for COVID-19 images. The scheme begins by normalizing the cover image to enhance resistance against geometric attack and computes a normalized principal component for embedding. Experimental results show SecDH's imperceptibility and advantages over traditional schemes. In a similar line, this research [100] introduces a robust technique with a high embedding capacity for color images. By fusing multi-focus images using NSCT and computing hash values for authentication, the technique enhances information richness and security. Embedding the fused image and hash value into the cover media using transformed-domain schemes, along with encryption, ensures higher security. Additionally, a hybrid optimization algorithm computes an optimal factor for improved imperceptibility and robustness. Experimental results demonstrate the technique's effectiveness and resistance to common attacks, achieving a 9.5% increase in robustness and an 8.8% enhancement in quality compared to existing works.
Further, the research [ 99 ] proposes SIELNet, a robust encryption algorithm for color images. Utilizing a novel chaotic map and custom network, SIELNet ensures secure data transmission and storage. Experimental results validate its superior performance, promising enhanced data integrity in Industry 5.0.
Furthermore, the evaluation of these techniques relies on a diverse set of metrics that assess their performance in terms of security, robustness, capacity, perceptual quality, and statistical characteristics. This research background provides an overview of the key evaluation metrics, tools, and attacks used for steganography, and cryptography, including their definitions and significance in assessing the effectiveness of covert communication methods. With the help of the following information on evaluation criteria, tools, and attacks, numerous research papers spanning both cryptography and steganography domains have been analyzed and are presented in Table 10 . This provides readers with in-depth information to facilitate their understanding of the Results section with clarity.
Evaluation criteria
Peak signal to noise ratio (PSNR) [ 1 , 5 , 7 , 92 , 95 , 97 ] PSNR is a widely used metric in image processing that quantifies the quality of reconstructed signals by measuring the ratio of the peak signal power to the noise power. In steganography, PSNR is employed to evaluate the perceptual quality of stego images by comparing them to their original counterparts, with higher PSNR values indicating better image fidelity. The PSNR of the grey-level image is defined as follows:
Mean square error (MSE) [ 1 , 22 , 92 , 95 , 97 ] MSE measures the average squared difference between the pixel values of the original and reconstructed signals, providing a quantitative measure of reconstruction accuracy. In steganography, MSE is utilized to assess the distortion introduced by embedding hidden data, with lower MSE values indicating reduced perceptual distortion.
Correlation coefficient (CC) [ 1 , 9 , 22 , 95 ] CC serves as a robust metric commonly applied to evaluate message correlation, particularly within image formats through median filtering. While not extensively employed in steganography, its utility is more pronounced when messages adopt image form. In the realm of image watermarking, CC finds wider usage owing to the prevalent image-based nature of watermarks. Notably, CC's modus operandi doesn't hinge on error quantification but centers on computing the correlation between original message image pixels and their counterparts extracted from the message. Consequently, CC values, ranging from −1 to 1, signify correlation strength, with 1 denoting optimal correlation. Its computation can be executed using the following equation:
Capacity [ 5 , 92 , 95 , 97 ] Capacity refers to the maximum amount of hidden information that can be embedded within a cover signal without causing perceptible distortion. In steganography, capacity metrics assess the payload capacity of steganographic algorithms, guiding the selection of embedding techniques to achieve a balance between data hiding capacity and perceptual quality.
Structural similarity index (SSIM) [ 7 , 22 , 33 , 96 ] It's a metric used in image processing to quantify the similarity between two images. SSIM considers luminance, contrast, and structure, mimicking human visual perception. It's widely used in research to evaluate the quality of image compression, denoising, and restoration algorithms.
Human visual system (HVS) metrics [ 7 , 95 ] HVS metrics model the perceptual characteristics of the human visual system to evaluate the visual quality and perceptibility of stego signals. In steganography, HVS metrics such as the Structural Similarity Index (SSIM) and perceptual entropy are utilized to assess the visibility of embedded data and ensure imperceptibility to human observers.
Entropy [ 33 , 88 , 96 ] Entropy measures the randomness or uncertainty of a signal and is used to quantify the information content of cover and stego signals. In steganography, entropy metrics assess the statistical properties of stego signals, with lower entropy values indicating a higher degree of hidden information. The entropy can be calculated for an 8-bit image as follows:
\(H\left(I\right)=-\sum_{i=1}^{{2}^{8}}{P(I}_{i}){log}_{b}P{(I}_{i})\) , where I denote the Intensity value, and \({P(I}_{i})\) represents the probability of intensity value \({I}_{i}\) .
Histogram analysis [ 9 , 45 , 92 , 95 , 97 ] Histogram analysis examines the distribution of pixel intensities in cover and stego signals to detect statistical anomalies introduced by steganographic embedding. In steganalysis, histogram-based metrics evaluate the statistical differences between cover and stego signals, facilitating the detection of hidden information.
Bit error ratio (BER) [ 9 , 13 , 22 , 95 ] BER quantifies the ratio of incorrectly received bits to the total number of transmitted bits and is used to measure the accuracy of data transmission in digital communication systems. In steganography, BER is employed to evaluate the accuracy of data extraction from stego signals, with lower BER values indicating a higher level of data integrity.
Bits per pixel (BPP) [ 41 , 61 , 85 , 95 ] BPP measures the average number of embedded bits per pixel in stego images and is used to quantify the embedding efficiency of steganographic algorithms [ 96 ]. In steganography, BPP metrics assess the trade-off between embedding capacity and visual quality, guiding the selection of embedding parameters.
Signal-to-noise ratio (SNR) [ 14 , 95 ] SNR measures the ratio of signal power to noise power and is used to quantify the quality of transmitted signals in communication systems. In steganography, SNR metrics evaluate the robustness of steganographic algorithms to noise interference, with higher SNR values indicating better signal quality.
Amplitude difference (AD) [ 14 ] AD measures the difference in amplitude or magnitude between the original plaintext and the corresponding ciphertext resulting from the encryption process. It quantifies the level of distortion introduced during encryption, with lower AD values indicating minimal alteration in amplitude between the plaintext and ciphertext. The assessment of AD aids in evaluating the perceptual quality and robustness of cryptographic algorithms, ensuring that encrypted data retains fidelity and is resistant to unauthorized tampering.
Avalanche effect (AE) [ 14 ] AE characterizes the sensitivity of a cryptographic algorithm to small changes in the input, resulting in significant changes in the output ciphertext. A robust cryptographic algorithm exhibits a pronounced avalanche effect, where even minor modifications in the input plaintext lead to extensive changes in the resulting ciphertext. AE plays a pivotal role in assessing the security and strength of encryption algorithms, as it indicates the extent to which encrypted data conceals underlying patterns and resists cryptanalysis attempts aimed at deciphering the original plaintext.
Bits per code (BPC) [ 14 ] BPC refers to the average number of bits used to represent each symbol or code in a given data encoding scheme or communication system. It quantifies the efficiency of data representation and transmission by measuring the ratio of the total number of bits to the total number of codes or symbols transmitted. In data encoding and compression techniques, a lower BPC indicates higher efficiency in representing data using fewer bits, while ensuring minimal information loss or distortion.
Throughput [ 14 ]: Throughput represents the rate at which data is successfully transmitted or processed over a communication channel or system within a specific time. It measures the amount of data transferred per unit time and is typically expressed in bits per second (bps) or a similar unit of data transmission rate. Throughput is influenced by factors such as channel bandwidth, data encoding efficiency, error correction mechanisms, and system latency. Higher throughput values indicate greater data transmission capacity and efficiency, enabling faster and more reliable communication.
Uncorrectable error rate (UER) [ 14 ] UER is a metric used in error detection and correction systems to quantify the frequency or probability of errors that cannot be successfully detected or corrected by error correction mechanisms. It represents the rate of errors that remain undetected or uncorrected despite the implementation of error detection and correction techniques. A low Uncorrectable Error Rate is desirable in communication systems, indicating a high level of reliability and effectiveness in error detection and correction processes.
Cronbach’s alpha (CA) [ 19 ] Cronbach's alpha is a measure of internal consistency and reliability of steganographic or cryptographic algorithms. It ensures that they consistently perform as intended across different datasets or scenarios.
Composite reliability (CR) [ 19 ] Composite reliability is another measure of internal consistency reliability, similar to Cronbach's alpha. It evaluates the reliability of a set of items in measuring a latent construct, taking into account the factor loadings of the items.
Average variance extracted (AVE) [ 19 ] AVE is a measure of convergent validity in structural equation modeling (SEM). It assesses the amount of variance captured by a latent construct in relation to the variance due to measurement error.
Structural equation modeling (SEM) [ 19 ] SEM is a statistical method used to test and validate theoretical models that specify relationships among observed and latent variables. It allows researchers to assess the structural relationships between variables and evaluate the goodness-of-fit of the proposed model.
Normalized chi-square (Normalized χ2) [ 19 ] Normalized chi-square is a goodness-of-fit measure used in SEM, indicating the discrepancy between the observed and expected covariance matrices relative to the degrees of freedom.
Goodness-of-fit index (GFI) [ 19 ] GFI is a measure of the overall fit of the structural equation model to the observed data. It assesses the extent to which the model reproduces the observed covariance matrix.
Root mean square error (RMSE) [ 19 ] RMSE is a measure of discrepancy between observed and predicted values in SEM. It quantifies the average difference between observed and model-estimated covariance matrices, with lower RMSE values indicating better model fit.
Normed fit index (NFI) [ 19 ] NFI is a goodness-of-fit index in SEM that evaluates the relative improvement in the fit of the proposed model compared to a null model. Higher NFI values indicate a better fit.
Tucker lewis index (TLI) 19] TLI, also known as the Non-Normed Fit Index (NNFI), is a measure of incremental fit in SEM. It compares the proposed model to a baseline model with uncorrelated variables, with TLI values close to 1 indicating a good fit.
Comparative fit index (CFI) [ 19 ] CFI is another measure of incremental fit in SEM, assessing the improvement in the fit of the proposed model relative to a null model. CFI values close to 1 indicate a good fit.
Normalized cross-correlation coefficient (NCCC) [ 33 ] NCCC is employed to measure the similarity between the cover and stego-images. A high NCCC value close to 1 signifies that the steganographic process has been performed effectively, resulting in minimal detectable differences between the original cover image and the stego-image, thereby ensuring the concealment of hidden information within the cover image. This can be evaluated as \({\gamma }_{p,q}=\frac{cov(p,q)}{\sqrt{D(p)\sqrt{D(q)}}}, with D\left(p\right),\) where p and q represent two variables that can denote either the secret and decrypted images in the cryptography process or the cover and stego-images in the steganography process. The correlation coefficient is \(\gamma \) , and each of the \(cov(p, q)\) , \(D(p),\) and \(D(q)\) , correspond to the covariance and variances [ 33 ] of these variables p and q.
Number of pixel change rates (NPCR) [ 33 ] The NPCR metric is utilized during the encryption stage to evaluate the disparity between cipher images before and after a single pixel alteration in a plaintext image. Let P represent the total number of pixels, where C1 and C2 denote the cipher images before and after the pixel change, respectively. Additionally, D is a bipolar array defined such that \(D(i,j)=0\) \(if C1(i,j)=C2(i,j), and D(i,j)=1\) otherwise. The NPCR determines the percentage of differing pixel values between the original and encrypted images. This metric gauges the resilience of the encryption method against potential intrusions and attacks, with higher NPCR values indicating a stronger strategy. \(N\left(C1,C2\right)=\sum_{i,j}\frac{D(i,j)}{P}\times 100\%\) .
Unified average changing intensity (UACI) [ 33 ] UACI calculates the mean intensity of variances between two images with the following formula: \(UACI=\frac{1}{2}[\sum_{pq}\frac{{{I}_{1}\left(p,q\right)}_{-}{I}_{2}\left(p,q\right)}{255}]\times 100\) , where \({I}_{1}\) , \({I}_{2}\) represent the two encrypted images derived from the original image by altering a single pixel, with, p and q denoting the coordinates of the pixels being considered \({I}_{1}\) , and \({I}_{2}\) respectively.
Percentage residual difference (PRD) [ 34 ] This metric assesses the variance between the original ECG host signal and the resulting watermarked ECG signal, calculated as \(PRD=\sqrt{\frac{\sum_{i=1}^{N}{(}^{{{x}_{i}-{y}_{i})}^{2}}}{{\sum }_{i=1}^{N}{{x}^{2}}_{i}}}\) , where \(x\) represents the Original ECG signal, and y is the watermarked signal.
Weighted wavelet percentage residual difference (WWPRD) [ 34 ] This metric is used particularly in the context of watermarking techniques. It is employed to evaluate the effectiveness of image watermarking algorithms by quantifying the perceptual differences between the original image and the watermarked version. In WWPRD, the residual difference between the original image and the watermarked image is calculated in the wavelet domain. By analyzing the WWPRD values, researchers can assess the trade-off between watermark invisibility (how imperceptible the watermark is to human observers) and robustness (how resistant the watermark is to various image processing operations and attacks).
Steganography and steganalysis tools used
Stegdetect [ 10 ] This tool is designed to detect and analyze hidden information within digital media, providing users with powerful steganalysis capabilities. It employs advanced algorithms and techniques to identify subtle modifications or anomalies in digital files that may indicate the presence of hidden information. StegDetect [ 10 ] is widely used by digital forensics experts, law enforcement agencies, and cybersecurity professionals to uncover hidden threats and investigate potential security breaches.
Steganalysis attack [ 61 , 75 , 86 ]
Salt and pepper noise Salt and Pepper Noise, also known as impulse noise, introduces sporadic white and black pixels in an image, resembling grains of salt and pepper scattered throughout the image. This type of noise typically occurs due to errors in data transmission or faults in image acquisition devices.
Additive white gaussian noise (AWGN) AWGN is a type of noise that follows a Gaussian distribution and is characterized by its constant power spectral density across all frequencies. It represents random variations in pixel values added to the original image, often resulting from electronic interference or sensor noise in imaging devices.
Median filtering Median filtering is a spatial domain filtering technique commonly used to remove impulsive noise such as Salt and Pepper Noise. It replaces each pixel value with the median value of its neighboring pixels within a defined window, effectively reducing the impact of outliers caused by noise.
Lowpass filtering Lowpass filtering is a technique used to suppress high-frequency components in an image while preserving low-frequency information. It is commonly employed to mitigate noise by smoothing the image, thereby reducing the effect of high-frequency noise components such as AWGN.
Weiner filtering Weiner filtering is a signal processing technique used to deconvolve images corrupted by additive noise, such as AWGN. It employs a frequency domain approach to estimate and suppress the noise while enhancing the signal-to-noise ratio in the restored image.
Sharpening Sharpening techniques aim to enhance the perceived sharpness and clarity of an image by accentuating edges and details. However, when applied to noisy images, sharpening can exacerbate the visibility of noise, making it a potential target for attacks aimed at degrading image quality.
Histogram equalization attack Histogram equalization is a technique used to adjust the contrast of an image by redistributing pixel values across a wider dynamic range. However, adversaries can exploit this technique to amplify the visibility of noise, especially in regions with low contrast, thereby degrading the overall quality of the image.
Rotation attack Rotation attacks involve rotating an image by a certain angle, which can introduce geometric distortions and potentially exacerbate the visibility of noise. Adversaries may employ rotation attacks to degrade the quality of images, particularly those affected by noise, as part of malicious activities or security breaches.
Pitch removal attacks These involve the removal or alteration of specific pitch frequencies in audio signals. These attacks are often used in scenarios where certain frequency components need to be suppressed or modified, such as in audio watermarking or enhancement techniques.
Bit-plane removal attacks This type of attack targets the bit-plane decomposition of images. In digital image processing, images are often represented using a bit-plane decomposition, where each bit-plane represents a different level of image detail or intensity. Bit-plane removal attacks aim to remove or modify specific bit-planes, thereby altering the visual appearance or content of the image.
Chi-Square attack [ 89 ] This is a prominent technique used to detect the presence of hidden information within digital media, particularly images. This attack leverages statistical analysis to uncover inconsistencies or anomalies in the distribution of pixel values within an image. The rationale behind the Chi-Square Attack lies in the fact that steganographic embedding typically introduces subtle changes to the statistical properties of an image, such as the distribution of pixel values. These changes, while imperceptible to the human eye, can be detected through statistical analysis methods like the chi-square test.
Regular singular (RS) analysis [ 75 ] RS analysis involves analyzing the regular and singular components of an image to identify irregularities or inconsistencies introduced by steganographic embedding techniques. This analysis leverages mathematical properties to distinguish between the regular content of an image and any additional hidden data.
Binary similarity measures (BSM) analysis [ 75 ] BSM are statistical measures used to assess the similarity between the binary representations of two images. In steganalysis, these measures are employed to compare the binary data of an original image with that of a potentially steganographic image. Deviations or discrepancies in binary similarity may indicate the presence of hidden data.
The next section discusses the methodology employed by this research.
3 Methodology
In this section, we outline a reproducible search strategy employed for conducting a comprehensive literature survey. Initially, data collection was performed utilizing the selected databases, namely Footnote 2 Scopus, Footnote 3 IEEE Digital Library, and Footnote 4 ISI Web of Science, with search queries formulated as detailed in Sect. 3.1 . Subsequently, the study selection process was executed, elucidated in Sect. 3.2 . Finally, the final data was extracted from the literature, as described in Sect. 3.3 . The Footnote 5 Parsifal tool was employed to optimize the efficiency of the review process, including the tasks of reviewing, screening, and extracting relevant literature.
3.1 Data gathering (DG)
The initial step in the literature exploration process involves data gathering. Two distinct literature searches were conducted: one encompassing journal articles and a supplementary search focused on conference papers. These papers are also discussed in Sect. 2 . The results of the additional literature search contribute primarily to gaining further insights related to RQ1. To effectively explore the selected databases, essential keywords, and criteria were identified. While both literature searches share common keywords, their criteria, such as publication year and language, were slightly adjusted to ensure a manageable scope. These criteria were refined through an iterative process that involved fine-tuning the keywords and assessing the quantity of relevant literature available on Scopus. The final keywords used for the search query can be expressed as follows:
Search Query: ("cryptography" AND "steganography") AND ("application" OR "real-world") AND ("security" OR "cyberattack" OR "cybersecurity").
Upon utilizing the specified keywords, the three databases collectively yielded a total of 749 results as of May 24th, 2023. Subsequently, inclusion criteria, encompassing year, language, and type, were applied to filter the obtained results. The application of these criteria is detailed in the following two sub-sections.
(a) DG-Literature Search 1: Journal Articles. A comprehensive literature search was conducted specifically for journal articles, with the databases accessed on May 24th, 2023. The criteria applied to this search are as follows:
Only literature published from 2010 onwards was included.
The literature must be classified as a journal article, excluding review papers, conference papers, books, and other sources.
Publications from any region are considered, but they must be in English.
The search encompassed the examination of titles, abstracts, and keywords. These criteria collectively establish the following additional query options:
year > = 2010
AND language = = English
AND type = = Journal Article
These search criteria, along with the keywords from Sect. 2.1, resulted in the total number of 217 journal articles:
Scopus: 179
Web of Science: 31
After removing duplicates using the Parsifal tool, 194 journal articles were left for further analysis.
(b) DG-Literature Search 1: Conference Papers. Furthermore, a supplementary literature search focusing on conference papers was conducted, with the databases accessed on June 23rd, 2023. The search criteria and query vary slightly from the previous literature search as outlined below:
Only conference papers from conference proceedings published from 2018 onwards were considered.
Review papers, journal articles, books, and other sources were excluded.
Similar to the previous search, publications from any region were eligible, provided they were in English.
These criteria lead to the following query options:
year >= 2018
AND language == English
AND type == Conference Paper
AND source type == Conference Proceedings
These search criteria, along with the keywords from Sect. 2.1, resulted in a total number of 147 conference papers:
Web of Science: 11
After removing duplicates using the Parsifal tool, 113 conference papers were left for further analysis.
3.2 Study selection (SS)
The subsequent stage of the literature exploration process involves the selection of pertinent studies, which comprises two distinct phases. The following are seven conditions established to ensure that only literature addressing the research questions outlined in Sect. 1 is considered while filtering out literature of insufficient quality. It is important to note that the two literature searches applied these conditions differently
The paper focuses on researching the combination of cryptography and steganography disciplines.
The paper investigates the application of cryptography and steganography within specific domains (e.g., medical, military, financial) or contexts rather than a general application for "secure communications."
The paper addresses efforts to enhance the security of a system or process rather than solely transmitting additional data.
Is the objective of the paper clearly defined?
Have related works been adequately studied?
Is the methodology employed in the paper clearly described?
Are the results presented clearly and measurably?
(a) SS-Literature Search 1: Journal Articles. In the first literature search focused on journal articles, papers were assessed for relevance based on conditions 1–3 (Sect. 3.2 ), considering the information presented in the title and abstract. Subsequently, papers were further scrutinized to determine if they met conditions 4–7 (Sect. 3.2 ) by examining their contents. Only papers that fulfilled all seven conditions were included in the selection process. As a result of this rigorous selection process, the initial total of results was reduced to 24 journal articles. The flow chart depicted in Fig. 2 a illustrates the sequential steps involved in data gathering and study selection. Papers that discussed no specific application, such as "secure communications," were not categorized as such since a significant number of such papers were already omitted during the search query phase. Including them in the list would have resulted in an incomplete compilation of relevant articles.
(b) SS-Literature Search 2: Conference Papers. In the second literature search, focusing on conference papers, the selection process entailed examining papers for conditions 1–3 (Sect. 3.2 ) based on the information presented in the title and abstract. These conditions were crucial in determining whether a paper should be considered for RQ1. As a result of this selection process, 21 conference papers met the criteria. It is worth noting that two papers were identified as having been released before 2018 and were subsequently manually filtered out. The flow chart illustrated in Fig. 2 b provides a visual representation of the data-gathering process and study selection for this search.
3.3 Data extraction (DE)
The third step of exploring literature is extracting data. Data extraction consists of two parts, both performed using Parsifal. To answer RQ1 features related to a paper’s application have been extracted (both literature searches). The list of features evolved during the process of extraction as it was expanded, restructured, and finalized (Sect. 3.1 ) to encompass all encountered literature. Next, to answer RQ2 and RQ3, information related to the algorithms and metrics, advantages, limitations, and evaluation methods discussed by the literature were extracted (only literature search 1: journal articles). The results of data gathering, study selection, and data extraction are presented in the subsequent sections.
In this section, we present the comprehensive findings derived from the systematic review, addressing the research questions outlined in Sect. 1 . To facilitate a better understanding of the findings, figures, and tables are provided. The subsequent sections are organized in alignment with the order of the research questions. Section 4.1 delves into the encountered types of applications and explores potential categorization approaches. Additionally, Sect. 4.2 discusses the applications, their limitations, and advantages identified during the review process. Lastly, Sect. 4.3 focuses on the analysis methods employed in the literature. By following this structured arrangement, we aim to provide a clear and cohesive presentation of the research findings, offering valuable insights into the combined application of steganography and cryptography in various domains and contexts.
4.1 RQ1: exploring applications
For each study, relevant characteristics pertaining to the context in which the combined application of steganography and cryptography is explored were extracted. The analysis of the literature emphasizes the significance of categorizing the application of each article in two distinct ways:
The application domain : This refers to the specific industry sector or domain in which an application operates. The encountered application domains include financial, government, medical, and transportation.
The technological domain/technology [72]: This aspect involves identifying one or more technological topics associated with an application. Technologies are considered tools that can be employed across various domains to solve diverse problems or perform various tasks. The encountered technologies include Big Data, Blockchain, Cyber-Physical Systems (CPS), Cloud Computing (Cloud), Edge Computing (Edge), Fog Computing (Fog), Internet of Things (IoT), IPv6, Machine Learning (ML), Mobile Computing (Mobile), Personal Computing (Personal), Satellite Imaging (Satellite), Unmanned Aerial Vehicles (UAVs), and Voice Operated Systems (Voice).
By employing these two distinct categorizations, namely Application Domain and Technological Domain , it becomes possible to identify specific commonalities and differences within the applications. This facilitates informed research and the development of tailored solutions for specific application domains or technologies. Notably, this categorization approach differs from how other reviews, as exemplified by [ 45 ], typically categorize applications. While some studies may focus on applications specific to a particular application domain, such as the medical domain, other articles ([ 1 , 7 , 9 , 14 , 19 , 33 , 36 , 85 , 86 , 90 ] [ 5 , 10 ]) may exclusively concentrate on applications within a technological domain. A technological domain can be applicable across numerous application domains. Given these considerations, categorization by application domain is given precedence, and cases, where the application domain or technological domain could not be determined, have been excluded from categorization. Furthermore, irrespective of the application or technological domain, the specific focus or functionality of each application is also determined.
Functionality: This refers to the specific features, tasks, or roles performed by an application within its domain. It is important to note that security is considered a common role across the explored literature and is, therefore, not specified as functionality. Examples of functionalities include Smart Monitoring, Anonymization, Healthcare Data Transmission, Vehicle Diagnostics, Malware Detection, and Industry 4.0/5.0 Implementation.
The subsequent sections present the results obtained from both literature searches, providing further insights into the combined application of steganography and cryptography.
4.1.1 Journal articles
The findings from literature search 1, pertaining to RQ1, are presented in two tables. Table 1 provides an overview of articles and their corresponding application domains, while Table 2 focuses on the technologies employed, reflecting the split categorization approach. The core functionality of each paper studied for this review is explicitly mentioned in both tables. In cases where certain studies solely concentrate on a technological domain, potential application domains have been specified in italics (please refer to Table 2 ). These application domains are either suggested by the authors themselves or inferred based on similar literature. It is worth noting that technology often has applicability across a broader range of application domains. In such instances, the application domain is identified as ‘Cross-Domain.’ As showcased in Table 1 and Table 2 , a total of 12 journal articles from each table were analyzed, with each article focusing on distinct application domains and their corresponding technological domains.
Figure 3 a displays journal articles published from 2010 to 2023, categorized by application domains (Medical, Government, and Transportation) or technological domains (N/A). The figure reveals a modest increase in articles exclusively centered on technological domains, surpassing those focused on application domains. Considering the diverse potential of these technologies across various application domains (e.g., IoT [ 63 ]), it is advisable to prioritize innovation in a broader sense. Subsequently, refining these technologies for specific application domains holds the potential for even greater rewards. On the other hand, Fig. 3 b presents conference papers published between 2018 and 2023. In addition to the medical domain, as observed in the Journal articles shown in Fig. 3 a, there is a notable trend toward the financial domain in conference papers in the realm of combining and applying cryptography and steganography. More information on the Conference papers can be found in subsection 4.1.2.
Distribution of literature in application and technological domains (N/A) over time
Figure 4 provides visual representations of the distribution of application domains (Fig. 4 a) and technological domains (Fig. 4 c) based on the data presented in Table 1 . In Fig. 4 a, it is evident that the majority (n = 9, [ 13 , 22 , 34 , 41 , 45 , 56 , 61 , 78 , 88 ]) out of the total 12 articles focus on the medical domain, suggesting a relatively narrow focus of research in this area. Furthermore, only a small number of articles concentrate on governmental applications (n = 2, [ 75 , 87 ]) and transportation (n = 1, [ 48 ]). Similarly, the occurrences of technological domains are visualized in Fig. 4 c. Notably, technologies with an occurrence of 1 are grouped under 'Other,' which includes Big Data, Fog Computing, Web Applications, Personal Computing, Edge Computing, and Cyber-Physical Systems. The visualization in Fig. 4 b and d is completed in subsection 3.1.2, where the conference papers are analyzed in depth. After analysis of the journal articles, it becomes evident that only one article from 2018 focuses on an application in the Transportation domain. Furthermore, in both 2021 and 2022, there is a lack of publications in the medical domain, whereas the four preceding years had such publications. Another notable observation is the surge in articles focusing on a specific technology in 2022. However, the applications discussed in these articles ([ 7 , 36 , 86 , 90 ]) seem unrelated, making it challenging to identify any underlying reason behind this trend.
Distributions of domains of journal articles (left) and conference papers (right)
Furthermore, an attempt was made to employ the VOSviewer Footnote 6 tool to identify any authorship overlap among the identified journal articles. However, none of the articles displayed any shared authors, indicating a dispersed distribution of researchers working on the topic. This suggests that research on the combined approach of steganography and cryptography is relatively new, aligning with the increasing trend observed in the number of articles over the past 13 years in Fig. 3 a. However, it is important to consider that additional factors may contribute to this observation. A more detailed discussion of journal articles focusing on specific application domains is provided in Sect. 3.2 .
4.1.2 Conference papers
To gain further insights, conference papers were also subjected to analysis. The results of this additional literature search are presented in Tables 3 and 4 , providing additional data for a comprehensive review. Similarly, to journal papers, the publication years of conference papers are depicted in Fig. 3 b. Notably, there has been a relatively consistent number of papers published each year, suggesting either a sustained interest in combining steganography and cryptography or a stabilization of the field following a previous period of change. However, due to time constraints, papers published before 2018 were not explored in this study. Surprisingly, from 2018 to 2023, out of the 21 papers analyzed, only a few (n = 5) focused on specific application domains (as seen in Fig. 4 b). These papers predominantly spanned the medical domain (n = 3, [ 23 , 29 , 47 ]) and a newly emerging financial domain (n = 2, [ 52 , 62 ]). Once again, the medical domain emerged as the most popular area of application. Furthermore, while approximately 50% of the identified literature in journal articles explored applications in specific domains, only 24% of conference papers did the same. This disparity may further emphasize the trend of developing technologies in a more generalized sense rather than focusing exclusively on specific application domains. Similarly, Fig. 4 d showcases the technological domains, revealing the presence of three prominent technologies shared between journal articles and conference papers: Mobile Computing, the Internet of Things, and Cloud Computing, with Cloud Computing being particularly prevalent. It should be noted that making a direct comparison between the two searches is challenging due to the difference in the time covered by the literature.
4.2 RQ2: advantages, limitations, and trade-offs
In this section, we discuss the observations made regarding the algorithms and methodologies employed in the journal articles. Firstly, we present general observations, and subsequently, we delve into the three application domains encountered in the journal articles, namely Government, Medical, and Transportation The research papers are also sorted out (as listed in Table 1 ). The research papers are also arranged in ascending order according to these three categories. The full data collected for this RQ2 can be found in Table 5 . Further, there are other categories identified from Tables 1 and 2 , such as Cross-Domain, Medical-Military, Energy, Medical, Finance, and Military; Cross-domain is also reflected in Table 5 . The research papers are discussed in Sect. 2 .
Government application domain
This category focuses on two articles [ 75 , 87 ] that explore the application of both steganography and cryptography in the government domain, specifically in the areas of surveillance and voting. These articles are listed in Table 6 . Each article presents different approaches with their respective strengths and limitations.
The first article [ 75 ] proposes a two-tiered video surveillance system that offers robustness against cipher-breaking attacks. However, the quality of the recovered data is dependent on the compression rate of the Compressed Sensing (CS) technique used. Additionally, the system could be enhanced to accommodate more than two levels of authorization.
The second article [ 87 ] introduces an online voting system that ensures individual verifiability and security. However, it is susceptible to certain security challenges, such as collusion among polling officers and network eavesdropping. The system provides receipts to voters, but this poses a potential issue in case users lose their receipts. Improvements, such as exploring alternative algorithms, may enhance the system's performance, such as reducing the size of receipts.
Overall, these articles highlight different aspects and considerations in the government domain when implementing steganography and cryptography, emphasizing both the strengths and areas for potential improvement in their respective approaches.
Medical application domain
This section focuses on nine articles that explore applications in the medical domain. The articles are listed in Table 7 , along with their respective advantages and limitations.
Among these articles, three papers ([ 56 , 61 , 88 ]) incorporate the use of chaotic algorithms in their encryption methods. For example, [ 56 ] presents a transmission system for generic data that utilizes chaotic encryption based on a 2D-Henon map ([ 84 ]). However, limited practical implementation details are provided, and future works could be drawn upon [ 53 ] for a more in-depth analysis of the implementation aspects. One drawback is that these three papers lack performance analysis and key measurements such as Computation Time (CT) and Throughput (TP) for the chaotic algorithms. This limitation hampers the assessment of their potential for real-time systems. Nevertheless, [ 61 , 88 ], which also employs chaotic encryption, can serve as inspiration for similar approaches. It should be noted that not all chaotic encryption algorithms, due to their complex iterative operations, are suitable for real-time systems. However, less resource-intensive methods like [ 60 ] could be considered viable alternatives. This aspect could be explored as a future research direction in the field.
Health data in IoT
Two papers ([ 22 , 34 ]) focus on health data transmissions from IoT devices, particularly in the context of remote patient monitoring. These devices typically prioritize low power consumption and low computational complexity. In [ 34 ], data is concealed within ECG signals, while [ 22 ] utilizes image steganography. Both papers employ encryption before embedding the data. In [ 34 ], the receiver must possess knowledge of the encryption and embedding keys, and no key is transmitted. On the other hand, [ 22 ] embeds both the data and the encryption key. [ 34 ] employs XOR cipher for its computational simplicity, while [ 22 ] utilizes AES ([ 30 ]) and RSA ([ 90 ]) encryption methods. It is worth considering more secure or efficient alternatives, such as TEA and its variants [ 50 ] or hardware-accelerated AES ([ 55 ]), for IoT devices. Both papers utilize multi-level DWT (Discrete Wavelet Transform) for steganography. These differences highlight the range of methodologies employed to safeguard patient data during IoT transmissions.
Embedding location restrictions
Among the medical papers focused on healthcare data transmissions, two ([ 41 , 88 ]) discuss methods that impose restrictions on data embedding locations. In [ 88 ], the Distance Regularized Level Set Evolution (DRLSE) algorithm [ 42 ] is utilized to identify the Region of Interest (ROI) and Non-Region of Interest (NROI) in a medical image. Data is embedded in the NROI using adaptive Pixel Expansion Embedding (PEE) to achieve higher capacity. For the ROI, a custom algorithm based on histogram-shifting with contrast enhancement is employed to ensure visual clarity. In this paper, data embedding is performed before image encryption. In contrast, [ 41 ] also identifies ROI and NROI areas, specifically in DICOM images. However, in this case, the encryption process is conducted before the identification of these areas. Edge detection techniques such as the Gabor Filter and Canny Edge [ 55 ] are employed for area identification. Patient data is only embedded in the NROI to preserve image quality. Additionally, to maintain the verifiability of integrity, which is crucial in medical applications, an ROI-generated hash is embedded in the NROI. These approaches demonstrate different strategies for data embedding in specific areas of medical images, highlighting the preservation of image quality, visual clarity, and the importance of integrity verification in healthcare applications.
Transportation Application Domain
An article focuses on an application in the transportation domain, and its advantages and limitations are listed in Table 8 . In [ 48 ], a system is proposed to securely deliver diagnostic data to manufacturers and handle firmware updates. Although the system is innovative, there could be potential drawbacks, such as extended decryption times and potential inefficiency when dealing with larger software updates. To address these challenges, future work could investigate the utilization of more efficient cryptographic algorithms and adapt the method to better accommodate larger files, which is common when dealing with updates. Moreover, future research in the transportation domain could explore vehicle-to-vehicle (V2V) networks, where minimizing the speed and size of communication is essential.
4.3 General observations
Several observations can be made regarding all the journal articles. Firstly, the steganography methods commonly employed in the identified applications primarily focus on images, as indicated in Table 9 .
There is a noticeable underutilization of other cover mediums such as audio, signal, hardware, video, and text. This gap in research highlights the need for further investigation in these areas. Within the medical domain specifically, 7 out of 9 articles utilize image steganography. The choice of image-based steganography in medical applications is effective, considering the frequent use of medical imaging. However, there is potential for diversifying data types by exploring other forms of steganography, such as video steganography in recorded surgeries or expanding signal steganography beyond ECG signals. This diversification would enhance the usability and robustness of steganography in various systems.
Secondly, in certain applications ([ 22 , 44 ]), the encryption key is embedded together with the data in the cover medium. This eliminates the requirement for a separate communication channel (in the case of dynamic keys) or pre-established cryptographic keys.
Thirdly, it is noteworthy that 42% of the identified articles, spanning various application and technological domains, incorporate a Reversible Data Hiding (RDH) technique. RDH techniques enable the lossless reconstruction of the original cover media after the hidden data has been extracted. This capability is particularly crucial in sectors such as healthcare, where preserving the integrity of the original data, such as medical imagery, is often of utmost importance [ 13 , 22 , 41 , 44 , 61 , 88 ].
Based on these findings, it is evident that there is a need to diversify research in terms of methods and cover mediums . Attention should be given to addressing security challenges in government applications, while a more comprehensive assessment of the performance of chaotic algorithms in medical domains is required. Additionally, there is a call for exploring a wider range of steganography methods for healthcare data transmissions. In the transportation domain, it is advisable to explore other cryptographic algorithms to effectively handle larger data files. Overall, research efforts can significantly enhance data security across various sectors by addressing these areas of improvement.
4.4 RQ3: analyzing evaluation methods used
In this section, we discuss the analysis and evaluation methods utilized in the Journal articles, which are listed in Table 10 . The analysis of steganography typically revolves around four main concepts: capacity, robustness, security, and imperceptibility (sometimes divided into undetectability and invisibility) [ 4 , 68 , 82 ]. On the other hand, cryptography evaluation focuses on security, encryption time, key size, plain vs. cipher size , and other related factors [ 26 , 83 ]. Considering the similarities between these concepts, they are grouped into three perspectives: Security, Performance, and User. These perspectives are interconnected and interdependent, as demonstrated in Fig. 5 .
The three discussed analysis perspectives
4.4.1 Security perspective
Similar to cryptography, steganography can also be vulnerable to different types of attacks, such as ciphertext and plaintext attacks [ 49 ]. Steganography is susceptible to similar attack types, including known carrier and known message attacks [ 49 ]. The significance of safeguarding against these attacks is contingent upon the order in which steganography and cryptography are applied.
When data is embedded first and then encrypted, the primary defense against attacks lies in the strength of the encryption itself. Several articles, such as [ 13 , 57 , 87 , 88 ] (listed in Table 5 in section 3.2 ), follow this order of operations. Among these articles, some also address advanced attacks, including histogram equalization ([ 9 , 44 , 61 , 88 ]), while only one article tackles rotation attacks ([ 61 ]). Conversely, when data is encrypted first, the primary defense against attacks lies in the strength or imperceptibility of the stego object. The majority of applications follow this order of operations, as evidenced by articles such as [ 13 , 22 , 34 , 41 , 44 , 48 , 61 , 75 ], among others. These implementations primarily focus on achieving steganographic imperceptibility, utilizing metrics such as PSNR, SSIM, MSE, and BER. They heavily rely on cryptographic evaluations from previous works. Even articles proposing custom or more complex encryption methods ([ 34 , 44 , 48 , 56 , 61 , 88 ]) still analyze cryptographic security as an integral part of their evaluation.
The following insights are drawn based on the security perspective:
Vulnerability to attacks Similar to cryptography, steganography is prone to various attacks, including ciphertext and plaintext attacks. This underscores the necessity of implementing robust defenses to safeguard against potential security breaches.
Order of operations The sequence in which steganography and cryptography are applied influences the defense mechanisms against attacks. Whether data is embedded first and then encrypted, or vice versa, dictates where the primary defense lies, either in the strength of encryption or the imperceptibility of the stego object.
Advanced attack consideration Some articles address advanced attacks, such as histogram equalization and rotation attacks, highlighting the importance of considering sophisticated attack vectors that may compromise the invisibility of stego objects.
Emphasis on imperceptibility The majority of implementations prioritize achieving steganographic imperceptibility by encrypting data first. This emphasizes the importance of concealing hidden data within digital media while maintaining the appearance and quality of the original content.
Integration of cryptographic security Even articles proposing custom encryption methods analyze cryptographic security comprehensively. This integration underscores the interdependence between cryptographic measures and steganographic techniques in ensuring the overall security of hidden information.
4.4.2 Performance perspective
The performance of encountered systems can be influenced by several factors, including computation time (CT), capacity (related to steganography), and key size (related to cryptography).
Computation time , which encompasses both steganography and cryptography, is particularly important as it correlates with power consumption, making it a crucial consideration in real-time and power-sensitive systems. While some articles like [ 1 , 13 , 33 , 36 , 44 , 48 ] incorporate CT measurements, only two similar applications [ 1 , 36 ] specifically address the need for managing power consumption in their environments. CT measurements are often discussed as "total time" or analyzed individually for different components of the system, such as embedding time, extraction time, encryption time, and more. This approach allows for more targeted performance improvements. Interestingly, among the seven articles exploring applications in the Internet of Things (IoT), three articles [ 7 , 22 , 34 ] do not utilize time-based analysis metrics. This omission makes it challenging to accurately assess the performance and efficiency of their proposed applications. A time-based analysis is vital for a comprehensive understanding of application performance as it not only reveals the speed of processes but also provides insights into the efficient utilization of system resources.
Another significant metric to consider is capacity . The balance between imperceptibility and capacity holds importance depending on the specific application. In certain (real-time) applications where relatively small data fragments are shared, the capacity of the cover medium may not be as critical. In such cases, imperceptibility may also be of lesser relevance. Out of the 24 articles analyzed, capacity is evaluated in 9 articles [ 5 , 7 , 9 , 13 , 41 , 61 , 75 , 85 , 86 ], either in comparison to other implementations or by examining different parameters within the same implementation. It is worth noting that only one article ([ 7 ]) focusing on IoT applications specifically analyzed the capacity of the employed steganographic method. The key size in cryptographic algorithms can have a significant impact on encryption time, as explained in [ 40 ]. In the context of IoT, [ 22 ] specifically addresses cryptographic operations using an AES key size of 128 bits. Although AES-128 is generally regarded as secure, larger key sizes can be employed. The utilization of more efficient encryption algorithms could potentially allow for the use of larger keys while maintaining similar encryption times. Surprisingly, the discussion or justification of key sizes for well-known cryptographic algorithms does not appear to be frequently addressed in the analyzed literature.
Upon examining various research papers listed in Table 10 , the following insights regarding performance are observed:
The analysis revealed key considerations related to computation time, capacity, and key size.
Computation time was emphasized as critical due to its association with power consumption, especially in real-time and power-sensitive systems.
Capacity, concerning the balance between imperceptibility and capacity in steganography, was noted to vary depending on specific application requirements.
The analysis underscored the significant impact of key size selection in cryptographic algorithms on encryption time, highlighting the importance of careful consideration in algorithm design.
Despite the importance of these factors, the analysis revealed areas where certain metrics, such as time-based analysis in IoT applications, were lacking, making it challenging to comprehensively assess performance and efficiency.
4.4.3 User perspective
The user perspective evaluates how effectively a system incorporating steganography and cryptography aligns with the user's workflow, emphasizing factors such as ease of use, comprehension, trust, processing time, and system stability. The impact of the system on the user's workflow is particularly crucial for applications where the user directly interacts with the system. However, even in cases where the system operates in the background, it can still potentially influence the user's experience, albeit to a slightly lesser degree. From the reviewed literature, it is observed that only a limited number of studies include usability tests to analyze user experience. For instance, the implementation of an e-voting system discussed in [ 75 ] incorporates usability and user acceptance testing using Nielsen's quality components [ 58 ] and Davis' Technology Acceptance Model (TAM) [ 20 ], respectively. These well-established methods assess the usability and acceptance of the system. Similarly, the NFC access control scheme presented in [ 19 ] includes usability, perceived vulnerability, perceived security, and behavioral intention tests to examine how the proposed security scheme could influence user behavior. The methods utilized in this study were adapted from previous works [ 15 , 35 , 76 ].
Applications such as remote patient monitoring ([ 34 ]) aim to provide a user-friendly experience, requiring minimal complex setup from the user's perspective. It is mentioned that any additional complexity introduced by the implementation of steganography or cryptography should ideally be abstracted away from the user. However, the only user interaction highlighted in the article is related to the imperceptibility of the Human Visual System (HVS), where doctors inspect ECGs. Similarly, the application of hiding files in audio files on PCs [ 5 ] is closely related to end-users, but the article does not delve further into this aspect and omits user testing in this regard. This omission creates an evaluation gap, as it fails to comprehend the actual user experience and potential areas for improvement. User experience can be significantly influenced by other perspectives, such as security and performance. If the combination of steganography and cryptography leads to excessively slow data processing or if the system lacks robustness against attacks like compression or cropping, it could compromise the user's ability to effectively manage stego objects (e.g., share or post-process them). This vulnerability could potentially result in data loss or corruption, ultimately degrading the overall user experience. Therefore, robust implementations of steganography and cryptography are essential for maintaining a high-quality user experience.
After analyzing the User perspective criteria, we identify the following insights:
Despite the importance of user experience, there's a noted lack of usability tests in the reviewed literature, with only a few studies incorporating established methods like Nielsen's quality components and Davis' Technology Acceptance Model (TAM).
Applications aim to provide a user-friendly experience, with additional complexity introduced by steganography or cryptography ideally abstracted away from the user to ensure ease of use.
User experience can be significantly impacted by factors like security and performance, with slow data processing or lack of robustness against attacks compromising the effective management of stego objects and degrading overall user experience.
Robust implementations of steganography and cryptography are crucial for maintaining a high-quality user experience, highlighting the importance of considering user-centric factors in system design and evaluation.
4.5 General observations
In summary, the evaluation of steganography and cryptography requires a comprehensive analysis that encompasses security, performance, and user perspectives. Unfortunately, several studies overlook certain metrics, creating gaps in our understanding of computation time, capacity, key size, and user-friendliness. It is crucial to strike a balance between steganography and cryptography to ensure an optimal user experience, robust security, and efficient performance. Future research should aim to address these oversights and strive for a more comprehensive evaluation framework.
5 Conclusion and future scope
This review examines the state of combined steganography and cryptography applications in journal articles and conference papers, categorized by application and technological domains. While medical applications dominate, IoT and Cloud Computing domains show active research. Real-time constraints and privacy protection are prominent concerns in technological domains. The combined approach provides data security and privacy benefits, but trade-offs and limitations remain. Further research is needed to address these challenges and improve methodologies. The evaluation metrics vary, emphasizing domain-specific knowledge. A comprehensive framework is proposed, incorporating security, performance, and user perspectives. However, there is a notable lack of user testing in the literature, highlighting the need for user-centric system design. This review focused solely on conference papers for RQ1 due to time constraints. Conference papers are valuable sources of the latest findings and innovative practices in the rapidly evolving field of information security, making them relevant not just for RQ1 but also for RQ2 and RQ3. Additionally, the search keywords were limited to "cryptography" and "steganography," but other terms like "encryption" or "data-hiding" may be used. Future research could explore applications in diverse domains such as transportation and energy. Comparative studies could shed light on the advantages of using steganography or cryptography individually in different scenarios. Further investigations into non-image steganographic mediums and the impact of combining steganography and cryptography on end-user experience and acceptance are also warranted.
Data availability
The datasets generated during and/or analyzed during the current study are available from the corresponding author upon reasonable request.
https://www.scopus.com/
https://ieeexplore.ieee.org/
https://www.webofscience.com/
https://parsif.al/
https://www.vosviewer.com/
Alissa, K.A., Maray, M., Malibari, A.A., Alazwari, S., Alqahtani, H., Nour, M.K., Al Duhayyim, M.: Optimal deep learning model enabled secure UAV classification for industry. Comput. Mater. Contin. 74 (3), 5349–5367 (2023)
Google Scholar
Abbas, M.S., Mahdi, S.S., Hussien, S.A.: Security improvement of cloud data using hybrid cryptography and steganography. In: 2020 International Conference on Computer Science and Software Engineering (CSASE), pp. 123–127. IEEE (2020)
Al Abbas, A.A.M., Ibraheem, N.B.: Using DNA In Adynamic Lightweight Algorithm For Stream Cipher In An IoT Application. In: 2022 International Symposium on Multidisciplinary Studies and Innovative Technologies (ISMSIT), pp. 232–240. IEEE (2022)
Al-Ani, Z.K., Zaidan, A.A., Zaidan, B.B., Alanazi, H.: Overview: main fundamentals for steganography. arXiv preprint arXiv:1003.4086 . (2010)
Al-Juaid, N., Gutub, A.: Combining RSA and audio steganography on personal computers for enhancing security. SN Appl. Sci. 1 , 1–11 (2019)
Article Google Scholar
Ali, M.H., Al-Alak, S.: Node protection using hiding identity for IPv6 based network. In: 2022 Muthanna International Conference on Engineering Science and Technology (MICEST), pp. 111–117. IEEE (2022)
Alsamaraee, S., Ali, A.S.: A crypto-steganography scheme for IoT applications based on bit interchange and crypto-system. Bull. Electr. Eng. Inf. 11 (6), 3539–3550 (2022)
Anderson, R.J., Petitcolas, F.A.: On the limits of steganography. IEEE J. Sel. Areas Commun. 16 (4), 474–481 (1998)
Anushiadevi, R., Amirtharajan, R.: Design and development of reversible data hiding-homomorphic encryption & rhombus pattern prediction approach. Multimed. Tools Appl. 82 (30), 46269–46292 (2023)
Badhani, S., Muttoo, S.K.: Evading android anti-malware by hiding malicious applications inside images. Int. J. Syst. Assur. Eng. Manag. 9 , 482–493 (2018)
Banga, P.S., Portillo-Dominguez, A.O., Ayala-Rivera, V.: Protecting user credentials against SQL injection through cryptography and image steganography. In: 2022 10th International Conference in Software Engineering Research and Innovation (CONISOFT), pp. 121–130. IEEE (2022)
Bharathi, P., Annam, G., Kandi, J.B., Duggana, V.K., Anjali, T.: Secure file storage using hybrid cryptography. In: 2021 6th International Conference on Communication and Electronics Systems (ICCES), pp. 1–6. IEEE (2021)
Bhardwaj, R.: An improved reversible data hiding method in encrypted domain for E-healthcare. Multimed. Tools Appl. 82 (11), 16151–16171 (2023)
Bhattacharjee, S., Rahim, L.B.A., Watada, J., Roy, A.: Unified GPU technique to boost confidentiality, integrity and trim data loss in big data transmission. IEEE Access 8 , 45477–45495 (2020)
Bhuiyan, M., Picking, R.: A gesture controlled user interface for inclusive design and evaluative study of its usability. J. Softw. Eng. Appl. 4 (09), 513 (2011)
Bokhari, M.U., Shallal, Q.M.: A review on symmetric key encryption techniques in cryptography. Int. J. Comput. Appl. 147 (10), 43 (2016)
Castillo, R.E., Cayabyab, G.T., Castro, P.J.M., Aton, M.R.: Blocksight: a mobile image encryption using advanced encryption standard and least significant bit algorithm. In: Proceedings of the 1st International Conference on Information Science and Systems, pp. 117–121 (2018)
Caviglione, L., Podolski, M., Mazurczyk, W., Ianigro, M.: Covert channels in personal cloud storage services: the case of dropbox. IEEE Trans. Ind. Inf. 13 (4), 1921–1931 (2016)
Cheong, S.N., Ling, H.C., Teh, P.L.: Secure encrypted steganography graphical password scheme for near field communication smartphone access control system. Expert Syst. Appl. 41 (7), 3561–3568 (2014)
Davis, F.D.: Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 13 , 319–340 (1989)
Dhawan, S., Chakraborty, C., Frnda, J., Gupta, R., Rana, A.K., Pani, S.K.: SSII: secured and high-quality steganography using intelligent hybrid optimization algorithms for IoT. IEEE Access 9 , 87563–87578 (2021)
Elhoseny, M., Ramírez-González, G., Abu-Elnasr, O.M., Shawkat, S.A., Arunkumar, N., Farouk, A.: Secure medical data transmission model for IoT-based healthcare systems. IEEE Access 6 , 20596–20608 (2018)
Gamal, S.M., Youssef, S.M., Abdel-Hamid, A.: Secure transmission and repository platform for electronic medical images: case study of retinal fundus in teleophthalmology. In: 2020 International Conference on Computing, Electronics & Communications Engineering (iCCECE), pp. 9–14. IEEE (2020)
Ghuge, S.S., Kumar, N., Savitha, S., & Suraj, V.: Multilayer technique to secure data transfer in private cloud for saas applications. In: 2020 2nd International Conference on Innovative Mechanisms for Industry Applications (ICIMIA), pp. 646–651. IEEE (2020)
Gupta, S., Goyal, A., Bhushan, B.: Information hiding using least significant bit steganography and cryptography. Int. J. Modern Educ. Comput. Sci. 4 (6), 27 (2012)
Gururaja, H.S., Seetha, M., Koundinya, A.K.: Design and performance analysis of secure elliptic curve cryptosystem. Int. J. Adv. Res. Comput. Commun. Eng. 2 (8), 1 (2013)
Haque, M.E., Zobaed, S.M., Islam, M.U., Areef, F.M.: Performance analysis of cryptographic algorithms for selecting better utilization on resource constraint devices. In: 2018 21st International Conference of Computer and Information Technology (ICCIT), pp. 1–6. IEEE (2018)
Sri, P.H., Chary, K.N.: Secure file storage using hybrid cryptography. Int. Res. J. Mod. Eng. Technol. Sci. (2022). https://doi.org/10.56726/IRJMETS32383
Hashim, M.M., Rhaif, S.H., Abdulrazzaq, A.A., Ali, A.H., Taha, M.S.: Based on IoT healthcare application for medical data authentication: Towards a new secure framework using steganography. In: IOP Conference Series: Materials Science and Engineering, vol. 881, no. 1, p. 012120. IOP Publishing (2020)
Heron, S.: Advanced encryption standard (AES). Netw. Secur. 2009 (12), 8–12 (2009)
Hussain, M., Wahab, A.W.A., Batool, I., Arif, M.: Secure password transmission for web applications over internet using cryptography and image steganography. Int. J. Secur. Appl. 9 (2), 179–188 (2015)
Hussein, A.A., Jumah Al-Thahab, O.Q.: Design and simulation a video steganography system by using FFTturbo code methods for copyrights application. Eastern-Euro. J. Enterp. Technol. 2 (9), 104 (2020)
Hussein, S.A., Saleh, A.I., Mostafa, H.E.D.: A new fog based security strategy (FBS 2) for reliable image transmission. J. Ambient Intell. Humaniz. Comput. 11 , 3265–3303 (2020)
Ibaida, A., Khalil, I.: Wavelet-based ECG steganography for protecting patient confidential information in point-of-care systems. IEEE Trans. Biomed. Eng. 60 (12), 3322–3330 (2013)
Ifinedo, P.: Understanding information systems security policy compliance: an integration of the theory of planned behavior and the protection motivation theory. Comput. Secur. 31 (1), 83–95 (2012)
Jain, D.K., Li, Y., Er, M.J., Xin, Q., Gupta, D., Shankar, K.: Enabling unmanned aerial vehicle borne secure communication with classification framework for industry 5.0. IEEE Trans. Ind. Inf. 18 (8), 5477–5484 (2021)
Jankowski, B., Mazurczyk, W., Szczypiorski, K.: PadSteg: introducing inter-protocol steganography. Telecommun. Syst. 52 , 1101–1111 (2013)
Kavitha, V., Sruthi, G.S., Thoshinny, B., Riduvarshini, S.R.: Stagchain–a steganography based application working on a blockchain environment. In: 2022 3rd International Conference on Electronics and Sustainable Communication Systems (ICESC), pp. 674–681. IEEE (2022)
Khan, H.A., Abdulla, R., Selvaperumal, S.K., Bathich, A.: IoT based on secure personal healthcare using RFID technology and steganography. Int. J. Electr. Comput. Eng. 11 (4), 3300 (2021)
Kumar, M.G.V., Ragupathy, U.S.: A survey on current key issues and status in cryptography. In: 2016 International Conference on Wireless Communications, Signal Processing and Networking (WiSPNET), pp. 205–210. IEEE (2016)
Kumar, N., Kalpana, V.: A novel reversible steganography method using dynamic key generation for medical images. Indian J. Sci. Technol. 8 (16), 1 (2015)
Li, C., Xu, C., Gui, C., Fox, M.D.: Distance regularized level set evolution and its application to image segmentation. IEEE Trans. Image Process. 19 (12), 3243–3254 (2010)
Article MathSciNet Google Scholar
Madavi, K.B., Karthick, P.V.: Enhanced cloud security using cryptography and steganography techniques. In: 2021 International Conference on Disruptive Technologies for Multi-Disciplinary Research and Applications (CENTCON), vol. 1, pp. 90–95. IEEE (2021)
Mancy, L., Vigila, S.M.C.: A new diffusion and substitution-based cryptosystem for securing medical image applications. Int. J. Electron. Secur. Digit. Forens. 10 (4), 388–400 (2018)
Mandal, P.C., Mukherjee, I., Paul, G., Chatterji, B.N.: Digital image steganography: a literature survey. Inf. Sci. 609 , 1451–1488 (2022)
Mandal, S., Khan, D.A.: Enhanced-longest common subsequence based novel steganography approach for cloud storage. Multimed. Tools Appl. 82 (5), 7779–7801 (2023)
Manikandan, V.M., Masilamani, V.: Reversible data hiding scheme during encryption using machine learning. Proc. Comput. Sci. 133 , 348–356 (2018)
Mayilsamy, K., Ramachandran, N., Raj, V.S.: An integrated approach for data security in vehicle diagnostics over internet protocol and software update over the air. Comput. Electr. Eng. 71 , 578–593 (2018)
Mishra, R., Bhanodiya, P.: A review on steganography and cryptography. In: 2015 International Conference on Advances in Computer Engineering and Applications, pp. 119–122. IEEE (2015)
Mishra, Z., Acharya, B.: High throughput novel architectures of TEA family for high speed IoT and RFID applications. J. Inf. Secur. Appl. 61 , 102906 (2021)
Mogale, H., Esiefarienrhe, M., Letlonkane, L. Web authentication security using image steganography and AES encryption. In: 2018 International Conference on Intelligent and Innovative Computing Applications (ICONIC), pp. 1–7. IEEE (2018)
More, S.S., Mudrale, A., Raut, S.: Secure transaction system using collective approach of steganography and visual cryptography. In: 2018 International Conference on Smart City and Emerging Technology (ICSCET), pp. 1–6. IEEE (2018)
Mostaghim, M., Boostani, R.: CVC: chaotic visual cryptography to enhance steganography. In: 2014 11th International ISC Conference on Information Security and Cryptology, pp. 44–48. IEEE (2014)
Munoz, P.S., Tran, N., Craig, B., Dezfouli, B., Liu, Y.: Analyzing the resource utilization of AES encryption on IoT devices. In: 2018 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC), pp. 1200–1207. IEEE (2018)
Nadernejad, E., Sharifzadeh, S., Hassanpour, H.: Edge detection techniques: evaluations and comparisons. Appl. Math. Sci. 2 (31), 1507–1520 (2008)
MathSciNet Google Scholar
Bremnavas, I., Mohamed, I.R., Shenbagavadivu, N.: Secured medical image transmission through the two dimensional chaotic system. Int. J. Appl. Eng. Res. 10 (17), 38391–38396 (2015)
Neetha, S.S., Bhuvana, J., Suchithra, R.: An efficient image encryption reversible data hiding technique to improve payload and high security in cloud platforms. In: 2023 6th International Conference on Information Systems and Computer Networks (ISCON), pp. 1–6. IEEE (2023)
Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 249–256 (1990)
Nissar, A., Mir, A.H.: Classification of steganalysis techniques: a study. Digit. Signal Process. 20 (6), 1758–1770 (2010)
Pande, A., Zambreno, J.: A chaotic encryption scheme for real-time embedded systems: design and implementation. Telecommun. Syst. 52 , 551–561 (2013)
Parah, S.A., Ahad, F., Sheikh, J.A., Bhat, G.M.: Hiding clinical information in medical images: a new high capacity and reversible data hiding technique. J. Biomed. Inform. 66 , 214–230 (2017)
Patil, N., Kondabala, R.: Two-layer secure mechanism for electronic transactions. In: 2022 International Conference on Recent Trends in Microelectronics, Automation, Computing and Communications Systems (ICMACC), pp. 174–181. IEEE (2022)
Perwej, Y., Haq, K., Parwej, F., Mumdouh, M., Hassan, M.: The internet of things (IoT) and its application domains. Int. J. Comput. Appl. 975 (8887), 182 (2019)
Chen, C.P., Zhang, C.Y.: Data-intensive applications, challenges, techniques and technologies: a survey on big data. Inf. Sci. 275 , 314–347 (2014)
Phipps, A., Ouazzane, K., Vassilev, V.: Enhancing cyber security using audio techniques: a public key infrastructure for sound. In: 2020 IEEE 19th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom), pp. 1428–1436. IEEE (2020)
Pokharana, A., Sharma, S.: Encryption, file splitting and file compression techniques for data security in virtualized environment. In: 2021 Third International Conference on Inventive Research in Computing Applications (ICIRCA), pp. 480–485. IEEE (2021)
Prabu, S., Ganapathy, G.: Steganographic approach to enhance the data security in public cloud. Int. J. Comput. Aided Eng. Technol. 13 (3), 388–408 (2020)
Pradhan, A., Sahu, A.K., Swain, G., Sekhar, K.R.: Performance evaluation parameters of image steganography techniques. In: 2016 International Conference on Research Advances in Integrated Navigation Systems (RAINS), pp. 1–8. IEEE (2016)
Kumar, P., Sharma, V.K.: Information security based on steganography & cryptography techniques: a review. Int. J. 4 (10), 246–250 (2014)
Preethi, P., Prakash, G.: Secure fusion of crypto-stegano based scheme for satellite image application. In: 2021 Asian Conference on Innovation in Technology (ASIANCON), pp. 1–6. IEEE (2021)
Ramamoorthy, U., Loganathan, A.: Analysis of video steganography in military applications on cloud. Int. Arab J. Inf. Technol. 19 (6), 897–903 (2022)
Angel, N.A., Ravindran, D., Vincent, P.D.R., Srinivasan, K., Hu, Y.C.: Recent advances in evolving computing paradigms: cloud, edge, and fog technologies. Sensors 22 (1), 196 (2021)
Reshma, V., Gladwin, S.J., Thiruvenkatesan, C.: Pairing-free CP-ABE based cryptography combined with steganography for multimedia applications. In: 2019 International Conference on Communication and Signal Processing (ICCSP), pp. 0501–0505. IEEE (2019)
Rout, H., Mishra, B.K.: Pros and cons of cryptography, steganography and perturbation techniques. IOSR J. Electron. Commun. Eng. 76 , 81 (2014)
Issac, B., Rura, L., Haldar, M.K.: Implementation and evaluation of steganography based online voting system. Int. J. Electr. Gov. Res. 12 (3), 71–93 (2016)
Ryu, Y.S., Koh, D.H., Ryu, D., Um, D.: Usability evaluation of touchless mouse based on infrared proximity sensing. J. Usability Stud. 7 (1), 31–39 (2011)
Saleh, M.E., Aly, A.A., Omara, F.A.: Data security using cryptography and steganography techniques. Int. J. Adv. Comput. Sci. Appl. 7 (6), 390 (2016)
Sengupta, A., Rathor, M.: Structural obfuscation and crypto-steganography-based secured JPEG compression hardware for medical imaging systems. IEEE Access 8 , 6543–6565 (2020)
Shaji, A., Stephen, M., Sadanandan, S., Sreelakshmi, S., Fasila, K.A.: Phishing site detection and blacklisting using EVCS, steganography based on android application. In: International Conference on Intelligent Data Communication Technologies and Internet of Things (ICICI) 2018, pp. 1384–1390. Springer International Publishing (2019)
Siregar, B., Gunawan, H., Budiman, M.A.: Message security implementation by using a combination of hill cipher method and pixel value differencing method in mozilla thunderbird email client. In: Journal of Physics: Conference Series, vol. 1255, no. 1, p. 012034. IOP Publishing (2019)
Stanescu, D., Stratulat, M., Ciubotaru, B., Chiciudean, D., Cioarga, R., Micea, M.: Embedding data in video stream using steganography. In: 2007 4th International Symposium on Applied Computational Intelligence and Informatics, pp. 241–244. IEEE (2007)
Subhedar, M.S., Mankar, V.H.: Current status and key issues in image steganography: a survey. Comput. Sci. Rev. 13 , 95–113 (2014)
Wang, X., Zhang, J., Schooler, E.M., Ion, M.: Performance evaluation of attribute-based encryption: toward data privacy in the IoT. In: 2014 IEEE International Conference on Communications (ICC), pp. 725–730. IEEE (2014)
Wu, J., Liao, X., Yang, B.: Image encryption using 2D Hénon-Sine map and DNA approach. Signal Process. 153 , 11–23 (2018)
Xiong, L., Shi, Y.: On the privacy-preserving outsourcing scheme of reversible data hiding over encrypted image data in cloud computing. Comput. Mater. Contin. 55 (3), 523 (2018)
Xu, S., Horng, J.H., Chang, C.C., Chang, C.C.: Reversible data hiding with hierarchical block variable length coding for cloud security. IEEE Trans. Dependable Secure Comput. (2022). https://doi.org/10.1109/TDSC.2022.3219843
Yang, Y., Xiao, X., Cai, X., Zhang, W.: A secure and high visual-quality framework for medical images by contrast-enhancement reversible data hiding and homomorphic encryption. IEEE Access 7 , 96900–96911 (2019)
Zhang, L., Hu, X., Rasheed, W., Huang, T., Zhao, C.: An enhanced steganographic code and its application in voice-over-IP steganography. IEEE Access 7 , 97187–97195 (2019)
Zhang, X.G., Yang, G.H., Ren, X.X.: Network steganography based security framework for cyber-physical systems. Inf. Sci. 609 , 963–983 (2022)
Zhou, X., Tang, X.: Research and implementation of RSA algorithm for encryption and decryption. In: Proceedings of 2011 6th International Forum on Strategic Technology, vol. 2, pp. 1118–1121. IEEE (2011)
Sarmah, D.K., Kulkarni, A.J.: JPEG based steganography methods using cohort intelligence with cognitive computing and modified multi random start local search optimization algorithms. Inf. Sci. 430 , 378–396 (2018)
Yang, Y., Newsam, S.: Bag-of-visual-words and spatial extensions for land-use classification. In: Proceedings of the 18th SIGSPATIAL International Conference on Advances in Geographic Information Systems, pp. 270–279 (2010)
AID: A scene classification dataset, https://www.kaggle.com/datasets/jiayuanchengala/aid-scene-classification-datasets . Accessed 29 Feb 2024
Elshoush, H.T., Mahmoud, M.M.: Ameliorating LSB using piecewise linear chaotic map and one-time pad for superlative capacity, imperceptibility and secure audio steganography. IEEE Access 11 , 33354–33380 (2023)
Michaylov, K.D., Sarmah, D.K.: Steganography and steganalysis for digital image enhanced forensic analysis and recommendations. J. Cyber Secur. Technol. (2024). https://doi.org/10.1080/23742917.2024.2304441
Sarmah, D.K., Kulkarni, A.J.: Improved cohort intelligence—a high capacity, swift and secure approach on JPEG image steganography. J. Inf. Secur. Appl. 45 , 90–106 (2019)
Singh, O.P., Singh, A.K., Agrawal, A.K., Zhou, H.: SecDH: security of COVID-19 images based on data hiding with PCA. Comput. Commun. 191 , 368–377 (2022)
Singh, K.N., Baranwal, N., Singh, O.P., Singh, A.K.: SIELNet: 3D chaotic-map-based secure image encryption using customized residual dense spatial network. IEEE Trans. Consumer Electron. (2022). https://doi.org/10.1109/TCE.2022.3227401
Mahto, D.K., Singh, A.K., Singh, K.N., Singh, O.P., Agrawal, A.K.: Robust copyright protection technique with high-embedding capacity for color images. ACM Trans. Multimed. Comput. Commun. Appl. (2023). https://doi.org/10.1145/3580502
Download references
Author information
Authors and affiliations.
SCS/EEMCS, University of Twente, P.O. Box 217, 7500AE, Enschede, Overjissel, The Netherlands
Indy Haverkamp & Dipti K. Sarmah
You can also search for this author in PubMed Google Scholar
Contributions
Indy Haverkamp: Conceptualization, Methodology, Validation, Investigation, Formal Analysis, Data Curation, Writing—Original Draft, Visualization, Dipti Kapoor Sarmah: Methodology, Writing—Review & Editing, Visualization, Supervision, Project administration.
Corresponding author
Correspondence to Dipti K. Sarmah .
Ethics declarations
Conflict of interest.
The authors have no competing interests to declare that are relevant to the content of this article.
Human and Animals Participants
Informed consent.
All authors agreed with the content and all gave explicit consent to submit.
Additional information
Publisher's note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and permissions
About this article
Haverkamp, I., Sarmah, D.K. Evaluating the merits and constraints of cryptography-steganography fusion: a systematic analysis. Int. J. Inf. Secur. 23 , 2607–2635 (2024). https://doi.org/10.1007/s10207-024-00853-9
Download citation
Accepted : 12 April 2024
Published : 05 May 2024
Issue Date : August 2024
DOI : https://doi.org/10.1007/s10207-024-00853-9
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Image steganography
- Cryptography
- Real-world applications
- Evaluation perspectives
Advertisement
- Find a journal
- Publish with us
- Track your research
- Open access
- Published: 09 February 2024
Artificial intelligence and quantum cryptography
- Petar Radanliev ORCID: orcid.org/0000-0001-5629-6857 1 , 2
Journal of Analytical Science and Technology volume 15 , Article number: 4 ( 2024 ) Cite this article
5656 Accesses
3 Citations
1 Altmetric
Metrics details
The technological advancements made in recent times, particularly in artificial intelligence (AI) and quantum computing, have brought about significant changes in technology. These advancements have profoundly impacted quantum cryptography, a field where AI methodologies hold tremendous potential to enhance the efficiency and robustness of cryptographic systems. However, the emergence of quantum computers has created a new challenge for existing security algorithms, commonly called the ‘quantum threat’. Despite these challenges, there are promising avenues for integrating neural network-based AI in cryptography, which has significant implications for future digital security paradigms. This summary highlights the key themes in the intersection of AI and quantum cryptography, including the potential benefits of AI-driven cryptography, the challenges that need to be addressed, and the prospects of this interdisciplinary research area.
Introduction
Quantum cryptography is an advanced subfield of cryptography that employs the principles of quantum mechanics to ensure secure communication. Unlike classical cryptography, which typically utilises complex mathematical algorithms to encode data, quantum cryptography uses the physical properties of quantum particles, such as photons, to create an inherently secure communication system.
The cornerstone of quantum cryptography is quantum key distribution (QKD), a method that enables two parties to generate a shared random secret key, which is essential for encrypting and decrypting messages in such a way that any eavesdropper’s presence can be detected. The security of QKD is rooted in fundamental quantum mechanical principles, such as the Heisenberg uncertainty principle and quantum entanglement.
The Heisenberg uncertainty principle states that measuring a quantum system inevitably alters its state. Thus, any eavesdropper attempts to intercept and measure the quantum keys will introduce detectable anomalies, alerting the communicating parties to the presence of an intrusion.
Quantum entanglement is another fundamental concept in quantum mechanics that links two quantum particles so that the state of one instantaneously affects the state of the other, regardless of the distance separating them. This property can be used to establish a secure key between two parties.
The primary benefit of quantum cryptography is its potential to provide communication channels impervious to eavesdropping. It overcomes many limitations of traditional cryptographic methods, particularly in advancing computational power, such as quantum computers. This makes it a crucial study area for ensuring the security of sensitive data in the quantum computing era.
The convergence of AI and quantum cryptography has been a recent topic of great interest among scientific and technological experts. Both fields have changed their respective industries: AI has made remarkable strides in healthcare and finance by leveraging its exceptional ability to process data, recognise patterns, and make informed decisions. In parallel, quantum cryptography provides unparalleled security based on physical laws, primarily through quantum key distribution (QKD) and related protocols.
The alignment of AI and quantum cryptography is no accident. In our present digital age, marked by significant data transfers and escalating cybersecurity threats, it’s logical to integrate AI’s computational power with quantum cryptography’s unbreakable security measures. By examining extensive amounts of data, AI algorithms have the potential to elevate quantum cryptographic procedures, making them more robust and efficient. Meanwhile, quantum cryptography can provide a secure framework for AI systems, ensuring that the data and algorithms they manage remain impervious to breaches.
Quantum cryptography has become increasingly important due to the imminent arrival of quantum computers. These computers can crack classical cryptographic codes in a short amount of time, which poses a significant threat to modern cybersecurity. Therefore, combining AI and quantum cryptography is not just an academic exercise but a necessary measure to address this pressing issue.
This review thoroughly explores the intersection of AI and quantum cryptography. We take a deep dive into the historical development of both areas, how they interact with each other, and the challenges and opportunities they bring at the same time, and we also spotlight significant experiments and applications in the field. We aim to give readers a complete comprehension of the current research environment and to stress the immense potential of this combination for the future.
The convergence of AI and quantum cryptography represents a ground-breaking union of two transformative fields. AI has transformed how we process and analyse data, while quantum cryptography offers unparalleled security in information transmission. As these two domains continue to evolve, their intersection provides a captivating area for exploration. This paper explores the interplay, potential advancements, and challenges of AI and quantum cryptography.
Objectives of the study
This study aims to explore the historical background of AI and quantum cryptography and examine the current research and application scenario at their intersection. We will also analyse the challenges of integrating AI with quantum cryptography and highlight possible opportunities and prospects in this interdisciplinary field.
Research questions
How have the fields of artificial intelligence and quantum cryptography evolved historically?
How can AI improve Quantum Cryptographic protocols and vice versa?
What are the main challenges in combining AI and quantum cryptography?
What opportunities emerge from the interaction of AI and quantum cryptography, and how might they influence future research and applications?
The following sections will explore the exciting and interdisciplinary intersection, guiding researchers and enthusiasts.
A brief history of both AI and quantum cryptography
Introduction to cryptography.
The study of cryptography, also known as cryptology, originates from the Greek words kryptós and graphein, meaning hidden or secret and to write, respectively, and logia, meaning to study. In Greek, cryptography is defined as “secret writing.” (Liddell 1894 ).
The basis of modern cryptography is cryptographic algorithms designed around the concept of ‘computational hardness assumption’ (Braverman et al. 2015 ). It finds practical applications in various sectors, such as chip-based payment cards, digital currencies, computer passwords, and military communications (Paar and Pelzl 2009 ). It plays a crucial role in cybersecurity and securing communications with encryption (e.g. HTTPS, PGP).
In the realm of cryptocurrencies and crypto-economics, Zero Knowledge Proofs (ZKP), cryptographic keys, and cryptographic hash functions are commonly used cryptographic techniques.
Algorithms for encryption include the triple data encryption algorithm (3DEA) of the advanced encryption standard (AES). It encrypts data three times with the data encryption standard (DES) cypher, 3DES (Triple DES). DES is based on the Lucifer (cypher) symmetric-key algorithm (known as Data Encryption Algorithm—DEA) (Feistel 1971 ).
Another popular encryption method is the asymmetric RSA public-key encryption algorithm developed by Ron Rivest, Adi Shamir, and Leonard Adleman (Rivest et al. 1978 ).
In addition, IPAA Regulatory Compliance, GDPR (GDPR 2023 ; ICO 2023 ), and PCI-DSS also play significant roles in ensuring the safety and security of sensitive information.
Cryptography vs cybersecurity
In recent years, most of the cryptographic development has been for cybersecurity. In this short section, we wanted to emphasise the specific strengths and vulnerabilities in recent cryptography applications in cybersecurity.
First and foremost, good cryptography depends on the difficulty of the mathematical problem. In other words, the encryption is only as strong as the mathematical problem the cryptographic algorithm solves.
The second factor is implementation quality because correct implementation is critical to the algorithm's security.
The third requirement is critical secrecy because secret keys must be stored somewhere, usually by a centralised trusted authority.
Suppose you are a hacker attempting to hack a cryptosystem. In that case, you will begin by attempting to solve the math problem, looking for vulnerabilities in the implementation, or attempting to obtain access to the secret keys.
Quantum cryptography vs low memory cryptography
The National Institute of Standards and Technology (NIST) has announced Ascon as the algorithm that will serve as the official standard for lightweight cryptography of low-memory internet-of-things devices. Footnote 1 Since the NIST competition was announced in 2018, selecting the best, most secure, and most efficient algorithm has been ongoing, and the standard may not be ready until late 2023. However, it is essential to note that other institutes, such as ISO and ENISA, have yet to select the most appropriate algorithms. Other standard-setting organisations from around the world will likely leverage NIST’s efforts. The other option is to go through this process themselves, leaving their IoT infrastructure vulnerable to cyber threats.
According to NIST, the most peculiar aspect of the selection process was the effectiveness of these new algorithms ‘most of the finalists exhibited performance advantages over NIST standards on various target platforms without introducing security concerns’. Footnote 2 This statement is especially concerning given that NIST is one of the most frequently updated and globally recognised as one of the most advanced cybersecurity frameworks. Assume that other standard-setting organisations have not even begun identifying a lightweight cryptographic standard and that numerous available algorithms exist. Consequently, this reaffirms that cybersecurity and cryptography are strongly linked to the global standardisation of security frameworks and regulations.
The original request for submissions Footnote 3 for the NIST lightweight cryptography standard resulted in 57 solutions submitted for review by NIST. Lightweight cryptography ensures that data is securely transmitted from and to the “innumerable” tiny IoT devices, necessitating a new category of cryptographic algorithms. Most IoT micromachines, sensors, actuators, and other low-memory devices used for network guidance and communication operate on deficient electrical power. These devices have minimal circuitry, like the electronics in keyless entry fobs and Radio-Frequency Identification (RFID) tags used in supply chains and warehouses. Comparatively, even the most basic mobile phone would have a significantly less limited chip, and the primary advantage of these Internet of Things technologies is their low cost and small size. Existing cryptographic algorithms require more computational power and electronic resources than IoT devices have. Consequently, the primary weakness of all IoT devices is tied to their primary strength.
Quantum cryptography presents a unique approach compared to lightweight cryptography like Ascon, which caters to low-memory devices like IoT devices. It follows the principles of quantum mechanics and primarily focuses on quantum key distribution (QKD), offering security that is theoretically impossible to break.
NIST is concentrating on Ascon to protect data on small IoT devices with limited computing abilities. On the other hand, quantum cryptography aims to utilise the distinctive characteristics of quantum bits (qubits) for secure communication, regardless of the device's computational power. One of the main obstacles of quantum cryptography is its current scalability and compatibility with conventional communication systems. Lightweight cryptography, on the other hand, must maintain security despite limited computational resources. Due to their computational limitations, IoT devices face challenges in employing conventional cryptographic algorithms. If direct quantum cryptography methods were to be implemented, these devices could face even more significant difficulties.
The convergence of classical and quantum domains has paved the way for developing hybrid cryptographic techniques that can provide enhanced security measures, even on low-power devices. Such solutions are designed to combine the strengths of both classical and quantum systems, ensuring the utmost protection of sensitive data and information. By leveraging the unique properties of quantum mechanics, hybrid cryptographic algorithms can overcome the limitations of classical cryptography and offer advanced levels of security that are essential in today's digital age.
Review of advancements in artificial intelligence
Although the concept of machines and statues that mimic human thought and behaviour can be found in ancient myths and legends, the scientific field of AI emerged in the mid-twentieth century. In 1950, British mathematician Alan Turing established the Turing Test as a benchmark for a machine's ability to exhibit intelligent actions identical to a human.
Over the years, AI research has experienced peaks and troughs, commonly called “AI winters” and “AI springs.” In the 1960s, there was a lot of optimism and funding for AI, as early problem-solving algorithms and knowledge representation showed potential. However, there were soon computational limitations and difficulties in emulating human intelligence. The 1980s witnessed a revival with the development of expert systems, which mimicked human decision-making skills. Nevertheless, by the end of the decade, the shortcomings of these systems became more apparent. In Fig. 1 , we can visually compare the complexity of different algorithms.
Navigating through popular and traditional ML algorithms
Some of the more complex algorithms seen in Fig. 1 did not exist in the 1980s. The twenty-first century has brought remarkable progress in computational power and data accessibility. With the help of machine learning and intense learning, machines can now handle extensive datasets and efficiently perform tasks such as speech and image recognition. As a result, AI has become a crucial component of modern technological advancement.
Review of advancements in quantum cryptography
The foundation of quantum cryptography can be traced back to the early twentieth century. Quantum mechanics raised challenges and opportunities for information processing due to the counterintuitive properties of quantum systems, such as superposition and entanglement.
During the 1970s and 1980s, there were significant advancements made in quantum information theory. Charles Bennett and Gilles Brassard introduced the quantum key distribution (QKD) concept in 1984 with the BB84 protocol, based on previous quantum mechanics and information theory research. This protocol utilised quantum mechanics principles to allow two parties to create a shared, secret random key that was secure due to physical laws.
In the years that followed, there was a significant development in both theoretical and practical aspects of quantum cryptography. Besides crucial distribution, quantum cryptographic protocols have expanded their scope to include quantum digital signatures and secure direct communication. With the progress in photonics and quantum technologies, these protocols have been implemented and tested in real-world scenarios, opening doors for commercial quantum secure communication networks.
Although originating from different scientific traditions, AI and quantum cryptography have converged through fundamental insights, technological advancements, and a continuous pursuit of understanding and innovation. This convergence presents numerous opportunities and challenges, potentially transforming information security and computational intelligence.
Review of integration in AI with quantum cryptography
The technological advancements in AI and quantum computing have been monumental, leading to significant changes in various domains, including cryptography. One of the primary objectives of integrating AI with quantum cryptography is to harness AI’s computational prowess to enhance the efficiency, security, and robustness of quantum cryptographic systems. AI methodologies, with their ability to process vast amounts of data, recognise intricate patterns, and adapt to new scenarios, can significantly contribute to optimising quantum cryptographic protocols and addressing the complex challenges they face.
In parallel, quantum cryptography offers a unique avenue to safeguard AI systems, given its foundational security based on the laws of quantum mechanics. This integration is timely and relevant in our digital era, characterised by extensive data exchanges and escalating cybersecurity threats. Here, the role of AI becomes crucial. By analysing and interpreting large datasets, AI algorithms can play a pivotal role in elevating the security and effectiveness of quantum cryptographic practices.
However, the emergence of quantum computers has introduced a new and formidable challenge for cryptographic systems –the ‘quantum threat.’ This threat looms over traditional cryptographic methods, rooted in the fact that quantum computers have the potential to break many of the cryptographic algorithms currently in use. Thus, the synergy of AI and quantum cryptography is not merely an academic pursuit but a necessary evolution in our approach to digital security. AI-driven methodologies in quantum cryptography aim to anticipate, mitigate, and robustly defend against the quantum threat, ensuring a secure computational future.
This review delves deep into the interaction between AI and quantum cryptography, exploring their historical development, the challenges presented by the advent of quantum computing, and the transformative potential of their integration. By doing so, it aims to provide a comprehensive understanding of the current landscape and the exciting prospects this interdisciplinary fusion offers for the future of secure computation.
Review of the quantum threat
The ‘quantum threat’ refers to the potential vulnerability of existing cryptographic systems in the face of advanced quantum computing capabilities. Cryptographic methods like RSA and ECC (Elliptic Curve Cryptography) depend on the computational difficulty of specific mathematical problems. For example, RSA relies on the challenge of factoring large prime numbers, and ECC depends on the complexity of solving the elliptic curve discrete logarithm problem. These problems, currently considered difficult for classical computers, could potentially be solved efficiently by quantum computers using algorithms such as Shor’s algorithm.
Quantum computers operate on principles of quantum mechanics, such as superposition and entanglement, to process information differently than classical computers. This capability allows them to perform specific calculations much more efficiently than traditional computers. Shor’s algorithm demonstrates that a quantum computer could factor large numbers exponentially faster than the best-known algorithms running on a classical computer. As a result, the encryption systems that depend on the difficulty of these problems for security would become vulnerable once sufficiently powerful quantum computers are developed.
The quantum threat is not just a theoretical concern but a near-future reality. The advent of quantum computing thus necessitates the development of new cryptographic systems that are secure against quantum attacks, often referred to as ‘quantum-resistant’ or ‘post-quantum’ cryptography. These systems aim to use algorithms and cryptographic methods that quantum computers cannot efficiently break.
Integrating AI with quantum cryptography is a strategic response to this threat. AI's advanced pattern recognition and predictive capabilities can aid in developing, testing, and optimising quantum-resistant algorithms. Moreover, AI can contribute to the real-time assessment and adaptation of cryptographic systems, making them more resilient against the rapidly evolving landscape of quantum computing. This makes the convergence of AI and quantum cryptography a critical area of research for ensuring data security and privacy in the forthcoming quantum computing era.
Research methodology
This research employs a qualitative approach within an interpretive paradigm to comprehensively investigate the intricate relationship between AI and quantum cryptography. With the emergence of standardised tools and ontologies that strive to enhance information exchange and automate vulnerability management, the cybersecurity landscape is evolving rapidly. One such tool is the ‘Reference Ontology for Cybersecurity Operational Information’ (Takahashi and Kadobayashi 2015 ). This tool provides a structured framework for cybersecurity information and facilitates its exchange within the domain of cybersecurity operations. This approach proposes a reference ontology for cybersecurity operational information that promotes collaboration and information exchange among organisations. The ontology structures cybersecurity information and aligns with industry specifications. The authors worked with cybersecurity organisations to develop the ontology and demonstrated its usability by discussing industry specification coverage. They also established an adaptable information structure that complements industry specifications and outlines a prototype cybersecurity knowledge base that facilitates information exchange. This article explores the potential usage scenarios of the ontology and knowledge base in cybersecurity operations. The proposed ontology aims to advance the exchange of cybersecurity information.
The CYBEX framework (Rutkowski et al. 2010 ) is a significant step towards establishing a global standard for exchanging cybersecurity information. As an ITU-T initiative, CYBEX aims to standardise how cybersecurity entities communicate and ensure the integrity of this exchange. Introducing CYBEX will reduce the fragmentation of cybersecurity information availability, allowing for a more uniform defence posture worldwide. This paper outlines the framework’s specifications, practical applications, and progress. CYBEX is uniquely structured around five functional blocks: Information Description, Information Discovery, Information Query, Information Assurance, and Information Transport. Together, these blocks enhance the automation and efficiency of cybersecurity operations, potentially reducing human error and operational costs. While these works provide valuable insights and contribute to the overarching goals of security information exchange and vulnerability management, they are not the central focus of this survey. As such, our research does not delve into these areas in detail but acknowledges their significance in the broader context of cybersecurity.
This research aims to enhance our understanding of the impact of these two technological advancements on cybersecurity. This study is informed by global efforts to develop, refine, and establish a range of quantum-safe cryptography algorithms (Kumar Sep. 2022 ).
Data collection
We employed two primary methodologies to gather data. Firstly, we gathered primary data from industry standards and guidelines (Nist et al. 2016 ; NIST 2023a , b , 2011 ; Tabassi 2023 ; https://www.nist.gov/news-events/news/2018/04/nist-releases-version-11-its-popular-cybersecurity-framework ). Then, we conducted a case study with the authors and the organisations behind these standards. These interactions were systematically recorded, transcribed, and coded for further analysis. The process is recorded and can be visualised in Fig. 2 .
AI model evaluation and validation
Secondly, we reviewed a comprehensive literature by examining reputable scholarly journals and books. Our focus was on papers that critically evaluated the role of encryption in the context of AI and quantum mechanics (Kop 2023 ), particularly from the literature on quantum technology applications (Broadbent et al. 2015 ) and their societal impact, which were integrated during the analysis (Elaziz and Raheman 2022 ).
Data analysis
Thematic analysis (Yin 2009a ) was the primary method to analyse the interactions between national and international standards. To begin with, preliminary codes were generated based on thoroughly scrutinising the interactions (Eisenhardt 1989 ). These codes were then sorted and organised into more comprehensive themes. It was a detailed and iterative analysis process, requiring ongoing data review to ensure an accurate representation (Yin 2009b ). Moreover, valuable insights from academic literature were incorporated into the analysis (Eisenhardt 1989 ), explicitly focusing on quantum technology applications' societal impact (Alyami et al. 2021 ).
Validation procedures
To uphold the validity of our research, we employed the triangulation technique for evaluating software security through quantum computing techniques, such as the durability perspective (Alyami et al. 2021 ), the Hybrid Fuzzy ANP-TOPSIS Approach (Agrawal et al. 2020 ), and the integrated hesitant fuzzy-based decision-making framework for evaluating sustainable and renewable energy (Sahu et al. 2023 ). This involved verifying the insights we derived from case study interactions with the conclusions drawn from scholarly literature. Furthermore, we engaged peer-reviewed papers and assessed specific data portions and corresponding analyses. Their contributions were pivotal in anchoring the research's findings and aligning with the broader academic dialogue.
Review of the interplay between AI and quantum cryptography
The convergence of AI (Ying 2010 ) and quantum cryptography (Shapna Akter 2023) is a fascinating development that offers exciting computational and information security possibilities. This intersection represents a novel approach to secure communication and intelligent data processing that has the potential to revolutionise the way we perceive and utilise technological advancements. In this article, we will delve deeper into this fusion, closely examining its technical details, recent progress, and challenges to regulatory standards. This comprehensive analysis aims to provide a more nuanced understanding of this cutting-edge field and its potential implications for the future of technology and security.
AI in quantum cryptography
In modern cryptography (Paar and Pelzl 2009 ), one can find S-boxes, complex mathematical structures that are essential components within many symmetric key algorithms. These S-boxes have been created using vectorial Boolean functions in conjunction with AI, specifically by utilising neural network-based techniques (Nitaj and Rachidi 2023 ). This AI-driven approach allows for a more streamlined design process. It supports the analysis of cryptographic properties, ultimately developing more optimised and secure cryptographic protocols (Sevilla and Moreno, 2019 ). Through this method, the speed and efficiency of the design process are enhanced (Ying 2010 ; Diffie and Hellman 1976 ) while also ensuring that the result is a robust and reliable cryptographic protocol (Ayoade et al. 2022 ).
Optimising quantum key distribution (QKD)
Quantum cryptography is a highly secure communication method based on the principles of quantum mechanics. It relies on the QKD (quantum key distribution) method, which allows two parties to exchange a secret, shared random key for encrypting and decrypting their messages. The BB84 protocol is a well-known example of the QKD methods (Shamshad et al. 2022 ).
QKD is a highly secure method but is not immune to errors and security breaches. Hence, AI has the potential to enhance QKD in several ways.
Firstly, AI can help with error correction, an inevitable occurrence in any real-world QKD system. By predicting and correcting errors, AI can ensure the quantum key’s integrity, which is essential for maintaining the security of the communication channel.
Secondly, AI-powered techniques can continuously monitor QKD systems to detect potential security breaches or eavesdropping attempts. This enhances security analysis and keeps the system safe from unauthorised access or tampering.
Finally, AI algorithms can optimise the rate of quantum key generation (Ying 2010 ) by considering environmental factors and hardware performance. This helps generate a quicker and more efficient key rate, crucial for high-speed communication channels. By leveraging AI-powered techniques, QKD can become even more secure and reliable, paving the way for the future of secure communication.
Quantum cryptography in AI: securing AI systems
In today's technologically advanced world, industries that rely on AI must prioritise the security of their algorithms and the data they handle. Data breaches can have severe consequences, including reputational damage and financial loss. One way to add an extra layer of security to AI systems is by using quantum cryptographic techniques. These techniques use the principles of quantum mechanics to protect data from potential attackers, making it computationally impossible for anyone to breach the system. By implementing these advanced security measures, industries can ensure the safety and integrity of their AI systems and the sensitive data they process.
Quantum principles in AI algorithms
The principles that govern the world of quantum physics vastly differ from those of classical physics. These principles can be a source of inspiration and innovation to design advanced AI algorithms. One such technique in quantum computing is quantum entanglement, which can optimise AI algorithms, particularly in training neural networks (Ying 2010 ). This results in the creation of more efficient and faster AI models. Furthermore, scientists have discovered that quantum entanglement, where particles become intertwined, can be leveraged to develop AI models that can process information in previously impossible ways. This breakthrough can revolutionise the field of AI and pave the way for even more advanced applications.
Regulatory landscape and standards
Integrating AI technology with quantum cryptography has presented novel challenges (Kop 2023 ) in regulatory and standards compliance (Ying 2010 ). To address this, various international organisations have come together to establish comprehensive guidelines and protocols for ensuring the reliability and security of quantum cryptographic systems. These efforts aim to establish a dependable and trustworthy framework to support the continued development and deployment of advanced quantum cryptographic solutions.
Notable advancements in data privacy and security have been made with the help of prominent organisations such as ISO/IEC (ISO 2022 , 2017 , 2023 ; NIST 2023a , b , c , d , e , 2001 , f , g , 2022a , b , 2018 , 2014 , 2011 ; Tabassi 2023 ; SWID 2023 ; Petrov 2021 ; Udroiu et al. 2022 ; Catril Opazo 2021 ; NIST 2020 ; NIST 800-53 2020 ; NIST Advanced Manufacturing Office 2013 ; Johnson et al. 2016 ; https://advisera.com/27001academy/what-is-iso-27001/ ; https://www.nist.gov/news-events/news/2018/04/nist-releases-version-11-its-popular-cybersecurity-framework ; https://csrc.nist.gov/Projects/block-cipher-techniques ; https://csrc.nist.gov/Projects/post-quantum-cryptography ; https://csrc.nist.gov/Projects/lightweight-cryptography ; https://csrc.nist.gov/Projects/pec ; https://www.nist.gov/cyberframework/getting-started ), and EU/UK GDPR (GDPR 2023 ; ICO 2023 ). These entities have provided valuable insights and guidelines for protecting sensitive information, thus promoting user trust and confidence. With their contributions, the industry is better equipped to address emerging threats and challenges, paving the way for a more secure digital landscape.
The esteemed International Organization for Standardisation (ISO) and the International Electrotechnical Commission (IEC) have taken on the critical task of launching projects aimed at standardising quantum cryptographic protocols. This includes the crucial essential establishment procedures vital for the secure transmission of sensitive and confidential information. These projects ensure that quantum cryptography is widely accepted as a trusted and reliable method for secure communication in various industries, including finance, healthcare, and government. With the standardisation of these protocols, organisations can have greater confidence in the security of their communication systems, which is essential in today’s increasingly interconnected world.
The National Institute of Standards and Technology (NIST) (NIST 2023a , b ), a federal agency within the United States Department of Commerce, has extensively developed benchmarks and standards for quantum cryptographic systems. This ensures these systems meet rigorous security requirements for safeguarding sensitive information in the quantum computing era. NIST's efforts aim to promote a secure and reliable framework for quantum communication and cryptography, which are expected to play a vital role in the future of cybersecurity.
AI regulation presents a unique set of challenges. While standardisation issues arise in the quantum realm, AI faces its regulatory obstacles. These include concerns about data privacy, ethical considerations, and transparency in decision-making. Addressing these concerns requires global conversations on how best to regulate AI. For instance, the General Data Protection Regulation (GDPR) in the European Union provides precise guidelines for AI decision-making processes. This ensures transparency and accountability, thereby guaranteeing the responsible use of AI. The challenges of regulating AI are complex and multifaceted, but they are necessary to ensure safe and responsible development and use of this technology.
The merging of AI and quantum cryptography presents a promising future, but obstacles exist to successfully executing, refining, and adhering to legal requirements. It is essential to adopt a collaborative methodology that involves scholars, policymakers, and industry professionals to achieve the full potential of this unification. We must acknowledge and work together to overcome the challenges as we progress.
Challenges and opportunities: integrating AI and quantum cryptography
The intersection of AI and quantum cryptography presents exciting possibilities. However, the intersection of these two ground-breaking fields is challenging. This chapter delves into the significant challenges and opportunities resulting from their integration. For example, Neural network-based AI has shown considerable promise in enhancing cryptographic systems, with several practical applications demonstrating its potential. For instance, neural networks have been successfully employed in the development of cryptographic algorithms themselves. One notable example is using machine learning techniques to design and optimise S-boxes (substitution boxes) in symmetric key cryptography. These S-boxes are critical components in many cryptographic algorithms, such as AES (advanced encryption standard ), where they introduce nonlinearity and confusion into the encryption process. AI-driven methods can analyse the properties of S-boxes, such as nonlinearity and differential uniformity, to develop more secure and efficient cryptographic algorithms.
Another application is in the field of cryptanalysis. AI algorithms and profound learning models have been used to perform automated cryptanalysis on various cryptographic algorithms. By training neural networks with examples of plaintext and corresponding ciphertext, these models can learn to predict the key or decipher the messages without the key, thereby identifying potential vulnerabilities in cryptographic algorithms.
In addition to enhancing traditional cryptographic systems, neural network-based AI plays a pivotal role in addressing the challenges of quantum computers. Quantum computers exploit specific vulnerabilities in widely used cryptographic algorithms. For instance, Shor’s algorithm takes advantage of quantum computers' ability to efficiently factor large numbers, thereby breaking the RSA encryption, which relies on the difficulty of factoring the product of two large prime numbers. Similarly, quantum computers can efficiently solve the discrete logarithm problem, undermining the security of ECC and Diffie-Hellman key exchange.
These vulnerabilities stem from the quantum principle of superposition, which allows quantum computers to evaluate multiple possibilities simultaneously, and quantum entanglement, which enables them to correlate the properties of separated particles. These characteristics enable quantum computers to perform specific calculations much faster than classical computers, rendering current cryptographic methods vulnerable.
Integrating AI with quantum-resistant cryptographic research is essential to developing new algorithms that can withstand the capabilities of quantum computers. For example, AI can simulate quantum attacks on cryptographic algorithms, helping researchers understand and mitigate vulnerabilities. Furthermore, AI-driven optimisation techniques can aid in the creation of more efficient and secure post-quantum cryptographic algorithms, ensuring the continued protection of digital information in the quantum era.
Challenges: technological limitations
While quantum systems have the potential to provide unparalleled computational power, numerous technological limitations make their practical implementation difficult (Gill et al. 2022 ). One of the primary challenges in this field is the design of distributed quantum systems, which requires significant advancements in quantum hardware and error correction techniques (Awan et al. 2022 ). Despite these challenges, researchers remain dedicated to exploring the potential of quantum computing and developing new strategies to overcome the obstacles that stand in the way of progress.
Data challenges in AI and the transition to post-quantum cryptography
Integrating AI systems with quantum cryptographic systems is a complex process dependent on data quality, volume, privacy, security, and potential biases.
Real-time applications face several challenges in implementing AI-driven quantum cryptography. The scalability and performance of these technologies remain challenging, especially for large-scale data encryption and internet communication. Quantum cryptographic systems require significant infrastructure and can be resource-intensive, making large-scale deployments challenging. Integrating advanced quantum cryptographic methods into existing communication systems without disrupting service is complex, and ensuring seamless operation during the transition to quantum-secure systems is crucial. Real-time applications demand minimal latency, and AI algorithms combined with quantum cryptographic processes can introduce latency that affects the efficiency and usability of real-time systems. Quantum cryptographic systems are sensitive to environmental factors, leading to higher error rates and making it challenging to ensure reliability and accuracy in diverse environments.
Integrating AI with quantum cryptography is feasible, leading to significant advancements in cryptographic security. AI algorithms enhance quantum cryptographic protocols, making them more adaptable and efficient. The AI-driven approaches have effectively mitigated the quantum threat, providing a pathway to develop and optimise quantum-resistant cryptographic algorithms. Despite challenges, AI’s successful implementations and potential applications in enhancing quantum cryptographic systems indicate a promising future. This includes secure communication channels, enhanced data privacy, and robust security solutions for various industries.
Continued research and development are crucial to address the challenges in real-time applications, improve scalability, reduce latency, and ensure compatibility with existing systems. The results underscore the necessity for policy development and industry engagement to facilitate the transition to quantum-secure cryptographic systems. This involves standardising practices, investing in infrastructure, and promoting collaboration among academia, industry, and policymakers. This process can be visualised in Fig. 3 .
Data challenges in the AI data lifecycle management caused by quantum cryptography
The successful implementation of AI in the context is seen in Fig. 3 , requires using post-quantum cryptographic methods, particularly considering the imminent arrival of quantum computers (Aldoseri et al. 2023 ). However, the transition to these methods must be carefully considered and prepared, as standardisation and widespread acceptance may pose significant challenges. As such, it is crucial to prioritise the development of robust and reliable solutions that can effectively address these issues and ensure the safety and security of sensitive data.
Opportunities for enhanced security mechanisms and AI-driven quantum systems
The potential integration of AI’s impressive data processing capabilities with the unassailable security of quantum cryptography could give rise to ultra-secure communication channels impervious to classical and quantum threats. With the rapid advancements in quantum computation, mounting evidence suggests that quantum systems will soon outstrip classical systems regarding computational capabilities (Ayoade et al. 2022 ). AI has the potential to significantly enhance quantum systems leading to faster algorithms and streamlined cryptographic protocols with far-reaching implications. Such developments could revolutionise secure communication and data transfer. The merging of quantum concepts with artificial intelligence has a potential for new research areas, attracting more significant funding in quantum cryptography and pushing the boundaries of both fields.
There are significant challenges when merging AI and quantum cryptography, but the potential rewards are vast. Researchers can unlock a wealth of possibilities that lay the foundation for new advancements in computation and security. These advancements can revolutionise how we approach these fields and significantly impact society.
Public key (PK) cryptography plays a vital role in this effort. Asymmetric cryptography, or public key (PK) cryptography, uses two mathematically linked keys: public and private. Unlike symmetric cryptography, which relies on one key for encryption and decryption, PK cryptography uses separate keys for each operation. This enhances security and ensures that sensitive data remains secure, even if an adversary intercepts the public key. PK cryptography enables secure communication and cryptographic features such as critical exchanges, digital signatures, and data encryption. It is a crucial component of modern cryptographic systems, offering enhanced security, scalability, and adaptability across various applications.
A crucial concept in cryptography is digital signature generation. To generate a digital signature, the signatory must first create a key pair consisting of a private key and a public key. The private key is kept confidential and never shared, whereas the public key is made available. A unique hash of the document or message to be signed is generated using a hash function. This hash value uniquely represents the content of the document. Hash signing occurs when the signer encrypts the generated hash value using their private key. This links their signature to a particular document. Upon encrypting the hash value, a cryptographic digital signature is generated, unique to both the document and the signer.
The combination of AI and quantum cryptography presents exciting opportunities. Despite the significant challenges that must be overcome, the potential rewards are vast, and the implications could be far-reaching. Merging these two fields can unlock a wealth of possibilities that lay the foundation for new advancements in computation and security. This could revolutionise secure communication and data transfer, leading to new research areas and pushing the boundaries of both fields.
Quantum cryptography
Quantum cryptography is a revolutionary technique that has the potential to provide unparalleled security measures based on the principles of quantum mechanics. In contrast to traditional cryptography, which relies on complex mathematical problems, quantum cryptography utilises the unique characteristics of quantum particles to establish an unbreakable encryption method. One of the critical components of this approach is quantum key distribution (QKD), which allows two parties to generate a secret and shared random key that can be used for secure communication. Furthermore, any attempt to eavesdrop on the quantum communication would be detected, as it would disrupt the quantum states being transmitted, revealing the presence of an intruder. This feature provides an added layer of safety and protection to the communication between the two parties.
Role of artificial intelligence in security
The role of AI in cybersecurity has become increasingly significant in recent times due to its ability to leverage machine learning and advanced algorithms to rapidly identify patterns, anomalies, and potential threats within vast data sets. This capability is especially critical in a world where cyber threats constantly evolve and become more sophisticated. AI not only helps to identify cyber threats in real-time, but it also provides predictive analysis to anticipate potential vulnerabilities, enabling proactive security measures. Furthermore, AI-driven systems can enhance authentication processes, simplify security operations, and facilitate faster responses to identified threats. AI is revolutionising cybersecurity by providing a powerful tool to combat cyber threats and protect sensitive data.
Previous studies on AI and quantum cryptography
There is ongoing research into the relationship between artificial intelligence and quantum cryptography, a growing study area. A study conducted by Ayoade ( 2022 ) demonstrated the impressive capabilities of quantum computing compared to traditional systems, suggesting the potential for AI at the quantum level. Gupta's research (Gupta et al. 2023 ) explores how AI and machine learning can aid quantum computing in the healthcare industry. In 2019, a discussion delved into how quantum cryptography could protect communication between trusted parties from unauthorised listeners, indicating potential intersections with AI-driven security measures. These studies highlight the importance of continued exploration in this interdisciplinary field, as AI and quantum cryptography can shape the future of cybersecurity.
Artificial intelligence in cryptography
Overview of ai techniques in cryptography.
AI has transformed many fields, including cryptography. Using machine learning techniques, AI offers new ways to tackle old and new cryptographic problems. Neural network-based AI is particularly useful for improving cryptographic methods and cryptanalysis (Nitaj and Rachidi 2023 ). AI's ability to quickly analyse vast amounts of data makes it an essential tool for identifying patterns and predicting potential cryptographic threats, which helps enhance security measures.
AI in classical cryptography
In traditional cryptography, AI is mainly used for cryptanalysis. By training machine learning algorithms to recognise patterns and deviations in encrypted data, they can anticipate potential encryption keys and decode encrypted texts without the key. Furthermore, these AI methods strengthen classical encryption techniques, making them more resilient against brute-force attacks and other standard decryption methods. The combination of AI and classical cryptography has progressed considerably, with cryptography contributing to advancing AI techniques and vice versa.
Integrating quantum cryptography and AI presents challenges and opportunities (Kop 2023 ). As quantum computing technology advances, there could be vulnerabilities in cryptographic algorithms. Still, AI's predictive abilities can help identify these weaknesses and assist in creating algorithms that are resistant to quantum computing (Zolfaghari et al.). Additionally, AI techniques can enhance quantum essential distribution procedures, ensuring secure communication in quantum networks. While this field is still in its early stages, it has the potential to bring about transformative advancements in secure communication shortly.
Principles of quantum cryptography
The security of quantum cryptography is based on the principles of quantum mechanics, a field of physics that examines the behaviour of subatomic particles. It functions because data preserved in quantum states cannot be replicated or accessed without altering the original state. This fundamental concept, the “no-cloning theorem,” is essential in safeguarding quantum cryptographic networks (Shapna Akter, 2023 ).
Quantum key distribution
Quantum key distribution (QKD) is a secure method that utilises quantum mechanics concepts to create and distribute cryptographic keys between two parties (Gyongyosi and Imre 2020 ; Tsai et al. 2021 ). The BB84 protocol is one of the most widely used protocols in QKD. The critical feature of QKD is that it can detect any attempts at eavesdropping. If a third party tries to intercept the exchanged quantum keys, the transmitted quantum states would be disrupted. This would immediately alert the communicating parties of a possible security breach (Diamanti et al. 2016 ).
Quantum cryptographic protocols
There are various applications of quantum cryptographic protocols aside from QKD. For instance, quantum digital signatures, quantum coin tosses, and quantum secure direct communication. These protocols use quantum mechanics to perform tasks that are impossible with traditional cryptography, thus ensuring more robust security measures (Broadbent et al. 2015 ).
Challenges and solutions
The concept of quantum cryptography presents new possibilities for secure communication, but it also comes with its own set of challenges. In the real world, implementing QKD networks is difficult due to issues such as quantum channel loss, noise, and technological limitations (Lovic 2020 ). However, researchers are actively working to overcome these obstacles. Post-quantum cryptography (PQC) offers algorithms that can withstand quantum adversaries, bridging the gap between classical and quantum cryptography techniques (Tsai et al. 2021 ).
Intersection of AI and quantum cryptography
Synergistic approaches.
The convergence of AI and quantum cryptography presents unprecedented opportunities for secure computations and improved cryptographic protocols. As AI models become more complex, quantum-secure algorithms are of paramount importance. Quantum computing provides a platform for AI algorithms that can process vast amounts of data in polynomial time, enabling AI operations to be performed more quickly and effectively.
AI for enhanced quantum cryptographic protocols
Quantum cryptographic protocols such as BB84 can be optimised using AI's machine learning capabilities (Shor 1994 ). By analysing quantum states and predicting the likelihood of eavesdropping, artificial intelligence can dynamically adjust quantum key distribution parameters to improve security. In addition, AI can aid in developing post-quantum cryptographic algorithms, ensuring resistance to quantum computer attacks.
Quantum computing for AI model security
Novel encryption techniques can be introduced when combining quantum computing with AI, making AI models more secure (Bennett and Brassard 2020 ). Quantum bits (qubits) can simultaneously represent multiple states, providing a higher-dimensional computation space for artificial intelligence that can be utilised to develop ever-evolving encryption systems. This type of dynamic encryption can present difficulties for potential attackers (Mallow et al. 2022 ).
Potential risks and mitigations
Integrating artificial intelligence and quantum cryptography holds promise but is not without risk. A constantly evolving encryption system may introduce new vulnerabilities or be challenging to administer. It is essential to balance innovation and risk management, ensuring that ethical and security considerations remain at the forefront of development as quantum technologies advance.
Applications and implications
The convergence of quantum computing and AI has made significant strides in several scientific domains, including the field of cryptography. The power of quantum computation has improved the encryption methodologies of AI algorithms, making them more impregnable. Moreover, cryptography is evolving with the emergence of quantum key distribution (QKD), which exploits the singular traits of quantum mechanics.
In addition to cryptography, quantum computing is revolutionising biochemical research by providing cutting-edge computational potential. Quantum computers could simulate intricate biochemical interactions and lead to significant medical advancements.
The consolidation of quantum computing and AI holds tremendous potential to revolutionise various industries. However, the ongoing development of these technologies also brings ethical dilemmas to the forefront. Quantum capabilities could decrypt sensitive data, posing privacy risks, and the vast potential of quantum-AI convergence may produce dependencies that can be exploited maliciously.
To harness the full potential of quantum and AI integration while mitigating associated risks, policymakers must proactively understand the complexities of these technologies. Regulatory bodies must ensure data privacy and security while safeguarding individual rights and societal welfare. The difficulty lies in balancing the potential benefits and risks of these technologies.
The combination of quantum computing and AI has tremendous potential in various scientific domains and industries. However, it is essential to consider these technologies’ ethical considerations and regulatory implications to harness their potential fully. Policymakers and regulatory bodies must ensure data privacy and security while safeguarding individual rights and societal welfare.
Case studies: the intersection of AI and quantum cryptography
Implementation of ai in quantum cryptographic systems.
The convergence of AI and quantum mechanics has paved the way for innovative encryption methods that efficiently tackle the ever-changing and increasingly complex security risks (Awan et al. 2022 ). By combining the power of quantum computing with AI algorithms, these techniques can effectively safeguard sensitive data and prevent unauthorised access, ensuring the highest level of protection for critical information (Taylor 2020 ).
Real-world applications and results
Quantum AI has significantly improved data protection and transaction security in the banking industry. AI methods have changed encryption techniques, leading to more advanced security measures that can counter constantly evolving threats. Traditional security measures have limitations that make detecting advanced and insider threats difficult. Cyber attackers have been using AI, data poisoning, and model theft to automate attacks, making it necessary to use cybersecurity techniques based on artificial intelligence.
The CS-FSM method and the K-nearest neighbour (KNN) algorithm are two such methods. The CS-FSM method uses the enhanced encryption standard (EES) algorithm to encrypt and decrypt data, ensuring information security in the financial sector. The KNN algorithm detects and prevents malware attacks by making predictions using training data. These methods enhance the performance of cybersecurity systems, improving their resistance to cyberattacks, data privacy, scalability, risk reduction, data protection, and attack prevention.
Quantum artificial intelligence has also been adopted in retail to provide more secure and efficient transactions. By leveraging quantum AI’s power, retailers can safeguard their customers’ data and ensure seamless transactions. This technology offers a highly reliable solution to protect customers' sensitive information.
The fusion of AI and Quantum Mechanics can lead to significant advancements in cryptographic systems. While shifting to quantum cryptographic systems has numerous benefits, it also presents implementation challenges that can be overcome with careful planning and execution. The benefits of incorporating quantum artificial intelligence into cryptography are evident, particularly in sectors such as retail, where customer data protection and transaction security have been significantly improved.
Integrating AI and quantum mechanics in cryptographic systems has tremendous potential to revolutionise data protection and transaction security in various industries. This intersection creates more robust and secure systems that can withstand evolving cyber threats, crucial for safeguarding sensitive information. It also allows for the development of innovative cryptographic techniques and quantum-resistant algorithms.
To advance this field, researchers must continue to innovate and explore these technologies' ethical implications and sustainability. Policymakers should support research and development while ensuring data privacy and security by creating policies that promote best practices. Industry professionals should invest in research and development, stay updated with the latest advancements, and train the workforce to adapt to these new technologies. They should also participate in shaping policies and standards that affect the deployment of these technologies.
The potential benefits of integrating AI and quantum cryptography are vast and exciting, and it holds the promise of creating a secure computational environment in an era where quantum computing is set to become a significant player. By enhancing data security, industries could increase consumer trust and transform online banking, e-commerce, healthcare, national security, and telecommunications transactions.
Overall, the intersection of AI and quantum cryptography is a dynamic and evolving field that can future-proof cryptographic systems and enhance global digital security. With international collaboration in establishing global standards and practices, we can realise the full potential of these technologies and take data security to a whole new level.
The future of AI-powered quantum cryptography
We must delve deeper into the various sectors utilising AI-powered quantum cryptography. By doing so, researchers can gain a better understanding of the practical challenges and advantages that arise within each sector. This, in turn, can lead to the development of more effective and efficient applications of AI-powered quantum cryptography.
Considering recent technological progress, it is imperative to thoroughly analyse the ethical considerations, particularly concerning data privacy and the possibility of abuse. We must take these concerns seriously and ensure that measures are in place to safeguard against any potential negative consequences of using innovative technologies. As such, it is crucial to consider the implications of any new developments and approach them cautiously, always considering the potential impact on individuals and society.
It is imperative to thoroughly scrutinise the sustainability and flexibility of these mechanisms, particularly considering the constant advancements in both AI and quantum mechanics. This careful examination will enable us to ensure their long-term effectiveness and potential for adaptation to future developments.
The potential for increased research capabilities can be achieved by collaborating with AI and quantum physics professionals. By combining their expertise, a more comprehensive approach can be taken towards advancing scientific inquiry. The potential of AI and quantum cryptography is highly promising. Through dedicated research, this technology can be fully unlocked in the future.
Our discussion has explored the intricate relationship between AI and quantum cryptography, revealing that combining these two domains can effectively enhance cryptographic systems and fortify security measures. Integrating AI and quantum cryptography has led to remarkable advancements in sectors such as banking and e-commerce, facilitating the development of robust security protocols and bolstering users’ trust in these sectors.
The field of AI-driven quantum cryptography is rapidly evolving, with ongoing research and expected advancements that have the potential to revolutionise the field. Hybrid cryptographic systems, automated cryptographic protocol design, quantum key distribution enhancements, post-quantum cryptography development, quantum machine learning for cryptanalysis, and secure multi-party computation (MPC) are hotspots for innovation and breakthroughs.
Researchers are actively exploring the integration of quantum-resistant algorithms with traditional cryptographic methods. AI-powered optimisation and analysis can be crucial in developing and fine-tuning these hybrid systems for maximum efficiency and security. These hybrid systems leverage the strengths of both quantum and classical cryptography, providing enhanced security against both classical and quantum threats.
In the automated design of cryptographic protocols, AI, specifically machine learning and neural networks, is a promising research direction. AI algorithms can analyse vast amounts of data to identify patterns and potential vulnerabilities in cryptographic protocols, leading to more robust and secure system design. This approach could lead to the discovery of novel cryptographic methods inherently resistant to quantum attacks.
Research is underway to use AI to improve the performance and reliability of QKD systems. AI algorithms can help optimise the QKD process, reduce errors, and enhance key generation rates. This includes the use of AI in adaptive QKD, where the parameters of the QKD system are dynamically adjusted in response to changing environmental conditions and potential security threats.
AI is expected to accelerate the development of post-quantum cryptography algorithms. By simulating quantum attacks, AI can help identify potential weaknesses in current algorithms and guide the design of new quantum-resistant cryptographic schemes. This could lead to the creation of a new generation of cryptographic algorithms that can secure data against classical and quantum computational threats.
The emerging field of quantum machine learning, which combines quantum computing with machine learning algorithms, has potential applications in cryptanalysis. Quantum-enhanced machine learning could analyse encrypted data more efficiently, leading to faster and more effective cryptanalysis. This research could provide insights into the resilience of cryptographic algorithms against advanced quantum computing techniques.
With the advancements in AI and quantum cryptography, secure multi-party computation (MPC) is expected to become more robust and efficient. AI can assist in optimising the protocols and algorithms used in MPC, ensuring secure, collaborative computation among multiple parties without revealing individual data inputs.
However, as these research areas develop, it is essential to consider ethical implications and ensure that advancements in AI-driven quantum cryptography are aligned with global data protection standards and privacy concerns. The future of AI-driven quantum cryptography promises enhanced security and efficiency while posing challenges and responsibilities regarding ethical use and global regulation.
To advance our understanding of AI-driven quantum cryptography, companies that depend on secure data transmissions should allocate resources towards research and development that combines artificial intelligence and quantum mechanics. This could result in more resilient and adaptable cryptographic systems, ultimately improving data security. Additionally, organisations should prioritise training their employees to adapt to these cutting-edge technologies.
In conclusion, combining AI and quantum cryptography is a promising field with significant potential in enhancing data security and privacy. Ongoing research and advancements in hybrid cryptographic systems, automated cryptographic protocol design, quantum key distribution enhancements, post-quantum cryptography development, quantum machine learning for cryptanalysis, and secure multi-party computation are expected to revolutionise the field. However, it is crucial to consider ethical implications and ensure that advancements in AI-driven quantum cryptography are aligned with global data protection standards and privacy concerns.
Availability of data and materials
All data and materials are included in the article.
https://www.nist.gov/news-events/news/2023/02/nist-selects-lightweight-cryptography-algorithms-protect-small-devices .
https://csrc.nist.gov/News/2023/lightweight-cryptography-nist-selects-ascon .
https://www.nist.gov/news-events/news/2018/04/nist-issues-first-call-lightweight-cryptography-protect-small-electronics .
Advisera, “What is the meaning of ISO 27001?”. https://advisera.com/27001academy/what-is-iso-27001/
Agrawal A, et al. Software security estimation using the hybrid fuzzy ANP-TOPSIS approach: design tactics perspective. Symmetry. 2020;12(4):598. https://doi.org/10.3390/SYM12040598 .
Article ADS Google Scholar
Aldoseri A, Al-Khalifa KN, Hamouda AM. Re-thinking data strategy and integration for artificial intelligence: concepts, opportunities, and challenges. Appl Sci. 2023;13(12):7082. https://doi.org/10.3390/APP13127082 .
Article CAS Google Scholar
Alyami H, et al. The evaluation of software security through quantum computing techniques: a durability perspective. Appl Sci. 2021;11(24):11784. https://doi.org/10.3390/APP112411784 .
Awan U, Hannola L, Tandon A, Goyal RK, Dhir A. Quantum computing challenges in the software industry. A fuzzy AHP-based approach. Inf Softw Technol. 2022;147:106896. https://doi.org/10.1016/J.INFSOF.2022.106896 .
Article Google Scholar
Ayoade O, Rivas P, Orduz J. Artificial intelligence computing at the quantum level. Data. 2022;7(3):28. https://doi.org/10.3390/DATA7030028 .
Bennett CH, Brassard G. Quantum cryptography: public key distribution and coin tossing. Theor Comput Sci. 2020;560(P1):7–11. https://doi.org/10.1016/j.tcs.2014.05.025 .
Article MathSciNet Google Scholar
Braverman M, Ko YK, Weinstein O. Approximating the best Nash equilibrium in no (1ogn)-time breaks the exponential time hypothesis. Proc West Mark Ed Assoc Conf. 2015;2015-Janua(January):970–82. https://doi.org/10.1137/1.9781611973730.66 .
Broadbent A, Schaffner C, Broadbent Abroadbe BA, Uottawaca B, Schaffner C. Quantum cryptography beyond quantum key distribution. Des Codes Cryptogr. 2015;78(1):351–82. https://doi.org/10.1007/S10623-015-0157-4 .
Article MathSciNet PubMed PubMed Central Google Scholar
Catril Opazo JE, NIST cybersecurity framework in South America: Argentina, Brazil, Chile, Colombia, and Uruguay (2021)
Diamanti E, Lo HK, Qi B, Yuan Z. Practical challenges in quantum key distribution. Npj Quantum Inf. 2016;2(1):1–12. https://doi.org/10.1038/npjqi.2016.25 .
Diffie W, Hellman ME. New directions in cryptography. IEEE Trans Inf Theory. 1976;22(6):644–54. https://doi.org/10.1109/TIT.1976.1055638 .
Eisenhardt KM. Building theories from case study research. Acad Manag Rev. 1989;14(4):532. https://doi.org/10.2307/258557 .
Elaziz A, Raheman F. The future of cybersecurity in the age of quantum computers. Fut Internet. 2022;14(11):335. https://doi.org/10.3390/FI14110335 .
Feistel H, Block cipher cryptographic system (1971)
GDPR, What is GDPR, the EU’s new data protection law?—GDPR.eu. Accessed 07 Jul 2023. https://gdpr.eu/what-is-gdpr/
Gill SS, et al. AI for next generation computing: Emerging trends and future directions. Internet of Things. 2022;19:100514. https://doi.org/10.1016/J.IOT.2022.100514 .
Gupta S, Modgil S, Bhatt PC, Chiappetta Jabbour CJ, Kamble S. Quantum computing led innovation for achieving a more sustainable Covid-19 healthcare industry. Technovation. 2023;120:102544. https://doi.org/10.1016/J.TECHNOVATION.2022.102544 .
Gyongyosi L, Imre S. Secret key rate adaption for multicarrier continuous-variable quantum key distribution. SN Comput Sci. 2020;1(1):1–17. https://doi.org/10.1007/s42979-019-0027-7 .
ICO, Information Commissioner’s Office (ICO): The UK GDPR, UK GDPR guidance and resources. Accessed 08 July 2023. https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/lawful-basis/a-guide-to-lawful-basis/lawful-basis-for-processing/consent/
ISO, “ISO/IEC 27035–1:2016—Information technology—Security techniques—Information security incident management—Part 1: Principles of incident management.” Accessed 25 July 2023. https://www.iso.org/standard/60803.html
ISO, “ISO - International Organization for Standardization.” Accessed 26 Dec 2017. https://www.iso.org/home.html
ISO, “ISO/IEC 27001 and related standards Information security management 2022
ISO, “ISO/IEC DIS 42001 - Information technology—Artificial intelligence—Management system.” Accessed 06 April 2023. https://www.iso.org/standard/81230.html
Johnson C, Badger L, Waltermire D, Snyder J, Skorupka C. Guide to cyber threat information sharing. NIST Spec Publ. 2016. https://doi.org/10.6028/NIST.SP.800-150 .
Kop M. Quantum-ELSPI: a novel field of research. Digit Soc. 2023;2(2):1–17. https://doi.org/10.1007/S44206-023-00050-6 .
Kumar M. Post-quantum cryptography Algorithm’s standardization and performance analysis. Array. 2022;15:100242. https://doi.org/10.1016/J.ARRAY.2022.100242 .
Liddell HG. A greek-english lexicon. Cape Palmas: Harper; 1894.
Google Scholar
Lovic V, Quantum key distribution: advantages, challenges and policy 2020. https://doi.org/10.17863/CAM.58622
Mallow GM, Hornung A, Barajas JN, Rudisill SS, An HS, Samartzis D. Quantum computing: the future of big data and artificial intelligence in spine. Spine Surg Relat Res. 2022;6(2):93. https://doi.org/10.22603/SSRR.2021-0251 .
Article PubMed PubMed Central Google Scholar
NIST, “Advanced Encryption Standard (AES), 2001. Accessed 19 March 2023. https://web.archive.org/web/20170312045558/http://nvlpubs.nist.gov/nistpubs/FIPS/NIST.FIPS.197.pdf
NIST, “Framework for Improving Critical Infrastructure Cybersecurity,” 2014. Accessed 24 Dec 2017. https://www.nist.gov/sites/default/files/documents/cyberframework/cybersecurity-framework-021214.pdf
NIST, “Cybersecurity Framework Version 1.1,” 2018
NIST, “Product Integration using NVD CVSS Calculators,” 2022. https://nvd.nist.gov/Vulnerability-Metrics/Calculator-Product-Integration
NIST, “Key Management - Symmetric Block Ciphers, Pair-Wise Key Establishment Schemes,” 2022, [Online]. https://csrc.nist.gov/projects/key-management/key-establishment
NIST, “Artificial intelligence | NIST.” Accessed 06 April 2023. https://www.nist.gov/artificial-intelligence
NIST, “AI Risk Management Framework | NIST,” National Institute of Standards and Technology. Accessed 18 April 2023. Available: https://www.nist.gov/itl/ai-risk-management-framework
NIST, “Software Security in Supply Chains: Software Bill of Materials (SBOM) | NIST,” National Institute of Standards and Technology. Accessed 18 April 2023. https://www.nist.gov/itl/executive-order-14028-improving-nations-cybersecurity/software-security-supply-chains-software-1
NIST, “Post-Quantum Cryptography | CSRC | Competition for Post-Quantum Cryptography Standardisation,” 2023. Accessed 06 Sept 2023. https://csrc.nist.gov/projects/post-quantum-cryptography
NIST, “SP 800-61 Rev. 2, Computer Security Incident Handling Guide | CSRC.” Accessed 25 July 2023. https://csrc.nist.gov/pubs/sp/800/61/r2/final
NIST, “Post-Quantum Cryptography | CSRC | Selected Algorithms: Public-key Encryption and Key-establishment Algorithms,” 2023. Accessed 06 Sept 2023. https://csrc.nist.gov/Projects/post-quantum-cryptography/selected-algorithms-2022
NIST, “NVD—CVSS v3 Calculator,” CVSS Version 3.1. Accessed 03 Jan 2023. https://nvd.nist.gov/vuln-metrics/cvss/v3-calculator
NIST 800-53, “Security and Privacy Controls for Information Systems and Organizations 2020
NIST Advanced Manufacturing Office, “Advanced Manufacturing Partnership,” 2013. Accessed 04 May 2020. https://www.nist.gov/amo/programs
NIST C, Cybersecurity Framework | NIST . 2016. https://www.nist.gov/cyberframework
NIST, “Block Cipher Techniques.” https://csrc.nist.gov/Projects/block-cipher-techniques
NIST, “Post-Quantum Cryptography PQC.” https://csrc.nist.gov/Projects/post-quantum-cryptography
NIST, “Privacy-Enhancing Cryptography PEC.” https://csrc.nist.gov/Projects/pec
NIST, “Lightweight Cryptography.” https://csrc.nist.gov/Projects/lightweight-cryptography
NIST, “Cybersecurity Framework.” https://www.nist.gov/cyberframework/getting-started
NIST, “Hash Functions,” 2020. https://csrc.nist.gov/Projects/Hash-Functions
NIST, “NIST Special Publication 800–128,” 2011. https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-128.pdf
NIST, “NIST Version 1.1,” National Institute of Standards and Technology, U.S. Department of Commerce. https://www.nist.gov/news-events/news/2018/04/nist-releases-version-11-its-popular-cybersecurity-framework
Nitaj A, Rachidi T. Applications of neural network-based AI in cryptography. Cryptography. 2023;7(3):39. https://doi.org/10.3390/CRYPTOGRAPHY7030039 .
Paar C, Pelzl J. Understanding cryptography: a textbook for students and practitioners. Berlin: Springer; 2009.
Petrov M, Adapted SANS cybersecurity policies for NIST cybersecurity framework
Rivest RL, Shamir A, Adleman L. A method for obtaining digital signatures and public-key cryptosystems. Commun ACM. 1978;21(2):120–6.
Rutkowski A, et al. CYBEX. ACM SIGCOMM Comput Commun Rev. 2010;40(5):59–64. https://doi.org/10.1145/1880153.1880163 .
Sahu K, Srivastava RK, Kumar S, Saxena M, Gupta BK, Verma RP. Integrated hesitant fuzzy-based decision-making framework for evaluating sustainable and renewable energy. Int J Data Sci Anal. 2023;16(3):371–90. https://doi.org/10.1007/S41060-023-00426-4 .
Sevilla J, Moreno P, Implications of quantum computing for artificial intelligence alignment research 2019
Shamshad S, Riaz F, Riaz R, Rizvi SS, Abdulla S. An enhanced architecture to resolve public-key cryptographic issues in the internet of things (IoT), employing quantum computing supremacy. Sensors (basel). 2022;22(21):271–6. https://doi.org/10.3390/S22218151 .
Shapna Akter M Quantum cryptography for enhanced network security: a comprehensive survey of research. Developments, and Future Directions 2023
Shor PW, Algorithms for quantum computation: discrete logarithms and factoring. In: Proceedings—annual IEEE symposium on foundations of computer science, FOCS, 1994. Pp. 124–134. https://doi.org/10.1109/SFCS.1994.365700
SWID, “Software Identification (SWID) Tagging | CSRC | NIST,” National Institute of Standards and Technology. Accessed 19 April 2023. [Online]. https://csrc.nist.gov/projects/Software-Identification-SWID
Tabassi E, AI risk management framework | NIST. (2023) https://doi.org/10.6028/NIST.AI.100-1
Takahashi T, Kadobayashi Y. Reference ontology for cybersecurity operational information. Comput J. 2015;58(10):2297–312. https://doi.org/10.1093/COMJNL/BXU101 .
Taylor RD. Quantum artificial intelligence: a ‘precautionary’ U.S. approach? Telecomm Policy. 2020;44(6):101909. https://doi.org/10.1016/J.TELPOL.2020.101909 .
Tsai CW, Yang CW, Lin J, Chang YC, Chang RS. Quantum key distribution networks: challenges and future research issues in security. Appl Sci. 2021;11(9):3767. https://doi.org/10.3390/APP11093767 .
Udroiu A-M, Dumitrache M, Sandu I, Improving the cybersecurity of medical systems by applying the NIST framework. In 2022 14th international conference on electronics, computers and artificial intelligence (ECAI). IEEE, 2022, pp 1–7
Yin KR, Case study research: design and methods (2009) Accessed 25 April 2023. https://books.google.com/books?hl=en&lr=&id=FzawIAdilHkC&oi=fnd&pg=PR1&dq=Yin,+R.+K.+(2009).+Case+study+research:+Design+and+methods+(Vol.+5).+sage.&ots=l_5Q4fkSYt&sig=fICdRmFfBrFKJIHQRApE252vNhQ#v=onepage&q&f=false
Yin RK. Case study research: design and methods, vol. 5. Newcastle upon Tyne: Sage; 2009b.
Ying M. Quantum computation, quantum theory and AI. Artif Intell. 2010;174(2):162–76. https://doi.org/10.1016/J.ARTINT.2009.11.009 .
Ying M. Quantum computation, quantum theory and AI ✩. Artif Intell. 2010;174:162–76. https://doi.org/10.1016/j.artint.2009.11.009 .
Zolfaghari B, Rabieinejad E, Yazdinejad A, Parizi RM, Dehghantanha A, Crypto makes AI evolve
Download references
Acknowledgements
Eternal gratitude to the Fulbright Visiting Scholar Project.
This work has been supported by the PETRAS National Centre of Excellence for IoT Systems Cybersecurity, which has been funded by the UK EPSRC under Grant Number EP/S035362/1, and ESRC Grant Number: ES/V003666/1.
Author information
Authors and affiliations.
Department of Computer Sciences, University of Oxford, Oxford, UK
Petar Radanliev
School of Management, University of Bath, Bath, UK
You can also search for this author in PubMed Google Scholar
Contributions
Dr PR—sole author—was responsible for reviewing the researched literature, conceived the study, was involved in protocol development, gaining ethical approval, patient recruitment and data analysis, wrote the first draft of the manuscript and reviewed and edited the manuscript and approved the final version of the manuscript.
Corresponding author
Correspondence to Petar Radanliev .
Ethics declarations
Ethics approval and consent to particiapte.
The University of Oxford ethical committee has granted ethical approval under reference R51864/002.
Competing interests
The author declares no conflict of interest, nor competing interest.
Additional information
Publisher's note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and permissions
About this article
Cite this article.
Radanliev, P. Artificial intelligence and quantum cryptography. J Anal Sci Technol 15 , 4 (2024). https://doi.org/10.1186/s40543-024-00416-6
Download citation
Received : 12 December 2023
Accepted : 28 January 2024
Published : 09 February 2024
DOI : https://doi.org/10.1186/s40543-024-00416-6
Share this article
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
- Artificial intelligence
- Quantum algorithms
- Neural networks
- Quantum-AI integration
- Quantum threats
- AI-enhanced security
- Quantum information processing
quantum cryptography Recently Published Documents
Total documents.
- Latest Documents
- Most Cited Documents
- Contributed Authors
- Related Sources
- Related Keywords
Quantum and Post‐Quantum Cryptography
Quantum sampling for finite key rates in high dimensional quantum cryptography, the long road ahead to transition to post-quantum cryptography.
Anticipating the output of the competition for new cryptographic algorithms.
Post-Quantum Cryptography: A Solution to Quantum Computing on Security Approaches
Examples of quantum it in new technologies of computation.
The paper includes definitions of elements of quantum IT referred to classical technologies of computation. It explains the principles of transformation of calculating algorithms to the domain of quantum computations using the optimisation and matrix calculus. Exemplary applications of classical algorithms are presented with possibilities of their realisation in domain of quantum IT. Autor presents some possibilities for using quantum algorithms in new computation technologies concerning quantum cryptography and data analyses with complex computations.
Post-Quantum and Code-Based Cryptography—Some Prospective Research Directions
Cryptography has been used from time immemorial for preserving the confidentiality of data/information in storage or transit. Thus, cryptography research has also been evolving from the classical Caesar cipher to the modern cryptosystems, based on modular arithmetic to the contemporary cryptosystems based on quantum computing. The emergence of quantum computing poses a major threat to the modern cryptosystems based on modular arithmetic, whereby even the computationally hard problems which constitute the strength of the modular arithmetic ciphers could be solved in polynomial time. This threat triggered post-quantum cryptography research to design and develop post-quantum algorithms that can withstand quantum computing attacks. This paper provides an overview of the various research directions that have been explored in post-quantum cryptography and, specifically, the various code-based cryptography research dimensions that have been explored. Some potential research directions that are yet to be explored in code-based cryptography research from the perspective of codes is a key contribution of this paper.
Trends In Natural Language Processing : Scope And Challenges
Quantum cryptography is a comparatively new and special type of cryptography which uses Quantum mechanics to provide unreal protection of data/information and unconditionally secure communications. This is achieved with Quantum Key Distribution (QKD) protocols which is a representation of an essential practical application of Quantum Computation. In this paper the authors will venture the concept of QKD by reviewinghow QKD works, the authors shall take a look at few protocols of QKD, followed by a practical example of Quantum Cryptography using QKD and certain limitations from the perspective of Computer Science in specific and Quantum Physics in general.
Securing the future internet of things with post‐quantum cryptography
Efficient implementation of finite field arithmetic for binary ring-lwe post-quantum cryptography through a novel lookup-table-like method, quantum cryptography, export citation format, share document.
You are using an outdated browser. Please upgrade your browser to improve your experience and security.
- PERSPECTIVES
- Military & Aerospace
- AI & Big Data
- Power Management
- Industrial Control & Automation
- Programmable Logic
- Prototyping
- Test & Measurement
- Wireless & Networking
- Silicon Grapevine
- Wine Down Friday
- Education Link
- EETimes University
- Tech Papers
- Asia Pacific
- Europe / Middle East
- Special Projects
Perspectives on Migration Toward Post-Quantum Cryptography
One of the most significant shifts in the field of security is expected to be the migration to post-quantum cryptography (PQC), especially for embedded systems. This transition is crucial due to the evolving capabilities of quantum computers, which pose an unprecedented threat to current cryptographic methods.
At the recent embedded world North America conference, Joost Renes, principal security architect and cryptographer at NXP Semiconductors, discussed this shift, highlighting the importance of cryptographic resilience in embedded systems.
The quantum threat and current cryptography
âQuantum computers have a potential impact on cryptography,â Renes said, underscoring the advancements in quantum computing that could threaten the use of popular algorithms like RSA and ECC.
Quantum algorithms like Shorâs and Groverâs could break or weaken these encryption systems, forcing industries to rethink their current reliance on such cryptographic methods. Shor’s algorithm can efficiently solve the prime factorization problem and the discrete logarithm problem, effectively breaking the mathematical foundations of RSA and ECC security.
Grover’s algorithm, on the other hand, speeds up the search process, effectively reducing the security level of symmetric encryption schemes like AES or weakening these encryption systems. RSA and ECC, which are based on difficult mathematical problems, become solvable with a sufficiently powerful quantum computer.
Renes emphasized the seriousness of this threat, pointing out that âthe security of these algorithms is based on classical hardware assumptions,â which could be easily overcome by quantum advancementsâ. For instance, with AES (Advanced Encryption Standard), the security is not completely broken by quantum computing, but its effectiveness is significantly reducedâparticularly halving the security in the case of AES-128â.
Although the timeline for when quantum computers will become a genuine threat to current cryptographic systems is still debated, with estimates ranging from 2030 to 2035, Renes stressed that organizations should start thinking about transitioning now. âThe White House has come out and said 2035 is when we expect these quantum computers to be large enough,â he said, adding that others, like the BSI, estimate it could be sooner, around 2030â.
This is no small matter for embedded systems, which are often deployed in environments where they are expected to remain operational for extended periodsâsometimes up to 20 years. âIf you’re putting a device out there, its lifetime can be very long, 10 years, 15 years, 20 years,â Renes said. Thus, even if quantum computers are not an immediate threat, systems deployed today must be agile enough to handle the post-quantum eraâ.
Overall, the timeline for quantum computers becoming a threat is debated, but there is a need for early transition planning. Having the first-mover advantage in quantum computing might reap huge benefits once the systems get large enough to disrupt the current cryptographic systems. Such advancements will put environments that currently operate using traditional systems at risk, and they will eventually have to migrate to quantum sooner or later.
PQC: The solution
The solution to the quantum threat lies in PQC, designed to resist quantum attacks while still being implementable on classical hardware. Renes clarifies that PQC should not be confused with quantum key distribution (QKD), which relies on quantum principles and hardware. Instead, PQC focuses on algorithms that can run on today’s microcontrollers and embedded devicesâ.
One of the primary challenges of this transition is the sheer complexity of the migration, particularly in the embedded systems domain . Renes noted that current algorithms are well-established and optimized, while post-quantum algorithms often require more resources.
For example, âpublic key sizes can grow to 1.5 kilobytes, and ciphertext sizes could also reach up to 1.5 kilobytes,â which is significantly larger than the compact sizes of today’s keys and ciphertextsâ. This increase can substantially impact embedded systems, which are typically constrained by memory and processing power.
The migration toward PQC is not just about futureproofing; it is also driven by emerging standards and compliance requirements. Renes pointed out that organizations like the NSA, NIST and the European Union Agency for Cybersecurity (ENISA) are already defining timelines for PQC adoption.
âCNSA 2.0, for instance, is pushing to have firmware signatures transitioned to post-quantum standards by 2025,â he saidâ. The urgency of these timelines reflects the serious nature of the quantum threat.
Renes also highlighted the concept of âcrypto agility,â which involves designing systems that seamlessly adapt to new cryptographic algorithms. This is particularly crucial for embedded systems relying on secure boot and update mechanisms. âIn 5-10 years’ time, TLS [Transport Layer Security] is going to adopt post-quantum algorithms,â he said, and systems deployed today must be able to update their software to support these future algorithmsâ.
Additionally, other potential uses of quantum algorithms could enhance existing systems. For example, quantum algorithms could optimize large-scale logistics by providing faster route calculations, enhance ML by solving complex data problems more efficiently and even improve drug discovery through more powerful simulations.
Moreover, quantum cryptographic protocols, such as quantum-secure digital signatures, could also be used to bolster security in communication systems, offering an advanced safeguard against evolving cyber threats.
Embedded systems: challenges and opportunities
For embedded systems, the migration to PQC presents significant challenges and opportunities. On one hand, the resource constraints of these systems make the adoption of PQC more complex. âThe initial implementations of some post-quantum algorithms, such as the predecessor to MOTSA, required up to 128 kilobytes of RAM,â Renes said, adding that this is far beyond what many embedded systems can handleâ.
However, the industry has made advancements in optimizing these algorithms. For example, companies like IBM and Google have demonstrated quantum algorithms that significantly improve optimization problems, while NXP has successfully reduced the memory requirements of post-quantum algorithms.
Furthermore, collaboration among tech giants and research institutions is accelerating the practical implementation of these algorithms, providing tangible solutions that help overcome the hardware constraints of embedded systems in optimizing these algorithms.
At NXP, for instance, the memory requirements for these algorithms have been reduced drastically. âWe reduced the memory footprint from 123 kilobytes to only 8 kilobytes,â Renes said, albeit with some performance trade-offsâ. These optimizations ensure that PQC can be deployed on embedded systems without requiring a complete overhaul of existing hardware architectures.
Moreover, Renes highlights the importance of implementation security, noting that side-channel attacks like voltage glitching or fault injection can compromise both current and post-quantum algorithms, as shown in Figure 2.
Side-channel attacks exploit information leakage from physical implementation, while fault injection involves deliberately introducing errors to manipulate system behavior. The ongoing effort to secure these implementations, Renes said, will be critical in ensuring the resilience of embedded systems in the post-quantum eraâ.
Looking ahead: The future of cryptography
The migration toward PQC is inevitable, and while the timeline for when quantum computers will become a real threat is still uncertain, the industry must act now to prepare. âEven if today there is no quantum computer that can break these algorithms, we need to ensure that the systems we deploy today can handle the threats of tomorrow,â Renes advised.
The transition will be challenging, particularly for embedded systems, but the work being done by organizations like NXP to optimize PQC for resource-constrained environments is paving the way for an efficient migration. âWe’re seeing significant progress in adoption, and it’s only a matter of time before post-quantum cryptography becomes the new standard,â Renes said. In conclusion, the large-scale migration toward PQC for embedded systems is both necessary and an opportunity. By investing in crypto agility and ensuring compliance with emerging standards, the industry can protect itself from the quantum threat while continuing to innovate in the embedded systems space.
Share this:
Abhishek Jadhav is a contributing writer for EE Times.
Leave a Reply Cancel reply
You must Register or Login to post a comment.
This site uses Akismet to reduce spam. Learn how your comment data is processed .
IMAGES
VIDEO
COMMENTS
Conference PaperPDF Available. A Review Paper on Cryptography. June 2019. June 2019. DOI: 10.1109/ISDFS.2019.8757514. Conference: 2019 7th International Symposium on Digital Forensics and Security ...
Cryptography is the solution to secure data from different security risks. To enhance the security of communication systems better cryptosystems technology is obvious in the area of cryptography. Our research focuses on data encryption and decryption technique for a better cryptosystem; where we have proposed a new approach that ensures better ...
Cryptography is the study of assured communication procedure which allows only the sender and the intended person to review the message and the content shared. The simplest method used is the symmetric algorithm in which once the message is encrypted it is sent to the recipient along with its secret key. 2.
Cryptography is an international, scientific, peer-reviewed, open access journal on cryptography published quarterly online by MDPI.. Open Access — free for readers, with article processing charges (APC) paid by authors or their institutions.; High Visibility: indexed within Scopus, ESCI (Web of Science), dblp, and other databases. Journal Rank: JCR - Q2 (Computer Science, Theory and Methods ...
This paper presents is to present the idea of Cryptography, History of Cryptography, Modern Cryptography. This exploration zeroed in on various sorts of cryptography calculations that exist, as AES, DES, 3DES, IDEA, RSA, ECC, Homomorphic and Blowfish and so forth. 1. Introduction. Cryptography is a tool to execute messages confidentiality.
research papers, and academic studies related to next-generation cryptography and its role in securing digital communication. The aim is to gather insights, analyze key findings, and present a ...
Advances in cryptography have significantly contributed to the secure. communication and data protection landscape. These developments — fro m secure. communication protocols to homomorphic ...
A Review Paper on Cryptography Abstract: With the internet having reached a level that merges with our lives, growing explosively during the last several decades, data security has become a main concern for anyone connected to the web. Data security ensures that our data is only accessible by the intended receiver and prevents any modification ...
Journal of Cryptology is a comprehensive source for original results in modern information security. Provides a forum for original results in all areas of cryptology. Covers both cryptography and cryptanalysis, including information theoretic and complexity theoretic perspectives. Also discusses implementation, application, and standards issues.
Cryptography is considered as a branch of both mathematics and computer science, and it is related closely to information security. This chapter explores the earliest known cryptographic methods, including the scytale, Caesar cipher, substitution ciphers, and transposition ciphers. Also, explains the evolution of these methods over time. The development of symmetric and asymmetric key ...
The Journal of Cryptology started with volume 1 in 1988, and currently consists of four issues per year. It is published by Springer. IACR members receive a subscription to the Journal of Cryptology. Subscriptions are available to non-members directly from Springer. Journal homepage at Springer Current editorial board.
Existing review papers on the area only cover certain types of visual cryptography or lack comparisons between the various schemes. To address this gap, this paper provides broad overview of the area to aid new researchers in identifying research problems or to select suitable visual cryptography methods for their desired applications.
In general terms, cryptography is the study of mathematical techniques to enforce policies on information. These policies broadly specify who is allowed to send, read and edit digital information.
The research papers are also arranged in ascending order according to these three categories. The full data collected for this RQ2 can ... (CT), capacity (related to steganography), and key size (related to cryptography). Computation time, which encompasses both steganography and cryptography, is particularly important as it correlates with ...
The technological advancements made in recent times, particularly in artificial intelligence (AI) and quantum computing, have brought about significant changes in technology. These advancements have profoundly impacted quantum cryptography, a field where AI methodologies hold tremendous potential to enhance the efficiency and robustness of cryptographic systems. However, the emergence of ...
initial research questions, and presents recommendations for fostering the secure adoption of cryptography. 2 Related Work We focus our discussion of related work on research identify-ing fundamental challenges using cryptography. Developer Centered Cryptography. In 1993, Anderson dis-cussed the challenges faced by cryptographic system design-
By providing an overview of the necessary mathematical backgrounds for various cryptography algorithms, this article aims to equip readers with the foundational knowledge needed to explore these ...
This threat triggered post-quantum cryptography research to design and develop post-quantum algorithms that can withstand quantum computing attacks. This paper provides an overview of the various research directions that have been explored in post-quantum cryptography and, specifically, the various code-based cryptography research dimensions ...
One of the most significant shifts in the field of security is expected to be the migration to post-quantum cryptography (PQC), especially for embedded systems. This transition is crucial due to the evolving capabilities of quantum computers, which pose an unprecedented threat to current cryptographic methods. At the recent embedded world North ...
Abstract. The primary goal is to present a comprehensive overview of network security, cryptography, and digital signatures. Cryptography and network security should be narrower to include ...
In this paper, we have suggested the use of a combination of symmetric key cryptography algorithms namely: AES-GCM, Fernet, AES-CCM, CHACHA20_POLY1305 algorithm, which help to provide high-level ...