1. Introduction
A large amount of sensitive information is transmitted through the internet, and such information can be easily accessed through public as well as private networks. The images, which represent an important amount of transmitted information, can be photographs, agreements, contracts, identification documents, account statements, or other kind of scanned documents, with a high intrinsic value. Any person with a mobile device and an internet connection will be able to send images through the internet and store them in physical devices or in the cloud. The security and confidentiality of such information have led to the development of several efficient cryptographic systems.
In some practical applications, involving scientific and engineering fields such as astronomy or medicine, the use of images without compression and loss information is commonly required. Among the image formats that meet these requirements, we have the Microsoft-designed Bitmap (BMP) format, which contains all the image information in a simple format operating in the spatial domain. It can be modified and easily edited because it can be debugged and viewed without special tools; besides that, it is used to display image files, including the color of each pixel, and although it was designed to be a Windows standard, it is currently supported by a variety of operating systems [
1]. In several countries, such as Mexico, where this research was carried out, it is prohibited to lose information in documents with sensitive information that need to be encrypted [
2]; the use of BMP or another lossless scheme is imperative when it is required to store digital documents. To encrypt the BMP images, reference [
3] proposed an efficient cryptographic algorithm based on elliptic curves and chaos. However, the size of BMP format images is too large for many applications; then, it is necessary to use some compressing format. Thus, this paper proposes an encryption algorithm for use in JPEG (Joint Photographic Experts Group) format [
4], which is widely used in a variety of electronic devices such as smartphones, computers, or digital cameras. This task is carried out in two stages: the baseline, which has no loss of information, and DCT (Discrete Cosine Transform), which considerably reduces the file size due to the loss of information, which is achieved by discarding redundant pixels using a quantization matrix, reducing in such way the image size and doing it in an easier manner than file transmission or storing [
4]. It is important to mention that the picture quality will depend on the compression level; usually, a level from 95% to 0.70% will give excellent quality but low compression, while if the maximum level is applied, 0% to 20%, the resulting image quality will degrade and sometimes become indistinguishable. Moreover, most of the other image formats are based on the RGB color models (Red, Green, Blue), but the JPEG format employs the color model YUV, where “Y” means luminance while “U” and “V” means chrominance, which works under the principle that the human eye is unable to identify certain bands of colors [
5]. Thus, although there is a loss of information, it is imperceptible to the human eye.
Protecting images with sensitive information transmitted through public communication channels is an important task that has been a topic of active research during the last several decades, leading to the development of several efficient cryptographic algorithms. To this end, recently, researchers have developed some efficient cryptosystems based on chaotic systems and/or in elliptic curves that avoid discrete logarithm attack, since it has been proven that they are robust against differential [
6,
7], linear [
8], or even statistical cryptanalysis.
Several efficient cryptographic schemes have been proposed recently in the literature based on elliptic curves and chaos, such as those described below. In [
9], the authors propose two image encryption schemes in which their encryption keys are generated using the logistic equation that is presented in Equation (13) and Henon maps. Then, the image encryption is performed using the solution points of an elliptic curve and applying bitwise XOR operations in three different stages. In [
10], the authors propose a three-step scheme where first, it uses two chaotic maps known as LTM and TSM to permute the pixels of the input image; the input parameters of these are obtained using the SHA-512 of the input image, from which the encryption keys are also generated. Then, using an asymmetric cryptosystem known as Elliptic Curves-ElGamal, the image is encrypted. Finally, using a genetic algorithm based on DNA sequences, diffusion is applied to the previously obtained output. In [
11], the authors propose an encryption scheme in two scenarios; both use an Elliptic Curves-Diffie-Hellman protocol and another key generated from the SHA-256 of the input image. Furthermore, in the encoding process, a triple chaotic STH map is used, and a digital signature is added to verify its origin before being decrypted. In [
12], the authors propose a scheme that uses ECC to share a point of an elliptic curve that will serve to obtain the initial parameters
of a logistic map and the number of rounds of Arnold’s map that will serve to permute the pixels from an image. The parameter k of the elliptic curve used is obtained with the SHA-512 and using the multiplication of points between k and a generator, the encryption keys are created. In [
13], the authors propose a four-step scheme. Here, firstly, they reduce the input image, which in this case is an RGB image, and extend it in the grayscale for the entire field E (F_p). Secondly, they apply an enhanced 4D Arnold cat map t times to permute the image pixels. Next, they encrypt the image with an ECC. Finally, they apply a 3D Lorenz chaotic map to add diffusion to the final encrypted image. Finally, in [
14], the authors propose a new method called EC-GRP permutation operation, which combines the cryptographic properties of the ECC and the operation group. This procedure requires a pseudo-random bit generator with good attributes according to the cryptography field, which is used to encrypt the input image. According to the researchers, the results obtained are highly resistant to cryptanalysis tests.
Despite the efficient algorithms described above, several issues remain that must be improved. To this end, this paper proposes a cryptographic algorithm for improving the above-mentioned schemes, whose main contributions are summarized as follows: (a) It uses elliptic curves with a constant
l equal to zero for the generation of the set of encryption keys. (b) It proposes an algorithm to generate the elliptic curves in addition to being mandatory that they comply with certain characteristics. (c) It uses a chaotic logistic equation to generate permutations and a substitution box plus its inverse with a non-linearity level of 100. (d) It implements a fingerprint for the receiver to identify if the received file corresponds to the one sent by the issuer. (e) It encrypts the images in a single block, which allows obtaining an adequate encryption speed. On the other hand, the cryptosystem was evaluated using several tests. After analyzing the evaluation of the results reported in
Section 5, it follows that the proposed structure is robust and capable of withstanding linear, differential, statistical, brute force, or modification attacks, as well as some of the better known as the discrete logarithm and the MOV.
The rest of this paper is organized as follows. In
Section 2, the preliminaries of this research are stated.
Section 3 describes how the encryption keys, the permutations, the substitution box, and the complete encryption algorithm are developed.
Section 4 provides several essential information such as the images that are encrypted, the explanation of every test necessary to demonstrate how robust is our proposal against cryptanalysis, and the results obtained. In
Section 5, the analysis of the data obtained after applying all the tests is provided. Finally,
Section 6 presents the conclusions of this research.
4. Experiments and Security Analysis
In this section, we carry out all the required experiments to prove if this cryptosystem is resistant against the main types of cryptanalysis and modifications. Another piece of essential important information is provided, too. To facilitate understanding, graphs and tables are added.
4.1. Images for Experiments
The proposed cryptosystem has undergone several tests to demonstrate its robustness to the differential, lineal, and differential attacks. To carry out each experiment, JPEG images of different sizes have been chosen; these are:
Lena.jpeg, Baboon.jpeg, and Boat.jpeg, with dimensions of 512 × 512 pixels.
Barbara.jpeg, with dimensions of 720 × 576.
City.jpeg, with dimensions of 1280 × 720.
House.jpeg, with dimensions of 1920 × 1080.
Security.jpeg, with dimensions of 4900 × 3464 pixels.
The first four images are commonly used in peer-reviewed papers, and the others are proposed in this research. The desktop application in which the proposed algorithm was implemented in Java programming language and the BufferedImage library was used to support certain tasks such as the reading of the pixels of each image [
32]. All pictures are shown in
Figure 3.
4.2. Elliptic Curve to Generate the Encryption Keys
To perform all the experiments, the elliptic curve has been chosen with the following data:
1bdb5c33c01c799169c62dae71c4176c5d1b0d
1b1dbcab63715d81db85c8db589685e3b5f1ec
59196553ed14991dbb65913217a9e4cffd676c103fa9df60df6b4b1681806d3571a8bfba36a
5e74aaa3d208d1c5322ef4c2327fa7e23143aa8eab32b38e0dd1705f83c7cd9909626a3f039
179d2aa8f48234714c8bbd308c9fe9f88c50eb8285ae4ae46740e766125f66f46313fd78995
8e28e3a5cef3c15189a8b1d9a8d87a2ffc9c79146fde29b7984fd997ea4b2fbc
742fae6e43b6b33e4d4b49672c2097c87bbcbdba4f7b28b8969e844f19f3941acf646d88ae
55546d0927fc1d205516fbbe4ceb24ed2495e01adc421a814a5b404167a3674e8f2c15d9bf
Next, fast analysis of the chosen elliptic curve is conducted:
If the calculation mod is performed, the result obtained will be: 5a67ecc091bc618e74d07f4905361dd6c7e67fc44d6d26c76fbfbd9625a8b43a4eddb4fee4a; therefore, it is shown that this is a Non-Singular Curve.
Later, if the calculation mod is performed, the result obtained will be: 179d2aa8f48234714c8bbd308c9fe9f88c50eb8285ae4ae46740e766125f66f46313fd78995; therefore, it is verified that this elliptic curve is not Supersingular.
On the other hand, it is easy to observe that and are different; in this way, it is stated that this elliptic curve is not of Trace One.
Thus, it is concluded that the chosen elliptic curve fulfills the four requirements described in
Section 2.1 and is safe and suitable to generate the set of round keys.
4.3. Encrypted Images
Figure 4 shows the encrypted results of images shown in
Figure 3. They prove that it is visibly impossible to find a pattern that allows inferring its origin. Nevertheless, it is important to understand that a visual inspection is not enough to demonstrate the cipher text is impossible to reverse; for that reason, in the next sections, several tests will be applied to ascertain it mathematically.
4.4. Statistical Cryptanalysis
An important aspect in any cryptosystem is to quantify the quality of the encryption, that is, the level of randomness, which will determine its resistance to statistical attacks that can determine the encryption key or the plaintext through existing biases or patterns in the ciphertext. Statistical tests such as those presented below are useful for this purpose.
4.4.1. Entropy
Entropy is one of the most important works of the French mathematician Claude E. Shannon [
33]. In an image, it must be applied in each band of this depending on its type of color space. This test analyzes the histogram of the figure after it has been coded and determines if its frequency distribution is much more uniform than that of the original image. It is known that 255 is the maximum value of a pixel, and for its binary expression, it requires 8 bits; this implies that the perfect distribution of an image after being encrypted must be 8, which is unlikely to happen in practice. Therefore, any value greater than 7.9 indicates a high entropy level [
15]. Equation (17) is used to calculate entropy.
The results obtained in this test are shown in
Table 3. The evaluation results show that the entropy provided by the proposed scheme is quite close to 8, which is the maximum theoretical value. Moreover,
Figure 5 shows the histograms per every RGB channel of the Lena image used in the experiments to visualize its distribution before and after encryption. Here, it can be shown that after encryption, the estimated histogram is almost flat, independently of the histogram shape of the image before encryption. These types of graphs are widely used to carry out a visual inspection in this test.
4.4.2. Correlation Coefficient
The correlation coefficient is based on the problem of analyzing the relationship between two variables
[
34]. In the specific case of image encryption, it analyzes whether the position of two contiguous pixels
is determined by some given pattern or if there is a dependency between two pixels in each image. Otherwise, it is said that there is no correlation, and both are randomly positioned. This measure is obtained with Equation (18).
If the measurement yields a 0, it is said that both images are completely different, and if a 1 or −1 is obtained, the conclusion would be that both images are equal. In practical terms, it is almost impossible to get zero, but anything value close to this indicates a high level of randomness between the pixels of an encrypted image. The most appropriate way to carry out this measurement is in three directions: Horizontal, that is, a pixel and its neighbour to the right; Vertical, that is, a pixel and its neighbor below; and Diagonal, that is, a pixel and its neighbor to below shifted by one space to the right.
Table 4,
Table 5 and
Table 6 show the correlation coefficients obtained; only absolute values are registered. The evaluation results show that the values of horizontal, vertical, and diagonal cross-correlation of the images under analysis closely approach zero, which means that it is not possible to infer one pixel of the image under analysis. Thus, if the correlation between the pixels of the encrypted image approach to zero, it is not possible to infer the original image using only information of the encrypted one. In addition to the experimental data given in
Table 2,
Table 3,
Table 4,
Table 5 and
Table 6,
Figure 6 shows the scatter plot per every RGB channel of one of the images used in the experiments to visualize the position of all the pixels before and after encryption; these types of graphs are widely used to carry out a visual inspection in this test. From this figure, it follows that in the encrypted image, it is not possible to estimate the value of pixel (
x,
y) using the value of pixel on the other locations of the same image.
4.4.3. Test
This statistical test is based on the statement of two hypotheses: the first of them is known as null and is identified as
[
35]; in this case, its role is to affirm that the pixels of an image after being encrypted have a random distribution. The second one is known as the alternative and is called
; its function is to reject the assertion proposed by the null. Using Equation (19),
is calculated, and it determines which of the two hypotheses is accepted and which is rejected.
refers to the observed frequencies, that is, those of the encrypted image, and
talks about the number of expected frequencies, which in this case is 256.
It is known that in tests based on hypotheses, there are two errors: a type I error that is the most important, that is, unequivocally rejecting,
; and, type II, that is, accepting
wrongly. For this research work, the type I error is used, whose value is
. In practice, any threshold (
) less than 308 is enough for
to be accepted [
36].
Table 7 shows the results obtained from the test after the experiments.
4.5. Differential and Linear Cryptanalysis
Cryptanalysis is the antagonistic science of cryptography whose objective is to develop attacks capable of compromising or breaking encryption algorithms. Symmetric and block encryption cryptosystems must be robust to at least two types of cryptanalysis, differential and linear [
37].
The differential attacks were first proposed by Eli Biham and Adi Shamir with the aim of breaking the DES cryptosystem [
7]. Attacks of this type exploit the high probability of the existence of differences in the plain text ∆X and differences in the last round of encryption ∆Y, which is known as the differential. This attack selects inputs and analyzes outputs to find the encryption key.
The linear attack was proposed for the first time by Mitsuru Matsui, who sought to exploit the DES cryptosystem through the known plaintext at entry [
6]. This attack works considering the linear correlations between some of the plaintext bits (input block) and the output bits (cipher block) to infer the cipher key. There are various tests that a cryptosystem must undergo to verify that it is not vulnerable to the attacks described above; this process is carried out below.
4.5.1. NPCR and UACI
Both standards serve to test the resistance of any cryptosystem against differential cryptanalysis; they function as follows. Starting from the fact that there are two encrypted images,
and
, which come from two images whose only difference is a pixel, and the encryption process has been carried out with the same keys. If the proposed cryptosystem is robust, images
and
must be practically different, which can be measured with the NPCR and UACI standards [
38]. The first is defined by Equation (20) and the second is defined by (21) where,
and
refer to the images,
refers to the number of pixels in each image (
number of planes), and
is defined in Equation (22).
In the case of this experiment, both images are encrypted with the set of encryption keys, which are generated from the
corresponding to the SHA-1 of both. In all the images used, pixel 3750 of the blue channel has been modified. In practical terms, in these tests, it is expected to obtain percentages between the range of 99.5% and 99.6% for the NPCR and between 33.4% and 33.5% for the UACI. The results obtained in this test are shown in
Table 8.
4.5.2. Avalanche Effect
The avalanche effect, also known as avalanche attack, is another important manner to test the resistance of any cryptosystem against differential attacks [
39]. This standard works with the same principle that was observed in NPCR and UACI tests, where a tiny change made in an image will produce a practically different ciphered image, although in this case, the change is made on a bit level. Having said that, let
and
be two plain images with only one different bit; then, they are ciphered using a group of round keys generated from a main key
with just a distinct bit among each other. Thus, the resulting images
and
must be shown a bit rate of changing approached to 50% [
40]. Using Equations (22) and (23), it is possible to obtain such a measure.
The results obtained in the avalanche test are given in
Table 9.
The evaluation results show that the avalanche effect resulting from the proposed scheme is quite close to the ideal value, which is equal to 50%.
4.5.3. Chosen/Known Plain-Text Attacks
These types of attacks fit into the category of linear cryptanalysis, and there four of them in total, which are the known-plaintext attack, the chosen-plaintext attack, the ciphertext-only attack, and the chosen-ciphertext attack; the first two are the most important, and it is stated that any cryptosystem capable of supporting them will do the same with the last two [
39]. The procedure to test if a cryptosystem can resist them consists of encrypting two images: one white and one black, and then measuring their entropies and correlation coefficients and verifying if they fall within the parameters described in
Section 4.4.1 and
Section 4.4.2. In this experiment, five images of each are used, corresponding to the dimensions of those chosen for the experimentation stage.
Table 10 shows the average results obtained in this test.
4.6. Key Sensitivity Test
The objective of this test is to analyze the percentage of different pixels between two equal images,
and
. Both will be encrypted with a set of encryption keys generated from a specific
, and in the case of the second image, the same
will be used as for
but with a small modification that could be the change of one bit. If we want to decrypt
with the keys of
, it must be impossible, and vice versa. The percentage of change is calculated with Equation (24), and in practical terms, it is expected that minimum values of DiffImg equal to 99% will be obtained.
Table 11 shows the results obtained from the test after the experiments.
4.7. Keyspace Analysis
The brute force attack on an encryption key is inescapable, where its objective is to try all possible combinations until finding the one that matches. In this case, the keys are generated from a solution point of an elliptic curve, which implies that there are different possibilities, because is a prime number. Furthermore, for a key to be secure, it must have a minimum size of bits. Since the keys depend on the integer which is 256 bits, the size of the keys used in this proposal is about , which is far beyond the minimum required.
4.8. Occlusion and Noise Attacks
When an encrypted image is transmitted over an insecure medium, it takes the risk of being intentionally or incidentally modified or distorted by an attacker. Any clipping, obstruction, or change in pixels that can be interpreted as added noise implies information that has been lost and is impossible to recover. Nevertheless, if the image encryption quality of the image is high and all the pixels have been perfectly distributed in a way considered as random, when the image is decrypted, much of the original information will still be possible to display. Therefore, it is very important to measure the resistance of any cryptosystem, mainly to two attacks: that is the occlusion and the added noise.
Furthermore, in this test, an image that displays a written message is used, which is shown in
Figure 7. Sometimes, these types of pictures are sent to share notes or news and, commonly, attackers can try to interfere or damage them to avoid that the communication can be completed.
Regarding the first attack, Lena.jpeg and Text.jpeg are encrypted, and then, various sections of an amplitude of 25%, 50%, and 75% of the total surface are intentionally cut out. Thus, when the image is deciphered, it is visually analyzed if the pixels remaining rearranged are enough to be able to infer which was the original image. The second attack consists of adding salt and pepper noise to the encrypted image, which in practice consists of scattering black or white pixels pseudo-randomly on the surface; for this case, the noise is added in densities of 25%, 50%, and 75%, and then, the same visual analysis is done as in the first one. The obtained results are shown in
Figure 8,
Figure 9,
Figure 10,
Figure 11,
Figure 12 and
Figure 13.
4.9. Time Encryption
The computer used for all the tests had the next hardware resources:
Processor: Intel Core 13 7350k, 4.00 GHz, dual-core.
RAM Memory: 8 GB, 1600 Mhz, DDR3L.
Video Card: Intel 630.
Hard disk: SATA 7200 RPM.
Operative System: Windows 10 Pro.
Having explained this, the speed encryption was measured five times for every image; thus, in
Table 12 shows the average speed calculated.
4.10. Comparison with Other Articles of the State of the Art
Table 13 shows the average results obtained in this research in four tests for entropy and correlation coefficient, while
Table 14 shows a comparison of the NPCR, UACI, and the avalanche obtained by proposed scheme compared with those obtained using other similar papers recently published in the literature.
Table 13 and
Table 14 shows that the evaluation results provided by the proposed are quite competitive with other previously proposed schemes.
5. Analysis and Results
This section provides an analysis of the evaluation results obtained by every test performed in a strictly sequential order as they were performed. The first evaluated parameter is the
Entropy; according to the results of
Section 4.4.1, this test seeks to find a value that means that the encrypted image has almost a uniform frequency distribution. If a value greater than 7.9 is obtained, the entropy is high and complies with the previous approach. It can be seen from
Table 3 that the lowest calculated value is 7.9992, which is very close to the ideal value for which an image encoded with 8 bit/pixel is equal to 8. The next evaluated parameter is the Correlation coefficient. In
Section 4.4.2, it was explained that this test seeks to measure the level of dependence between the contiguous pixels of an image encrypted in three directions: horizontal, vertical, and diagonal. There is a high non-linearity when the values approach zero, which happens in all cases. It should be remembered that the results of
Table 4,
Table 5 and
Table 6 are expressed in absolute values. These tables show that the cross-correlation values of encrypted signals approach zero, which means that the knowledge of some pixel values does not allow the estimate the encrypted images. The
Test proposed in
Section 4.4.3 is based on the proposition of two hypotheses; the first indicates that the encryption carried out is random, and the second contradicts it; the acceptance of each one depends on a threshold
that must be less than 308. It is observed that all the values recorded in
Table 7 are less than 300 in all cases.
The resistance to differential and linear cryptoanalysis was also evaluated. To this end, the
NPCR and UACI were estimated. As mentioned in
Section 4.5.1, this test seeks to measure using both standards of the quantity and percentage of different pixels between two images whose only difference is one pixel. It was also explained that for the NPCR, minimum values of 99.5% are expected, and for the UACI, we expect values not less than 33.4%; as seen in
Table 8, the lowest value of the first case is 99.60%, and for the second, it is 33.44%. Other test is the avalanche test, in which is evaluated that a tiny change made in an image will produce a practically different ciphered image. The evaluation results shown in
Table 9 show that the obtained results are very close to 50%, which is the ideal value. The
Chosen/Known plain-text attacks are the other recommended evaluation. In this test, black and white images of different dimensions are coded, and subsequently, their entropies and correlation coefficients are measured, which are recorded in
Table 10. All measurements obtained are within the ranges specified in
Section 4.5.1 and
Section 4.5.2.
The
Keyspace and key sensitivity was also evaluated. In the case of the first, it was already explained in
Section 4.7 that the minimum size for a key to be considered secure is
, which is far exceeded, since in our proposal, it is
. On the other hand, the second test measures the percentage of distinct pixels when an image is encrypted with two keys with minimal variation.
Table 11 shows percentages greater than 99% that correspond to what is expected in practice. Another evaluated attack is the
Occlusion and noise attacks. The objective of both is to determine with a visual inspection how much information is recovered from an encrypted image that is covered or cropped to a certain extent as in the occlusion attack, or when it is randomly covered with white or black pixels after adding salt and pepper noise. For both cases, it is carried out in extensions or intensities of 25%, 50%, and 75%.
Figure 8,
Figure 9,
Figure 10,
Figure 11,
Figure 12 and
Figure 13 show that although the level of visibility after decryption is lower with higher intensities, it is possible to perceive the message. Following the foregoing, it is determined that the results obtained are good and all the tests have been satisfactorily passed.
Finally,
Table 13 and
Table 14 compare the results of the entropy, correlation coefficient, NPCI, UACI, and avalanche tests of the proposal presented with other articles of the state of the art. It is easy to see that the results obtained are close to those reported by other recently proposed papers. In some of the cases, they are equal or slightly lower, and in others, better measurements are obtained; therefore, it can be said that the system proposed in this research is competitive concerning the existing state of the art.
Finally, the future works that can be carried out for enhancing the performance of this research work are mentioned as follows. Researchers could develop substitution boxes with a higher level of non-linearity and better cryptography properties or use other logistic equations such as Lawrence’s equation. The authors consider that it would also be of great value to verify the effectiveness of this proposal or others in the encryption of audio or video. Finally, the introduction of a steganography scheme could increase the security of the system and propose a scheme that includes a key distribution system.