Controlling the Difficulty of Combinatorial Optimization Problems for Fair Proof-of-Useful-Work-Based Blockchain Consensus Protocol
Abstract
:1. Introduction
2. Related Work
2.1. Controlling the Difficulty in PoW and PoUW Protocols from the Literature
2.1.1. PoW Difficulty Adjustment
2.1.2. PoUW-Based Protocols That Utilize Difficulty Adjustment
2.1.3. PoUW-Based Protocols Considering AI, ML, and DL
2.1.4. Controlling the Difficulty in PoUW
2.2. Difficulty Estimation for CO Problem Instances
3. Controlling the Difficulty—Methodology
3.1. Difficulty Estimation Module
3.2. Instance Grouping Module
4. Case Study
4.1. Difficulty Estimation Module
4.1.1. Experimental Setup
4.1.2. ArcFlow Case Study
4.1.3. GIST Case Study
4.2. Instances Grouping Module Case Study
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Notation
Abbreviations | |
ANN | Artificial Neural Network |
AUC | Area Under the Curve |
BC | Blockchain |
BPP | Bin Packing Problem |
BIT | Block Insertion Time |
CNN | Convolutional Neural Network |
CO | Combinatorial Optimization |
COCP | Combinatorial Optimization Consensus Protocol |
CPU | Central processing unit |
CRISP-DM | CRoss Industry Standard Process for Data Mining |
DL | Deep Learning |
DPLS | Doubly Parallel Local Search |
GIST | Greedy Iterative Stochastic Transformation |
JSSP | Job Shop Scheduling Problem |
LSTM | Long Short Term Memory |
MFP | Maximum Flow Problem |
MIP | Mixed Integer Programming |
ML | Machine Learning |
NSGA-II | Non-dominated Sorting Genetic Algorithm II |
Problem of scheduling independent tasks on identical parallel machines | |
PCA | Principal Component Analysis |
DLBC | Deep Learning-Based Consensus Protocol |
PoS | Proof-of-Search |
PoUW | Proof-of-Useful-Work |
PoW | Proof-of-Work |
RMSE | Root Mean Squared Error |
RNN | Recurrent Neural Networks |
ROC | Receiver Operating Characteristics |
R-squared | Coefficient of Determination |
SAT | Propositional Satisfiability |
SNN | Shallow Neural Network |
SPP | Strip Packing Problem |
SSP | Subset Sum Problem |
TSP | Traveling Salesman Problem |
Symbols | |
∅ | Empty set |
Package size tolerance parameter | |
Type of activation function | |
Type of activation function | |
Type of activation function | |
Number of chosen coordinates in TSP instance | |
Number of coordinates in TSP instance | |
Number of coordinates in hard TSP instance | |
Threshold for good enough solution for TSP instance | |
i | Submitted instance |
Instance score (difficulty) | |
Current time | |
f | BIT value |
Number of insertion cycles an instance is expected to wait in the pool before it is solved | |
Group of instances | |
G | Set of groups of instances |
L | Number of groups |
Deadline for execution of instance | |
H | Heap of instances not included in any group yet |
j | Instance from heap |
M | Set of mandatory instances |
Group formed of mandatory instances | |
Subset sum problem solver function | |
n | Number of tasks in |
m | Number of processors in |
P | Set of tasks in |
Processing time of task q in | |
y | Time needed for problem execution (target variable) |
w-th example of instance for | |
Blockchain and machine learning jargon | |
Miners | BC participants who have the computer hardware and appropriate software needed to mine digital currencies or solve complex mathematical problems |
Consensus Protocol | Mechanism to perform BC management without the central authority |
Transaction | A unit measure of data in BC |
Block | Blocks are the basic containers of information in a blockchain, they contain transaction as stored data |
Cryptographic Puzzle | A mathematical puzzle that miners must solve in PoW-based BCs in order to append their blocks |
Proof-of-Work (PoW) | A common mechanism used to validate peer-to-peer transactions and maintain highly secured immutability of the blockchain |
Proof-of-Useful-Work (PoUW) | Energy efficient consensus protocol that re-purposes the computational effort required to maintain protocol security to solve complex real-world problems |
Proof-of-Learning (PoLe) | PoW that exploit the computation power of miners for training ML models as a useful work in consensus protocol |
Proof-of-Search (PoS) | Combines BC consensus formation with solving optimization problems |
Instance | An example of a problem with all the inputs needed to compute a solution to the problem |
Feature | Individual measurable property or characteristic of an instance |
Solution Space | The set of all possible solutions for the combinatorial optimization problem |
Randomized Algorithm | An algorithm that employs a degree of randomness as a part of its logic or procedure |
Parameter | The configuration variable that is internal to the model and whose value can be estimated from the given data |
Hyperparameter | The explicitly specified parameter that controls the training process |
Training (Train) Dataset | A dataset of examples used during the learning process to fit the parameters |
Validation Dataset | A dataset of examples used to tune the hyperparameters |
Test Dataset | A dataset that is independent of the training dataset, but follows the same probability distribution |
Data Normalization | The organization of data to appear similar across all records and fields |
Data Standardization | The process of converting data to a common format suitable for analysis by users |
Data Preprocessing | Data mining technique used to transform the raw data in a useful and efficient format |
Bias | Describes how well a model matches the training set |
Confusion Matrix | A table that is used to define the performance of a classification algorithm |
Transfer Learning | Taking the relevant parts of a pre-trained ML model and applying it to a new problem |
Depth | Number of layers in ML model |
Group | Group of instances that the miner should solve before publishing a new block |
Heap | All instances that are not included in any group yet |
Package | Successfully created group |
Model | A decision process in an abstract manner |
Framework | A tool that provides ready-made components or solutions that are customized in order to speed up development |
Cloud Service | Refers to a wide range of services delivered on demand to companies and customers over the internet |
Relative Bound | Difference between upper and lower bound of a solution relative to the upper bound value |
Scaling | A technique to standardize the independent features present in the data in a fixed range |
Oversampling | A technique that creates synthetic samples by randomly sampling the characteristics from occurrences in the minority class |
Neuron | A connection point in an ANN |
Activation Function | Decides whether a neuron should be activated or not |
Layer | In a DL model a structure or network topology in the model’s architecture |
Hidden Layer | A layer in between input layer and output layer |
Dropout | A regularization method that approximates training a large number of ANNs with different architectures in parallel |
Output Layer | The last layer of neurons that produces given outputs for the ANN |
Input Layer | Brings the initial data into the ANN for further processing by subsequent layers of artificial neurons |
Regression | A technique for investigating the relationship between independent variables or features and a dependent variable or outcome |
Classification | A supervised learning concept which basically categorizes a set of data into classes |
Pool | Place to store unprocessed objects |
Hierarchical Model | A model in which lower levels are sorted under a hierarchy of successively higher-level units |
References
- Ball, M.; Rosen, A.; Sabin, M.; Vasudevan, P.N. Proofs of Useful Work. IACR Cryptology ePrint Archive. Last update 2021. Available online: https://eprint.iacr.org/2017/203.pdf (accessed on 1 April 2021).
- Shibata, N. Proof-of-search: Combining blockchain consensus formation with solving optimization problems. IEEE Access 2019, 7, 172994–173006. [Google Scholar] [CrossRef]
- Fitzi, M.; Kiayias, A.; Panagiotakos, G.; Russell, A. Ofelimos: Combinatorial Optimization via Proof-of-Useful-Work∖A Provably Secure Blockchain Protocol. IACR Cryptology ePrint Archive. 2021. Available online: https://eprint.iacr.org/2021/1379.pdf (accessed on 28 January 2021).
- Todorović, M.; Matijević, L.; Ramljak, D.; Davidović, T.; Urošević, D.; Jakšić-Krüger, T.; Jovanović, Đ. Proof-of-Useful-Work: BlockChain Mining by Solving Real-life Optimization Problems. Symmetry 2022, 14, 1831. [Google Scholar] [CrossRef]
- Haouari, M.; Mhiri, M.; El-Masri, M.; Al-Yafi, K. A novel proof of useful work for a blockchain storing transportation transactions. Inf. Process. Manag. 2022, 59, 102749. [Google Scholar] [CrossRef]
- Li, W. Adapting Blockchain Technology for Scientific Computing. arXiv 2018, arXiv:1804.08230. [Google Scholar]
- Loe, A.F.; Quaglia, E.A. Conquering generals: An NP-hard proof of useful work. In Proceedings of the 1st Workshop on Cryptocurrencies and Blockchains for Distributed Systems, Munich, Germany, 15 June 2018; pp. 54–59. [Google Scholar]
- Syafruddin, W.A.; Dadkhah, S.; Köppen, M. Blockchain Scheme Based on Evolutionary Proof of Work. In Proceedings of the 2019 IEEE Congress on Evolutionary Computation (CEC), Wellington, New Zealand, 10–13 June 2019; pp. 771–776. [Google Scholar]
- Chin, Z.H.; Yap, T.T.V.; Tan, I.K. Simulating the adjustment of the difficulty in blockchain with SimBlock. In Proceedings of the 2nd ACM International Symposium on Blockchain and Secure Critical Infrastructure, Taipei, Taiwan, 6 October 2020; pp. 192–197. [Google Scholar]
- Feng, W.; Cao, Z.; Shen, J.; Dong, X. RTPoW: A Proof-of-Work Consensus Scheme with Real-Time Difficulty Adjustment Algorithm. In Proceedings of the 2021 IEEE 27th International Conference on Parallel and Distributed Systems (ICPADS), Beijing, China, 14–16 December 2021; pp. 233–240. [Google Scholar]
- Hutter, F.; Hamadi, Y.; Hoos, H.H.; Leyton-Brown, K. Performance prediction and automated tuning of randomized and parametric algorithms. In Proceedings of the 12th International Conference on Principles and Practice of Constraint Programming—CP 2006, Nantes, France, 25–29 September 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 213–228. [Google Scholar]
- Smith-Miles, K.; Lopes, L. Measuring instance difficulty for combinatorial optimization problems. Comput. Oper. Res. 2012, 39, 875–889. [Google Scholar] [CrossRef]
- Baldominos, A.; Saez, Y. Coin. AI: A proof-of-useful-work scheme for blockchain-based distributed deep learning. Entropy 2019, 21, 723. [Google Scholar] [CrossRef] [Green Version]
- Lihu, A.; Du, J.; Barjaktarevic, I.; Gerzanics, P.; Harvilla, M. A Proof of Useful Work for Artificial Intelligence on the Blockchain. arXiv 2020, arXiv:2001.09244. [Google Scholar]
- Li, B.; Chenli, C.; Xu, X.; Shi, Y.; Jung, T. DLBC: A Deep Learning-Based Consensus in Blockchains for Deep Learning Services. arXiv 2020, arXiv:1904.07349v2. [Google Scholar]
- Schröer, C.; Kruse, F.; Gómez, J.M. A systematic literature review on applying CRISP-DM process model. Procedia Comput. Sci. 2021, 181, 526–534. [Google Scholar] [CrossRef]
- Alipour, H.; Mu noz, M.A.; Smith-Miles, K. Enhanced instance space analysis for the maximum flow problem. Eur. J. Oper. Res. 2023, 304, 411–428. [Google Scholar] [CrossRef]
- Strassl, S.; Musliu, N. Instance space analysis and algorithm selection for the job shop scheduling problem. Comput. Oper. Res. 2022, 141, 105661. [Google Scholar] [CrossRef]
- Jooken, J.; Leyman, P.; De Causmaecker, P. A new class of hard problem instances for the 0–1 knapsack problem. Eur. J. Oper. Res. 2022, 301, 841–854. [Google Scholar] [CrossRef]
- Piechowiak, K.; Drozdowski, M.; Sanlaville, É. Framework of algorithm portfolios for strip packing problem. Comput. Ind. Eng. 2022, 172, 108538. [Google Scholar] [CrossRef]
- Pinedo, M.L. Scheduling: Theory, Algorithms, and Systems; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Mrad, M.; Souayah, N. An Arc-Flow Model for the Makespan Minimization Problem on Identical Parallel Machines. IEEE Access 2018, 6, 5300–5307. [Google Scholar] [CrossRef]
- Ostojić, D.; Davidović, T.; Jakšić Kruger, T.; Ramljak, D. Comparative Analysis of Heuristic Approaches to P||Cmax. In Proceedings of the 11th International Conference on Operations Research and Enterprise Systems, ICORES 2021, Online Streaming, 3–5 February 2022; pp. 352–359. [Google Scholar]
- Dwork, C.; Naor, M. Pricing via processing or combatting junk mail. In Annual International Cryptology Conference; Springer: Cham, Switzerland, 1992; pp. 139–147. [Google Scholar]
- Nakamoto, S. Bitcoin: A Peer-to-Peer Electronic Cash System. 2008. Available online: https://nakamotoinstitute.org/bitcoin/ (accessed on 29 November 2019).
- Garay, J.; Kiayias, A.; Leonardos, N. The bitcoin backbone protocol: Analysis and applications. In Annual International Conference on the Theory and Applications of Cryptographic Techniques; Springer: Cham, Switzerland, 2015; pp. 281–310. [Google Scholar]
- Pass, R.; Seeman, L.; Shelat, A. Analysis of the blockchain protocol in asynchronous networks. In Annual International Conference on the Theory and Applications of Cryptographic Techniques; Springer: Cham, Switzerland, 2017; pp. 643–673. [Google Scholar]
- Meshkov, D.; Chepurnoy, A.; Jansen, M. Short paper: Revisiting difficulty control for blockchain systems. In Data Privacy Management, Cryptocurrencies and Blockchain Technology; Springer: Cham, Switzerland, 2017; pp. 429–436. [Google Scholar]
- Noda, S.; Okumura, K.; Hashimoto, Y. An Economic Analysis of Difficulty Adjustment Algorithms in Proof-of-Work Blockchain Systems. 2019. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3410460 (accessed on 20 September 2022).
- Aggarwal, V.; Tan, Y. A Structural Analysis of Bitcoin Cash’s Emergency Difficulty Adjustment Algorithm. 2019. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3383739 (accessed on 20 September 2022).
- Zhang, S.; Ma, X. A general difficulty control algorithm for proof-of-work based blockchains. In Proceedings of the ICASSP 2020—2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Barcelona, Spain, 4–8 May 2020; pp. 3077–3081. [Google Scholar]
- Zheng, K.; Zhang, S.; Ma, X. Difficulty prediction for proof-of-work based blockchains. In Proceedings of the 2020 IEEE 21st International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), Atlanta, GA, USA, 26–29 May 2020; pp. 1–5. [Google Scholar]
- Chin, Z.H.; Yap, T.T.V.; Tan, I.K.T. Genetic-Algorithm-Inspired Difficulty Adjustment for Proof-of-Work Blockchains. Symmetry 2022, 14, 609. [Google Scholar] [CrossRef]
- Hutter, F.; Xu, L.; Hoos, H.H.; Leyton-Brown, K. Algorithm runtime prediction: Methods & evaluation. Artif. Intell. 2014, 206, 79–111. [Google Scholar]
- Cook, W. Concorde TSP Solver. 2016. Available online: http://www.math.uwaterloo.ca/tsp/concorde.html (accessed on 1 August 2022).
- Karimi-Mamaghan, M.; Mohammadi, M.; Meyer, P.; Karimi-Mamaghan, A.M.; Talbi, E.G. Machine learning at the service of meta-heuristics for solving combinatorial optimization problems: A state-of-the-art. Eur. J. Oper. Res. 2022, 296, 393–422. [Google Scholar] [CrossRef]
- Vaughan, D. Analytical Skills for AI and Data Science: Building Skills for an AI-Driven Enterprise; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2020. [Google Scholar]
- Allen, M.; Leung, R.; McGrenere, J.; Purves, B. Involving domain experts in assistive technology research. Univers. Access Inf. Soc. 2008, 7, 145–154. [Google Scholar] [CrossRef]
- Janssens, J. Data Science at the Command Line; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2021. [Google Scholar]
- Oymak, S.; Li, M.; Soltanolkotabi, M. Generalization guarantees for neural architecture search with train-validation split. In Proceedings of the 38th International Conference on Machine Learning, Virtual, 18–24 July 2021; PMLR: London, UK, 2021; pp. 8291–8301. [Google Scholar]
- Saunshi, N.; Gupta, A.; Hu, W. A Representation Learning Perspective on the Importance of Train-Validation Splitting in Meta-Learning. In Proceedings of the 38th International Conference on Machine Learning, Virtual, 18–24 July 2021; PMLR: London, UK, 2021; pp. 9333–9343. [Google Scholar]
- Patro, S.; Sahu, K.K. Normalization: A preprocessing stage. arXiv 2015, arXiv:1503.06462. [Google Scholar] [CrossRef]
- Singh, D.; Singh, B. Investigating the impact of data normalization on classification performance. Appl. Soft Comput. 2020, 97, 105524. [Google Scholar] [CrossRef]
- Bhanja, S.; Das, A. Impact of data normalization on deep neural network for time series forecasting. arXiv 2018, arXiv:1812.05519. [Google Scholar]
- Hou, Z.; Hu, Q.; Nowinski, W.L. On minimum variance thresholding. Pattern Recognit. Lett. 2006, 27, 1732–1743. [Google Scholar] [CrossRef]
- Khalid, S.; Khalil, T.; Nasreen, S. A survey of feature selection and feature extraction techniques in machine learning. In Proceedings of the 2014 Science and Information Conference, London, UK, 7–29 August 2014; pp. 372–378. [Google Scholar]
- Cai, J.; Luo, J.; Wang, S.; Yang, S. Feature selection in machine learning: A new perspective. Neurocomputing 2018, 300, 70–79. [Google Scholar] [CrossRef]
- Hall, M.A. Correlation-Based Feature Selection for Machine Learning. Ph.D. Thesis, The University of Waikato, Hamilton, New Zealand, 1999. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Novaković, J.D.; Veljović, A.; Ilić, S.S.; Papić, Ž.; Milica, T. Evaluation of classification models in machine learning. Theory Appl. Math. Comput. Sci. 2017, 7, 39–46. [Google Scholar]
- Reich, Y.; Barai, S. Evaluating machine learning models for engineering problems. Artif. Intell. Eng. 1999, 13, 257–272. [Google Scholar] [CrossRef]
- Jiao, Y.; Du, P. Performance measures in evaluating machine learning based bioinformatics predictors for classifications. Quant. Biol. 2016, 4, 320–330. [Google Scholar] [CrossRef] [Green Version]
- Haghighi, S.; Jasemi, M.; Hessabi, S.; Zolanvari, A. PyCM: Multiclass confusion matrix library in Python. J. Open Source Softw. 2018, 3, 729. [Google Scholar] [CrossRef] [Green Version]
- Marom, N.D.; Rokach, L.; Shmilovici, A. Using the confusion matrix for improving ensemble classifiers. In Proceedings of the 2010 IEEE 26th Convention of Electrical and Electronics Engineers in Israel, Eilat, Israel, 17–20 November 2010; pp. 000555–000559. [Google Scholar]
- Bergstra, J.; Bardenet, R.; Bengio, Y.; Kégl, B. Algorithms for hyper-parameter optimization. Adv. Neural Inf. Process. Syst. 2011, 24. Available online: https://proceedings.neurips.cc/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf (accessed on 1 August 2022).
- Feurer, M.; Hutter, F. Hyperparameter optimization. In Automated Machine Learning; Springer: Cham, Switzerland, 2019; pp. 3–33. [Google Scholar]
- Yang, L.; Shami, A. On hyperparameter optimization of machine learning algorithms: Theory and practice. Neurocomputing 2020, 415, 295–316. [Google Scholar] [CrossRef]
- Shawki, N.; Nunez, R.R.; Obeid, I.; Picone, J. On Automating Hyperparameter Optimization for Deep Learning Applications. In Proceedings of the 2021 IEEE Signal Processing in Medicine and Biology Symposium (SPMB), Philadelphia, PA, USA, 4 December 2021; pp. 1–7. [Google Scholar]
- Coffman, E.G., Jr.; Garey, M.R.; Johnson, D.S. Approximation algorithms for bin packing: A survey. In Approximation Algorithms for NP-Hard Problems; PWS Publishing Company: Boston, MA, USA, 1996; pp. 46–93. [Google Scholar]
- Pisinger, D. Algorithms for Knapsack Problems; DIKU rapport 95/1; University of Copenhagen: Copenhagen, Denmark, 1995. [Google Scholar]
- Davidović, T.; Šelmić, M.; Teodorović, D.; Ramljak, D. Bee colony optimization for scheduling independent tasks to identical processors. J. Heuristics 2012, 18, 549–569. [Google Scholar] [CrossRef]
- Graham, R.L. Bounds on multiprocessing timing anomalies. SIAM J. Appl. Math. 1969, 17, 416–429. [Google Scholar] [CrossRef] [Green Version]
- Yamashiro, H.; Nonaka, H. Estimation of processing time using machine learning and real factory data for optimization of parallel machine scheduling problem. Oper. Res. Perspect. 2021, 8, 100196. [Google Scholar] [CrossRef]
- Flach, P. Performance evaluation in machine learning: The good, the bad, the ugly, and the way forward. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; Volume 33, pp. 9808–9814. [Google Scholar]
- Lorenzo-Seva, U. How to Report the Percentage of Explained Common Variance in Exploratory Factor Analysis; Department of Psychology: Tarragona, Italy, 2013. [Google Scholar]
- Shelke, M.S.; Deshmukh, P.R.; Shandilya, V.K. A review on imbalanced data handling using undersampling and oversampling technique. Int. J. Recent Trends Eng. Res 2017, 3, 444–449. [Google Scholar]
- Yang, J.B.; Shen, K.Q.; Ong, C.J.; Li, X.P. Feature selection for MLP neural network: The use of random permutation of probabilistic outputs. IEEE Trans. Neural Netw. 2009, 20, 1911–1922. [Google Scholar] [CrossRef]
- Song, F.; Guo, Z.; Mei, D. Feature selection using principal component analysis. In Proceedings of the 2010 International Conference on System Science, Engineering Design and Manufacturing Informatization, Yichang, China, 12–14 November 2010; Volume 1, pp. 27–30. [Google Scholar]
- Cheng, Q.; Varshney, P.K.; Arora, M.K. Logistic regression for feature selection and soft classification of remote sensing data. IEEE Geosci. Remote Sens. Lett. 2006, 3, 491–494. [Google Scholar] [CrossRef]
- Abdel-Gawad, A.; Ratner, S. Adaptive Optimization of Hyperparameters in L2-Regularised Logistic Regression; Technical report; 2007. Available online: http://cs229.stanford.edu/proj2007/AbdelGawadRatner-AdaptiveHyperparameterOptimization.pdf (accessed on 1 August 2022).
- Khadem, H.; Eissa, M.R.; Nemat, H.; Alrezj, O.; Benaissa, M. Classification before regression for improving the accuracy of glucose quantification using absorption spectroscopy. Talanta 2020, 211, 120740. [Google Scholar] [CrossRef]
Feature Notation | Explanation of Feature |
---|---|
n | Cardinality of set that is, number of tasks |
m | Number of processors |
av.length | Average length of elements in set P |
median | Median value of elements in set P |
std.dev | Standard deviation of set P |
max | Maximum value in set P |
min | Minimum value in set P |
range | Difference between maximum and minimum value in set P |
k | Number of different values of elements in set P |
Difference between a solution upper and lower bound relative to the upper bound | |
Average number of tasks per processor | |
Polynomial feature of | |
Polynomial feature of | |
Polynomial feature of | |
Polynomial feature of | |
Polynomial feature of | |
Instances with same subtype have the same value | |
Probability distribution of random generator for elements of P | |
Index of instance in dataset with same subtype and class [1–10] |
Feature | Correlation with Target y |
---|---|
y | 1.000000 |
std.dev | 0.539327 |
max | 0.538683 |
av.length | 0.538287 |
median | 0.536837 |
k | 0.529941 |
range | 0.528752 |
min | 0.495868 |
0.423243 | |
0.420624 | |
0.412105 | |
subtype | 0.386236 |
0.377794 | |
0.364556 | |
n | 0.356450 |
0.337697 | |
0.300864 | |
0.246362 | |
m | 0.100150 |
0.005231 |
Feature | Correlation with Target y |
---|---|
y | 1.000000 |
0.476547 | |
m | 0.442684 |
0.430256 | |
0.418171 | |
0.411345 | |
0.382118 | |
0.340878 | |
av.length | 0.306203 |
median | 0.305065 |
min | 0.302683 |
max | 0.297263 |
0.289090 | |
range | 0.283401 |
std.dev | 0.253432 |
0.250783 | |
k | 0.215968 |
0.211151 | |
n | 0.160885 |
0.007393 |
i | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 |
0.000013 | 0.0001 | 0.000105 | 0.000231 | 0.000471 | 0.000607 | 0.001301 | 0.275697 | 1.420326 | 4.651514 | |
10 | 20 | 6 | 19 | 17 | 3 | 13 | 7 | 12 | 11 | |
i | 11 | 12 | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 |
4.905763 | 4.928929 | 5.492491 | 5.833624 | 6.058383 | 6.132013 | 9.387262 | 11.366228 | 15.392948 | 19.907111 | |
8 | 4 | 9 | 14 | 2 | 18 | 15 | 1 | 5 | 16 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Maleš, U.; Ramljak, D.; Jakšić Krüger, T.; Davidović, T.; Ostojić, D.; Haridas, A. Controlling the Difficulty of Combinatorial Optimization Problems for Fair Proof-of-Useful-Work-Based Blockchain Consensus Protocol. Symmetry 2023, 15, 140. https://doi.org/10.3390/sym15010140
Maleš U, Ramljak D, Jakšić Krüger T, Davidović T, Ostojić D, Haridas A. Controlling the Difficulty of Combinatorial Optimization Problems for Fair Proof-of-Useful-Work-Based Blockchain Consensus Protocol. Symmetry. 2023; 15(1):140. https://doi.org/10.3390/sym15010140
Chicago/Turabian StyleMaleš, Uroš, Dušan Ramljak, Tatjana Jakšić Krüger, Tatjana Davidović, Dragutin Ostojić, and Abhay Haridas. 2023. "Controlling the Difficulty of Combinatorial Optimization Problems for Fair Proof-of-Useful-Work-Based Blockchain Consensus Protocol" Symmetry 15, no. 1: 140. https://doi.org/10.3390/sym15010140
APA StyleMaleš, U., Ramljak, D., Jakšić Krüger, T., Davidović, T., Ostojić, D., & Haridas, A. (2023). Controlling the Difficulty of Combinatorial Optimization Problems for Fair Proof-of-Useful-Work-Based Blockchain Consensus Protocol. Symmetry, 15(1), 140. https://doi.org/10.3390/sym15010140