Active Learning: Encoder-Decoder-Outlayer and Vector Space Diversification Sampling
Abstract
:1. Introduction
- -
- Proposal for an Encoder-Decoder-Outlayer (EDO) active learning method for text classification;
- -
- Exploration of the applicability of EDO, demonstrating its effectiveness in addressing issues of limited labeled data;
- -
- Exploration of the utilization of different models and techniques, such as BERTbase, S-BERT, Universal Sentence Encoder, Word2Vec, and Document2Vec, to optimize datasets for deep learning;
- -
- Proposal for the use of T-SNE for dimension reduction and comparison of sentence vectors.
2. Literature Review
3. Data Description
4. Methodology
Basic Framework
5. Experimental Results
5.1. Settings
- (1)
- Train an auto encoder-decoder, (omitted, borrow from Sentence-BERT)
- a.
- Only Encoder, Decoder and train data in GPU memory
- (2)
- Encoder vector data buffering (encode all data items into vectors)
- a.
- Parallelizable
- b.
- Only Encoder and predict data in GPU memory
- (3)
- Train Outlayer
- a.
- Only Outlayer, batch train vectors and label in GPU memory
5.2. Vector Space Diversification Sampling
5.3. Data and Vector Space: Understanding and Unfamiliarity
5.4. Results
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Bražinskas, A.; Nallapati, R.; Bansal, M.; Dreyer, M. Efficient few-shot fine-tuning for opinion summarization. arXiv 2022, arXiv:2205.02170. [Google Scholar]
- Dodge, J.; Ilharco, G.; Schwartz, R.; Farhadi, A.; Hajishirzi, H.; Smith, N. Fine-tuning pretrained language models: Weight initializations, data orders, and early stopping. arXiv 2020, arXiv:2002.06305. [Google Scholar]
- Hu, E.J.; Shen, Y.; Wallis, P.; Allen-Zhu, Z.; Li, Y.; Wang, S.; Wang, L.; Chen, W. Lora: Low-rank adaptation of large language models. arXiv 2021, arXiv:2106.09685. [Google Scholar]
- Settles, B. Active Learning Literature Survey: Semantic Scholar, Active Learning Literature Survey|Semantic Scholar. 1970. Available online: https://www.semanticscholar.org/paper/Active-Learning-Literature-Survey-Settles/818826f356444f3daa3447755bf63f171f39ec47 (accessed on 1 April 2023).
- Dai, Y.; Yang, C.; Liu, Y.; Yao, Y. Latent-Enhanced Variational Adversarial Active Learning Assisted Soft Sensor. IEEE Sens. J. 2023. [Google Scholar] [CrossRef]
- Deng, H.; Yang, K.; Liu, Y.; Zhang, S.; Yao, Y. Actively exploring informative data for smart modeling of industrial multiphase flow processes. IEEE Trans. Ind. Inform. 2020, 17, 8357–8366. [Google Scholar] [CrossRef]
- Xie, B.; Yuan, L.; Li, S.; Liu, C.H.; Cheng, X.; Wang, G. Active learning for domain adaptation: An energy-based approach. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 22 February–1 March 2022; Volume 36, pp. 8708–8716. [Google Scholar]
- Liu, Y.; Yan, Z.; Tan, J.; Li, Y. Multi-purpose Oriented Single Nighttime Image Haze Removal Based on Unified Variational Retinex Model. In IEEE Transactions on Circuits and Systems for Video Technology; IEEE: Piscataway, NJ, USA, 2022. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. In Proceedings of the Advances in Neural Information Processing Systems 30 (NIPS 2017), Long Beach, CA, USA, 4–9 December 2017; Volume 30. Available online: https://arxiv.org/pdf/1706.03762.pdf (accessed on 2 April 2023).
- Floris Jacobs, P.; Maillette de Buy Wenniger, G.; Wiering, M.; Schomaker, L. Active Learning for Reducing Labeling Effort in Text Classification Tasks. arXiv 2021, arXiv:2109.04847. [Google Scholar]
- Reimers, N.; Gurevych, I. Sentence-bert: Sentence embeddings using siamese bert-networks. arXiv 2019, arXiv:1908.10084. [Google Scholar]
- Cer, D.; Yang, Y.; Kong, S.Y.; Hua, N.; Limtiaco, N.; John, R.S.; Constant, N.; Guajardo-Cespedes, M.; Yuan, S.; Tar, C.; et al. Universal sentence encoder. arXiv 2018, arXiv:1803.11175. [Google Scholar]
- Mikolov, T.; Chen, K.; Corrado, G.; Dean, J. Efficient estimation of word representations in vector space. arXiv 2013, arXiv:1301.3781. [Google Scholar]
- Le, Q.; Mikolov, T. Distributed Representations of Sentences and Documents. arXiv 2014, arXiv:1405.4053. [Google Scholar]
- Van der Maaten, L.; Hinton, G. Visualizing Data Using T-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. arXiv 2015, arXiv:1502.01852. [Google Scholar]
- Dettmers, T.; Lewis, M.; Belkada, Y.; Zettlemoyer, L. Llm.int8(): 8-bit matrix multiplication for transformers at scale. arXiv 2022, arXiv:2208.07339. [Google Scholar]
- Dogo, E.M.; Afolabi, O.J.; Nwulu, N.I.; Twala, B.; Aigbavboa, C.O. A Comparative Analysis of Gradient Descent-Based Optimization Algorithms on Convolutional Neural Networks. In Proceedings of the 2018 International Conference on Computational Techniques, Electronics and Mechanical Systems (CTEMS), Belgaum, India, NJ, USA, 21–22 December 2018; Niranjan, S.K., Kavitha, C., Kavitha, K.S., Sathish Kumar, T., Eds.; IEEE Bangalore Section. IEEE: Piscataway, NJ, USA, 2018; pp. 92–99. [Google Scholar]
- Markowitz, H. Portfolio Selection. J. Financ. 1952, 7, 77–91. [Google Scholar] [CrossRef]
- Leung, M.F.; Wang, J. Cardinality-constrained portfolio selection based on collaborative neurodynamic optimization. Neural Netw. 2022, 145, 68–79. [Google Scholar] [CrossRef] [PubMed]
- Gu, P.; Tian, S.; Chen, Y. Iterative Learning Control Based on Nesterov Accelerated Gradient Method. IEEE Access 2019, 7, 115836–115842. [Google Scholar] [CrossRef]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. arXiv 2015, arXiv:1502.03167. [Google Scholar]
- Bello, A.; Ng, S.-C.; Leung, M.-F. A BERT Framework to Sentiment Analysis of Tweets. Sensors 2023, 23, 506. [Google Scholar] [CrossRef] [PubMed]
Dataset Name | #Items | Percent | F1 (Trivial) | F1 (Rand) | F1 (VSD Min) | F1 (VSD Ave) | F1 (VSD Max) | F1 (VSD Min-Rand) | F1 (VSD Ave-Rand) | F1 (VSD Max-Rand) |
---|---|---|---|---|---|---|---|---|---|---|
agnews | 15 | 0.0019737 | 0.25 | 0.2002 | 0.3228 | 0.5219 | 0.6358 | 0.1226 | 0.3217 | 0.4356 |
amazonpolar | 15 | 3.75 × 10−5 | 0.5 | 0.3702 | 0.3425 | 0.5259 | 0.6999 | −0.0277 | 0.1556 | 0.3296 |
dbpedia | 15 | 0.0002143 | 0.0714 | 0.1766 | 0.2006 | 0.3052 | 0.3899 | 0.024 | 0.1286 | 0.2133 |
emotion | 15 | 0.0075 | 0.1667 | 0.1115 | 0.2069 | 0.2293 | 0.2613 | 0.0954 | 0.1178 | 0.1498 |
imdb | 15 | 0.0006 | 0.5 | 0.55 | 0.484 | 0.6614 | 0.7568 | −0.0659 | 0.1114 | 0.2069 |
yelp | 15 | 0.0003 | 0.2 | 0.1122 | 0.2269 | 0.2529 | 0.3048 | 0.1148 | 0.1408 | 0.1926 |
agnews | 25 | 0.0032895 | 0.25 | 0.3564 | 0.6411 | 0.7071 | 0.7356 | 0.2847 | 0.3508 | 0.3793 |
amazonpolar | 25 | 6.25 × 10−5 | 0.5 | 0.3975 | 0.6152 | 0.6831 | 0.7443 | 0.2178 | 0.2856 | 0.3468 |
dbpedia | 25 | 0.0003571 | 0.0714 | 0.2002 | 0.364 | 0.4611 | 0.5514 | 0.1638 | 0.2609 | 0.3512 |
emotion | 25 | 0.0125 | 0.1667 | 0.1675 | 0.2343 | 0.2513 | 0.2805 | 0.0668 | 0.0838 | 0.113 |
imdb | 25 | 0.001 | 0.5 | 0.7555 | 0.7061 | 0.7439 | 0.7749 | −0.0494 | −0.0116 | 0.0194 |
yelp | 25 | 0.0005 | 0.2 | 0.1501 | 0.2229 | 0.2901 | 0.3296 | 0.0728 | 0.14 | 0.1795 |
agnews | 50 | 0.0065789 | 0.25 | 0.3747 | 0.6887 | 0.764 | 0.7933 | 0.314 | 0.3893 | 0.4186 |
amazonpolar | 50 | 0.000125 | 0.5 | 0.7615 | 0.6986 | 0.7393 | 0.7825 | −0.0629 | −0.0222 | 0.021 |
dbpedia | 50 | 0.0007143 | 0.0714 | 0.3767 | 0.5342 | 0.6205 | 0.6847 | 0.1574 | 0.2438 | 0.308 |
emotion | 50 | 0.025 | 0.1667 | 0.234 | 0.2581 | 0.2854 | 0.3171 | 0.0241 | 0.0514 | 0.0831 |
imdb | 50 | 0.002 | 0.5 | 0.5518 | 0.725 | 0.7491 | 0.7981 | 0.1732 | 0.1973 | 0.2463 |
yelp | 50 | 0.001 | 0.2 | 0.1907 | 0.3106 | 0.3361 | 0.3589 | 0.1199 | 0.1454 | 0.1682 |
agnews | 70 | 0.0092105 | 0.25 | 0.686 | 0.7384 | 0.7722 | 0.8027 | 0.0523 | 0.0861 | 0.1167 |
amazonpolar | 70 | 0.000175 | 0.5 | 0.6215 | 0.7615 | 0.7775 | 0.7926 | 0.14 | 0.156 | 0.1711 |
dbpedia | 70 | 0.001 | 0.0714 | 0.6304 | 0.7154 | 0.7532 | 0.7797 | 0.0851 | 0.1228 | 0.1494 |
emotion | 70 | 0.035 | 0.1667 | 0.2284 | 0.2912 | 0.3144 | 0.3532 | 0.0629 | 0.086 | 0.1248 |
imdb | 70 | 0.0028 | 0.5 | 0.6959 | 0.7356 | 0.7709 | 0.7924 | 0.0396 | 0.0749 | 0.0965 |
yelp | 70 | 0.0014 | 0.2 | 0.3397 | 0.3512 | 0.3706 | 0.4004 | 0.0115 | 0.0309 | 0.0607 |
agnews | 100 | 0.0131579 | 0.25 | 0.6515 | 0.7909 | 0.8005 | 0.8194 | 0.1394 | 0.149 | 0.1679 |
amazonpolar | 100 | 0.00025 | 0.5 | 0.7612 | 0.7697 | 0.7852 | 0.8049 | 0.0085 | 0.024 | 0.0437 |
dbpedia | 100 | 0.0014286 | 0.0714 | 0.5767 | 0.8067 | 0.8254 | 0.8538 | 0.23 | 0.2487 | 0.2771 |
emotion | 100 | 0.05 | 0.1667 | 0.3133 | 0.2981 | 0.3223 | 0.3429 | −0.0152 | 0.009 | 0.0296 |
imdb | 100 | 0.004 | 0.5 | 0.7312 | 0.7404 | 0.7638 | 0.7783 | 0.0092 | 0.0326 | 0.0471 |
yelp | 100 | 0.002 | 0.2 | 0.3616 | 0.3499 | 0.3778 | 0.3936 | −0.0117 | 0.0162 | 0.032 |
agnews | 200 | 0.0263158 | 0.25 | 0.8166 | 0.7887 | 0.8047 | 0.8314 | −0.0279 | −0.0118 | 0.0148 |
amazonpolar | 200 | 0.0005 | 0.5 | 0.7259 | 0.8026 | 0.8162 | 0.8338 | 0.0767 | 0.0904 | 0.108 |
dbpedia | 200 | 0.0028571 | 0.0714 | 0.77 | 0.8874 | 0.8968 | 0.9134 | 0.1174 | 0.1268 | 0.1434 |
emotion | 200 | 0.1 | 0.1667 | 0.3619 | 0.3158 | 0.3451 | 0.3659 | −0.0461 | −0.0168 | 0.004 |
imdb | 200 | 0.008 | 0.5 | 0.8139 | 0.7795 | 0.7864 | 0.7938 | −0.0344 | −0.0275 | −0.0201 |
yelp | 200 | 0.004 | 0.2 | 0.3788 | 0.3804 | 0.3956 | 0.4087 | 0.0016 | 0.0168 | 0.0299 |
agnews | 300 | 0.0394737 | 0.25 | 0.8077 | 0.8077 | 0.8257 | 0.8468 | −0.0001 | 0.018 | 0.039 |
amazonpolar | 300 | 0.00075 | 0.5 | 0.8344 | 0.8216 | 0.8391 | 0.8519 | −0.0128 | 0.0047 | 0.0174 |
dbpedia | 300 | 0.0042857 | 0.0714 | 0.8926 | 0.9048 | 0.9182 | 0.9269 | 0.0122 | 0.0256 | 0.0343 |
emotion | 300 | 0.15 | 0.1667 | 0.3807 | 0.3537 | 0.3675 | 0.3876 | −0.027 | −0.0132 | 0.007 |
imdb | 300 | 0.012 | 0.5 | 0.7767 | 0.7714 | 0.789 | 0.8036 | −0.0053 | 0.0123 | 0.0269 |
yelp | 300 | 0.006 | 0.2 | 0.3996 | 0.408 | 0.4186 | 0.4282 | 0.0084 | 0.019 | 0.0286 |
agnews | 500 | 0.0657895 | 0.25 | 0.8327 | 0.8167 | 0.832 | 0.8401 | −0.0159 | −0.0006 | 0.0075 |
amazonpolar | 500 | 0.00125 | 0.5 | 0.8355 | 0.8282 | 0.8384 | 0.8452 | −0.0074 | 0.0028 | 0.0096 |
dbpedia | 500 | 0.0071429 | 0.0714 | 0.902 | 0.9266 | 0.9389 | 0.9464 | 0.0246 | 0.0369 | 0.0444 |
emotion | 500 | 0.25 | 0.1667 | 0.4049 | 0.3811 | 0.3987 | 0.4088 | −0.0238 | −0.0062 | 0.0038 |
imdb | 500 | 0.02 | 0.5 | 0.8126 | 0.8092 | 0.811 | 0.8173 | −0.0034 | −0.0016 | 0.0047 |
yelp | 500 | 0.01 | 0.2 | 0.4317 | 0.4299 | 0.4381 | 0.4432 | −0.0017 | 0.0064 | 0.0115 |
agnews | 700 | 0.0921053 | 0.25 | 0.8477 | 0.8275 | 0.8427 | 0.8534 | −0.0202 | −0.005 | 0.0058 |
amazonpolar | 700 | 0.00175 | 0.5 | 0.8421 | 0.8373 | 0.8496 | 0.8638 | −0.0048 | 0.0076 | 0.0217 |
dbpedia | 700 | 0.01 | 0.0714 | 0.9239 | 0.9425 | 0.9484 | 0.9524 | 0.0186 | 0.0245 | 0.0285 |
emotion | 700 | 0.35 | 0.1667 | 0.464 | 0.4084 | 0.4222 | 0.4388 | −0.0556 | −0.0417 | −0.0251 |
imdb | 700 | 0.028 | 0.5 | 0.8334 | 0.8153 | 0.8212 | 0.8261 | −0.0182 | −0.0123 | −0.0073 |
yelp | 700 | 0.014 | 0.2 | 0.4501 | 0.4356 | 0.4446 | 0.4565 | −0.0145 | −0.0055 | 0.0064 |
agnews | 1000 | 0.1315789 | 0.25 | 0.8626 | 0.8407 | 0.8507 | 0.8592 | −0.0219 | −0.012 | −0.0034 |
amazonpolar | 1000 | 0.0025 | 0.5 | 0.85 | 0.8519 | 0.8588 | 0.8657 | 0.0019 | 0.0088 | 0.0157 |
dbpedia | 1000 | 0.0142857 | 0.0714 | 0.9482 | 0.9477 | 0.9545 | 0.962 | −0.0005 | 0.0063 | 0.0138 |
emotion | 1000 | 0.5 | 0.1667 | 0.4519 | 0.4304 | 0.4469 | 0.4565 | −0.0215 | −0.005 | 0.0046 |
imdb | 1000 | 0.04 | 0.5 | 0.843 | 0.8197 | 0.8274 | 0.8405 | −0.0233 | −0.0157 | −0.0025 |
yelp | 1000 | 0.02 | 0.2 | 0.4644 | 0.4447 | 0.4553 | 0.4697 | −0.0197 | −0.0091 | 0.0053 |
agnews | 2000 | 0.2631579 | 0.25 | 0.8688 | 0.8636 | 0.8668 | 0.8721 | −0.0053 | −0.002 | 0.0033 |
amazonpolar | 2000 | 0.005 | 0.5 | 0.8737 | 0.8722 | 0.8774 | 0.8839 | −0.0015 | 0.0037 | 0.0102 |
dbpedia | 2000 | 0.0285714 | 0.0714 | 0.9611 | 0.9623 | 0.965 | 0.9665 | 0.0012 | 0.0039 | 0.0054 |
emotion | 2000 | 1 | 0.1667 | 0.5049 | 0.478 | 0.4848 | 0.4945 | −0.0269 | −0.0201 | −0.0104 |
imdb | 2000 | 0.08 | 0.5 | 0.8507 | 0.8386 | 0.8472 | 0.8527 | −0.0121 | −0.0035 | 0.002 |
yelp | 2000 | 0.04 | 0.2 | 0.4857 | 0.4683 | 0.4776 | 0.4839 | −0.0174 | −0.0082 | −0.0018 |
agnews | 3000 | 0.3947368 | 0.25 | 0.8711 | 0.8689 | 0.8704 | 0.8721 | −0.0022 | −0.0007 | 0.0011 |
amazonpolar | 3000 | 0.0075 | 0.5 | 0.8822 | 0.8748 | 0.8822 | 0.8869 | −0.0074 | 0 | 0.0048 |
dbpedia | 3000 | 0.0428571 | 0.0714 | 0.9672 | 0.967 | 0.9699 | 0.9716 | −0.0001 | 0.0027 | 0.0045 |
emotion | 3000 | 1.5 | 0.1667 | 0.512 | 0.5063 | 0.5129 | 0.5188 | −0.0057 | 0.001 | 0.0068 |
imdb | 3000 | 0.12 | 0.5 | 0.839 | 0.8522 | 0.8541 | 0.8578 | 0.0132 | 0.0151 | 0.0187 |
yelp | 3000 | 0.06 | 0.2 | 0.4875 | 0.4853 | 0.4885 | 0.4931 | −0.0022 | 0.001 | 0.0056 |
agnews | 5000 | 0.6578947 | 0.25 | 0.8794 | 0.8736 | 0.8785 | 0.8865 | −0.0058 | −0.0009 | 0.0071 |
amazonpolar | 5000 | 0.0125 | 0.5 | 0.892 | 0.8813 | 0.889 | 0.8942 | −0.0107 | −0.003 | 0.0022 |
dbpedia | 5000 | 0.0714286 | 0.0714 | 0.9706 | 0.9713 | 0.972 | 0.973 | 0.0007 | 0.0014 | 0.0024 |
emotion | 5000 | 2.5 | 0.1667 | 0.5372 | 0.5241 | 0.534 | 0.543 | −0.0131 | −0.0031 | 0.0059 |
imdb | 5000 | 0.2 | 0.5 | 0.8703 | 0.8565 | 0.8642 | 0.8727 | −0.0138 | −0.0061 | 0.0024 |
yelp | 5000 | 0.1 | 0.2 | 0.504 | 0.4986 | 0.505 | 0.5137 | −0.0055 | 0.001 | 0.0097 |
agnews | 10,000 | 1.3157895 | 0.25 | 0.8861 | 0.8865 | 0.8884 | 0.8909 | 0.0003 | 0.0023 | 0.0048 |
amazonpolar | 10,000 | 0.025 | 0.5 | 0.9012 | 0.898 | 0.8999 | 0.9029 | −0.0032 | −0.0013 | 0.0017 |
dbpedia | 10,000 | 0.1428571 | 0.0714 | 0.9758 | 0.9743 | 0.9752 | 0.9761 | −0.0014 | −0.0006 | 0.0003 |
emotion | 10,000 | 5 | 0.1667 | 0.5621 | 0.5508 | 0.5567 | 0.5634 | −0.0112 | −0.0053 | 0.0013 |
imdb | 10,000 | 0.4 | 0.5 | 0.8787 | 0.8694 | 0.8744 | 0.8766 | −0.0093 | −0.0043 | −0.0021 |
yelp | 10,000 | 0.2 | 0.2 | 0.528 | 0.5232 | 0.5284 | 0.5332 | −0.0047 | 0.0004 | 0.0052 |
agnews | 16,000 | 2.1052632 | 0.25 | 0.8977 | 0.8927 | 0.895 | 0.8983 | −0.005 | −0.0027 | 0.0006 |
amazonpolar | 16,000 | 0.04 | 0.5 | 0.9042 | 0.9029 | 0.9057 | 0.9074 | −0.0012 | 0.0016 | 0.0032 |
dbpedia | 16,000 | 0.2285714 | 0.0714 | 0.9778 | 0.9779 | 0.9781 | 0.9782 | 0.0001 | 0.0003 | 0.0004 |
emotion | 16,000 | 8 | 0.1667 | 0.5841 | 0.5743 | 0.5844 | 0.5958 | −0.0099 | 0.0003 | 0.0116 |
imdb | 16,000 | 0.64 | 0.5 | 0.8802 | 0.8808 | 0.8844 | 0.8869 | 0.0006 | 0.0041 | 0.0066 |
yelp | 16,000 | 0.32 | 0.2 | 0.5412 | 0.5387 | 0.5426 | 0.5463 | −0.0025 | 0.0014 | 0.0051 |
agnews | 20,000 | 2.6315789 | 0.25 | 0.8944 | 0.8949 | 0.8964 | 0.8986 | 0.0005 | 0.002 | 0.0041 |
amazonpolar | 20,000 | 0.05 | 0.5 | 0.9098 | 0.9055 | 0.9077 | 0.9107 | −0.0043 | −0.0021 | 0.0009 |
dbpedia | 20,000 | 0.2857143 | 0.0714 | 0.9791 | 0.9785 | 0.979 | 0.9794 | −0.0005 | −0.0001 | 0.0004 |
imdb | 20,000 | 0.8 | 0.5 | 0.8844 | 0.8797 | 0.8831 | 0.8846 | −0.0047 | −0.0013 | 0.0002 |
yelp | 20,000 | 0.4 | 0.2 | 0.5511 | 0.548 | 0.5502 | 0.5556 | −0.0031 | −0.0009 | 0.0044 |
agnews | 25,000 | 3.2894737 | 0.25 | 0.9004 | 0.8952 | 0.8982 | 0.9005 | −0.0052 | −0.0021 | 0.0002 |
amazonpolar | 25,000 | 0.0625 | 0.5 | 0.9106 | 0.9092 | 0.9103 | 0.912 | −0.0014 | −0.0003 | 0.0015 |
dbpedia | 25,000 | 0.3571429 | 0.0714 | 0.9798 | 0.9793 | 0.9798 | 0.9804 | −0.0005 | 0 | 0.0006 |
imdb | 25,000 | 1 | 0.5 | 0.8863 | 0.8842 | 0.8866 | 0.8879 | −0.0021 | 0.0003 | 0.0016 |
yelp | 25,000 | 0.5 | 0.2 | 0.5537 | 0.5476 | 0.5539 | 0.5568 | −0.0061 | 0.0002 | 0.003 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zeng, H.; Kong, F. Active Learning: Encoder-Decoder-Outlayer and Vector Space Diversification Sampling. Mathematics 2023, 11, 2819. https://doi.org/10.3390/math11132819
Zeng H, Kong F. Active Learning: Encoder-Decoder-Outlayer and Vector Space Diversification Sampling. Mathematics. 2023; 11(13):2819. https://doi.org/10.3390/math11132819
Chicago/Turabian StyleZeng, Hongyi, and Fanyi Kong. 2023. "Active Learning: Encoder-Decoder-Outlayer and Vector Space Diversification Sampling" Mathematics 11, no. 13: 2819. https://doi.org/10.3390/math11132819
APA StyleZeng, H., & Kong, F. (2023). Active Learning: Encoder-Decoder-Outlayer and Vector Space Diversification Sampling. Mathematics, 11(13), 2819. https://doi.org/10.3390/math11132819