1. Pyrak, B.; Gubica, T.; Rogacka-Pyrak, K. Cyclodextrin nanosponges as bioenhancers of phytochemicals. Prospect. Pharm. Sci. 2024, 22(3), 170–177. DOI: 10.56782/pps.272
2. Zielińska-Pisklak, M.; Michalik, K.A.; Szeleszczuk, Ł. Complexes of Fat-Soluble Vitamins with Cyclodextrins. Int. J. Mol. Sci. 2025, 26(13), Art. No: 6110. DOI: 10.3390/ijms26136110
3. Araj, S.K.; Szeleszczuk, Ł. A Review on Cyclodextrins/Estrogens Inclusion Complexes. Int. J. Mol. Sci. 2023, 24(10), Art. No: 8780. DOI: 10.3390/ijms24108780
4. Pyrak, B.; Rogacka-Pyrak, K.; Gubica, T.; Szeleszczuk, Ł. Exploring Cyclodextrin-Based Nanosponges as Drug Delivery Systems: Understanding the Physicochemical Factors Influencing Drug Loading and Release Kinetics. Int. J. Mol. Sci. 2024, 25(6), Art. No: 3527. DOI: 10.3390/ijms25063527
5. Napiórkowska, E.; Szeleszczuk, Ł. Review of Applications of β-Cyclodextrin as a Chiral Selector for Effective Enantioseparation. Int. J. Mol. Sci. 2024, 25(18), Art. No: 10126. DOI: 10.3390/ijms251810126
6. Crini, G. A History of Cyclodextrins. Chem. Rev. 2014, 114(21), 10940–10975. DOI: 10.1021/cr500081p
7. Christoforides, E.; Andreou, A.; Koskina, P.; Bethanis, K. Selective Crystallization of Trans-Nerolidol in
β-Cyclodextrin: Crystal Structure and Molecular Dynamics Analysis. Crystals 2025, 15(9), Art. No: 802. DOI: 10.3390/cryst15090802
8. Napiórkowska, E.; Szeleszczuk, Ł. Conformational landscape of β-cyclodextrin: a computational resource for host-guest modeling in supramolecular systems. J. Comput. Aided Mol. Des. 2025, 39, Art. No: 117. DOI: 10.1007/s10822-025-00694-1
9. Gackowski, M.; Madriwala, B.; Studzińska, R.; Koba, M. Novel Isosteviol-Based FXa Inhibitors: Molecular Modeling, In Silico Design and Docking Simulation. Molecules 2023, 28(13), Art. No: 4977. DOI: 10.3390/molecules28134977
10. Spirande, E.; Miryashkin, T.; Kolmakov, A.; Shapeev, A. Automated prediction of thermodynamic properties via Bayesian free-energy reconstruction from molecular dynamics. Comput. Condens. Matter 2025, 45, Art. No: e01163. DOI: 10.1016/j.cocom.2025.e01163
11. Boczar, D.; Michalska, K. A Review of Machine Learning and QSAR/QSPR Predictions for Complexes of Organic Molecules with Cyclodextrins. Molecules 2024, 29(13), Art. No: 3159. DOI: 10.3390/molecules29133159
12. Gackowski, M.; Szewczyk-Golec, K.; Pluskota, R.; Koba, M.; Mądra-Gackowska, K.; Woźniak, A. Application of Multivariate Adaptive Regression Splines (MARSplines) for Predicting Antitumor Activity of Anthrapyrazole Derivatives. Int. J. Mol. Sci. 2022, 23(9), Art. No: 5132. DOI: 10.3390/ijms23095132
13. Gackowski, M.; Madriwala, B.; Koba, M. In silico design, docking simulation, and ANN-QSAR model for predicting the anticoagulant activity of thiourea isosteviol compounds as FXa inhibitors. Chem. Pap. 2023, 77, 7027–7044. DOI: 10.1007/s11696-023-02994-y
14. Todkar, R.; Shirote, P.; Mohite, S. In Silico Screening and DFT Analysis of Nelumbo nucifera Phytochemicals as Potential BACE-1 Inhibitors for Alzheimer’s disease. Prospect. Pharm. Sci. 2025, 23(4), 29–36. DOI: 10.56782/pps.379
15. Tahıl, G.; Delorme, F.; Le Berre, D.; Monflier, É.; Sayede, A.; Tilloy, S. Curated dataset of association constants between a cyclodextrin and a guest for machine learning. Chem. Data Collect. 2023, 45, Art. No: 101022. DOI: 10.1016/j.cdc.2023.101022
16. Mikolov, T.; Chen, K.; Corrado, G.; Dean, J. Efficient Estimation of Word Representations in Vector Space. arXiv 2013, arXiv:1301.3781. DOI: 10.48550/arXiv.1301.3781
17. Huertas, C.; Juárez-Ramírez, R.; Raymond, C. Heat Map based Feature Ranker: In Depth Comparison with Popular Methods. Intell. Data Anal. 2018, 22(5), 1009–1037. DOI: 10.3233/IDA-173481
18. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; Vanderplas, J.; Passos, A.; Cournapeau, D.; Brucher, M.; Perrot, M.; Duchesnay, É. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12(85), 2825–2830.
19. Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. Proc. KDD 2016, 2016, 785–794. DOI: 10.1145/2939672.2939785
20. Ke, G.; Meng, Q.; Finley, T.; Wang, T.; Chen, W.; Ma, W.; Ye, Q.; Liu, T.-Y. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Adv. Neural Inf. Process. Syst. 2017, 30, 3146–3154.
21. Prokhorenkova, L.; Gusev, G.; Vorobev, A.; Dorogush, A.V.; Gulin, A. CatBoost: Unbiased Boosting with Categorical Features. Adv. Neural Inf. Process. Syst. 2018, 31, 6638–6648.
22. Friedman, J.H. Greedy Function Approximation: A Gradient Boosting Machine. Ann. Stat. 2001, 29(5), 1189–1232. DOI: 10.1214/aos/1013203451
23. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. DOI: 10.1023/A:1010933404324
24. Akiba, T.; Sano, S.; Yanase, T.; Ohta, T.; Koyama, M. Optuna: A Next-generation Hyperparameter Optimization Framework. Proc. KDD 2019, 2019, 2623–2631. DOI: 10.1145/3292500.3330701
25. Hyndman, R.J.; Koehler, A.B. Another Look at Measures of Forecast Accuracy. Int. J. Forecast. 2006, 22(4), 679–688. DOI: 10.1016/j.ijforecast.2006.03.001
26. Lundberg, S.M.; Lee, S.-I. A Unified Approach to Interpreting Model Predictions. Adv. Neural Inf. Process. Syst. 2017, 30, 4765–4774.