International Journal of Innovative Research in Computer and Communication Engineering

ISSN Approved Journal | Impact factor: 8.771 | ESTD: 2013 | Follows UGC CARE Journal Norms and Guidelines

| Monthly, Peer-Reviewed, Refereed, Scholarly, Multidisciplinary and Open Access Journal | High Impact Factor 8.771 (Calculated by Google Scholar and Semantic Scholar | AI-Powered Research Tool | Indexing in all Major Database & Metadata, Citation Generator | Digital Object Identifier (DOI) |


TITLE An Adaptive Machine Learning Framework for Automated Pattern Recognition in Large-Scale Datasets
ABSTRACT This paper presents an adaptive pattern recognition framework that combines deep learning–based feature extraction with classical machine learning techniques for effective classification of structured data. The proposed system employs an autoencoder to learn compact latent representations of input features, enabling the transformation of raw data into a more informative feature space. These learned features are then utilized by a Random Forest classifier to perform accurate and robust classification. A synthetic dataset based on Gaussian distribution is generated to simulate structured patterns with controlled overlap between classes, allowing realistic evaluation of the model. Data preprocessing is carried out using standard scaling to normalize feature distributions and improve training stability. The autoencoder is trained in an unsupervised manner by minimizing reconstruction error using Mean Squared Error loss, facilitating efficient feature learning, while the Random Forest leverages ensemble learning to enhance classification performance and reduce overfitting .The system is evaluated using performance metrics such as accuracy, precision, recall, F1-score, and confusion matrix, demonstrating strong classification capability under moderately overlapping conditions. Furthermore, the framework follows a batch learning approach, where the model can be updated through periodic retraining with new data, making it suitable for applications involving evolving data patterns while maintaining computational efficiency.
AUTHOR N. DURGA PRASANNA, K. SOMA SEKHAR, A. SAI PRAVEEN, K.MANOBHIRAM, B. ABHISHEK Assistant Professor, Dept. of IT, Sir CR Reddy College of Engineering, Eluru, India B. Tech Student, Dept. of IT, Sir CR Reddy College of Engineering, Eluru, India
VOLUME 183
DOI DOI: 10.15680/IJIRCCE.2026.1404036
PDF pdf/36_An Adaptive Machine Learning Framework for Automated Pattern Recognition in Large-Scale Datasets.pdf
KEYWORDS
References [1] L. Breiman, “Random Forests,” Machine Learning, vol. 45, no. 1, pp. 5–32, 2001.
[2] G. E. Hinton and R. R. Salakhutdinov, “Reducing the Dimensionality of Data with Neural Networks,” Science, vol. 313, no. 5786, pp. 504–507, 2006.
[3] D. P. Kingma and M. Welling, “Auto-Encoding Variational Bayes,” arXiv preprint arXiv:1312.6114, 2014.
[4] F. Pedregosa et al., “Scikit-learn: Machine Learning in Python,” Journal of Machine Learning Research, vol. 12, pp. 2825–2830, 2011.
[5] M. Abadi et al., “TensorFlow: A System for Large-Scale Machine Learning,” in Proc. 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI), 2016, pp. 265–283.
[6] I. Goodfellow, Y. Bengio, and A. Courville, Deep Learning. MIT Press, 2016.
[7] T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning. Springer, 2009.
[8] J. Han, M. Kamber, and J. Pei, Data Mining: Concepts and Techniques. Morgan Kaufmann, 2011.
[9] S. Raschka and V. Mirjalili, Python Machine Learning. Packt Publishing, 2017.
[10] C. M. Bishop, Pattern Recognition and Machine Learning. Springer, 2006.
[11] A. Géron, Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow. O’Reilly, 2019.
[12] Z. Ghahramani, “Probabilistic Machine Learning and Artificial Intelligence,” Nature, vol. 521, pp. 452–459, 2015.
image
Copyright © IJIRCCE 2020.All right reserved