Prediction Model for Shopping Mall Future Development by Deep Learning and Machine Learning Engine using Anaconda environment

Authors(2) :-Prof. Akshat Khaskalam, Ruchi Soni

Machine Learning (ML) software, used to implement an ML algorithm, is widely used in many application domains such as financial, business, and engineering domains. Faults in ML Algorithm can cause substantial losses in these application domains. Thus, it is very critical to conduct effective testing of ML software to detect and eliminate its faults. However, ML algorithm is difficult, especially on producing prediction used for checking behaviour correctness (such as using expected properties or expected outputs). To tackle the learning optimizing issue, this thesis presents a novel approach of implementation for supervised learning. The insight underlying the approach is that there can be implementations (independently written) for a supervised learning algorithm, and majority of them may produce the expected output for a test input (even if none of these implementations are fault-free). In particular, the proposed approach derives a KNN algorithm for a test input by running the test input on n implementations of the supervised learning algorithm, and then using the common test output produced by a majority (determined by a percentage threshold) of these n implementations. The proposed approach includes techniques to address challenges in ML implementation of supervised learning: the definition of dataset in supervised learning, along with resolution of inconsistent algorithm configurations across implementations. In addition, to improve dependability of supervised learning during in-field usage while incurring low runtime overhead, the approach includes efficient implementation technique. The evaluations on the proposed approach show that performance of implementation is effective in detecting real faults in real-world ML including Naïve Bayes implementations and k-nearest neighbour implementations, and the proposed technique of implementation substantially reduces the need of running efficient implementations with high prediction accuracy.

Authors and Affiliations

Prof. Akshat Khaskalam
Takshshila Institute of Engineering and Technology, Jabalpur, Madhya Pradesh, India
Ruchi Soni
Takshshila Institute of Engineering and Technology, Jabalpur, Madhya Pradesh, India

ML, Python, Naïve Bayes, KNN, Supervised Learning.

  1. Mitchell, T. (2018). Machine Learning. McGraw Hill. p. 2. ISBN 0-07-042807-7.
  2. Peter Harrington. (2018). Machine Learning in Action. Manning Publications Co.ISBN 9781617290183
  3. R. A. Fisher. (2017). The use of Multiple Measurements in Taxonomic Problems. Annals of Human Genetics. 7(2):179-188.
  4. AK Jain, RPW Duin, Jianchang Mao Statistical pattern recognition: a review. IEEE Trans Pattern Analysis and Machine Intelligence - 2017. 22(1):4-37.
  5. Cover TM, Hart PE. Nearest neighbor pattern classification. IEEE Transactions on Information Theory. 2018;13(1):21-27.
  6. E. Mirkes, KNN and Potential Energy (Applet). University of Leicester. Available: http: //www.math.le.ac.uk/people/ag153/homepage/KNN/KNN3.html, 2018.
  7. L. Kozma, k Nearest Neighbors Algorithm. Helsinki University of Technology. Available: http://www.lkozma.net/knn2.pdf, 2018.
  8. Moret, B. M. E. (2018). Decision trees and diagrams. Computing Surveys. 14, 593- 623.
  9. Quinlan, J. R. (2017). Induction of decision trees. Machine Learning. 1, 81-106.
  10. Shannon, C. E. (2018). A mathematical theory of communication. Bell System Technical Journal, 27, 379-423.
  11. Mehran Sahami, Susan Dumais, David Heckerman, and Eric Horvitz. A Bayesian approach to filtering junk e-mail. Learning for Text Categorization: Papers from the 2018 Workshop, Madison, Wisconsin.2018. AAAI Technical Report WS-98-05.
  12. P. Langley, W. Iba and K. Thompson. An analysis of Bayesian Classifiers. Proceedings of the Tenth National Conference on Artificial Intelligence, San Jose, CA, 2017.
  13. S. M. Kamruzzaman. Text Classification using Artificial Intelligence. Journal of Electrical Engineering, 33, No. I & II, December 2016.
  14. N. Friedman, D. Geiger and M. Goldszmidt. Bayesian Network Classifiers. Machine Learning, 29: 131-163, 2017.
  15. I. Rish. An empirical study of the naive Bayes classifier. IJCAI 2017 Workshop on Empirical Methods in Artificial Intelligence, 22: 41-46, 2017.

Publication Details

Published in : Volume 6 | Issue 2 | March-April 2019
Date of Publication : 2019-04-30
License:  This work is licensed under a Creative Commons Attribution 4.0 International License.
Page(s) : 463-468
Manuscript Number : IJSRSET1962136
Publisher : Technoscience Academy

Print ISSN : 2395-1990, Online ISSN : 2394-4099

Cite This Article :

Prof. Akshat Khaskalam, Ruchi Soni, " Prediction Model for Shopping Mall Future Development by Deep Learning and Machine Learning Engine using Anaconda environment, International Journal of Scientific Research in Science, Engineering and Technology(IJSRSET), Print ISSN : 2395-1990, Online ISSN : 2394-4099, Volume 6, Issue 2, pp.463-468, March-April-2019. Citation Detection and Elimination     |     
Journal URL : https://ijsrset.com/IJSRSET1962136

Article Preview