Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications

Здесь есть возможность читать онлайн «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications» — ознакомительный отрывок электронной книги совершенно бесплатно, а после прочтения отрывка купить полную версию. В некоторых случаях можно слушать аудио, скачать через торрент в формате fb2 и присутствует краткое содержание. Жанр: unrecognised, на английском языке. Описание произведения, (предисловие) а так же отзывы посетителей доступны на портале библиотеки ЛибКат.

Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications: краткое содержание, описание и аннотация

Предлагаем к чтению аннотацию, описание, краткое содержание или предисловие (зависит от того, что написал сам автор книги «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications»). Если вы не нашли необходимую информацию о книге — напишите в комментариях, мы постараемся отыскать её.

Written and edited by a group of renowned specialists in the field, this outstanding new volume addresses primary computational techniques for developing new technologies in soft computing. It also highlights the security, privacy, artificial intelligence, and practical approaches needed by engineers and scientists in all fields of science and technology. It highlights the current research, which is intended to advance not only mathematics but all areas of science, research, and development, and where these disciplines intersect. As the book is focused on emerging concepts in machine learning and artificial intelligence algorithmic approaches and soft computing techniques, it is an invaluable tool for researchers, academicians, data scientists, and technology developers.
The newest and most comprehensive volume in the area of mathematical methods for use in real-time engineering, this groundbreaking new work is a must-have for any engineer or scientist’s library. Also useful as a textbook for the student, it is a valuable contribution to the advancement of the science, both a working handbook for the new hire or student, and a reference for the veteran engineer.

Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications — читать онлайн ознакомительный отрывок

Ниже представлен текст книги, разбитый по страницам. Система сохранения места последней прочитанной страницы, позволяет с удобством читать онлайн бесплатно книгу «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications», без необходимости каждый раз заново искать на чём Вы остановились. Поставьте закладку, и сможете в любой момент перейти на страницу, на которой закончили чтение.

Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

References

1. Min Cai, Huan-Shui Zhang and Wei Wang, “A power control algorithm based on SIR with multiplicative stochastic noise,” Proceedings of 2004 International Conference on Machine Learning and Cybernetics (IEEE Cat. No.04EX826), Shanghai, China, 2004, pp. 2743-2746 vol. 5, doi: 10.1109/ICMLC.2004.1378324.

2. M. Muselli, A. Bertoni, M. Frasca, A. Beghini, F. Ruffino and G. Valentini, “A Mathematical Model for the Validation of Gene Selection Methods,” in IEEE/ACM Transactions on Computational Biology and Bioinformatics , vol. 8, no. 5, pp. 1385-1392, Sept.-Oct. 2011, doi: 10.1109/TCBB.2010.83.

3. Dutta, Nabanita & Subramaniam, Umashankar & Sanjeevikumar, P.. (2018). Mathematical models of classification algorithm of Machine learning. 10.5339/qproc.2019.imat3e2018.3.

4. KNN Model-Based Approach in Classification, Gongde Guo1, Hui Wang 1, David Bell 2, Yaxin Bi 2, and Kieran Greer 1, School of Computing and Mathematics, University of Ulster, Newtownabbey, BT37 0QB, Northern Ireland, UK.

5. S. Ji, L. T. Watson and L. Carin, “Semi supervised Learning of Hidden Markov Models via a Homotopy Method,” in IEEE Transactions on Pattern Analysis and Machine Intelligence , vol. 31, no. 2, pp. 275-287, Feb. 2009, doi: 10.1109/TPAMI.2008.71.

6. Pavithra, M., Rajmohan, R., Kumar, T. A., & Ramya, R. (2021). Prediction and Classification of Breast Cancer Using Discriminative Learning Models and Techniques. Machine Vision Inspection Systems , Volume 2: Machine Learning-Based Approaches, 241-262 .

7. SanghamitraMohanty, HimadriNandini Das Bebartta, “Performance Comparison of SVM and K-NN for Oriya Character Recognition”, (IJACSA) International Journal of Advanced Computer Science and Applications, Special Issue on Image Processing and Analysis , 2011, pp. 112-115

8. D. Bouchoffra and F. Ykhlef, “Mathematical models for machine learning and pattern recognition,” 2013 8th International Workshop on Systems, Signal Processing and their Applications (WoSSPA), Algiers , 2013, pp. 27-30, doi: 10.1109/WoSSPA.2013.6602331.

9. R. Veena, S. Fauziah, S. Mathew, I. Petra and J. Hazra, “Data driven models for understanding the wind farm wake propagation pattern,” 2016 International Conference on Cogeneration, Small Power Plants and District Energy (ICUE), Bangkok , 2016, pp. 1-5, doi: 10.1109/COGEN.2016.7728969.

10. H. J. Vishnukumar, B. Butting, C. Müller and E. Sax, “Machine learning and deep neural network — Artificial intelligence core for lab and real-world test and validation for ADAS and autonomous vehicles: AI for efficient and quality test and validation ,” 2017 Intelligent Systems Conference (IntelliSys), London , 2017, pp. 714-721, doi: 10.1109/IntelliSys.2017.8324372.

11. A. Salaün, Y. Petetin and F. Desbouvries, “Comparing the Modeling Powers of RNN and HMM,” 2019 18th IEEE International Conference on Machine Learning and Applications (ICMLA), Boca Raton, FL, USA , 2019, pp. 1496-1499, doi: 10.1109/ICMLA.2019.00246.

12. S. A. Selvi, T. A. kumar, R. S. Rajesh and M. A. T. Ajisha, “An Efficient Communication Scheme for Wi-Li-Fi Network Framework,” 2019 Third International conference on I-SMAC (IoT in Social, Mobile, Analytics and Cloud) (I-SMAC), Palladam, India, 2019 , pp. 697-701, doi: 10.1109/I-S MAC47947.2019.9032650.

13. Mathematical models and deep learning for predicting the number of individuals reported to be infected with SARS-CoV-2‵ A. S. Fokas, N. Dikaios and G. A. Kastis.

14. M. A. Bahloul, A. Chahid and T. -M. Laleg-Kirati, “Fractional-Order SEIQRDP Model for Simulating the Dynamics of COVID-19 Epidemic,” in IEEE Open Journal of Engineering in Medicine and Biology , vol. 1, pp. 249-256, 2020, doi: 10.1109/OJEMB.2020.3019758.

15. Y. Yang, W. Yu and D. Chen, “Prediction of COVID-19 spread via LSTM and the deterministic SEIR model,” 2020 39th Chinese Control Conference (CCC), Shenyang, China , 2020, pp. 782-785, doi: 10.23919/CCC50068.2020.9189012.

1 *Corresponding author: akshathay@presidencyuniversity.in

2 †Corresponding author: pravinthraja@gmail.com

2

Edge Computing Optimization Using Mathematical Modeling, Deep Learning Models, and Evolutionary Algorithms

P. Vijayakumar*, Prithiviraj Rajalingam and S. V. K. R. Rajeswari

ECE Department, SRMIST, Kattankulathur, Chennai, India

Abstract

The rapid growth of the Internet of Things (IoT) with advanced applications requires high speed and real-time computing power. Edge computing brings the computation of data closer to the machine where it is being collected. It leads to a decrease in latency, bandwidth usage, and resources for the server and its cost. The significant challenges in edge computing are 1) optimal offloading decision making, 2) resource allocation, 3) Meeting Quality-of-Service (QoS) and Experience (QoE). This chapter addresses the above challenges using mathematical models, Deep Learning and the Evolutionary algorithm. The deep learning algorithm solves the highly complex problem by developing a model from the training data or observation (reinforcement learning). The deep learning approach converts the optimization problem of edge computing into classification or regression or intelligent decision-making problems and solves them. The Evolution algorithm finds an optimum solution for the given problem through the natural process of evaluation, which is used to solve the edge computing multi-optimization problem. An evolution algorithm like a genetic algorithm and ant colony can solve a few research problems of edge computing like task scheduling.

Keywords:Edge computing, deep learning, machine learning, evolutionary algorithm

2.1 Introduction to Edge Computing and Research Challenges

Edge computing is a new distributed computing paradigm. The pattern of edge computing is closer to the location as a platform for computation and data storage before working with the cloud. In simpler terms, edge computing works with smaller and real-time data, whereas cloud works with big data. Edge computing helps in quick response times and also saves bandwidth [1, 2]. In the use case of cloud-based augmented reality applications, latency and processing limitations are key challenges to implementing the cloud system due to geographical distance from the infrastructure. Edge computing comes into the picture as an advancement of cloud gaming as it allows short-distance travel of data. Edge computing has the advantages of reducing lag times and latency [3]. Mostly edge computing has a role in helping cloud-based IoT systems to provide computational service. A small recap of the Cloud-Based IoT system is provided below.

2.1.1 Cloud-Based IoT and Need of Edge Computing

The Internet of Things (IoT) plays a vital role in human daily life by making all the devices connected through the internet, and it works ingeniously. Day by day, the IoT plays a crucial role in all the domains [4]. For example, IoT provides excellent service to medical applications like tracking patient status, heart rate, blood pressure, and sugar level can be monitored, and if a patient goes into a critical or unstable condition, the doctor can provide solutions through the report generated by the IoT application [6]. The IoT data can also be used to study different patients’ lifestyles and activities to prevent them from going into a critical situation. Therefore, the IoT has developed opportunities to provide brilliant solutions with many predictions and intelligence.

IoT devices are correctly functioning because of several technologies like cloud computing that give many advantages to IoT devices, including storage infrastructure, processing the real-time data in IoT devices, and high-performance computing. It leads to cloud computing as a revolutionary part of IoT devices, which provides smart and self-predicted data [6]. Due to IoT devices’ evolution, cloud providers take an immense advantage to provide the communication or transfer of data between the IoT devices. This results in the Cloud of Things, which connects both cloud computing and IoT devices.

Читать дальше
Тёмная тема
Сбросить

Интервал:

Закладка:

Сделать

Похожие книги на «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications»

Представляем Вашему вниманию похожие книги на «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications» списком для выбора. Мы отобрали схожую по названию и смыслу литературу в надежде предоставить читателям больше вариантов отыскать новые, интересные, ещё непрочитанные произведения.


Отзывы о книге «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications»

Обсуждение, отзывы о книге «Simulation and Analysis of Mathematical Methods in Real-Time Engineering Applications» и просто собственные мнения читателей. Оставьте ваши комментарии, напишите, что Вы думаете о произведении, его смысле или главных героях. Укажите что конкретно понравилось, а что нет, и почему Вы так считаете.

x