This page has only limited features, please log in for full access.

Dr. Saurabh Singh
Department of Industrial & Systems Engineering, Dongguk University, Seoul 999007, Korea

Basic Info


Research Keywords & Expertise

0 Artificial Intelligence
0 Cloud Computing
0 Cryptography
0 Blockchain
0 IoT Security

Fingerprints

Blockchain
Artificial Intelligence
Cloud Computing

Honors and Awards

The user has no records in this section


Career Timeline

The user has no records in this section.


Short Biography

The user biography is not available.
Following
Followers
Co Authors
The list of users this user is following is empty.
Following: 0 users

Feed

Journal article
Published: 06 August 2021 in IEEE Transactions on Industrial Informatics
Reads 0
Downloads 0

Critical infrastructure comprising on-demand devices, including secondary servers, comes into play when a situation like an overload is involved. The on-demand servers and devices require smart management solutions that form an integral part of Artificial Intelligence of Things (AIoT). This work considers AIoT as a combination of Mobile-Internet of Things (M-IoT) and AI requiring immediate response, secondary support system, and computational resources. Privacy in AIoT is always a concern when sharing information as intruders can eavesdrop on the settings of the system. This paper uses an osmotic computing paradigm, which enables the derivation of strategies to decide on the methods of sharing services via optimal and privacy-aware resource management in AIoT. A safety competition is built on top of configuration rewards that help to attain privacy-by-design. The contributions of this work are expressed using theoretical analysis and numerical simulations.

ACS Style

Vishal Sharma; Teik Guan Tan; Saurabh Singh; P K Sharma. Optimal and privacy-aware resource management in AIoT using osmotic computing. IEEE Transactions on Industrial Informatics 2021, PP, 1 -1.

AMA Style

Vishal Sharma, Teik Guan Tan, Saurabh Singh, P K Sharma. Optimal and privacy-aware resource management in AIoT using osmotic computing. IEEE Transactions on Industrial Informatics. 2021; PP (99):1-1.

Chicago/Turabian Style

Vishal Sharma; Teik Guan Tan; Saurabh Singh; P K Sharma. 2021. "Optimal and privacy-aware resource management in AIoT using osmotic computing." IEEE Transactions on Industrial Informatics PP, no. 99: 1-1.

Journal article
Published: 03 August 2021 in IEEE Access
Reads 0
Downloads 0

The fifth-generation (5G) of cellular technology is currently being deployed over the world. In the next decade of mobile networks, beyond 5G (B5G) cellular networks with the under-development advanced technology enablers are expected to be a fully developed system that could offer tremendous opportunities for both enterprises and society at large. B5G in more ambitious scenarios will be capable to facilitate much-improved performance with the significant upgrade of the key parameters such as massive connectivity, ultra-reliable and low latency (URLL), spectral efficiency (SE) and energy efficiency (EE). Equipping non-orthogonal multiple access (NOMA) with other key drivers will help to explore systems’ applicability to cover a wide variety of applications to forge a path for future networks. NOMA empowers the networks with seamless connectivity and can provide a secure transmission strategy for the industrial internet of things (IIoT) anywhere and anytime. Despite being a promising candidate for B5G networks a comprehensive study that covers operating principles, fundamental features and technological feasibility of NOMA at mmWave massive MIMO communications is not available. To address this, a simulation-based comparative study betweenNOMA and orthogonal multiple access (OMA) techniques formmWave massive multiple-input and multiple-output (MIMO) communications is presented with performance discussions and identifying technology gaps. Throughout the paper, aspects of operating principles, fundamental features and technological feasibility of NOMA are discussed. Also, it is demonstrated that NOMA not only has good adaptability but also can outperform other OMA techniques for mmWave massive MIMO communications. Some foreseeable challenges and future directions on applying NOMA to B5G networks are also provided.

ACS Style

Joydev Ghosh; Vishal Sharma; Huseyin Haci; Saurabh Singh; In-Ho Ra. Performance Investigation of NOMA versus OMA Techniques for mmWave Massive MIMO Communications. IEEE Access 2021, PP, 1 -1.

AMA Style

Joydev Ghosh, Vishal Sharma, Huseyin Haci, Saurabh Singh, In-Ho Ra. Performance Investigation of NOMA versus OMA Techniques for mmWave Massive MIMO Communications. IEEE Access. 2021; PP (99):1-1.

Chicago/Turabian Style

Joydev Ghosh; Vishal Sharma; Huseyin Haci; Saurabh Singh; In-Ho Ra. 2021. "Performance Investigation of NOMA versus OMA Techniques for mmWave Massive MIMO Communications." IEEE Access PP, no. 99: 1-1.

Review article
Published: 03 May 2021 in Archives of Computational Methods in Engineering
Reads 0
Downloads 0

Plant disease detection is a critical issue that needs to be focused on for productive agriculture and economy. Detecting plant disease using traditional methods is a tedious job as it requires a tremendous amount of work, time, and expertise. Automatic plant disease detection is an important research area that has recently gained a lot of attention among the academicians, researchers, and practitioners. Machine Learning and Deep Learning can help identify the plant disease at the initial stage as soon as it appears on plant leaves. In this state-of-an-art review, a thorough investigation has been performed to evaluate the possibility of using Machine Learning models to identify plant diseases. In this study, diseases and infections of four types of crops, i.e., Tomato, Rice, Potato, and Apple, are considered. Initially, numerous possible infections and diseases on these four kinds of crops are studied along with their reason for the occurrence and possible symptoms for their detections. An in-depth study of the different steps involved in plant disease detection and classification using Machine Learning and Deep Learning is provided. Various datasets available online for plant disease detection have also been presented. Along with this, a detailed study on various existing Machine Learning and Deep Learning-based classification models proposed by different researchers across the world for four considered crops in terms of their performance evaluations, the dataset used, and the feature extraction method is discussed. At last, various challenges in the use of machine learning and deep learning for plant disease detection and future research directions are enumerated and presented.

ACS Style

Javaid Ahmad Wani; Sparsh Sharma; Malik Muzamil; Suhaib Ahmed; Surbhi Sharma; Saurabh Singh. Machine Learning and Deep Learning Based Computational Techniques in Automatic Agricultural Diseases Detection: Methodologies, Applications, and Challenges. Archives of Computational Methods in Engineering 2021, 1 -37.

AMA Style

Javaid Ahmad Wani, Sparsh Sharma, Malik Muzamil, Suhaib Ahmed, Surbhi Sharma, Saurabh Singh. Machine Learning and Deep Learning Based Computational Techniques in Automatic Agricultural Diseases Detection: Methodologies, Applications, and Challenges. Archives of Computational Methods in Engineering. 2021; ():1-37.

Chicago/Turabian Style

Javaid Ahmad Wani; Sparsh Sharma; Malik Muzamil; Suhaib Ahmed; Surbhi Sharma; Saurabh Singh. 2021. "Machine Learning and Deep Learning Based Computational Techniques in Automatic Agricultural Diseases Detection: Methodologies, Applications, and Challenges." Archives of Computational Methods in Engineering , no. : 1-37.

Review
Published: 11 January 2021 in Sensors
Reads 0
Downloads 0

Ensuring soil strength, as well as preliminary construction cost and duration prediction, is a very crucial and preliminary aspect of any construction project. Similarly, building strong structures is very important in geotechnical engineering to ensure the bearing capability of structures against external forces. Hence, in this first-of-its-kind state-of-the-art review, the capability of various artificial intelligence (AI)-based models toward accurate prediction and estimation of preliminary construction cost, duration, and shear strength is explored. Initially, background regarding the revolutionary AI technology along with its different models suited for geotechnical and construction engineering is presented. Various existing works in the literature on the usage of AI-based models for the abovementioned applications of construction and maintenance are presented along with their advantages, limitations, and future work. Through analysis, various crucial input parameters with great impact on the estimation of preliminary construction cost, duration, and soil shear strength are enumerated and presented. Lastly, various challenges in using AI-based models for accurate predictions in these applications, as well as factors contributing to the cost-overrun issues, are presented. This study can, thus, greatly assist civil engineers in efficiently using the capabilities of AI for solving complex and risk-sensitive tasks, and it can also be used in Internet of things (IoT) environments for automated applications such as smart structural health-monitoring systems.

ACS Style

Sparsh Sharma; Suhaib Ahmed; Mohd Naseem; Waleed S. Alnumay; Saurabh Singh; Gi Hwan Cho. A Survey on Applications of Artificial Intelligence for Pre-Parametric Project Cost and Soil Shear-Strength Estimation in Construction and Geotechnical Engineering. Sensors 2021, 21, 463 .

AMA Style

Sparsh Sharma, Suhaib Ahmed, Mohd Naseem, Waleed S. Alnumay, Saurabh Singh, Gi Hwan Cho. A Survey on Applications of Artificial Intelligence for Pre-Parametric Project Cost and Soil Shear-Strength Estimation in Construction and Geotechnical Engineering. Sensors. 2021; 21 (2):463.

Chicago/Turabian Style

Sparsh Sharma; Suhaib Ahmed; Mohd Naseem; Waleed S. Alnumay; Saurabh Singh; Gi Hwan Cho. 2021. "A Survey on Applications of Artificial Intelligence for Pre-Parametric Project Cost and Soil Shear-Strength Estimation in Construction and Geotechnical Engineering." Sensors 21, no. 2: 463.

Journal article
Published: 10 August 2020 in IEEE Access
Reads 0
Downloads 0

To survive in the competitive environment, most organizations have adopted component-based software development strategies in the rapid technology advancement era and the proper utilization of cloud-based services. To facilitate the continuous configuration, reduce complexity, and faster system delivery for higher user satisfaction in dynamic scenarios. In cloud services, customers select services from web applications dynamically. Healthcare body sensors are commonly used for diagnosis and monitoring patients continuously for their emergency treatment. The healthcare devices are connected with mobile or laptop etc. on cloud environment with network and frequently change applications. Thus, organizations rely on regression testing during changes and implementation to validate the quality and reliability of the system after the alteration. However, for a large application with limited resources and frequently change component management activities in the cloud computing environment, component-based system verification is difficult and challenging due to irrelevant and redundant test cases and faults. In this study, proposed a test case selection and prioritization framework using a design pattern to increase the faults detection rate. First, we select test cases on frequently accessed components using observer patterns and, secondly, prioritize test cases on adopting some strategies. The proposed framework was validated by an experiment and compared with other techniques (previous faults based and random priority). Hence, experimental results show that the proposed framework successfully verified changes. Subsequently, the proposed framework increases the fault detection rate (i.e., more than 90%) than previous faults based and random priority (i.e., more than 80% respectively).

ACS Style

Sadia Ali; Yaser Hafeez; N. Z. Jhanjhi; Mamoona Humayun; Muhammad Imran; Anand Nayyar; Saurabh Singh; In-Ho Ra. Towards Pattern-Based Change Verification Framework for Cloud-Enabled Healthcare Component-Based. IEEE Access 2020, 8, 148007 -148020.

AMA Style

Sadia Ali, Yaser Hafeez, N. Z. Jhanjhi, Mamoona Humayun, Muhammad Imran, Anand Nayyar, Saurabh Singh, In-Ho Ra. Towards Pattern-Based Change Verification Framework for Cloud-Enabled Healthcare Component-Based. IEEE Access. 2020; 8 (99):148007-148020.

Chicago/Turabian Style

Sadia Ali; Yaser Hafeez; N. Z. Jhanjhi; Mamoona Humayun; Muhammad Imran; Anand Nayyar; Saurabh Singh; In-Ho Ra. 2020. "Towards Pattern-Based Change Verification Framework for Cloud-Enabled Healthcare Component-Based." IEEE Access 8, no. 99: 148007-148020.

Journal article
Published: 27 July 2020 in Applied Sciences
Reads 0
Downloads 0

Health-related limitations prohibit a human from working in hazardous environments, due to which cognitive robots are needed to work there. A robot cannot learn the spatial semantics of the environment or object, which hinders the robot from interacting with the working environment. To overcome this problem, in this work, an agent is computationally devised that mimics the grid and place neuron functionality to learn cognitive maps from the input spatial data of an environment or an object. A novel quadrant-based approach is proposed to model the behavior of the grid neuron, which, like the real grid neuron, is capable of generating periodic hexagonal grid-like output patterns from the input body movement. Furthermore, a cognitive map formation and their learning mechanism are proposed using the place–grid neuron interaction system, which is meant for making predictions of environmental sensations from the body movement. A place sequence learning system is also introduced, which is like an episodic memory of a trip that is forgettable based on their usage frequency and helps in reducing the accumulation of error during avisit todistant places. The model has been deployed and validated in two different spatial data learning applications, one being the 2D object detection by touch, and another is the navigation in an environment. The result analysis shows that the proposed model is significantly associated with the expected outcomes.

ACS Style

Rahul Shrivastava; Prabhat Kumar; Sudhakar Tripathi; Vivek Tiwari; Dharmendra Singh Rajput; Thippa Reddy Gadekallu; Bhivraj Suthar; Saurabh Singh; In-Ho Ra; Gadekallu Thippa Reddy. A Novel Grid and Place Neuron’s Computational Modeling to Learn Spatial Semantics of an Environment. Applied Sciences 2020, 10, 5147 .

AMA Style

Rahul Shrivastava, Prabhat Kumar, Sudhakar Tripathi, Vivek Tiwari, Dharmendra Singh Rajput, Thippa Reddy Gadekallu, Bhivraj Suthar, Saurabh Singh, In-Ho Ra, Gadekallu Thippa Reddy. A Novel Grid and Place Neuron’s Computational Modeling to Learn Spatial Semantics of an Environment. Applied Sciences. 2020; 10 (15):5147.

Chicago/Turabian Style

Rahul Shrivastava; Prabhat Kumar; Sudhakar Tripathi; Vivek Tiwari; Dharmendra Singh Rajput; Thippa Reddy Gadekallu; Bhivraj Suthar; Saurabh Singh; In-Ho Ra; Gadekallu Thippa Reddy. 2020. "A Novel Grid and Place Neuron’s Computational Modeling to Learn Spatial Semantics of an Environment." Applied Sciences 10, no. 15: 5147.

Journal article
Published: 01 July 2020 in Sustainable Cities and Society
Reads 0
Downloads 0

In the digital era, the smart city can become an intelligent society by utilizing advances in emerging technologies. Specifically, the rapid adoption of blockchain technology has led a paradigm shift to a new digital smart city ecosystem. A broad spectrum of blockchain applications promise solutions for problems in areas ranging from risk management and financial services to cryptocurrency, and from the Internet of Things (IoT) to public and social services. Furthermore, the convergence of Artificial Intelligence (AI) and blockchain technology is revolutionizing the smart city network architecture to build sustainable ecosystems. However, these advancements in technologies bring both opportunities and challenges when it comes to achieving the goals of creating a sustainable smart cities. This paper provides a comprehensive literature review of the security issues and problems that impact the deployment of blockchain systems in smart cities. This work presents a detailed discussion of several key factors for the convergence of Blockchain and AI technologies that will help form a sustainable smart society. We discuss blockchain security enhancement solutions, summarizing the key points that can be used for developing various blockchain-AI based intelligent transportation systems. Also, we discuss the issues that remain open and our future research direction, this includes new security suggestions and future guidelines for a sustainable smart city ecosystem.

ACS Style

Saurabh Singh; Pradip Kumar Sharma; ByungUn Yoon; Mohammad Shojafar; Gi Hwan Cho; In-Ho Ra. Convergence of blockchain and artificial intelligence in IoT network for the sustainable smart city. Sustainable Cities and Society 2020, 63, 102364 .

AMA Style

Saurabh Singh, Pradip Kumar Sharma, ByungUn Yoon, Mohammad Shojafar, Gi Hwan Cho, In-Ho Ra. Convergence of blockchain and artificial intelligence in IoT network for the sustainable smart city. Sustainable Cities and Society. 2020; 63 ():102364.

Chicago/Turabian Style

Saurabh Singh; Pradip Kumar Sharma; ByungUn Yoon; Mohammad Shojafar; Gi Hwan Cho; In-Ho Ra. 2020. "Convergence of blockchain and artificial intelligence in IoT network for the sustainable smart city." Sustainable Cities and Society 63, no. : 102364.

Journal article
Published: 23 June 2020 in IEEE Access
Reads 0
Downloads 0

Blockchain is attracting more and more attention to its applicability in the fields of Internet of Things (IoT). In particular, it is able to store data in unalterable blocks, associated with its secure peer-to-peer in a growing problem of transaction authorization in industrial and service provisioning applications. Moreover, it facilitates decentralized transaction (TX) validation and distributed ledger. The underneath algorithm of TX selection for validation may not be effective in terms of delay of various services of the applications. Because the existing random-based or fee-based selections are a delay insensitive that does not guarantee a minimum delay of a time-critical TX. This paper proposes a blockchain-based transaction validation protocol for a secure distributed IoT network. It includes a context-aware TX validation technique, where a TX is validated by a miner with the priority of a service. Besides, we adopt the Software Defined Networking enabled gateway as a middleware between IoT and the blockchain network in which the control operations and security of the network in a largescale are ensured. The proposed network model has evaluated and compared to the Core network. The results ensure the given priority in TX validation is more delay sensitive than the existing technique to provide quality of service of the network.

ACS Style

A. S. M. Sanwar Hosen; Saurabh Singh; Pradip Kumar Sharma; Uttam Ghosh; Jin Wang; In-Ho Ra; Gi Hwan Cho. Blockchain-Based Transaction Validation Protocol for a Secure Distributed IoT Network. IEEE Access 2020, 8, 117266 -117277.

AMA Style

A. S. M. Sanwar Hosen, Saurabh Singh, Pradip Kumar Sharma, Uttam Ghosh, Jin Wang, In-Ho Ra, Gi Hwan Cho. Blockchain-Based Transaction Validation Protocol for a Secure Distributed IoT Network. IEEE Access. 2020; 8 ():117266-117277.

Chicago/Turabian Style

A. S. M. Sanwar Hosen; Saurabh Singh; Pradip Kumar Sharma; Uttam Ghosh; Jin Wang; In-Ho Ra; Gi Hwan Cho. 2020. "Blockchain-Based Transaction Validation Protocol for a Secure Distributed IoT Network." IEEE Access 8, no. : 117266-117277.

Special issue paper
Published: 12 June 2020 in Journal of Real-Time Image Processing
Reads 0
Downloads 0

The human population is growing at a very rapid scale. With this progressive growth, it is extremely important to ensure that healthy food is available for the survival of the inhabitants of this planet. Also, the economy of developing countries is highly dependent on agricultural production. The overall economic balance gets affected if there is a variance in the demand and supply of food or agricultural products. Diseases in plants are a great threat to the yield of the crops thereby causing famines and economy slow down. Our present study focuses on applying machine learning model for classifying tomato disease image dataset to proactively take necessary steps to combat such agricultural crisis. In this work, the dataset is collected from publicly available plant–village dataset. The significant features are extracted from the dataset using the hybrid-principal component analysis–Whale optimization algorithm. Further the extracted data are fed into a deep neural network for classification of tomato diseases. The proposed model is then evaluated with the classical machine learning techniques to establish the superiority in terms of accuracy and loss rate metrics.

ACS Style

Thippa Reddy Gadekallu; Dharmendra Singh Rajput; M. Praveen Kumar Reddy; Kuruva Lakshmanna; Sweta Bhattacharya; Saurabh Singh; Alireza Jolfaei; Mamoun Alazab. A novel PCA–whale optimization-based deep neural network model for classification of tomato plant diseases using GPU. Journal of Real-Time Image Processing 2020, 18, 1383 -1396.

AMA Style

Thippa Reddy Gadekallu, Dharmendra Singh Rajput, M. Praveen Kumar Reddy, Kuruva Lakshmanna, Sweta Bhattacharya, Saurabh Singh, Alireza Jolfaei, Mamoun Alazab. A novel PCA–whale optimization-based deep neural network model for classification of tomato plant diseases using GPU. Journal of Real-Time Image Processing. 2020; 18 (4):1383-1396.

Chicago/Turabian Style

Thippa Reddy Gadekallu; Dharmendra Singh Rajput; M. Praveen Kumar Reddy; Kuruva Lakshmanna; Sweta Bhattacharya; Saurabh Singh; Alireza Jolfaei; Mamoun Alazab. 2020. "A novel PCA–whale optimization-based deep neural network model for classification of tomato plant diseases using GPU." Journal of Real-Time Image Processing 18, no. 4: 1383-1396.

Journal article
Published: 12 June 2020 in Sensors
Reads 0
Downloads 0

Traditional systems of handwriting recognition have relied on handcrafted features and a large amount of prior knowledge. Training an Optical character recognition (OCR) system based on these prerequisites is a challenging task. Research in the handwriting recognition field is focused around deep learning techniques and has achieved breakthrough performance in the last few years. Still, the rapid growth in the amount of handwritten data and the availability of massive processing power demands improvement in recognition accuracy and deserves further investigation. Convolutional neural networks (CNNs) are very effective in perceiving the structure of handwritten characters/words in ways that help in automatic extraction of distinct features and make CNN the most suitable approach for solving handwriting recognition problems. Our aim in the proposed work is to explore the various design options like number of layers, stride size, receptive field, kernel size, padding and dilution for CNN-based handwritten digit recognition. In addition, we aim to evaluate various SGD optimization algorithms in improving the performance of handwritten digit recognition. A network’s recognition accuracy increases by incorporating ensemble architecture. Here, our objective is to achieve comparable accuracy by using a pure CNN architecture without ensemble architecture, as ensemble architectures introduce increased computational cost and high testing complexity. Thus, a CNN architecture is proposed in order to achieve accuracy even better than that of ensemble architectures, along with reduced operational complexity and cost. Moreover, we also present an appropriate combination of learning parameters in designing a CNN that leads us to reach a new absolute record in classifying MNIST handwritten digits. We carried out extensive experiments and achieved a recognition accuracy of 99.87% for a MNIST dataset.

ACS Style

Savita Ahlawat; Amit Choudhary; Anand Nayyar; Saurabh Singh; ByungUn Yoon. Improved Handwritten Digit Recognition Using Convolutional Neural Networks (CNN). Sensors 2020, 20, 3344 .

AMA Style

Savita Ahlawat, Amit Choudhary, Anand Nayyar, Saurabh Singh, ByungUn Yoon. Improved Handwritten Digit Recognition Using Convolutional Neural Networks (CNN). Sensors. 2020; 20 (12):3344.

Chicago/Turabian Style

Savita Ahlawat; Amit Choudhary; Anand Nayyar; Saurabh Singh; ByungUn Yoon. 2020. "Improved Handwritten Digit Recognition Using Convolutional Neural Networks (CNN)." Sensors 20, no. 12: 3344.

Journal article
Published: 19 May 2020 in Sensors
Reads 0
Downloads 0

In healthcare, interoperability is widely adopted in the case of cross-departmental or specialization cases. As the human body demands multiple specialized and cross-disciplined medical experiments, interoperability of business entities like different departments, different specializations, the involvement of legal and government monitoring issues etc. are not sufficient to reduce the active medical cases. A patient-centric system with high capability to collect, retrieve, store or exchange data is the demand for present and future times. Such data-centric health processes would bring automated patient medication, or patient self-driven trusted and high satisfaction capabilities. However, data-centric processes are having a huge set of challenges such as security, technology, governance, adoption, deployment, integration etc. This work has explored the feasibility to integrate resource-constrained devices-based wearable kidney systems in the Industry 4.0 network and facilitates data collection, liquidity, storage, retrieval and exchange systems. Thereafter, a Healthcare 4.0 processes-based wearable kidney system is proposed that is having the blockchain technology advantages. Further, game theory-based consensus algorithms are proposed for resource-constrained devices in the kidney system. The overall system design would bring an example for the transition from the specialization or departmental-centric approach to data and patient-centric approach that would bring more transparency, trust and healthy practices in the healthcare sector. Results show a variation of 0.10 million GH/s to 0.18 million GH/s hash rate for the proposed approach. The chances of a majority attack in the proposed scheme are statistically proved to be minimum. Further Average Packet Delivery Rate (ADPR) lies between 95% to 97%, approximately, without the presence of outliers. In the presence of outliers, network performance decreases below 80% APDR (to a minimum of 41.3%) and this indicates that there are outliers present in the network. Simulation results show that the Average Throughput (AT) value lies between 120 Kbps to 250 Kbps.

ACS Style

Adarsh Kumar; Deepak Kumar Sharma; Anand Nayyar; Saurabh Singh; ByungUn Yoon. Lightweight Proof of Game (LPoG): A Proof of Work (PoW)’s Extended Lightweight Consensus Algorithm for Wearable Kidneys. Sensors 2020, 20, 2868 .

AMA Style

Adarsh Kumar, Deepak Kumar Sharma, Anand Nayyar, Saurabh Singh, ByungUn Yoon. Lightweight Proof of Game (LPoG): A Proof of Work (PoW)’s Extended Lightweight Consensus Algorithm for Wearable Kidneys. Sensors. 2020; 20 (10):2868.

Chicago/Turabian Style

Adarsh Kumar; Deepak Kumar Sharma; Anand Nayyar; Saurabh Singh; ByungUn Yoon. 2020. "Lightweight Proof of Game (LPoG): A Proof of Work (PoW)’s Extended Lightweight Consensus Algorithm for Wearable Kidneys." Sensors 20, no. 10: 2868.

Journal article
Published: 24 April 2020 in Electronics
Reads 0
Downloads 0

The enormous growth in internet usage has led to the development of different malicious software posing serious threats to computer security. The various computational activities carried out over the network have huge chances to be tampered and manipulated and this necessitates the emergence of efficient intrusion detection systems. The network attacks are also dynamic in nature, something which increases the importance of developing appropriate models for classification and predictions. Machine learning (ML) and deep learning algorithms have been prevalent choices in the analysis of intrusion detection systems (IDS) datasets. The issues pertaining to quality and quality of data and the handling of high dimensional data is managed by the use of nature inspired algorithms. The present study uses a NSL-KDD and KDD Cup 99 dataset collected from the Kaggle repository. The dataset was cleansed using the min-max normalization technique and passed through the 1-N encoding method for achieving homogeneity. A spider monkey optimization (SMO) algorithm was used for dimensionality reduction and the reduced dataset was fed into a deep neural network (DNN). The SMO based DNN model generated classification results with 99.4% and 92% accuracy, 99.5%and 92.7% of precision, 99.5% and 92.8% of recall and 99.6%and 92.7% of F1-score, utilizing minimal training time. The model was further compared with principal component analysis (PCA)-based DNN and the classical DNN models, wherein the results justified the advantage of implementing the proposed model over other approaches.

ACS Style

Neelu Khare; Preethi Devan; Chiranji Chowdhary; Sweta Bhattacharya; Geeta Singh; Saurabh Singh; ByungUn Yoon. SMO-DNN: Spider Monkey Optimization and Deep Neural Network Hybrid Classifier Model for Intrusion Detection. Electronics 2020, 9, 692 .

AMA Style

Neelu Khare, Preethi Devan, Chiranji Chowdhary, Sweta Bhattacharya, Geeta Singh, Saurabh Singh, ByungUn Yoon. SMO-DNN: Spider Monkey Optimization and Deep Neural Network Hybrid Classifier Model for Intrusion Detection. Electronics. 2020; 9 (4):692.

Chicago/Turabian Style

Neelu Khare; Preethi Devan; Chiranji Chowdhary; Sweta Bhattacharya; Geeta Singh; Saurabh Singh; ByungUn Yoon. 2020. "SMO-DNN: Spider Monkey Optimization and Deep Neural Network Hybrid Classifier Model for Intrusion Detection." Electronics 9, no. 4: 692.

Original research
Published: 24 April 2020 in Journal of Ambient Intelligence and Humanized Computing
Reads 0
Downloads 0

Diabetic retinopathy is a prominent cause of blindness among elderly people and has become a global medical problem over the last few decades. There are several scientific and medical approaches to screen and detect this disease, but most of the detection is done using retinal fungal imaging. The present study uses principal component analysis based deep neural network model using Grey Wolf Optimization (GWO) algorithm to classify the extracted features of diabetic retinopathy dataset. The use of GWO enables to choose optimal parameters for training the DNN model. The steps involved in this paper include standardization of the diabetic retinopathy dataset using a standardscaler normalization method, followed by dimensionality reduction using PCA, then choosing of optimal hyper parameters by GWO and finally training of the dataset using a DNN model. The proposed model is evaluated based on the performance measures namely accuracy, recall, sensitivity and specificity. The model is further compared with the traditional machine learning algorithms—support vector machine (SVM), Naive Bayes Classifier, Decision Tree and XGBoost. The results show that the proposed model offers better performance compared to the aforementioned algorithms.

ACS Style

Thippa Reddy Gadekallu; Neelu Khare; Sweta Bhattacharya; Saurabh Singh; Praveen Kumar Reddy Maddikunta; Gautam Srivastava. Deep neural networks to predict diabetic retinopathy. Journal of Ambient Intelligence and Humanized Computing 2020, 1 -14.

AMA Style

Thippa Reddy Gadekallu, Neelu Khare, Sweta Bhattacharya, Saurabh Singh, Praveen Kumar Reddy Maddikunta, Gautam Srivastava. Deep neural networks to predict diabetic retinopathy. Journal of Ambient Intelligence and Humanized Computing. 2020; ():1-14.

Chicago/Turabian Style

Thippa Reddy Gadekallu; Neelu Khare; Sweta Bhattacharya; Saurabh Singh; Praveen Kumar Reddy Maddikunta; Gautam Srivastava. 2020. "Deep neural networks to predict diabetic retinopathy." Journal of Ambient Intelligence and Humanized Computing , no. : 1-14.

Journal article
Published: 05 February 2020 in Electronics
Reads 0
Downloads 0

Diabetic Retinopathy is a major cause of vision loss and blindness affecting millions of people across the globe. Although there are established screening methods - fluorescein angiography and optical coherence tomography for detection of the disease but in majority of the cases, the patients remain ignorant and fail to undertake such tests at an appropriate time. The early detection of the disease plays an extremely important role in preventing vision loss which is the consequence of diabetes mellitus remaining untreated among patients for a prolonged time period. Various machine learning and deep learning approaches have been implemented on diabetic retinopathy dataset for classification and prediction of the disease but majority of them have neglected the aspect of data pre-processing and dimensionality reduction, leading to biased results. The dataset used in the present study is a diabetes retinopathy dataset collected from the UCI machine learning repository. At its inceptions, the raw dataset is normalized using the Standardscalar technique and then Principal Component Analysis (PCA) is used to extract the most significant features in the dataset. Further, Firefly algorithm is implemented for dimensionality reduction. This reduced dataset is fed into a Deep Neural Network Model for classification. The results generated from the model is evaluated against the prevalent machine learning models and the results justify the superiority of the proposed model in terms of Accuracy, Precision, Recall, Sensitivity and Specificity.

ACS Style

Thippa Reddy Gadekallu; Neelu Khare; Sweta Bhattacharya; Saurabh Singh; Praveen Kumar Reddy Maddikunta; In-Ho Ra; Mamoun Alazab. Early Detection of Diabetic Retinopathy Using PCA-Firefly Based Deep Learning Model. Electronics 2020, 9, 274 .

AMA Style

Thippa Reddy Gadekallu, Neelu Khare, Sweta Bhattacharya, Saurabh Singh, Praveen Kumar Reddy Maddikunta, In-Ho Ra, Mamoun Alazab. Early Detection of Diabetic Retinopathy Using PCA-Firefly Based Deep Learning Model. Electronics. 2020; 9 (2):274.

Chicago/Turabian Style

Thippa Reddy Gadekallu; Neelu Khare; Sweta Bhattacharya; Saurabh Singh; Praveen Kumar Reddy Maddikunta; In-Ho Ra; Mamoun Alazab. 2020. "Early Detection of Diabetic Retinopathy Using PCA-Firefly Based Deep Learning Model." Electronics 9, no. 2: 274.

Journal article
Published: 27 January 2020 in Electronics
Reads 0
Downloads 0

The enormous popularity of the internet across all spheres of human life has introduced various risks of malicious attacks in the network. The activities performed over the network could be effortlessly proliferated, which has led to the emergence of intrusion detection systems. The patterns of the attacks are also dynamic, which necessitates efficient classification and prediction of cyber attacks. In this paper we propose a hybrid principal component analysis (PCA)-firefly based machine learning model to classify intrusion detection system (IDS) datasets. The dataset used in the study is collected from Kaggle. The model first performs One-Hot encoding for the transformation of the IDS datasets. The hybrid PCA-firefly algorithm is then used for dimensionality reduction. The XGBoost algorithm is implemented on the reduced dataset for classification. A comprehensive evaluation of the model is conducted with the state of the art machine learning approaches to justify the superiority of our proposed approach. The experimental results confirm the fact that the proposed model performs better than the existing machine learning models.

ACS Style

Sweta Bhattacharya; Siva Rama Krishnan S; Praveen Kumar Reddy Maddikunta; Rajesh Kaluri; Saurabh Singh; Thippa Reddy Gadekallu; Mamoun Alazab; Usman Tariq. A Novel PCA-Firefly Based XGBoost Classification Model for Intrusion Detection in Networks Using GPU. Electronics 2020, 9, 219 .

AMA Style

Sweta Bhattacharya, Siva Rama Krishnan S, Praveen Kumar Reddy Maddikunta, Rajesh Kaluri, Saurabh Singh, Thippa Reddy Gadekallu, Mamoun Alazab, Usman Tariq. A Novel PCA-Firefly Based XGBoost Classification Model for Intrusion Detection in Networks Using GPU. Electronics. 2020; 9 (2):219.

Chicago/Turabian Style

Sweta Bhattacharya; Siva Rama Krishnan S; Praveen Kumar Reddy Maddikunta; Rajesh Kaluri; Saurabh Singh; Thippa Reddy Gadekallu; Mamoun Alazab; Usman Tariq. 2020. "A Novel PCA-Firefly Based XGBoost Classification Model for Intrusion Detection in Networks Using GPU." Electronics 9, no. 2: 219.

Journal article
Published: 24 October 2019 in IEEE Internet of Things Journal
Reads 0
Downloads 0
ACS Style

Wenjun Li; Huayi Xu; Huixi Li; Yongjie Yang; Pradip Kumar Sharma; Jin Wang; Saurabh Singh. Complexity and Algorithms for Superposed Data Uploading Problem in Networks With Smart Devices. IEEE Internet of Things Journal 2019, 7, 5882 -5891.

AMA Style

Wenjun Li, Huayi Xu, Huixi Li, Yongjie Yang, Pradip Kumar Sharma, Jin Wang, Saurabh Singh. Complexity and Algorithms for Superposed Data Uploading Problem in Networks With Smart Devices. IEEE Internet of Things Journal. 2019; 7 (7):5882-5891.

Chicago/Turabian Style

Wenjun Li; Huayi Xu; Huixi Li; Yongjie Yang; Pradip Kumar Sharma; Jin Wang; Saurabh Singh. 2019. "Complexity and Algorithms for Superposed Data Uploading Problem in Networks With Smart Devices." IEEE Internet of Things Journal 7, no. 7: 5882-5891.

Journal article
Published: 10 October 2019 in IEEE Transactions on Network and Service Management
Reads 0
Downloads 0
ACS Style

A. S. M. Sanwar Hosen; Saurabh Singh; Pradip Kumar Sharma; Sazzadur Rahman; In-Ho Ra; Gi Hwan Cho; Deepak Puthal. A QoS-Aware Data Collection Protocol for LLNs in Fog-Enabled Internet of Things. IEEE Transactions on Network and Service Management 2019, 17, 430 -444.

AMA Style

A. S. M. Sanwar Hosen, Saurabh Singh, Pradip Kumar Sharma, Sazzadur Rahman, In-Ho Ra, Gi Hwan Cho, Deepak Puthal. A QoS-Aware Data Collection Protocol for LLNs in Fog-Enabled Internet of Things. IEEE Transactions on Network and Service Management. 2019; 17 (1):430-444.

Chicago/Turabian Style

A. S. M. Sanwar Hosen; Saurabh Singh; Pradip Kumar Sharma; Sazzadur Rahman; In-Ho Ra; Gi Hwan Cho; Deepak Puthal. 2019. "A QoS-Aware Data Collection Protocol for LLNs in Fog-Enabled Internet of Things." IEEE Transactions on Network and Service Management 17, no. 1: 430-444.

Research article
Published: 23 April 2019 in International Journal of Distributed Sensor Networks
Reads 0
Downloads 0

The growing demand for human-independent comfortable lifestyle has emboldened the development of smart home. A typical keenly intellective home includes many Internet of things contrivances that engender processes and immensely colossal data to efficiently handle its users’ demands. This incrementing demand raises a plethora of concern cognate to a smart home system in terms of scalability, efficiency, and security. All these issues are tedious to manage, and the existing studies lack the granularity for surmounting them. Considering such a requisite of security and efficiency as a quandary at hand, this article presents a secure and efficient smart home architecture, which incorporates the blockchain and the cloud computing technologies for a cumulated solution. Because of the decentralized nature of blockchain technology, it can serve the processing services and make the transaction copy of the collected sensible user data from smart home. To ensure the security of smart home network, our proposed model utilizes the multivariate correlation analysis technique to analyze the network traffic and identify the correlation between traffic features. We have evaluated the performance of our proposed architecture using different parameters like throughput and discovered that blockchain is an efficient security solution for the future Internet of things network.

ACS Style

Saurabh Singh; In-Ho Ra; Weizhi Meng; Maninder Kaur; Gi Hwan Cho. SH-BlockCC: A secure and efficient Internet of things smart home architecture based on cloud computing and blockchain technology. International Journal of Distributed Sensor Networks 2019, 15, 1 .

AMA Style

Saurabh Singh, In-Ho Ra, Weizhi Meng, Maninder Kaur, Gi Hwan Cho. SH-BlockCC: A secure and efficient Internet of things smart home architecture based on cloud computing and blockchain technology. International Journal of Distributed Sensor Networks. 2019; 15 (4):1.

Chicago/Turabian Style

Saurabh Singh; In-Ho Ra; Weizhi Meng; Maninder Kaur; Gi Hwan Cho. 2019. "SH-BlockCC: A secure and efficient Internet of things smart home architecture based on cloud computing and blockchain technology." International Journal of Distributed Sensor Networks 15, no. 4: 1.

Journal article
Published: 21 March 2019 in IEEE Access
Reads 0
Downloads 0

The idea of the Internet of Things (IoT) was developed in parallel to wireless sensor networks (WSNs). In a mobile WSN, a sensor node is generally assumed to move around randomly in direction and speed. Thus, a random waypoint is commonly used for node mobility modeling. Unfortunately, it does not consider the overlapping sensing coverage (OSC) at continuous moves or at a given time of sensor nodes. On top of this, there is no security mechanism to authenticate the sensor nodes and their privacy. Thus, results in a higher probability of occurring OSC in the network and the threats to the network through inside and outside attackers. To resolve these issues in 3D WSNs, this paper proposes a secure and privacy-preserving node mobility model that the nodes take part in periodic rounds securely. An ID-based authentication mechanism for joining nodes in the network and detection of the malicious node based on their survival strategies are proposed in the model. Moreover, the decision making of a next destination during a pause time in a round is three-folded. First, a set of member nodes are elected. Then, all nodes predict their prospective destinations randomly and the member nodes broadcast their prospective destinations' information to neighbors. Finally, the neighborhood nodes adjust their prospective destinations considering the broadcasted information, in order to reduce the OSC in the network. The simulation experiments show that the proposed model reduces the OSC in the network and detects the malicious nodes, thus, results in a higher effective sensing coverage rate of the network in a secure way.

ACS Style

A. S. M. Sanwar Hosen; Saurabh Singh; Vinayagam Mariappan; Maninder Kaur; Gi Hwan Cho. A Secure and Privacy Preserving Partial Deterministic RWP Model to Reduce Overlapping in IoT Sensing Environment. IEEE Access 2019, 7, 39702 -39716.

AMA Style

A. S. M. Sanwar Hosen, Saurabh Singh, Vinayagam Mariappan, Maninder Kaur, Gi Hwan Cho. A Secure and Privacy Preserving Partial Deterministic RWP Model to Reduce Overlapping in IoT Sensing Environment. IEEE Access. 2019; 7 ():39702-39716.

Chicago/Turabian Style

A. S. M. Sanwar Hosen; Saurabh Singh; Vinayagam Mariappan; Maninder Kaur; Gi Hwan Cho. 2019. "A Secure and Privacy Preserving Partial Deterministic RWP Model to Reduce Overlapping in IoT Sensing Environment." IEEE Access 7, no. : 39702-39716.