Computer Science And Technology - Research Publications
Permanent URI for this collectionhttps://kr.cup.edu.in/handle/32116/82
Browse
Item 3D modelling and visualization for Vision-based Vibration Signal Processing and Measurement(De Gruyter Open Ltd, 2021-04-10T00:00:00) Yao, Qi; Shabaz, Mohammad; Singh, Raj Karan; Lohani, Tarun Kumar; Bhatt, Mohammed Wasim; Panesar, Gurpreet SinghWith the technological evolutionary advent, a vision-based approach presents the remote measuring approach for the analysis of vibration. The structure vibration test and model parameter identification in the detection of the structure of the bridge evaluation occupies the important position. The bridge structure to operate safely and reliably is ensured, according to the geological data of qixiashan lead-zinc mine and engineering actual situation, with the aid of international mining software Surpac. To build the 3D visualization model of the application of visualization in mine production are discussed. The results show that the final solid model of -425 stope can accurately display the spatial form of each layer of stope through rotation, amplification and movement. The proposed system is effectually able to perform cutting, volume calculation and roaming in any direction, which has certain guiding significance for mine production management. An accuracy value of 98.75%, the sensitivity of 99%, specificity of 99.64% and PPV of 99.89% are achieved using the proposed 3D modelling and visualization algorithm for vibration signal processing and management. � 2021 Q. Yao et al., published by De Gruyter.Item Advance Stable Election Protocol in Wireless Sensor Network(Excel India Publishers, 2014) Singh, Mandeep; Sidhu, NavjotWireless sensor network (WSN) is an emerging research field. There are large numbers of sensors that collect and send data to base station. Saving energy by using various routing techniques is a challenge. Clustering is main technique used for this. Various protocols like SEP (Stable Election Protocol) and ESEP (Extended Stable Election Protocol) are clustering based heterogeneous aware protocols. In this paper, a new protocol ASEP (Advance Stable Election Protocol) has been proposed based on SEP. This is based on changing more efficiently and dynamically the cluster head election probability. Performance of this protocol has been evaluated in MATLAB and graphical results have been shown. The performance of ASEP is better than SEP in form of first node dies and total number of packets delivered.Item Advances and challenges in thyroid cancer: The interplay of genetic modulators, targeted therapies, and AI-driven approaches(Elsevier Inc., 2023-09-20T00:00:00) Bhattacharya, Srinjan; Mahato, Rahul Kumar; Singh, Satwinder; Bhatti, Gurjit Kaur; Mastana, Sarabjit Singh; Bhatti, Jasvinder SinghThyroid cancer continues to exhibit a rising incidence globally, predominantly affecting women. Despite stable mortality rates, the unique characteristics of thyroid carcinoma warrant a distinct approach. Differentiated thyroid cancer, comprising most cases, is effectively managed through standard treatments such as thyroidectomy and radioiodine therapy. However, rarer variants, including anaplastic thyroid carcinoma, necessitate specialized interventions, often employing targeted therapies. Although these drugs focus on symptom management, they are not curative. This review delves into the fundamental modulators of thyroid cancers, encompassing genetic, epigenetic, and non-coding RNA factors while exploring their intricate interplay and influence. Epigenetic modifications directly affect the expression of causal genes, while long non-coding RNAs impact the function and expression of micro-RNAs, culminating in tumorigenesis. Additionally, this article provides a concise overview of the advantages and disadvantages associated with pharmacological and non-pharmacological therapeutic interventions in thyroid cancer. Furthermore, with technological advancements, integrating modern software and computing into healthcare and medical practices has become increasingly prevalent. Artificial intelligence and machine learning techniques hold the potential to predict treatment outcomes, analyze data, and develop personalized therapeutic approaches catering to patient specificity. In thyroid cancer, cutting-edge machine learning and deep learning technologies analyze factors such as ultrasonography results for tumor textures and biopsy samples from fine needle aspirations, paving the way for a more accurate and effective therapeutic landscape in the near future. � 2023 The Author(s)Item Analysis of MAODV MANET routing protocol on different mobility models(Institute of Electrical and Electronics Engineers Inc., 2018) Kaur, N.MANET is an infrastructure less network. It has various applications in fields like military, commercial sector, local level, etc. MANET is easy to setup, configure and build. Due to infrastructure less network, MANET relies on the cooperation of nodes for communication. In MANET, communication between the nodes is possible with the help of protocols which are categorized into three types like reactive, proactive and hybrid. In the reactive category of protocol, a path is established at the real time. In the proactive category, protocols have predefined routing tables for each node. The hybrid category is the combination of both these categories. Modified Ad-hoc On-Demand Distance Vector (MAODV) is the reactive protocol which defines all possible routes among source node and the destination node which are maintained during data transmission. In the case of a disappointment of definite route, the data packets are transferred through secondary routes which were established with the help of routing tables that stores the multiple route paths to the destination. In the real world, nodes in MANET are allowed to transfer in any direction. Thus movement models like Random Way Point (RWP), Random Walk (RW), Random Direction (RD) and Gauss-Markov (GM) are taken as they have different mobility patterns. Therefore, M-AODV is implemented over different mobility models to compute parameters like End to End Delay (E2E Delay), Average Hop Count (AHC), and Throughput. ? 2017 IEEE.Item Analysis of VANET geographic routing protocols on real city map(Institute of Electrical and Electronics Engineers Inc., 2018) Kaur, H.; MeenakshiVehicular Ad hoc network (VANET) is an escalating field of research and laid basis for many newer technologies like Intelligent Transport Systems (ITS). Routing in VANETs plays crucial role in performance of networks. VANET protocols are classified as topology based and position based protocols. Research showed that position based protocols are more suited to VANETs as compared to topology based protocols because geographic routing does not involve an overhead and delay of maintaining routing tables instead geographic position of nodes is used for routing which can be obtained by Global Positioning System (GPS) device on vehicles. In this paper, two geographic routing protocols Anchor based Street and Traffic Aware Routing (A-STAR) and Greedy Perimeter Stateless Routing (GPSR) protocols are evaluated on real city map. Simulation of VANETs on real map scenarios provide accurate results and also useful to design and deploy VANETs in real world. Real world mobility model is important because it reflects real-world performance of protocols considered. Analysis of performance is carried in terms of throughput, packet delivery ratio, packet loss and average delay. Simulation of protocols is carried by varying density of nodes. A-STAR showed better performance on real city map over GPSR because A-STAR adopted Street awareness method of routing whereas GPSR works on Greedy forwarding and Routing around the perimeter methods. ? 2017 IEEE.Item Analysis of Virtualization: Vulnerabilities and attacks over virtualized cloud computing(IASIR, 2014) Kanika; Sidhu, NavjotCloud computing is the fastest growing technology in the IT world. The technology offers reduced IT costs and provides on the demand services to the individual users as well as organizations over the internet. The means of cloud computing is obtained by the virtualization of the resources such as hardware, platform, operating system and storages devices. Virtualization permits multiple operating systems to run on the same physical machine. Multiple tenants are unaware of the presence of the other tenant with whom they are sharing the resources. The co-existence of multiple virtual machines can be exploited to gain the access over other tenant's data or attack to deny of services. The significant concern is insuring the security and providing isolation between multiple operating systems. The paper explores various kinds of vulnerabilities and attacks associated with the virtualization.Item Analysis of Wormhole Attack on AODV and DSR Protocols Over Live Network Data(Springer, 2020) Mishra H.K.; Mittal M.Wireless ad hoc networks due to their open deployment architecture, are highly exposed to many security compromising attacks. These attacks can cause a lot of damage to privacy, security, and robustness of networks. The wormhole attack is believed to be one of the malicious attacks to detect as it can be performed without breaching any key or breaking any cipher in any wireless ad hoc network. A wormhole attack form a tunnel in the network using two or more malicious nodes to replay the data stealthily from one malicious node to other malicious end nodes in same or different network. In this way, the ad hoc networks are exploited by the attacker by either using the flaws in protocol design or in network architecture. So, there is requirement of security methods to make MANET routing protocols thwarting wormhole attack. In this research work, the wormhole attack has been performed over AODV and DSR protocols using the real-time live data introduced in simulator. The prevention technique was noted to successfully handling the attack by restoring the performance of network and alleviates the effect of attack from the network.Item Analyze Dark Web and Security Threats(Springer Science and Business Media Deutschland GmbH, 2023-05-03T00:00:00) Ansh, Samar; Singh, SatwinderThe deepest area of data storage where data mining and data management are not possible without the Tor (network) Policy is known as the dark web. The dark web is a paradise for government and private sponsored cybercriminals. In another word, the dark web is known as the underworld of the Internet used for sponsored and organized cybercrime. Tor network at the entry relay/guard user source IP replaced with local IP (i.e., 10.0.2.15) by default and every user machine ID (IP) recognize as local IP (10.0.2.15). A single source IP allocated for each user without collision makes the user an anomaly or invisible over the Internet. Tor browser works similar to VPN by default as a function to hide the source IP, but the advantage is Tor network�s volunteer devices are used as a tunnel to establish communication and offer freedom from surveillance of user activity. Tor browser offers a circuit (IP Route) for user activity, where the circuit allows available Tor IP at the exit relay for the user. The dark web uses the same IP at entry relay around the world, but at exit relay, IP is different and available based on country. In a dark web network, data transfer as an encapsulation of packet/massage is placed after three-layer of different encryption. Proposed six different machine-learning classifiers (Logistic Regression, Random Forest, Gradient Boosting, Ada Boosts, K-Nearest Neighbors, Decision Tree) used to the optimal solution and proceed to analyze security threats perform in the Dark web based on the communication protocol and user activity as data flow and active state. � 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.Item AODV Based Congestion Control Protocols: Review(IJCSIT, 2014) Bhatia, Bandana; Sood, NehaAd- hoc network is defined as the network in which the users communicate with each other by forming a temporary network without any centralized administration. Here each node acts both as a host as well as a router. They have highly deployable, dynamic and self-configurable topologies. Various routing protocols are defined for MANETs. These protocols may follow proactive, reactive or hybrid approach. Due to many number of nodes transmitting packets over the network, the chances of losing the packets over the network increases to a great extent. Also, with the increase in size of data packets, the congestion over the network increases which may lead to packet loses. The existing routing protocols for MANETs do not support congestion control as they are not congestion adaptive. There are many proposed protocols that are congestion adaptive and deals with the congestion over the network. This paper discusses congestion control protocols in MANETs. Also, three congestion control protocols, EDAODV, AODV-I and CRP, are discussed. Simulation results are gathered for AODV by varying the number of nodes and size of the data packets for four performance metrics, namely, throughput, routing overhead, packet delivery ratio and end-to-end delay.Item Application of Image Processing using Computer in Detection of Defective Printed Circuit Boards(International Journal of Advanced Research in Computer Science, 2017) Kaur, Beant; Kaur, Amandeep; Kaur, GurmeetIn these days, image processing has been used in many applications. It is the process for performing the different operations on the images using various functions. In this paper, image processing is used for finding the defective printed circuit boards. For finding the defective printed circuit boards, the parameters like entropy, standard deviation and euler number has been used. The result shows the effectiveness of the proposed method.Item Application of UAV tilt photogrammetry in 3D modeling of ancient buildings(Springer, 2021-11-03T00:00:00) Guo, Qiu; Liu, Hechun; Hassan, Faez M.; Bhatt, Mohammed Wasim; Buttar, Ahmed MateenThe initiation of photogrammetry that arrived in late 90�s permitted the 3D stereoscopic vision for the acquirement of information. A number of methodologies were embraced by several researchers to discover the innumerable aspects of photogrammetry, digital photography and image processing. Among these technologies UAV addressed tools were also employed in fast capturing of substantial areas in the efficient time slot this method was used by conventional aircrafts for efficient capturing. The expansion of unmanned aerial vehicles (UAV) in various fields has expanded comprehensively towards 3D modeling of ancient buildings. This expansion leads to the burden of obtaining highly precise information at multi-angle level and it becomes difficult for traditional technology to solve the 3D reconstruction problems of ancient buildings. To solve the problem of high precision 3D information acquisition and multi-angle real texture feature acquisition, this article proposes a new method of 3D reconstruction of ancient buildings combined with 3D laser scanning and tilt photogrammetry. The new method modifies the advantages of the two technologies and uses the feature point matching algorithm to realize the accurate fusion of multisource data, to gather the construction of a complete three-dimensional model inside and outside the ancient building. Considering the traditional ancestral hall of China as an example, the relative median error is computed for the constructed3D model, which is found to be minimized to 5�mm. The modeling efficiency is greatly improved by the proposed method when compared with the traditional method. The accuracy is relatively high and meets the requirements of modeling accuracy. Because the 3D model, elevation data of ancient buildings constructed in this study are derived from high precision point cloud data extraction. The accuracy of the model can also reach the millimeter level from the calculation results of error and relative middle error. Therefore, the 3D model constructed in this study has a high accuracy. It is revealed that this method provides significant technical support for the restoration and protection of ancient architectural cultural heritage. � 2021, The Society for Reliability Engineering, Quality and Operations Management (SREQOM), India and The Division of Operation and Maintenance, Lulea University of Technology, Sweden.Item Behavior analysis of LEACH protocol(Institute of Electrical and Electronics Engineers Inc., 2014) Maurya, Prashant; Kaur, Amanpreet; Choudhary, RohitA wireless sensor network (WSN) is an emerging field comprising of sensor nodes with limited resources like power, memory etc. It is used to monitor the remote areas where recharging or replacing the battery power of sensor nodes is not possible. So, energy is a most challenging issue in case of WSN. Low-Energy Adaptive Clustering Hierarchy (LEACH) is the first significant protocol which consumes less amount of energy while routing the data to the base station. In this paper LEACH protocol has been analyzed with different percentage of cluster heads at different locations of base station in the network. ? 2014 IEEE.Item Bell�s inequality with biased experimental settings(Springer, 2022-04-30T00:00:00) Singh, Parvinder; Faujdar, Jyoti; Sarkar, Maitreyee; Kumar, AtulWe analyse the efficiency of nonlocal correlations in comparison with classical correlations under biased experimental set-up, e.g. for a nonlocal game or a class of Bell-CHSH inequality where both Alice and Bob choose their measurements with a certain bias. We demonstrate that the quantum theory offers advantages over classical theory for the whole range of biasing parameters except for the limiting cases. Moreover, by using fine-grained uncertainty relations to distinguish between classical, quantum and superquantum correlations, we further confirm the underlined advantage of quantum correlations over classical correlations. Our results clearly show that all pure bi-partite entangled states violate the Bell-CHSH inequality under biased set-up. Although for the two-qubit mixed Werner state, the Horodecki state and a state proposed by Ma et al. (Phys Lett A 379:2802, 2015) the range of violation is same in both biased and unbiased scenarios, the extent of violation is different in both cases. We extend our analysis to detect nonlocal correlations using quantum Fisher information and demonstrate a necessary condition for capturing nonlocality in biased scenario. Furthermore, we also describe properties of nonlocal correlations under noisy conditions considering a biased experimental set-up. � 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.Item Bug Classification Depend Upon Refactoring Area of Code(Springer, 2023-01-05T00:00:00) Singh, Satwinder; Jalal, Maddassar; Kaur, SharanpreetDue to rapid development in the software industry, various software is being developed, which compromises the software quality. As long as time passes, the software which compromises with the quality starts showing bugs which adversely affect the working of the software system. Sometimes software undergo repeated addition of functionality, so various classes and functions in software systems become bulky and cause smells in code. Code smells also degrade the software quality, increasing the software system's maintenance cost. So these bad smells should be appropriately detected and eliminated from the software system well on time. With the identification of smells in code, one could easily map the refactoring area of code to improve the code. Improving the code will help in preventive maintenance to identify the bugs quickly. To solve this problem, the current study will focus on the five types of code smells: Data Class, God Class, Feature Envy, Refuse Parent Request and Brain method for identification of bugs in the code. The data set used for this study includes the smell extraction and identification from two versions of Eclipse, i.e. Eclipse 3.6 and Eclipse 3.7, which are renowned open soured industry size software systems. Later on, supervised machine learning classifiers j48, Random Forest, and Naive Bayes are used to identify the bugs in code at the class level. The results show that j48 performed well and provides high accuracy for the identification of bugs with the help of bad smells. � 2023, The Institution of Engineers (India).Item Classification of Breast Cancer Mammographic Images Using A Light-Weighted Convolutional Neural Network(Institute of Electrical and Electronics Engineers Inc., 2023-04-10T00:00:00) Kaur, Palwinder; Kaur, AmandeepDeep learning is a method demanded by radiologists to assist them interpret and classify medical images correctly. A Convolutional Neural Network (CNN) is the most widely used method for classifying and analysing images. In this paper, a light-weighted CNN is presented for breast cancer classification using a dataset of breast mammography images. The suggested methodology improves the classification of mammary cancer images to assist radiologists in the detection of mammary cancer. The application of the proposed model can help in the diagnosis of mammary cancer using digital mammograms without any preceding information about the existence of a cancerous lesion. The proposed CNN can categorize the input medical images as malignant or benign with an accuracy of 99.35% which is the highest accuracy achieved for such a large mammography dataset. � 2023 IEEE.Item Classification of defective modules using object-oriented metrics(Inderscience Enterprises Ltd., 2017) Singh, Satwinder; Singla, RozySoftware defect in today's era is crucial in the field of software engineering. Most of the organisations use various techniques to predict defects in their products before they are delivered. Defect prediction techniques help the organisations to use their resources effectively which results in lower cost and time requirements. There are various techniques that are used for predicting defects in software before it has to be delivered, e.g., clustering, neural networks, support vector machine (SVM). In this paper two defect prediction techniques: K-means clustering and multi-layer perceptron model (MLP) are compared. Both the techniques are implemented on different platforms. K-means clustering is implemented using WEKA tool and MLP is implemented using SPSS. The results are compared to find which algorithm produces better results. In this paper object-oriented metrics are used for predicting defects in the software. Copyright ? 2017 Inderscience Enterprises Ltd.Item Clustering of tweets: A novel approach to label the unlabelled tweets(Springer, 2020) Jan T.G.Twitter is one of the fastest growing microblogging and online social networking site that enables users to send and receive messages in the form of tweets. Twitter is the trend of today for news analysis and discussions. That is why Twitter has become the main target of attackers and cybercriminals. These attackers not only hamper the security of Twitter but also destroy the whole trust people have on it. Hence, making Twitter platform impure by misusing it. Misuse can be in the form of hurtful gossips, cyberbullying, cyber harassment, spams, pornographic content, identity theft, common Web attacks like phishing and malware downloading, etc. Twitter world is growing fast and hence prone to spams. So, there is a need for spam detection on Twitter. Spam detection using supervised algorithms is wholly and solely based on the labelled dataset of Twitter. To label the datasets manually is costly, time-consuming and a challenging task. Also, these old labelled datasets are nowadays not available because of Twitter data publishing policies. So, there is a need to design an approach to label the tweets as spam and non-spam in order to overcome the effect of spam drift. In this paper, we downloaded the recent dataset of Twitter and prepared an unlabelled dataset of tweets from it. Later on, we applied the cluster-then-label approach to label the tweets as spam and non-spam. This labelled dataset can then be used for spam detection in Twitter and categorization of different types of spams.Item Code clone detection and analysis using software metrics and neural network: A Literature Review(Eighth Sense Research Group, 2015) Kumar, Balwinder; Singh, SatwinderCode clones are the duplicated code which degrade the software quality and hence increase the maintenance cost. Detection of clones in a large software system is very tedious tasks but it is necessary to improve the design, structure and quality of the software products. Object oriented metrics like DIT, NOC, WMC, LCOM, Cyclomatic complexity and various types of methods and variables are the good indicator of code clone. Artificial neural network has immense detection and prediction capability. In this paper, various types of metric based clone detection approach and techniques are discussed. From the discussion it is concluded that clone detection using software metrics and artificial neural network is the best technique of code clone detection, analysis and clone predictionItem Combinational feature selection approach for network intrusion detection system(Institute of Electrical and Electronics Engineers Inc., 2015) Garg, T.; Kumar, Y.In the era of digital world, the computer networks are receiving multidimensional advancements. Due to these advancements more and more services are available for malicious exploitation. New vulnerabilities are found from common programs and even on vulnerability in a single computer might compromise the network of an entire company. There are two parallel ways to address this threat. The first way is to ensure that a computer doesn't have any known security vulnerabilities, before allowing it to the network it has access rights. The other way, is to use an Intrusion Detection System. IDSs concentrate on detecting malicious network traffic, such as packets that would exploit known security vulnerability. Generally the intrusions are detected by analyzing 41 attributes from the intrusion detection dataset. In this work we tried to reduce the number of attributes by using various ranking based feature selection techniques and evaluation has been done using ten classification algorithms that I have evaluated most important. So that the intrusions can be detected accurately in short period of time. Then the combinations of the six reduced feature sets have been made using Boolean AND operator. Then their performance has been analyzed using 10 classification algorithms. Finally the top ten combinations of feature selection have been evaluated among 1585 unique combinations. Combination of Symmetric and Gain Ratio while considering top 15 attributes has highest performance. ? 2014 IEEE.Item A comparative analysis and awareness survey of phishing detection tools(Institute of Electrical and Electronics Engineers Inc., 2018) Sharma, H.; Meenakshi, E.; Bhatia, S.K.Phishing is a kind of attack in which phishers use spoofed emails and malicious websites to steal personal information of people. Nowadays various tools are freely available to detect phishing and other web-based scams, many of which are browser extensions that generate a warning whenever user browses a suspected phishing site. In this research paper, comparison of eight phishing detection tools has been done to find the best one by testing each tool on the dataset, and further an awareness survey was carried out about these tools. Dataset contains two thousand verified phishing websites reported from August 2016 to March 2017 collected from two anti-phishing platforms i.e., Anti-Phishing Working Group (APWG) and PhishTank, and 500 legitimate websites that are visited by users regularly (i.e., Citibank.com, PayPal.com, Alibaba.com, Askfm.in, etc.) to test the effectiveness of eight popular anti-phishing tools. After testing all the tools on the dataset, it is found that AntiPhishing Toolbar did a very good job at identifying 94.32 percent of phishing as well as legitimate websites from the dataset. An awareness survey has been conducted among fifty M.tech Computer Science & Technology, and Cyber Security pursuing students at Central University of Punjab. The survey revealed that approximately 61 percent respondents are completely unaware about phishing detection tools. ? 2017 IEEE.