Building a Multi-class Prediction App for Malicious URLs

Abstract:

The page that houses a malicious snippet that could misuse a user’s computing resources, steal confidential data, or carry out other forms of assaults is known as a malicious host URL. They are generally distributed across the world wide web under various usage categories like spam, malware, phishing, etc. Although numerous methods or fixes (to identify URLs) have been developed in recent years, still cyberattacks continue to occur.

This study contributes towards implementing three tiers of the system for detection and protection from harmful URLs. The first tier focuses on evaluating the performance of discriminative features in model creation. Discriminative features are derived from URL details and “Whois” webpage information that helps in improving detection performance with less latency and low computational complexity. The influence of feature variation on Parametric (neural network) and non-parametric classifier detection results are assessed to narrow down to the most prominent features to be adapted in the best model for the task of identifying URLs with multi-categorization. The study reveals that non-parametric ensemble models like Light GBM, XGBoost, and Random Forest performed well with a detection accuracy of over 95%, which facilitated building a real-time detection system and differentiating multiple attack types (such as Malware, Phishing, and spam).

The second tier focuses on validation with a global database to know, if entered URL is reported as suspicious by various detection engines already. If not, it enables the user in updating the global database with URL details that are new and not reported yet. Finally, the two modules are integrated to create a web application using Streamlit that provides full system protection against malicious URLs.

Presented in: Advanced Network Technologies and Intelligent Computing. ANTIC 2022.

AUTHORS

Vijayaraj Sundaram


Dr. Shinu Abhi


Professor and Director – Corporate Training

Dr. Rashmi Agarwal


Associate Professor

Leave a Reply

Your email address will not be published. Required fields are marked *