Cookies Psst! Do you accept cookies?

We use cookies to enhance and personalise your experience.
Please accept our cookies. Checkout our Cookie Policy for more information.

SAW Software Bug Report

Introduction:

SAW is an acronym for Scrap Any Website. This software is used for scrapping websites to obtain information that might be needed by developers.

Provided below are some details about this software:

Name: Scrap Any Website Download here
OS: Windows 10 version 17763.0 or higher
Memory: Not specified (Minimum), 4 GB (Recommended)
Features: Data Extraction, Website Scraping
Approximate size: 136.5 MB

What I tested and reported was specifically the scrapping task. I focused on the scrapping tasks feature and documented the bug report from my personal observation.

Problem Description:

Issue 1: I observed there was an unknown code "-1" when scrapping some URLs, and this could really cause confusion and misunderstanding for some users.
This issue can be reproduced by inputing at list 20 website URLs, turning on discover New URL button, then Scrapping the URLs simultaneously. After this is done, some URLs gives the wrong code, which is "-1"' See more on this spreadsheet

Issue 2: I observed an inconsistency in the scrapped data statistics which could cause uncertainties in the usage of the software. This inconsistency can be reproduced when multiple URLs is tested. You will see miscalculated scrapped data. See image and further details by accessing this spreadsheet.

Issue 3: I observed the inability to deleted URLs by users. So in a case whereby a user mistakenly inputed a URL, it will be very difficult to remove. No feature or options was provided for this when scrapping in a folder.

Impact:

For issue 1, I couldn't place an actual error and believed there most have been a software issue hence my subtle inability to trust the data I see on this software. The scrapped data statistics miscalculation also might have been because of this flaw, which seems to be causing a flow of errors. The inability to delete a particular URL and re scrap the URL list is also making the software not user friendly in some unique cases.

Bug Report Process:

As cited earlier, click the link below to access the spreadsheet for more information of this bug report process:

BUG REPORT

The stated observations should be revisited and a guide should be provided for users. This will help encourage newbies to use the software and understand seemingly confusing terminologies,

Conclusion:

These identified issues significantly impact the software's usability and reliability. The presence of unexpected codes like "-1" during URL scraping creates confusion and undermines trust in the data output.
Inconsistencies in scrapped data statistics further complicate data analysis, potentially leading to flawed insights.

Additionally, the inability to delete URLs post-scraping hampers user flexibility and makes error correction cumbersome. Addressing these issues through thorough testing, documentation improvements, and user-friendly features will be essential in enhancing the software's functionality and user experience.

Samson Ajayi,
HNG 11 Intern, 2024.

Last Stories

What's your thoughts?

Please Register or Login to your account to be able to submit your comment.