Go Back Research Article May, 2020

Data Wrangling on Crawled Data

Abstract

Data wrangling is the process of cleaning, structuring and enriching raw data into a desired format for better decision making in less time. It is mainly used to improve Data Quality. A crawler is a program that visits Websites and reads their pages and other information in order to create entries for a search engine index with the help of cleaning, structuring and unifying cluttered and complex data into sets, data wrangling ensures that data becomes easy to access and analyze. It makes certain that there is no unarranged stack of data during analysis.

Keywords

crawled data data quality data cleaning structuring
Document Preview
Download PDF
Details
Volume 07
Issue 05
Pages 1022-1025
ISSN 2395-0056