An Improved Learning Based Disaster Event Using Big Data Analytics

Research Article
Saranya M and Prema A
DOI: 
http://dx.doi.org/10.24327/ijrsr.2017.0806.0408
Subject: 
science
KeyWords: 
Big Data, Hadoop, ETL, Map Reduce, HDFS
Abstract: 

Big data is a word for datasets that are so big or multifaceted that traditional data dispensation applications are inadequate to deal with them. It is not merely a data, rather it has become a complete subject, which involves various tools, techniques, and frameworks the face up to of extracting value from big data is parallel in many ways to the age-old problem of distilling business aptitude from transactional data. At the heart of this challenge is the process used to extract data from multiple sources, transform it to fit your analytical needs, and load it into a data warehouse for consequent analysis, a process known as “Extract, Transform & Load” (ETL). The Hadoop Distributed File System (HDFS) is the storage component of Hadoop which is used to implement the disaster management process. Map Reduce method has been calculated in this paper which is required for implement Big Data Analysis using HDFS.