•  
  •  
 

Abstract

Yesterday’s “Big Data” is today’s “data.” As technology advances, new difficulties and new solutionsemerge. In recent years, as a result of the development of Internet of Things (IoT) applications, the area of DataMining has been confronted with the difficulty of analyzing and interpreting data streams in real time and at a highdata throughput. This situation is known as the velocity element of big data. The rapid advancement of technologyhas come with an increased use of social media, computer networks, cloud computing, and the IoT. Experiments inthe laboratory also generate a large quantity of data, which must be gathered, handled, and evaluated. This massiveamount of data is referred to as “Big Data.” Analysts have seen an upsurge in data including valuable and worthlesselements. In extracting usable information, data warehouses struggle to keep up with the rising volume of datacollected. This article provides an overview of big data architecture and platforms, tools for data stream processing,and examples of implementations. Streaming computing is the focus of our project, which is building a data streammanagement system to deliver large-scale, cost-effective big data services. Owing to this study, the feasibility oflarge-scale data processing for distributed, real-time computing is improved even when the systems are overwhelmed.

Share

COinS