PREFERRED DOMAIN LITERATURE SURVEY PROBLEM DEFINITION WORK OBJECTIVES PROPOSED PROJECT TITLE “Data compression” The total sum of data held by all the big online storage and service companies like Google, Amazon, Microsoft and Facebook 1.2 million terabytes
The amount of data uploaded to the Internet in a single second is a
staggering 24,000 gigabytes
Compression means reducing the physical size of data such that it
occupies less storage space and memory,
Compressed files are, therefore, easier to transfer because there is a
sizable amount of reduction in the size of data to be transferred. Sl No Paper Title Discussion Remarks
1 Optimized rtl design and Proposed high-speed low- Complex
implementation of lzw complexity Register Transfer implementations using algorithm for high Logic (RTL) design FPGA architecture and bandwidth applications consumption of power is high. 2 Improving internet of Leveraging on the advantages capable of things communications offered by classification methods appropriately handling through compression and to optimize communications and any given data while at classification by enhancing information the same time transmission to simplify data increasing classification communication efficiency 3 A data compression The on-line data are decomposed Dependency on algorithm for the sea route with the historical principal principal components monitoring with wireless component coefficients, and then of sensing sensor network compressed according to the optimal codebook “ proposed two layered compression framework that reduces the amount of data with maintaining minimum error rate as well as avoiding bandwidth wastage“ To develop algorithm for fog node to collect data from different sources and checks the authenticity of the device.
To compress data using lossy compression technique,
Cloud takes the file from Fog and regenerate the data values from the file after calculating the frequencies for all different values.
Decompression: Cloud gets the data file according to the
request which contains the data values and their corresponding frequencies. Two layered compression Framework to mitigate data storage in fog and cloud nodes