0% found this document useful (0 votes)
4 views12 pages

Lossless

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views12 pages

Lossless

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12

LOSSLESS DATA COMPRESSION

AND DECOMPRESSION USING


HUFFMAN ALGORITHM

DESIGNED BY

ISHOLA CHRISTIANAH TOLUWALOPE


HND/19/COM/FT/461
Introduction
Data compression is a process by which a file (Text, Audio, and
Video) may be transformed to another (compressed) file, such that
the original file may be fully recovered from the original file
without any loss of actual information. This process may be useful
if one wants to save the storage space. For example if one wants to
store a 4MB file, it may be preferable to first compress it to a
smaller size to save the storage space. Also compressed files are
much more easily exchanged over the internet since they upload
and download much faster. We require the ability to reconstitute
the original file from the compressed version at any time. Data
compression is a method of encoding rules that allows substantial
reduction in the total number of bits to store or transmit a file. A
compression program is used to convert data from an easy-to-use format to one
optimized for compactness. Likewise, an uncompressing program returns the
information to its original form.
STATEMENT OF THE PROBLEM

File transfer application develop earlier and


used by the organization still allow
Information redundancy in stored. The
proposed system will be introduced to
eradicate the lapses in the earlier version.
Therefore, data compression and
decompression using Huffman algorithm to
increasing effective data density.
AIMS AND OBJECTIVES
The main aim of this project is to develop an application that
will carter for data compression and decompression using
Huffman algorithm.
The project has the following objectives:
 To create an application that will reduce data size using the
algorithm and techniques.
 To create binary representations of data which require less
storage space.
 To present an essential object to decrease storage size and
increase the speed of data transmission through
communication channels.
LITERATURE REVIEW
AUTHO YEAR TITLE AIM METHODOLOGY LIMITATION
R

Rachit 2016 An Improved Image The purpose of this paper is to It involves In calculation of parameters
Compression Technique compressed digital images the new method of splitting or Matlab tool required. These
Using Huffman Coding using Huffman coding and Fast dividing an input image into techniques are compared
and FFT Fourier Transform and equal rows & with respect to various
compare the results of columns and at final stage sum parameters such as mean
both techniques. of all individual compressed square
images. error (MSE), Peak signal to
noise ratio (PSNR),
Compression ratio (CR) and
Bits per pixel (BPP) for the
various input image of
different size
Lele A New Highly Efficient This algorithm consists of two A new Static Binary Tree The Two Module Based
Algorithm For modules: Direct Redundancy Model is introduced to Algorithm has demonstrated
Lossless Binary Image Exploitation and Improved efficiently model the Reduced excellent compression
Compression Arithmetic Coding. It is Blocks which are produced by performance.
referred the Direct Redundancy The simulation results
to as the Two Module Based Exploitation module. showed that the proposed
Algorithm (TMBA). algorithm well outperformed
the G3 and
G4 coding schemes.
2017 Lossless Image Images include information
Vikash & Compression through about human body which is We will use differential pulse Lossless compressions are
Sharma Huffman Coding used for different purpose code modulation for image JPEG, JPEG-LS and
Technique and Its such as medical examination compression with Huffman JPEG2000 are few well-
Application in Image security and other plans encoder, which is one of the known methods for lossless
Processing using Compression of images is latest and provides good compression.
MATLAB used in some applications compression ratio, peak
such as profiling information signal to noise ratio and
and transmission systems. minimum mean square error.
RESEARCHMETHODOLOGY
Research methodology has many research dimensions and methods. The
scope of research methodology is wider than research method. This is mainly
adopted by the researcher in undertaking this research. Methodology is the
underlying principles and rules that govern a system method on the other
hand it is a systematic procedure for a set of activities. Thus, from these
definitions a methodology encompasses the methods used within a study.

SYSTEM FLOW DIAGRAM


Result and Discussion

Extracted Folder: This is the extraction page that display all file
decompressed.
Result and Discussion

Zip Upload: This is Allow user to upload zip file.


Result and Discussion

Decompress: This is decompress module where a zip file is upload to be


decompress.

Login: This gives authorized user access into the system


Result and Discussion

Register: This gives users to fill in their information to gain access to the system
CONCLUSION
This dissertation has been able to achieve the goal set for it by
designing a file management system with an efficient method to sort,
merge and extract record using shano fano algorithms. This algorithm
helps in sorting different categories of information in the database, and
then merges several fields into one single record to generate the desired
report in a required format.
This research has provided an overview of data compression and
decompression methods of general utility. The algorithms have been
evaluated in terms of the amount of compression they provide,
algorithm efficiency, and susceptibility to error. While algorithm
efficiency and susceptibility to error are relatively independent of the
characteristics of the source ensemble, the amount of compression
achieved depends upon the characteristics of the source to a great
extent.
THANK YOU FOR
WATCHING MY
PRESENTATION

SLIDE

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy