Data compression is the compacting of info by lowering the number of bits which are stored or transmitted. In this way, the compressed info will need much less disk space than the original one, so extra content can be stored using the same amount of space. You will find different compression algorithms which work in different ways and with many of them just the redundant bits are deleted, therefore once the info is uncompressed, there is no loss of quality. Others delete unneeded bits, but uncompressing the data later on will result in reduced quality compared to the original. Compressing and uncompressing content needs a huge amount of system resources, especially CPU processing time, so each and every hosting platform that employs compression in real time must have enough power to support this attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of saving the whole code.

Data Compression in Web Hosting

The compression algorithm used by the ZFS file system that runs on our cloud hosting platform is known as LZ4. It can improve the performance of any site hosted in a web hosting account on our end since not only does it compress data significantly better than algorithms used by various file systems, but also uncompresses data at speeds which are higher than the hard drive reading speeds. This is achieved by using a lot of CPU processing time, that is not a problem for our platform considering that it uses clusters of powerful servers working together. One more advantage of LZ4 is that it enables us to generate backups more rapidly and on reduced disk space, so we shall have several daily backups of your databases and files and their generation will not affect the performance of the servers. That way, we could always restore any content that you may have removed by accident.