Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. In this way, the compressed info needs considerably less disk space than the initial one, so much more content could be stored on identical amount of space. There're many different compression algorithms that function in different ways and with some of them only the redundant bits are deleted, therefore once the info is uncompressed, there's no loss of quality. Others remove excessive bits, but uncompressing the data later will result in reduced quality compared to the original. Compressing and uncompressing content takes a significant amount of system resources, especially CPU processing time, therefore any hosting platform which uses compression in real time needs to have enough power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of saving the actual code.

Data Compression in Cloud Web Hosting

The ZFS file system which runs on our cloud Internet hosting platform employs a compression algorithm called LZ4. The latter is significantly faster and better than every other algorithm you'll find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data quicker than it is read from a hard drive, which improves the performance of Internet sites hosted on ZFS-based platforms. Because the algorithm compresses data really well and it does that quickly, we are able to generate several backup copies of all the content stored in the cloud web hosting accounts on our servers on a daily basis. Both your content and its backups will take less space and since both ZFS and LZ4 work very quickly, the backup generation will not change the performance of the web servers where your content will be kept.