Data compression is the compacting of information by lowering the number of bits which are stored or transmitted. In this way, the compressed information will require less disk space than the initial one, so more content can be stored on identical amount of space. You'll find different compression algorithms that work in different ways and with a lot of them just the redundant bits are erased, so once the data is uncompressed, there's no loss of quality. Others delete unnecessary bits, but uncompressing the data later will result in reduced quality compared to the original. Compressing and uncompressing content consumes a significant amount of system resources, in particular CPU processing time, so each and every web hosting platform which employs compression in real time must have enough power to support this attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of keeping the whole code.

Data Compression in Shared Web Hosting

The compression algorithm employed by the ZFS file system that runs on our cloud internet hosting platform is known as LZ4. It can supercharge the performance of any Internet site hosted in a shared web hosting account with us as not only does it compress info more efficiently than algorithms employed by various file systems, but it uncompresses data at speeds which are higher than the hard disk reading speeds. This can be done by using a lot of CPU processing time, that is not a problem for our platform since it uses clusters of powerful servers working together. One more advantage of LZ4 is that it enables us to generate backups at a higher speed and on less disk space, so we can have several daily backups of your files and databases and their generation won't influence the performance of the servers. In this way, we could always recover any kind of content that you may have removed by accident.