The term data compression means decreasing the number of bits of information that needs to be saved or transmitted. This can be achieved with or without the loss of data, so what will be removed during the compression will be either redundant data or unneeded one. When the data is uncompressed subsequently, in the first case the content and its quality shall be the same, while in the second case the quality will be worse. You will find various compression algorithms which are more effective for various kind of info. Compressing and uncompressing data in most cases takes a lot of processing time, so the server executing the action should have plenty of resources in order to be able to process the info quick enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 in the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Hosting

The ZFS file system which is run on our cloud web hosting platform uses a compression algorithm called LZ4. The aforementioned is considerably faster and better than any other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk drive, which improves the overall performance of Internet sites hosted on ZFS-based platforms. Since the algorithm compresses data really well and it does that quickly, we're able to generate several backup copies of all the content stored in the hosting accounts on our servers daily. Both your content and its backups will require reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not change the performance of the web servers where your content will be stored.