Data compression is the decrease of the number of bits which should be stored or transmitted and the process is rather important in the web hosting field due to the fact that information kept on hard disk drives is often compressed in order to take less space. You'll find different algorithms for compressing information and they provide different effectiveness depending on the content. A lot of them remove only the redundant bits, so no data will be lost, while others delete unnecessary bits, which leads to worse quality when your data is uncompressed. This process employs a lot of processing time, so a web hosting server should be powerful enough so as to be able to compress and uncompress data in real time. An instance how binary code can be compressed is by "remembering" that there are five consecutive 1s, for example, rather than storing all five 1s.

Data Compression in Hosting

The ZFS file system that operates on our cloud web hosting platform employs a compression algorithm called LZ4. The latter is a lot faster and better than any other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of websites hosted on ZFS-based platforms. Due to the fact that the algorithm compresses data quite well and it does that quickly, we're able to generate several backups of all the content stored in the hosting accounts on our servers daily. Both your content and its backups will take reduced space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the web hosting servers where your content will be stored.

Data Compression in Semi-dedicated Servers

If you host your Internet sites in a semi-dedicated server account with our firm, you can experience the advantages of LZ4 - the powerful compression algorithm used by the ZFS file system that's behind our advanced cloud web hosting platform. What differentiates LZ4 from all other algorithms out there is that it has an improved compression ratio and it is considerably quicker, in particular with regard to uncompressing website content. It does that even faster than uncompressed info can be read from a hard disk drive, so your Internet sites will perform faster. The higher speed comes at the expense of using plenty of CPU processing time, which is not a problem for our platform as it consists of a large number of clusters working together. Together with the better performance, you'll have multiple daily backups at your disposal, so you could recover any deleted content with just a few clicks. The backups are available for an entire month and we can afford to keep them because they need a lot less space compared to conventional backups.