The term data compression describes decreasing the number of bits of information that should be stored or transmitted. This can be achieved with or without the loss of data, which means that what will be removed at the time of the compression can be either redundant data or unnecessary one. When the data is uncompressed afterwards, in the first case the content and the quality will be identical, while in the second case the quality will be worse. You'll find various compression algorithms which are better for various kind of info. Compressing and uncompressing data usually takes a lot of processing time, therefore the server executing the action should have plenty of resources to be able to process your data quick enough. An example how information can be compressed is to store how many consecutive positions should have 1 and how many should have 0 within the binary code rather than storing the particular 1s and 0s.

Data Compression in Cloud Hosting

The compression algorithm employed by the ZFS file system which runs on our cloud hosting platform is called LZ4. It can boost the performance of any site hosted in a cloud hosting account on our end as not only does it compress info significantly better than algorithms employed by alternative file systems, but it uncompresses data at speeds that are higher than the hard drive reading speeds. This can be done by using a great deal of CPU processing time, that is not a problem for our platform owing to the fact that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it allows us to make backups much quicker and on less disk space, so we can have several daily backups of your databases and files and their generation will not influence the performance of the servers. In this way, we can always recover all content that you may have erased by mistake.