Data compression is the compacting of information by lowering the number of bits that are stored or transmitted. Thus, the compressed data takes substantially less disk space than the original one, so more content might be stored using identical amount of space. You can find different compression algorithms which function in different ways and with a number of them just the redundant bits are deleted, so once the data is uncompressed, there's no loss of quality. Others remove excessive bits, but uncompressing the data later will lead to reduced quality compared to the original. Compressing and uncompressing content consumes a large amount of system resources, and in particular CPU processing time, so any hosting platform which uses compression in real time needs to have ample power to support this attribute. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of keeping the entire code.