We need sometimes to compress Python objects (list, dictionary, string, etc) before saving them to cache and decompress after reading from cache.
Firstly we need to be sure we need to compress the objects. We should check if the data structures/objects are too big just to fit uncompressed in the cache. There is going to be an overhead for compression/decompression, that we have to tradeoff with the gains made by caching in the first place.
If we really need compression, then we probably want to use zlib.
If we are going to use zlib, we might want to experiment with the different compression levels available in the compress method, to balance CPU time vs compression levels:
Compresses the data in string, returning a string contained compressed data. level is an integer from 1 to 9 controlling the level of compression; 1 is fastest and produces the least compression, 9 is slowest and produces the most. The default value is 6. Raises the error exception if any error occurs.