Maximum Entropy Compressor - Using Probability Hashes w/Hash-Chaining to compress "impossible" data sets.
Go to file
kuntz e58755f378 updated readme 2021-02-28 16:07:32 -06:00
full Initial commit 2021-02-28 15:49:34 -06:00
README.md updated readme 2021-02-28 16:07:32 -06:00
comp.png randomly sampling stage 1 2021-02-03 11:59:35 -06:00
compress.c fixes 2021-02-17 21:19:18 -06:00
compress.h fixes 2021-02-17 21:19:18 -06:00
compress4.c more tests 2021-02-19 16:05:22 -06:00
compress4.h more tests 2021-02-19 16:05:22 -06:00
make.sh more tests 2021-02-19 16:05:22 -06:00

compress

Maximum Entropy Compressor - Using Probability Hashes w/Hash-Chaining to compress "impossible" data sets.

Copyright © 2021 by Brett Kuntz. All rights reserved.

See the /full/ directory for the latest test iteration that compresses & decompresses. Both files are stand-alone and you do not need to compile both for either to work.

Compressing takes about 35 minutes on an EC2 c5.24xlarge machine, decompression takes place in real-time.

Neither the compressor or decompressor require more than a few kilobytes of RAM to operate. They are best implemented on ASIC or GPU.