Optimize decoding tree to save more memory #2

Open
opened 2020-12-31 08:05:22 -05:00 by tcsullivan · 1 comment
tcsullivan commented 2020-12-31 08:05:22 -05:00 (Migrated from github.com)

See here.

Most importantly, we need to allow for larger decoding trees. Secondly though (and maybe this should be a separate issue), the decoding tree could likely be refined/redesigned to either take up less memory or work more efficiently.

See [here](https://www.reddit.com/r/cpp/comments/kn1wpe/compiletime_string_compression_using_huffman/ghjlj02/). Most importantly, we need to allow for larger decoding trees. Secondly though (and maybe this should be a separate issue), the decoding tree could likely be refined/redesigned to either take up less memory or work more efficiently.
tcsullivan commented 2020-12-31 09:54:43 -05:00 (Migrated from github.com)

Forgot that the input data gets left uncompressed if compression doesn't save space. The bug was simply in decoder::end returning a byte beyond what it should have for the uncompressed scenario. I'll commit the fix for the bug, but will still leave this issue up for the possibility of enhancing the decoder tree.

Forgot that the input data gets left uncompressed if compression doesn't save space. The bug was simply in `decoder::end` returning a byte beyond what it should have for the uncompressed scenario. I'll commit the fix for the bug, but will still leave this issue up for the possibility of enhancing the decoder tree.
Sign in to join this conversation.
No description provided.