in reply to Re: Data compression by 50% + : is it possible?
in thread Data compression by 50% + : is it possible?

sorry for the off-the-thread updates :)

the distribution of values is exactly as the one simulated. so every time the automata spits the number, it is either larger than the previous, never consecutive (larger by +1) and never larger than 90.

the readable output is, what it spits out but binary is ok. i just need to store it.

Frequency table looks ok but i'll check it one more time and post if i find irregularities. However, i understand what you and roboticus did. Thank you for the input !! :)

PS: yes the order i was talking about was on rows

  • Comment on Re^2: Data compression by 50% + : is it possible?

Replies are listed 'Best First'.
Re^3: Data compression by 50% + : is it possible?
by LanX (Saint) on May 13, 2019 at 15:36 UTC
    Great! :)

    FWIW Compress::Huffman looks promising.

    It seems to take a frequency table as input (actually probability, so divide my table by 10000) and to store it together with a bit string and decode it again.

    On a side note:

    You might want to experiment with changed weights, that is multiply the frequency with the length of the key, like 246 with 3 and recalculate the proportions.

    This could give you a better compression compared to the old format because there you need 3 characters for this entry.

    (I'm not sure here, it's mind boggling)

    Cheers Rolf
    (addicted to the Perl Programming Language :)
    Wikisyntax for the Monastery FootballPerl is like chess, only without the dice