in reply to Data compression by 50% + : is it possible?
Impossible! If your data is truly random.
You need a gnat's under 6.5 bits to represent each of your values and are currently using 8 bits. If you could pack it exactly (meaning using 1/2 bits), then the best you could achieve is 6.5/8 = 18.75%.
If you try dictionary lookup:
Beyond that, your into the luck of the draw. Some random datasets might contain enough common (sub)sequences to allow Lempel–Ziv or similar to get close to 50%, but for other datasets, the same algorithm will produce barely any reduction, (or even an increase).
See Kolmogorov_complexity for more.
|
|---|
| Replies are listed 'Best First'. | |
|---|---|
|
Re^2: Data compression by 50% + : is it possible?
by bliako (Abbot) on May 12, 2019 at 00:06 UTC | |
by BrowserUk (Patriarch) on May 12, 2019 at 01:52 UTC | |
by roboticus (Chancellor) on May 12, 2019 at 13:07 UTC | |
by BrowserUk (Patriarch) on May 12, 2019 at 23:33 UTC | |
by baxy77bax (Deacon) on May 13, 2019 at 08:53 UTC | |
|
Re^2: Data compression by 50% + : is it possible?
by LanX (Saint) on May 12, 2019 at 02:56 UTC | |
by BrowserUk (Patriarch) on May 12, 2019 at 03:06 UTC | |
by LanX (Saint) on May 12, 2019 at 03:34 UTC |