A while back (I’m going to WAG it at 5 years, but you know how these things go), there was a big ruckus about a radical new form of data compression that was going to “revolutionize computing.”
It was something invented by an amateur, and it was being kept secret while it was being developed for commercial use. But it had supposedly been seen by an expert who was willing to vouch for its potential.
IIRC, it was supposed to compress virtually any data to s.t. like 1/10 of its original size, and was going to immediately allow us to increase transmission bandwidth, storage capacity, etc. by an order of magnitude.
Anyone know what happened to this “revolution?”
You may be talking about fractal compression. ABout in the timeframe you were talking about Chaos theory was the new bandwagon of the mathematical/computer world(thanks to Jeff Goldblum’s character in Jurassic Park). The theory was basically that you could compress things with needing to have the actual data itself store anywhere, you would simply have a much smaller “reminder” stored that would allow the computer to extrapolate back to the original data. I heard a lot of theories but no implementations, so I guess it just was not practical to implement, and has gotten pushed to the back burner along with most of the other great things chaos theory was going to do for us.
I doubt that any compression would be able to claim a 10X ratio on every data set, but there are many focused solutions that do much better than that. MP3 for music, MPEG for video, JPEG for still images, Limpel-Ziv for text. I’m not sure which one, specifically you’re referring to, but I do remember about 5-7 years ago Fractal Compression was the up-and-comer. It boasted compression ratios of 1000X or more for certain classes of data - it definitely was not general purpose and the last I heard it was so compute intensive that they were considering designing special hardware to make it economical.
I did a quick patent search and the only general purpose compression schemes I found that match you general time frame are fractal compression by Michael F. Barnsley and Alan D. Sloan (1995), a binary encoding scheme by Mitsubishi (1994) and a recursive compression scheme by David C. James (1996).
The last one may be the one you’re thinking about, but many experts believe that it was not properly validated before the patent was filed. Check out section 9.5 of the compression FAQ for details:
http://www.faqs.org/faqs/compression-faq/part1/
The FAQ may help you narrow down the search…
Thanks, JoeyBlades, for the link. This may have been the one I was thinking of. Even if it was not, its refreshing to see that someone is keeping tabs on these nuts.
When I was seeing those stories being widely reported in the popular press, I felt much like the child must have felt in “The Emperor’s New Clothes”. I couldn’t believe that they were reporting this in a serious tone, repeating the “inventors’” claims as if they made sense. I kept waiting for someone to come out and say how far-fetched it was. Not only were the claims technically impossible, but the behavior of the “inventors” (e.g. refusing to demonstrate the compression on test files) made it obvious that they were up to something.