|
|
Subject:
Compression of random data
Category: Computers Asked by: snorkelman-ga List Price: $10.00 |
Posted:
11 Jan 2004 14:47 PST
Expires: 10 Feb 2004 14:47 PST Question ID: 295380 |
Could you please describe or refer to an algorithm that would enable a universal, general purpose program to compress any stream of random data? It would be a plus if this could be implemented on a Turing Machine, or failing that, a Windows PC. | |
|
|
There is no answer at this time. |
|
Subject:
Re: Compression of random data
From: efn-ga on 12 Jan 2004 10:57 PST |
It is mathematically impossible to have an algorithm that can compress any input without loss. For a discussion, see: http://www.faqs.org/faqs/compression-faq/part1/section-8.html --efn |
Subject:
Re: Compression of random data
From: scubapup-ga on 17 Jan 2004 15:29 PST |
you have a turing machine? hehe, wow. check this guy out, he's got turing and compression written all over his site. if he's real or not i'm not sure http://www.geocities.com/malbrain/ |
Subject:
Re: Compression of random data
From: sheyeh-ga on 11 Feb 2004 14:36 PST |
The Lempel-Ziv algorithm is a universal lossless compression algorithm that is asymptotically optimal, that is as the sequence grows longer you get a compression ration that approaches the best that can be done . This algorithm is actually used in the various versions of zip (gzip, winzip etc.) It is quite easy to implemet. See http://www.data-compression.com/lossless.html Also Section 12.10 of Cover and Thomas "Elements of Information Theory" for a slightly different description and proof of optimality. |
If you feel that you have found inappropriate content, please let us know by emailing us at answers-support@google.com with the question ID listed above. Thank you. |
Search Google Answers for |
Google Home - Answers FAQ - Terms of Service - Privacy Policy |