News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

100 to 1 Compressor

Started by Sergiu FUNIERU, February 23, 2010, 11:15:06 PM

Previous topic - Next topic

BlackVortex

tl;dr
but good compressors already use precompressing filters and file format recognition. FreeArc is a good example. Also, it's open source :
http://freearc.org/

P.S.: This thread has some golden stuff  :green2

oex

:lol I recommend before anyone else posts in favour of a 100 to 1 compressor they try compressing something and think long and hard about why it wont work.... Some of the posts in favour indeed have ideas for compression which are already in use in many compressors using some very sophisticated mathematics but some things really are impossible even for us ASMers :bg
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv

joemc

Oh no not this thread again !   :bg
I was just starting to forget Sergiu's face too.

MichaelW

Quote from: Arnold Archibald on May 24, 2010, 08:05:54 AM
As for the tests on an algo using random data, they are useless, given the current contextual usage of random meaning data that for no other reason contains non usable noise that is without usage context.
We are missing a perspective on this in that depending upon how any given program decodes and uses its native files these files will when interpretted by another program appear either the same (or similar) or effectively as noise.

These tests were aimed at the "any file" claim.

QuoteA true random generator CAN create a file that contains RLE encodable data (the term "clustering" is used for this data type).

Yes, but only with a low probability, and the longer the run the lower the probability.



eschew obfuscation

Rockoon

Sigh....

You folks are being trolled by someone that cant do what they claim. Period. The claim is impossible.

Also, I am very surprised that nobody on this of all forums, knows of the pigeonhole principle.

http://en.wikipedia.org/wiki/Pigeonhole_principle

Taking his 100KB or larger into 1KB or less claim, he is claiming that he can compress each of the 2^800000 possible input files into a unique file that only has 2^8000 states.

Impossible. Period.
When C++ compilers can be coerced to emit rcl and rcr, I *might* consider using one.

oex

I have thought about this long and hard and have realised it is possible and so obvious....

Step 1: Take 100 floppy disks of your favorite OAP software
Step 2: Copy them to a CD

And hey presto.... 100 times space compression.... You just have to think outside the box :bdg

* Remember you heard it here first :bg
We are all of us insane, just to varying degrees and intelligently balanced through networking

http://www.hereford.tv

frktons

Quote from: Sergiu Funieru on March 02, 2010, 04:59:19 PM
To stop offending more people with my claims, I'll answer again to this thread when my program is done.

I hope soon or late Sergiu is going to post something concrete.
If he succeeds, I'll be happy for him and for the new ideas he can spread.
If he doesn't, I'll be happy for him as well, because he will learn something
new about the difficulties of realizing working stuff vs projecting it.

I'm open to the dream not only to science  :P
Mind is like a parachute. You know what to do in order to use it :-)

brethren

Quote from: joemc on May 27, 2010, 03:26:39 AM
I was just starting to forget Sergiu's face too.


you can't forget!!!!

dedndave

he seems like a nice enough guy, even if a bit mis-guided
i am not particularly impressed with the compression on that image, though   :P

MichaelW

No offence to Sergiu, but images of him are like images of me, the smaller the better.
eschew obfuscation

dedndave

yah - i'm nothing to look at, either
i hadda put wifies pic up so people would like me   :lol

BlackVortex

Haha, great thread, would click again !

Twister

This would be very complicated for media files. Those are known to get around ~1-10% compression from great commercial compression archivers.

Mirage

Eh couldn't a ratio like that be achieved by the compressor itself having a list or something of chunks of byte sequences that it would map to a smaller byte while still keeping unique? Mind you it would have to take pretty big chunks and the list itself would be pretty large but doing that would seem feasible to me.  :lol

frktons

Quote from: Mirage on August 31, 2010, 03:59:59 PM
Eh couldn't a ratio like that be achieved by the compressor itself having a list or something of chunks of byte sequences that it would map to a smaller byte while still keeping unique? Mind you it would have to take pretty big chunks and the list itself would be pretty large but doing that would seem feasible to me.  :lol

If you have a compressor program of 4 Gb you can have a 100K file compressed with a good compression ratio.
:lol
Mind is like a parachute. You know what to do in order to use it :-)