• 0 Posts
  • 40 Comments
Joined 2 years ago
cake
Cake day: July 5th, 2023

help-circle
  • There’s more to it than that. Firstly, at a theoretical level you dealing with the concepts of entropy and information density. A given file has a certain level of information in it. Compressing it is sort of like distilling the file down to its purest form. Once you reached that point, there’s nothing left to “boil away” without losing information.

    Secondly, from a more practical point of view, compression algorithms are designed to work nicely with “normal” real world data. For example as a programmer you might notice that your data often contains repeated digits. So say you have this data: “11188885555555”. That’s easy to compress by describing the runs. There are three 1s, four 8s, and seven 5s. So we can compress it to this: “314875”. This is called “Run Length Encoding” and it just compressed our data by more than half!

    But look what happens if we try to apply the same compression to our already compressed data. There are no repeated digits, there’s just one 3, then one 1, and so on: “131114181715”. It doubled the size of our data, almost back to the original size.

    This is a contrived example but it illustrates the point. If you apply an algorithm to data that it wasn’t designed for, it will perform badly.












  • It definitely helps. You can sometimes logic yourself out of a spiral by acknowledging the emotion and why it’s there, while simultaneously rejecting the need for feeling it right now.

    It’s like “hey cool thanks brain I get that you want me to make sure that the bad thing doesn’t happen again so you’re looping that memory and the feeling that came with it. But actually that’s not helpful, that situation actually (wasn’t dangerous) / (won’t happen again) / (isn’t something I can solve right now), so let’s move on.”

    With practice, brain usually says “ok no worries”, and you can move on. It’s not really that simple but that’s the idea.