I am in a situation, where I need to write scripts for parsing log files, decompressing files to some strangely named folder under the /tmp folder and then check those archives marked as "decompression bomb" by the scanner and I really really would not like to do this myself.
I know what a decompression bomb is, I am pretty sure I have a full control of the contents of the intentionally large files in the folders I am going to scan. I still need to scan the files, everything must be scanned in case there is something evil lurking in the dark corners of the file system. I also know, that my pretty current system can handle the load for doing this.
Is there any way to manually adjust the threshold where the scanner classifies files from the performance and file space point of view as hopeless to uncompress? I understand that this might spoil some excellent performance reviews for the software, but I need a virus scanner that decompresses the files and sees if there are any threats, before somebody else does that and finds it out in the hard way.
Also, It would help to find the exact definition of the limit, where the scanner throws the towel in.
Thank you mom for all the love, but this time think I can take care of myself.