Hi,
regarding compressed files - as you correctly stated, MAX_FILE_SIZE_TO_EXTRACT_MB is applied to the files inside archives, so it will stop scanning big files inside compressed archives, but won't stop scanning many small files in a big archive. It's meant to defend against out-of-memory or decompression bomb which would affect (crash) the scanner. This is not a problem with the big archive with many small files - that can be scanned safely, even though it can take some time.
We currently don't have any configuration option to exclude big archives completely. Could you maybe pre-filter the files before passing them to the scanner? If all files are backups and you just want to skip the big ones, you can decide that before sending them to scan, right?
Regarding avast-fss - yes, it is quite limited. Scan is triggered only by FAN_CLOSE_WRITE event. It does not block access to files, so it does not work as user protection. It can surely be made much better. Can I ask how do you use it? There didn't seem much reason to invest developer time into FSS. We're more focused on direct scanning of files via REST etc.