How can I write shell script to reduce my disk usage

I have a mail… subject of that mail is diskutilization…need to check server name in that body of mail…once we catch server name from that mail we should login to that same server.

Now I need to check mount points or the directory which occupies more space in the current server.
I need to login to that server and gzip the all the files under that server.
If size get reduced in that directory it will come to next most space occupieng directory and do the same thing.

Please help me out in this

I recommend breaking this process down to steps + commands.

I also think that you can automate this even further down the road, but lets look at your request.

  1. you need to locate the server and files that are large. I’ve used this command-string with success but you can change it for your needs. The below command lists out all of the file and directory sizes in Kilobytes and sorts from smallest down to the largest files.
    $(sudo du -xak | sort -n)

You can eye-ball the files when looking for the largest. You could also add $(grep) after a pipe to search for specific files.

So, that’s the first step. Find large files.
Next step is to refine the list to files that you actually WANT to $(gzip) and compress.

Start looking for patterns of regularly $(gzip)'d files and see if the same files often need compressing. If they are log files then you can research and implement log-rotate which will do the heavy lifting of compressing the log files before it triggers your monitoring alert.