shell - Unable to gzip the file in the same location on Solaris -
i trying find file bigger 1 gb , zipping content, , after file zipped, need delete tar file , nullify existing file. here while gzipping file, it's not gzipping same path; it's going user's home directory. on solaris servers.
i running remote server.
ssh -o stricthostkeychecking=no -qt 192.168.1.1 "$(<.fs1.sh)"
this .fs1.sh
contains:
threshold="50" date=`date +"%m-%d-%y"` fs in $(df -k / /var /tmp| awk '{print $6}' | sed '1 d'); chk=$(df -k ${fs} | sed '1 d' | awk '{print $5}' | awk -f\% '{print $1}') if [ ${chk} -gt ${threshold} ]; /usr/local/bin/sudo rm -f /tmp/files.log /usr/local/bin/sudo rm -rf /var/core /usr/local/bin/sudo rm -rf /var/audit files=`/usr/local/bin/sudo /usr/bin/find /var/adm -xdev -type f -size +1000000000c -exec ls -lht {} \; | awk '{ print $9}'` files1=`/usr/local/bin/sudo /usr/bin/find /var/adm/log -xdev -type f -size +1000000000c -exec ls -lht {} \; | awk '{ print $9}'` files2=`/usr/local/bin/sudo /usr/bin/find /var/log -xdev -type f -size +1000000000c -exec ls -lht {} \; | awk '{ print $9}'` echo "$files" >>/tmp/files.log echo "$files1" >>/tmp/files.log echo "$files2" >>/tmp/files.log in `cat /tmp/files.log` echo $i /usr/local/bin/sudo tar -cf $i_$date.tar $i /usr/local/bin/sudo gzip $i_$date.tar /usr/local/bin/sudo rm $i_$date.tar /usr/local/bin/sudo cp /dev/null $i done fi if [ ${chk} -gt ${threshold} ]; echo "$(hostname): fileystem warning on ${fs} used: ${chk}%." sudo /usr/bin/find ${fs} -xdev -type f -size +100000000c -exec ls -lht {} \; echo " -----------------------------------------------------------------------------------------------------------------------------------------------------" fi done
you redirect output of gzip
stdout using -c
switch , specify output file.
gzip -c $i_$date.tar > /path/to/original/dir/new.gz
Comments
Post a Comment