Making bash script to help fix dead links, need a suggestion


several times i have tried to find a Linux tool which will allow conveniently fixing dead links spread across the drives by suggesting possible new location and fix the link, similar to what i know from Windows.

I have only found GitHub - wtsi-hgi/symlink-fixer but that look like to be working only for single directory.

So i am trying to write a bash script. I have got a list of link and destination paths using utility “symlinks / -rc|grep -i dangling”
linkpath → deaddestination

It is not efficient to repeat the search for a each dead link filename (in order to find new location), instead search for all in one run and save found paths for future use, but how to do it, it would be very long find command… I expect for example up to 200 dead links. When i have a text file containing the list of file names one per line, which command to use to execute search for all these file names in single run?

I have found this:

find . | grep -i -Ff filenames.txt
< file.lst mapfile filearr
locate ${filearr[@]} | grep /path/where/to/find
cat filenames.txt | xargs -I {} find . -type f -name {}

but maybe i am doing it entirely wrong:

  1. Gather list of symlinks and destination path combinations (link → path):
    symlinks “$directorysearch” -rc|grep “dangling: /”|sed -e “s|dangling: ||g” > deadsymlinklist

  2. Take above gathered list and extract destination path file names and save it to a new list

  3. Use list of dead filenames to search these in one run of find command and save output

  4. cycle through the list of dead filenames and extract relevant lines from the file containing search results. Print on screen so user input new symlink path?