1. if you need to check a lot of webshells on valid first you need to check them on http-code we need to receive an answer if there are any page on this url. for doing this tasks we use bash,.c php of course has more universal but it would take more time and you wont be able to use it on hosting as time of script working os limited.
here is listing of ready script:

#!/usr/local/bin/bash

links="/root/links.txt" # file with all the available link
result="/root/valid.txt" # file with links, where there is a page (200 ок).


if [ -r "$links" ]; then
if [ -w "$result" ]; then
for address in $(cat $links); do
if (wget -t 1 -t 5 --spider $address &>/dev/null); then
echo $address; # showing links with pages
echo $address >> $result;
fi
done
else
echo cannot open $result for writing or file does not exist.
fi
else
echo cannot open $links for reading or file does not exist.
fi

"wget -t 1 -t 5" says that well try to connect with an address one time with timeout 5sec. otherwise checking may take some days. if you wanna see more details delete "&>/dev/nul".

2. so we have a list of links where are some page. now we need to know if there are web-shell or something else. as most of shells were from rst was decided to download pages and make grep for needed info (r57shell). here is listing:
#!/usr/local/bin/bash

links="/root/valid.txt" # file with all links.
result="/root/result.txt" # file with links with webshells.

keyword="r57shell" # keyword

if [ -r "$links" ]; then
if [ -w "$result" ]; then
for address in $(cat $links); do
if ( (wget -t 5 -t 1 $address -o - | grep -m 1 $keyword) &>/dev/null); then
echo $address; # shows good links
echo $address >> $result;
fi
done
else
echo cannot open $result for writing or file does not exist.
fi
else
echo cannot open $links for reading or file does not exist.

fi

as you can see its almost the same script, only wget downloads file and then there is grep for r57shell.