I use something like the following for my webserver to try and deny all bots, spiders and crawlers access to my site. This is the second level of defense. Defense in depth is good. Level 1 is robots.txt; Level 2 is user agent filtering; Level 3 is fail2ban monitoring the access log and banning anybody who requests stuff faster than I think people can read.
Anyway: user agent filter. These three conditions all need to be true. The first condition makes sure only the sites listed are affected (because some sites are exempt…). The second condition makes an exception for Archive Bot and Gwene. The third condition filters for all self-identified bots, crawlers and spiders. The actual rule tells them that the page is gone and should be deleted (status 410), and it also adds a Location header just in case a human is curious which leads them to https://alexschroeder.ch/nobots.
The reason is this: For a while it seemed that we all benefited from search engines – authors and readers both. These days, you'll find that search results are full of garbage sites. Big sites with the most flatulent of pages explaining in great detail why the thing you're looking for is important and how to do it, clearly optimized for an ad company and not for a reader. Big sites that have a gazillion answers are preferred over small and individual sites. Perhaps that's easier. Perhaps it allows them to diffuse responsibility for the garbage, I don't know. The effect is, in any case, that there is no benefit to search engines for small site authors, either. I was unable to find my own pages on the search engines. If you you are a small site owner and you think you can find your own pages on Google and Bing, I suspect that's because they track you. Try it on a different computer, anonymously. Perhaps you won't find yourself, either.
In any case, if I can't get anything in return, both as a reader and as an author, I feel that the deal is off. Why let them feed on my words for free? Nay, at a cost, since they are keeping my website busy, producing CO₂ and heating the planet for no benefit at all.
Better to block them all.
RewriteCond "%{HTTP_HOST}" "^(alexschroeder\.ch|…)$" [nocase]
RewriteCond "%{HTTP_USER_AGENT}" "!archivebot|^gwene" [nocase]
RewriteCond "%{HTTP_USER_AGENT}" "bot|crawler|spider" [nocase]
RewriteRule ^ https://alexschroeder.ch/nobots [redirect=410,last]