RewriteCond %{HTTP_USER_AGENT} '^.*AnnoyingBot1.*' [OR]
RewriteCond %{HTTP_USER_AGENT} '^.*AnnoyingBot2.*' [OR]
...
RewriteCond %{HTTP_USER_AGENT} '^AnnoyingBotN.*'
RewriteRule ^/(.*) /var/www/html/robots.txt
Note that in the example above instead of dropping the connection, we are actually serving the robots.txt in response to ANY requests; just to make a point.
But also note that the above is not going to stop somebody who *really* wants to keep crawling your site - as the user-agent header can be easily changed on the client side. Although a large search engine company is likely not going to bother.
A rewrite condition to block by the client's IP address, or a block of addresses:
RewriteCond %(REMOTE_ADDR) '^123\.124\.125\..*'
Or, if your Apache server is behind a proxy that's hiding the client's real IP address, you may need to obtain that address from another header. For example, our prod. server is on an AWS load balancer, so our rules look like
RewriteCond %{HTTP:X-Forwarded-For} '^123\.124\.125\.126'