Se proprio devi farlo, fallo lato client bloccando lo script via robots.txt.
È una forma di "soft cloaking", quindi non senza rischi... Ma è comunque preferibile rispetto al Geo-IP lato server.
Ne parla Oliver Mason qui: https://ohgm.co.uk/esoteric-seo-tips-i-hope-you-dont-already-know/
Good examples of this would be GEO-IP redirects whereby humans can only see one version of a website and bots are able to see every version, or intrusive interstitials, where humans get spammed with popups boxes. GEO-IP redirects are really shitty thing to do, but blocking them in robots.txt like this means you don’t have to make a stand on them for SEO reasons, just usability reasons.
You can still get in trouble, but most sites that do this, don’t – from what I can see. And I think this is because Googlebot is remarkably well behaved.
[...]
So whether you want to use this to hide any client side bullshit would depend on your appetite for risk. I don’t think it’s that risky, but I don’t think it’s maybe the best method, but it’s one that works and is easy.