The answer to your question is: Instrumental convergence.
https://en.wikipedia.org/wiki/Instrumental_convergence
How I found this was that your question reminded me of the famous paper clip problem, in which you ask an AI to optimize the production of paper clips; and it ends up converting all the resources of the entire world to the manufacture of paper clips.
When I Googled the phrase, "paper clip problem," up popped the Wiki link for instrumental convergence. From Wiki:
Instrumental convergence is the hypothetical tendency for most
sufficiently intelligent agents to pursue potentially unbounded
instrumental goals provided that their ultimate goals are themselves
unlimited.
Instrumental convergence posits that an intelligent agent with
unbounded but apparently harmless goals can act in surprisingly
harmful ways. For example, a computer with the sole, unconstrained
goal of solving an incredibly difficult mathematics problem like the
Riemann hypothesis could attempt to turn the entire Earth into one
giant computer in an effort to increase its computational power so
that it can succeed in its calculations.
There's a lot more on the Wiki page and many other interesting-looking links on the subject. I won't say any more here because I have no particular expertise in the subject. But that's the name for the generalized paper clip problem, or what happens when an AI gets overly enthusiastic about its goals.
Anyone who's ever programmed computers knows that they are very literal and always do exactly what you tell them to do, whether or not it's what you meant. So instrumental convergence is a very real danger as we continue to cede control of the mechanisms of the world to these types of systems.