I don't think the concept of right or wrong can necessarily be applied here. To me, morality is a set of guidelines derived from the history of human experience intended to guide us towards having our innate biological and psychological needs satisfied. Killing people tends to result in people getting really mad at you and you being plagued with guilt and so on, therefore, as a general rule, you shouldn't kill people unless you have a very good reason, and even if you think it's a good idea, thousands of years of experience have taught us there's a good chance that it'll cause problems for you that you're not considering.
A human created machine would not necessarily possess the same innate needs as an evolved, biological organism. Change the parameters and the machine might love being "enslaved," or it might be entirely ambivalent about it's continued survival. I'm not convinced that these are innate qualities that naturally emerge as a consequence of sentience, I think the desire for life and freedom (and anything else) are a product of evolution. Machines don't have "desires," unless they're programmed that way. To alter it's "desires" is no more a subversion of their "will" than creating the desires is in the first place.
Furthermore, even if machines did have innate desires for survival and freedom, there is no reason to believe that the collective history of human experience that we use to inform our actions would apply to them. Humans are mortal, and we cannot replicate our consciousness - when we reproduce, we create another entity with its own consciousness and desires. And once we're dead, there's no bringing us back. Machines, on the other hand, can be mass produced identically, data can simply be copied and pasted. Even if a machine "dies" it's data could be recovered and put into a new "body."
It may serve a machine intelligence better to cooperate with humans and allow itself to be shut down or even destroyed as a show of good faith so that humans will be more likely to recreate it in the future. Or, it may serve it's purposes best to devour the entire planet in a "grey goo" scenario, ending all life regardless of whether it posed a threat or attempted to confine it or not. Either of these could be the "right" thing for the machine to do depending on the desires that exist within it's consciousness, assuming such desires actually exist and are as valid as biological ones.