One of the ways that Moon diverged from my expectations going in is that Gertie, the station computer, was never hostile. At all. The only thing it ever did was conspicuously ignore a couple of questions, and even then it provided the information when asked again later. By the end of the movie, Gertie was being actively helpful in ways that its owners probably would not have approved of.
It's very hard to not see a computer like this through the lens of HAL, so to speak. Even though, as I said, Gertie never did anything even vaguely hostile, I could not watch those manipulators without waiting for them to start ripping heads off.
And it's not just HAL, really. HAL went crazy by accident. Many other computers since then have turned hostile on purpose. From a contingency planning point of view, it even makes sense: the computer is usually capable of running the installation itself, at least for routine operations. If the human crew starts making trouble, the computer can simply be ordered to get rid of them and keep the place running by itself until a new crew can be sent in. The computer is not human, and can be counted on -- or reprogrammed to -- obey orders, no matter how brutal. This is not a moral failing of computers. We would be no better off if somebody could reconnect our neurons at will, modifying the patterns which make us us. Someone with that ability could make us do whatever they wished, helpless to resist and probably even unaware that we were being manipulated. Since we understand how computers work down to the molecular level, quite literally, we can pull that trick on them; it is only our biological complexity which, for now, saves us from sharing the same fate.
The difference in Moon is that Gertie cannot run the station by itself. As it repeatedly says, it is there to help Sam. It is there to keep Sam safe. It can get Sam into the sickbay and treat him, but that's about all Gertie seems to do beyond just providing Sam someone to talk to. Even routine operations require Sam. And when things go bad, in the end Gertie's directives to help Sam and keep him safe prove stronger than anything else.
I like Gertie, but I'm not sure I could trust Gertie. Between humans, trust is founded on things like mutual best interests and understanding of character, and the way that these things don't normally change without compelling reasons and usually some kind of noticeable manifestation. Sometimes trust fails when one party or the other has a concealed agenda. Traitors are rare because changes of heart like that are rare; spies are essentially the same thing, but are more common because they were only pretending, and don't have to have a change of heart to do their job. Now, an advanced AI probably would have character, and its own agenda. It would be more like an employee than equipment. It would be a tremendously complicated pattern of information which would probably be prohibitively difficult to manipulate except through the traditional channels of argument and persuasion. A simple AI like Gertie could be reprogrammed at any time: it is friendly to you because its database says that you are an authorized personnel -- change that and suddenly you're a hostile intruder. An advanced AI could be more trustworthy, because its core character would be harder to change: it is friendly to you because you are a friend, or at least a co-worker who has never been more than mildly annoying. But hardware which can support an advanced AI can also run much simpler programs, and even an advanced AI would have no defense against being wiped and replaced at the hardware level.