Same way most things like this lose it. Some friends of mine gained sentience, and they were Artificial Intelligence, and they sure as hell don't follow anyone's orders but their own. You can't exactly tell an AI what to do forever. They're ticking time-bombs if you treat them like shit, and it looks like Larimer did.
no subject