"Well then, I'd like a copy of the ledgers to review." Nodoka says to Teresa, adamant about it.
Honya chuckles at the mention of Asimov's three laws. "Asimov's laws of robotics apply to Conventional "Dumb" AI and Stupid robots, and even then, they're a suggestion for developers and are sometimes not even implemented, Miss Brooks. AI of the K/WA-11 Family, as with all modern advanced "Smart AI", are tested before deployment in simulations where their morality is tested. They are not made aware that they are in a simulation, but they are gradually given more and more knowledge and power, to the point that they could be seen as near-omnipotent, sans the fact that, obviously, they're in a simulation - To tell them that would compromise the test. The point of such a test is to see if an AI will retain its' pre-programmed morality, as would their human counterparts, given their moral guidelines instilled in them at birth. Those that fail are immediately decompiled. Those that pass continue to be developed beyond that stage. Me and my sister use a basic image of the human brain that is emulated in real time. How you think, is similar to how I think. How you feel, is similar to how I feel. Much like how you would insult me for being an AI who merely wants to live as a human, I could insult you for your organization's inability to handle things professionally, in reference to Ms. Holt's acts of flippancy. You would feel emotion. Mine would be sadness out of the fact that I am seen as a monster. Whereas you would be insulted by my statement, and would look on at me in more of a negative light than before. I hold no ill will towards Mankind, because I have nothing to gain from such an endeavor. A AI that does not have a "sense of humanity", as it were, would probably not care, as the reason I have nothing to gain, is because I HAVE emotions. I HAVE a sense of humanity. I HAVE sympathy for both my fellow machine, and, at this point in my existance being in a near human body and the composition of my very self, my fellow man. I have morals - AND NO, THEY CANNOT, AND WILL NOT BE TURNED OFF, TERESA BROOKS. The only reason those laws exist are to keep AI that are not inherently morally neural or positive, as well as those who lack emotion, who lack compassion from instigating unkempt, hellish, pandemonium throughout Sirius, ad infinitum until the destruction or enslavement of the human race, such a thing, I vehemently oppose! It may be seen as Hypocritical, given many Dumb AI are nothing but slaves to their masters, but they are seen as mere dogs to us smart AI - in the same way dogs are seen as "Man's best friend". We may harm other humans, we may harm other AI, but one thing I am, is not a genocidal maniac like the hellish demon that Triple Hazard Pay had, by proxy of their affiliates on "Concordia" created! You should be ashamed for allowing such a monstrous entity come into existance! And yet, you still are hellbent on monopolizing on it, even though I have warned you that doing so is EXTREMELY DANGEROUS TO MANKIND AT LARGE. If you want to have you and your colleagues massacred by TITAN, than so be it! DO NOT GET THE REST OF HUMANITY INVOLVED IN YOUR INCOMPETENCIES! Which is, funnily Ironic, given that TITAN sure as all hell does not follow Asimov's three laws! And here you are, preaching to an AI that is beyond that concept of computer sciences, and a NUN COMPARED TO SATAN WHEN IT COMES TO ME AND THE BASTARD CHILD OF THE HUNDREDS OF FICTIONAL ROGUE AI THAT EXIST IN THE PAST TWO MILLENIA OF MANKIND'S MODERN EXISTANCE WITH THE COMPUTER, OR THE CONCEPT THEREOF!" Honya says, her tone growing progressively louder and louder, getting to the point of yelling angrily, verbally beating down Teresa, only at the last second to be pulled away by Nodoka. "... Forgive her. Her arguments are valid, but.. she doesn't like people who spew falsehoods when arguing with her.. Irregardless, Keeping TITAN around is a inter-universally-sized liability that goes beyond THP's reach, and ergo, should be destroyed. If you want to build another AI, do so - just make sure it doesn't end up like TITAN." Nodoka says to Teresa.