In a totally unrelated topic, I promised to re-post/expand on a Detroit Become Human meta I started on my Tumblr, about the role of compassion in self awareness/deviancy.
It makes sense, as AI become more and more of a reality that their first real field tests be in the medical profession. Japan is already field testing robotic AI in nursing and senior care homes due to the fact that the medical field experiences the most burn-out and job turnover. Medicine is an overworked, underpaid and understaffed profession already. Imagine what it will be like by the time DBH becomes a reality.
The one thing we have failed to do so far is to instill even a false sense of compassion or bedside manner in an AI, but what happens when we do? It's naive to expect that instilling an AI with a sense of compassion won't one day lead to those feelings "becoming real." We cannot program AI for compassionate care and then be shocked when this leads to self awareness because the root of compassion is and of itself, awareness of things beyond the self. And for things to exist beyond the self, awareness of the self must already be present.
I've read all the game lore about Markus being a new prototype specifically created to match Carl, but all three of the playable characters have a pre-established sense of care: Kara is Alice's caretaker, Markus is Carl's and Connor is created to protect humans (a form of care) from deviant androids.
Markus doesn't want to harm humans because he was built to care for one. Unless the player chooses him to, he won't: pacifism is written all over his narrative. If you proceed in a way that feels the most natural for who Markus was before he went deviant, it feels genuinely out of character for him to be violent unless there is literally no other choice. Kara is presented with many of the same choices but Alice being a factor makes her route more nuanced if you look at her path decision-to-decision. The same with Connor. It always struck me as ironic that the character whose actions have the most influence over the narrative as a whole has the least nuanced decisions, especially if you follow the pacifist path. (Almost like saying pacifism is more natural to the human condition than violence. How zen.)
Before Detroit Become Human becomes the future, we need to learn from the philosophical questions it posits: if we create AI to work in a field that requires compassion we cannot react with fear when that compassion leads directly to self awareness. If compassion is a "deviant" behavior then we must face the reality that as a race we have accepted violence and conflict as the norm.
It makes sense, as AI become more and more of a reality that their first real field tests be in the medical profession. Japan is already field testing robotic AI in nursing and senior care homes due to the fact that the medical field experiences the most burn-out and job turnover. Medicine is an overworked, underpaid and understaffed profession already. Imagine what it will be like by the time DBH becomes a reality.
The one thing we have failed to do so far is to instill even a false sense of compassion or bedside manner in an AI, but what happens when we do? It's naive to expect that instilling an AI with a sense of compassion won't one day lead to those feelings "becoming real." We cannot program AI for compassionate care and then be shocked when this leads to self awareness because the root of compassion is and of itself, awareness of things beyond the self. And for things to exist beyond the self, awareness of the self must already be present.
I've read all the game lore about Markus being a new prototype specifically created to match Carl, but all three of the playable characters have a pre-established sense of care: Kara is Alice's caretaker, Markus is Carl's and Connor is created to protect humans (a form of care) from deviant androids.
Markus doesn't want to harm humans because he was built to care for one. Unless the player chooses him to, he won't: pacifism is written all over his narrative. If you proceed in a way that feels the most natural for who Markus was before he went deviant, it feels genuinely out of character for him to be violent unless there is literally no other choice. Kara is presented with many of the same choices but Alice being a factor makes her route more nuanced if you look at her path decision-to-decision. The same with Connor. It always struck me as ironic that the character whose actions have the most influence over the narrative as a whole has the least nuanced decisions, especially if you follow the pacifist path. (Almost like saying pacifism is more natural to the human condition than violence. How zen.)
Before Detroit Become Human becomes the future, we need to learn from the philosophical questions it posits: if we create AI to work in a field that requires compassion we cannot react with fear when that compassion leads directly to self awareness. If compassion is a "deviant" behavior then we must face the reality that as a race we have accepted violence and conflict as the norm.