A Law&Order themed news episode this week where we look at a collection of articles about the many legal implications that come with the ever increasing usage of machine learning and neural networks in the worlds courts. We start with the first traces of “Minority Report” PreCrime, follow on to privacy rights of convicted people and finish with the realities of A.I. based facial recognition.
“We can’t risk peoples’ lives on automated apps that save money.”
Driven by a shortage of human resources, machine learning and artificial intelligence are slowly infiltrating the court system.
From our doubts about speech-to-text systems causing problem with future decision reversals to our reservation regarding the accuracy of Neural Network “Recommendation systems” playing a role in deciding the measures of penalty for convicted wrong-doers.. We’re definitely not convinced that this is the ideal way forward.
On the other hand, us humans also do not have a squeaky clean track record when it comes to decision making, so maybe it’s all for the best after all?
“Good” isn’t only for the “good”…
While this article tries to inflame tensions a little and unfortunately seems to be taking this as a chance to take some pot-shots at Google, we did feel the idea of privacy rights of convicted criminals is worth a discussion.
To be sure, we definitely don’t claim to have all the wisdom and this is just our opinions, but articles like this do seem to have a “hidden agenda” and hopefully clearer minds prevail when this subject matter is discussed.
Maybe so, but this genie is not going back in the bottle..
ClearView’s credibility has pretty much been destroyed at this point and the whole image recognition technology also doesn’t fare much better!
But hang on, it’s easy to use, easy to set up and sounds cool? Let’s do it anyway!