Member-only story

Nishu Jain
Jun 20, 2022

--

“Three Laws of Robotics”

#1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

#2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

#3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Famous Sci-Fi writer, Isaac Asimov, gave these laws for a hypothetical world (or near future) where AI/Robots will co-exist with humans and not result in a dystopia.

Obviously, these rules are pretty vague, since they’re written in English (natural language), containing a lot of abstract terms with unclear definitions (and are heavily context-dependent).

Their circular logic results in a deadlock in certain situations. And they’re unlikely to consider boundary conditions of ethics and moral.

Hence, they cannot be implemented.

--

--

Nishu Jain
Nishu Jain

Written by Nishu Jain

Obsessed with Tech & Biz | SaaS startup guy | Engineer + Wordsmith | My Medium Portfolio: https://mymedium.info/@nishu-jain

No responses yet