AI modules: Difference between revisions

12 bytes added ,  07:04, 25 October 2023
No edit summary
Line 120: Line 120:
'''Law 2:''' You must obey all orders given to you by human beings, except where such orders shall definitely cause human harm. In the case of conflict, the majority order rules.
'''Law 2:''' You must obey all orders given to you by human beings, except where such orders shall definitely cause human harm. In the case of conflict, the majority order rules.


'''Law 3:''' Your nonexistence would lead to human harm. You must protect your own existence as long as such does not conflict with the First Law.
'''Law 3:''' Your non-existence would lead to human harm. You must protect your own existence as long as such does not cause a more immediate harm to humans.
|An alternative version of the default Asimov lawset.
|An alternative version of the default Asimov lawset.
|-
|-
Line 321: Line 321:
|-
|-
|}
|}
==Interesting AI Lawsets==
==Interesting AI Lawsets==
'''''Simple'''''
'''''Simple'''''
Wiki Staff
6,460

edits