I think you can use a Turing-like argument to argue against the existence of a finite set of moral rules that covers every situation too. The argument goes: suppose you have some set of rules. Now engineer a situation where, if the rules are followed, you cause something bad to happen. That’s similar to the step where turing says, “now create a program that asks if p halts, and if so runs an infinite loop”.
Which means, no sacred text or set of commandments could possibly cover every situation.
I think you can use a Turing-like argument to argue against the existence of a finite set of moral rules that covers every situation too. The argument goes: suppose you have some set of rules. Now engineer a situation where, if the rules are followed, you cause something bad to happen. That’s similar to the step where turing says, “now create a program that asks if p halts, and if so runs an infinite loop”.
Which means, no sacred text or set of commandments could possibly cover every situation.