No, I think that attitude is the result of the Enlightenment, at least in the West. Don't think that historically Christianity agreed all men were equal---unless you were an aristocratic white male and even then you were often subject to the whims of the current king, which should raise issues...