No, I don't think so.
There used to be a greater attempt to teach people to consider how their actions impacted others. Americans now live in a society where they're taught that the other people around them are obstacles. Where business has no obligation to anyone other than to its shareholders. Where family matters and everyone else doesn’t.
Look at some of the terms we use as a society. Keeping it Real and I’m not PC. Can you imagine the culture as a whole in the 60’s priding themselves as Inconsiderate Jerks?
Throughout history, man has been pretty dismissive of damage done to those they can class as ‘others’.