there needs to be a cultural shift in america like im not talking about culture war bullshit i mean the average american needs to learn to care about their community and the rest of the world and not be a self-absorbed asshole with a "fuck you i got mine" attitude.