Every time I open the news, there's a racial problem. White cops shooting black kids, white cops shooting latino kids, then these same black and latino kids rioting and looting. The education system rates poorly on the international scale yet tuition fees are some of the most expensive you'll find. Complaints about nutrition and the shit they put into US products. The US debt. The justice system, the political lobbying, surveillance etc. Sitting in the comfort of Europe, I wonder - has the US become a shitty place to live? Is it, perhaps, just the media focusing on the negative?