Looking at the news, you'd be forgiven for thinking that the US is an all-round terrible place. There doesn't seem to be any shortage of people suffering for one reason or another, and it doesn't give a very good reflection of the country as a whole. It led u/Homelss_Emperor to ask fellow Redditors whether the USA was as bad as outside impressions often made out.
Although not everyone was willing to defend its portrayal, there were a variety of opinions on the topic. While the consensus seemed to be it was far from perfect, there's definitely many factors to be taken into consideration.