Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
Been seeing a lot about how the government passes shitty laws, lot of mass shootings and expensive asf health care. I come from a developing nation and we were always told how America is great and whatnot. Are all states is America bad ?
America definitely has its issues, but I think we have historically been good about surfacing problems and making sure they’re at least talked about publicly, even if they’re not fixed. This probably makes it look worse than it is. I feel like even in countries with reasonable free speech, there can be social taboos against talking about certain things.
I’ll echo this. My understanding is that compared to other countries, Americans are willing to share our lives with strangers. Now, apply that to politics. As a country, we’re very open about everything.