Has Obamacare Made Restaurants Partisan?
Politics in the US is discouragingly partisan. National politics has become increasingly partisan since at least the late ’60s, when the passage of civil rights legislation influenced many conservative southern Democrats to join the Republican Party. Even state politics has become more partisan, where even famously nice people in Wisconsin have found themselves battling their neighbors across…
