Economics Has America always been capitalist? History shows that capitalism isn’t natural or normal, strengthening the belief that we can create something better. By James Parisot / 17 Feb 2019