All healthcare should be non-profit.
Our healthcare system and the governments view on it are dominated by the pharmaceutical industry, run by people whose intent is not to cure but to make money, and we all seem to think that it's fine. People in almost every country are dying en masse every day, from curable diseases, simply because they can't afford the treatment or because their health insurance won't cover it.
How can you actually trust a pharmaceutical company to make you better if they need to make bigger and bigger profits, and the only way for them to do this is when you're buying their pills? The only way to know for sure is to eliminate the whole idea of making profit, and making this industry about healing people again.
In our society we should help the weaker and less fortunate to carry their load, instead of leaving them on the street like that little Chinese girl that got run over. How can we stand by and watch someone die, telling ourselves that's just how it is? The uncivilized, egocentric idea of "me first and f#*k the rest" should be behind us by now.