Americans now believe that it is the federal government's responisbility to ensure that everybody has healthcare. What does that actually mean? Is healthcare a right? Sources: Americans Believe Healthcare should be provided by government http://www.pewresearch.org/fact-tank/2017/01/13/more-americans-say-government-should-ensure-health-care-coverage/ Waiting Your Turn: Wait Times for Health Care in Canada, 2016 Report: https://www.fraserinstitute.org/studies/waiting-your-turn-wait-times-for-health-care-in-canada-2016 (actual 100 page report) https://www.fraserinstitute.org/sites/default/files/waiting-your-turn-wait-times-for-health-care-in-canada-2016.pdf **I'd like to thank and appreciate my producer, Bueller, who may or not agree with the views presented here