The answer is yes. Tough to argue with these numbers.

Majority of Americans are liberals, it's not really up for debate anymore.
t is a persistent belief among many in the political and media establishments, fed by decades of right-wing propaganda, that the United States is a “center-right nation” that finds progressives to be far too liberal for mainstream positions of power.
If you look purely at electoral outcomes, those who assert this appear to have a fairly strong point. The last several decades of federal politics have been dominated by center-right policies and truly left-wing politicians have been largely marginalized (e.g., Bernie Sanders). Even Clinton and Obama — the last two Democratic presidents who, theoretically, should be leftists — are corporate-friendly moderates who have triangulated during negotiations with Republicans to pass center-right policy compromises (e.g., Obama’s Heritage Foundation-inspired ACA or the Clinton Defense of Marriage Act compromise).
While electoral results may support the idea of a center-right nation, looking beyond electoral politics — which involve a mixture of policy choices, party politics, fundraising and propaganda — and focusing purely upon raw policy preferences leaves us with an entirely different picture.
Here is a compilation of polling data from various reputable American polling organizations, describing the policy preferences of the Americans people over the last year.
And if your only defense is to attack the source- read it - they cite a whole lot of far-right polling sources. Everyone says the same thing. Libs have taken over.
http://www.salon.com/2014/07/28/gops_30_year_spin_job_is_over_why_we_are_not_a_center_right_nation/