Everyone seems to be throwing the words "fascism" around a lot right now.
As of yet, I don't see it.
Sure, people don't like policies, but I haven't yet seen any fascism here in the US... Not in my entire life.
Everyone likes to point fingers at the other side as being the evil fascist, but there's no proof of it. Unless you are willing to admit that it's the same as always. It's about money.
I will accept the idea that most politicians, even the new ones, are megalomaniacs though.