I find this beyond annoying. So as many of my friends from around the globe who are currently studying on colleges in many countries.
Professors on colleges in many countries are claiming that we’re a living in capitalism and that a bad economic state is a fault of capitalism.
It’s happening really often on my college. Like “capitalism is to blame”.. The other students in class are believing those words and and it’s making me sick.
The question is if professors even understand that we are not living in capitalism or they just need to repeat what somebody told them (like blame everything on capitalism, students have no idea about it).
First, if they don’t understand the current state of mixed economy they shouldn’t even get their diploma and should never be a professors in a first place.
And second, if they’re required to repeat what somebody told them, they are clearly speaking of the usefulness of that state controlled education.