Do they make us soft or educate us?
With “they” I am referring to all of our universities and high schools. Is it just me or did education change a lot. We don’t see teachers as authority but as failed adults who didn’t manage to succeed in their field and just went with teaching as plan B. Parents scream at the teachers because their kid is dumb and the kids blame the teachers for everything they do wrong, it’s just horrible. What I…
