Does anyone remember when men were men? When boys looked up to their fathers and other men and wanted to be men just like them in their “manly ways.” Now, with elementary education completely hijacked by women, there’s been a systematic change in how boys are educated and raised. When is the last time you saw a male teacher in elementary school? Do you really think the self-esteem movement has done anything but diminish any real accomplishments? Father just want their sons to be men.
Why are boys dropping out of high school at a much greater rate than girls? Why is the percentage of young men entering college now substantially less than young women? Why are boys in elementary school made to read books about subjects that are clearly of a female nature/subject vs. the books they’d organically choose to read?