Almost every damn job I've had in the South I have had to answer to white people with less education and experience than myself. I have worked in positions that required me to have a college degree as well as every other minority and we all had to answer to managers who were overwhelmingly white and many had only high school diplomas.
Now if they are smart enough with a high school diploma to be a manager, why do they require blacks to have "college degrees" to be in subordinate positions taking orders from them? How is this fair? And not only are these guys managers, they are some of the most nasty, bigoted managers I have ever seen. They can't be trusted and are highly, HIGHLY distrustful of their black employees to the point that we are watched and micromanaged far more than our white associates.
Does a college degree really mean anything to these southern white employers when it is held by a black person? Do they get a kick that "college educated" blacks are having to come to them for a job that often pays far less than what they earn?
America still has a long way to go in terms of providing the basic groundwork for black economic equality.
Wingspan Portfolio Advisors Blogspot