So I am in this sociology course titled “Race & Ethnic Relations.” The professor gave us her reasoning today as to why racism still persists in America. The corporations want racism to exist. It was something like this (very roughly).
Finance Capitalism is to blame for the continuing racism in this country. It works like this: I interview two people for a job, one black and one white (her area of study is racism towards African Americans). I don’t want to offer the black guy any more money than I have to. I have to offer the white guy the same $ as the black guy. This leads to everyone being underpaid and having no control over their own lives. So the corporations keep racism alive.
To me though, it seems like the guys in the corporations have to be racist in the first place for this theory to work. Someone has to come from some racist background or upbringing to bring racist practices into the workplace.
Basically I’m saying that the corporations don’t keep racism persisting in this country, it’s the old school prejudices that come from people wanting to build themselves up by putting others down.
Anyone have any comments on this? I would like to make a good argument against this professor. She has too much of an agenda for my taste. I want to learn facts/theories whatever, but not be told how I should think.
She also posted a bogus stat stating blatantly that “women make 70% of men’s wages.” This, of course is based on faulty data (the men and women being studied are not of the same age, experience, or have same time on the job).
Just looking for thoughts.