Growing up, my sister and I used to go to the apictures and watch films like Escape to Witch Mountain and Herbie Rides Again and Digby the biggest dog in the world (with no Black actors as far as I can remember), we used to love those films with the childlike innocence that you would expect from films aimed a children. I also remember watching all of the wonderful Hollywood musicals, they were always so colourful (how ironic) so wholesome and so pure.
But recently I saw classic musicals like A Star Is Born with Judy Garland and and Calamity Jane with Doris Day an noticed really for the first time after becoming “woke” (in true meaning of the term) how Black people were represented in these movies. Usually poor yet happy and smiling in their poverty, mostly servants, or in the serving industry i.e. waiting staff, porters, lift operators. Never were Black people portrayed at least as “equal yet separate”, or “like one of us”. I could also say the same for Indigenous and people of colour too who were either seen as the enemy or not represented at all.
I now find it difficult to watch the old Hollywood movies I used to love because of the consistent negative stereotypes of Black people and people of colour, which contributed to the steady drip feed of racist ideas I a Black person had against Black people.
Rebecca, I had intended to write an article about how Hollywood has indoctrinated my view of Black people but I think your article just about covers it.
Thanks for sharing