Random but how many of you (who are Black) were told by your parents something along the lines of “work isn’t for making friends/you don’t socialize at work/don’t talk about your personal life at work”?
I’m wondering if that hurts us in the workplace in some ways. I think it’s a delicate dance but at least in my field (law) people want to feel like they have a bond with you besides just the work
I’ve also seen a lot of summer interns not realize that showing up at the social activities that are planned by the firm is a part of the evaluation process. They’re tracking attendance at all of those things on the social calendar
To be clear I think racism/bias is the overwhelming reason why we don’t advance in these spaces. But I also think we aren’t hip to the game of playing office politics