One thing thats become clear in the past few tumultuous and for many, traumatic years is that its easy to feel like there is no control in our lives. Control is a basic psychological need that helps people feel like they have agency, from how they live to where they work. One area where people have tried to wrestle back control is around work.
As a Rice University business school professor and author, Ive examined through my research, teaching and readership the complex relationships between employees and their employers for nearly two decades. The aftermath of the pandemic is the latest iteration of a timeless negotiation between labor and management over control that took on added significance these past few years.
The pandemic acce…
Read the full article at: https://theconversation.com/americans-are-taking-more-control-over-their-work-lives-because-they-have-to-194036