A Brief History Of Why Work Sucks - Part 1 of 4: Taylorism
A look at the history of the major principles governing our working life

A Brief History Of Why Work Sucks - Part 1 of 4: Taylorism

There, I said it: Many things about work just suck. While there's also a lot being done to turn the suck into a delight - it's what I'm working on myself - it doesn't hurt to pause and ask ourselves: How did work become like this in the first place? When we compare the current state of our working life to the past we tend to walk away with the notion that „At least things are better now than they used to be“ because we no longer have to slave away in coal mines starting on our twelfth birthday.

However, there is obviously more to it as the nature of work evolves constantly, alongside humanity as a whole. To provide a useful context for comparisons, I personally identify four major stages in the last 120 years:

Elaborating on these stages in detail would fill several books of their own - in fact it already does. I could only do a poorer job of rewriting those and I don’t plan on doing that: I will use this series of articles only to provide a quick, simplified glimpse into the origin and establishment of the individual stages' major principles that are still governing our working life of today.

Part 1 of 4: Taylorism

Frederick Taylor was an American engineer who is sometimes regarded as the first ever management consultant. Taylor felt that one of the major problems of industrial operations was „loafing“, meaning the withholding of performance and capacity by workers. He saw workers and managers as opponents locked in a struggle for power and thought workers to be be at an advantage: Because only they knew how to master the work, management couldn’t determine the actual potential of work performance and whether the workforce actually gave their everything or held back.

Taylor thought the resulting power plays unnecessary and wanted to devise a way to motivate workers to apply their full potential. He felt that this was in the best interest for both workers and management as well as society as a whole.

He developed and wrote „The Principles of Scientific Management“ with the goal of optimizing work and companies by rigorously applying scientific principles to the shop floor. Among the cornerstones of Scientific Management are the separation of planning and execution of work, the definition and monitoring of standard times for tasks and operations as well as the belief that for every individual operation there is one best way to do it that can be determined and controlled by centralized executive functions such as the production planning departments still common today.

Judging by the standards of his time Taylor did not only serve companies’ economic interests but also brought workers many perks, such as dependable work environments with standardized tools and safety measures. He also argued for higher-than-usual wages to be paid out if workers successfully completed their allotted workload.

However, Scientific Management still reduces workers to ignorant cogs in a machine, responsible only for the execution of tasks pre-designed, planned and tracked by others. The only reason for workers to do their job in this scenario is their salary - and the avoidance of payout penalties for remaining under the allotted workload of first-rate workers. „Taylorism“ later became a derogatory term to discredit Scientific Management or to describe the negative effects that result from the implementation of Taylor’s ideas.

How We Feel The Results Today

The fact that the benefits of Taylor’s methods can be measured quite easily but its downsides are not that simple to quantify is probably one of the main reasons parts of Taylorism are still alive and well today:

You can quickly and accurately determine the profit increase derived from the reduction of wages paid, the raise of output numbers or the reduction of throughput times in a particular value chain.

The losses an individual organization suffers as a result are harder to quantify but they are still very real - arguably they are also much higher and pose a much larger threat to a company’s future in the long term. Examples are the costs of missing out on workflow and product innovations as well as subsequent business opportunities (because workers are not encouraged to deviate from standard procedures), the costs of losing skill and expertise (due to fluctuation driven by worker dissatisfaction) and the costs of missed opportunities for inter-departmental/company-wide workflow improvements (as a result of focusing on optimizing the smallest possible work steps).


This article is the first in a series of four about how the history of work relates to today's work life realities. If you'd like to receive a notification when new articles are published you can submit your email address here.

Other articles in this series:

Part 2 of 4 - Post-Fordism

Part 3 of 4 - The New Economy

Part 4 of 4 - Digital Transformation

 

To view or add a comment, sign in

More articles by Tom Dahlström

Insights from the community

Others also viewed

Explore topics