Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Study finds 83% of software developers feel burnout (usehaystack.io)
344 points by IcyApril on July 13, 2021 | hide | past | favorite | 289 comments


If you read https://haystack-books.s3.amazonaws.com/Study+to+understand+..., you’ll see

Q1. To what extent, if any, do the following statements apply to you?

I feel burnt out from work.

On a four-point scale, that scores 21%, 35%, 28%, 17%.

So, they took everyone except those who said they didn’t feel burnt out at all, and got 83%.

I think asking for feelings of burnout exaggerates the issue of being burnt out by a huge factor.

For me, burnout means not only being tired of your job, but also inability to ‘burn’ outside work. I don’t even see how you can be that moderately, but over a quarter of respondents reported that.


> I think asking for feelings of burnout exaggerates the issue of being burnt out by a huge factor.

I kind of see where you are coming from, but I think this is an important question to ask. Asking someone's opinion on something is a REALLY good indicator of their opinion on that thing: do you hate your job, do you like chocolate, do you love your wife, etc.

While the question "are you burned out" might not predict attrition as well as other questions and it might not be "scientifically assailable" - it still does tell us something about the person.

At work, I frequently ask my team (individually), "do you feel burned out", but I ALSO ask other questions "do you feel like you are doing good work", "do you enjoy what your are doing", "do you like the people you work with", "am I a good manager (a notoriously BAD question)".

These and others help me keep a pulse on my team and what is going on in their head.


But that's the point. Burnout is not one of those things about the people's opinion is the fact. People often are completely wrong about they being stressed at the exact moment you ask them, and burnout is about the kind of long-term experience that people are really bad at measuring.

Besides, even "do you like chocolate" is a very bad question to ask if you want to take any actionable conclusion from your data, or even if you want to sum your anecdotes up. You can't go asking people "do you like chocolate" and expect to know how much chocolate they will decide to eat, for example.


Try "how can I be a better manager?" or "do you have any suggestions for me or the team as a whole?"


> I think asking for feelings of burnout exaggerates the issue of being burnt out by a huge factor.

Is there a better scientific method to quantify burnout?


Not sure, but I'd ask questions around symptoms. A doctor also doesn't ask whether you have illness X, but rather if you have symptoms Y and Z and tests them.

Burnout has some pretty severe symptoms associated with it, several of them can be taboo in certain circles. It has strong overlap with depression for example. If 83% of developers have these symptoms, it would be shocking to me.


This is akin to saying that people who are missing some toes, but still have some toes, should not be counted among people who are missing toes. Burnout, like malnutrition, is not desired in our society, and so even minor amounts of it should be measured and treated.


"For me, burnout means not only being tired of your job, but also inability to ‘burn' outside work."

"burn outside work"? What does that mean?

Burnout is feeling great distress at doing your job, due to factors such as overwork, disillusionment, and depression. Sometimes that adverse feeling is so intense the employee is psychologically unable to continue.

As much some people would like to fancy themselves as coldly logical automatons, human action is actually in great part motivated by feelings. So feelings matter.


If a fire has burnt out, it can’t deliver any warmth anywhere, not just at some things.

Similarly, for me, the term “burnt out” means not only that you can’t accomplish anything useful at work, but also “take that home”, and stay feeling powerless there. If you feel tired at work, but the first day of a holiday feel full of energy again, I wouldn’t call that burnout.

Wikipedia doesn’t seem to fully agree, though. It redirects https://en.wikipedia.org/wiki/Burnout to https://en.wikipedia.org/wiki/Occupational_burnout, and quotes The Who as “Burn-out refers specifically to phenomena in the occupational context and should not be applied to describe experiences in other areas of life.” (https://www.who.int/news/item/28-05-2019-burn-out-an-occupat...)


> If a fire has burnt out, it can’t deliver any warmth anywhere, not just at some things.

You're putting a lot of weight on the mechanics of "burning," but the physics of combustion are an analogy of convenience, not the diagnostic criteria.

One of the reasons that "burnout" applies particularly to work rather than hobbies or recreational pursuits is that people (generally speaking) can't afford to just walk away from their job and return when they feel like it.


> As much some people would like to fancy themselves as coldly logical automatons

If anything, seeing oneself as a logical, emotionless automaton is one of the reasons for burnout and depression.


So they also don't test if people actually are burnt out, just whether they say they are?

My observation is that most people in tech discussing burnout do not know what burnout is. They think it means being bored with your job.


> So they also don't test if people actually are burnt out, just whether they say they are?

You can't shove someone into an MRI and get a 'units of burn out' score complete with confidence intervals.


I wonder if software presents a unique situation as we generally make a pretty decent salary and to transition to a new career involves a pretty significant pay cut. So some developers become trapped in jobs they hate with golden handcuffs. I really don't enjoy software development at all, but at this stage, to even think about doing something else would result in a massive pay cut. I did it once when I was younger, quit software and started a new career with a 55% pay cut. Liked the new work and did it for 5 years. Then I had kids and needed more money so I returned to software, since then I just exist in a constant state of burnout.


> since then I just exist in a constant state of burnout

The worst part is that even that doesn’t seem to matter. Whether I do my best work because I’m 100% into the job, or my worst because I just cannot be bothered any more, people perceive me the same way, and my salary remains the same.

If the effect is nil why should I even bother.


This is true for many jobs.

Roofer? Car mechanic? Yeah most customers won't know the difference. Burger flipper drops one on the floor. Toss it or put it back? Nobody knows.

This goes right up to CEOs - it's hard to _really_ evaluate performance. Stock price is a lagging indicator of company health.

The reasons to try to do a "whole-ass job" (its ok to phone it in, just do it solid) for me are:

a) It's what I'd like others to do when they do stuff for me and b) I think it's psychologically safer, long term.


This is exactly my experience as well. My current company has no clue the value I provide and my salary is going nowhere.

It is a social game. People who are outgoing and good looking win.


It's not at all unique, it's a mix of "efficiency wages" and tournament-based compensation.

Efficiency wages just means that the pay is set higher than what it'd take to clear the market. It means that people who'd otherwise have even better career prospects become programmers instead, people who would otherwise quit to work in a different field or for a competitor don't, and that employees generally see jobs at the current compensation level as something relatively scarce and hard to acquire in unemployment. Employers pay these efficiency wages because getting more out of their programmers is worth the extra expense, but on the flip side it means that they're more able to ask for overtime and otherwise apply stressors to employees.

Lots of places have efficiency wages, and they all tend to be stressful AIUI. Oil workers are notorious for this - massive salaries for dirty, dangerous, and difficult work shifts.

Tournament theory is more about how difficult it is to measure programmer productivity. So, don't even bother; instead get a sense of which programmers are doing better than the others, and make a significant portion of compensation based off "winning" this vaguely stack-rank shaped "tournament". In theory, if employees are risk-agnostic, this is as economically efficient as measuring their output and paying piece rates, but without the overhead of measuring output. In practice, this is yet more incentive for overtime, hard work, extracurriculars, networking, etc. Basically, instead of paying for the direct output of giving your job your all, they pay the best at doing that the most and let people figure out how that should translate.

The stereotypical jobs for tournament theory compensation are corporate executives and professional athletes - how hard an athlete works in the gym and what statistical performance is worth what are hard questions, but "who's the best 10 players in the NBA" is a lot easier to answer.

Anyhow, that's my understanding of the main factors leading to software developers being so well-compensated in the US and why there are so many unhappy and burnt-out developers. Part of the same complex IMO, it's fundamentally a result of programmer productivity being so important to businesses and yet so hard to measure.

The thing I'm curious most about is how pretty much all the example jobs I gave are skewed male. It's not every male-skewed job that fits the bill, but it's still something. I'm not at all sure what, though.


I was thinking a lot about Golden Handcuff situations and in a weird sense its artificially creating the actually bad scenario where someone is stuck in a job because they NEED it to make ends meat. You can be sure people in low paying jobs with low mobility are also burnt out. The difference is recognizing that golden handcuffs is an artificial situation and money doesn't create happiness (once you cross that base threshold). Its easy to see in others but hard to act on yourself. Ultimately its one of those things where you just need to be principled -- never stay at a job (career, etc) because of Golden Handcuffs. Its just like building flexibility or resiliency into an engineering system. You don't need to specifically justify it, its just a good principle to follow as a general rule.


That's the problem I have. I really want to switch to finance but I make so much money now I dread starting over at $60k or something.


yup, my salary allowed my wife to recently switch careers and take a 50% salary cut. There is no way we could do the same for my career though unless we made drastic changes like selling the house and never retiring.


How much of the workload is just the dev shortage/insufficient staffing?

I am currently the only dev on my team because of departures (there should be 4). Now, it is not really all that in production yet so it hasn't required late nights or waking up at odd hours, but it does linger in my mind that this could be a big issue if it grew a lot and my team didn't. I would hate to support it alone.

A friend of mine is missing two people on his team at another company. It means that they had to shove the interns into the support rotation to have full coverage of their app. They are hiring too, but as things stand, interns are staying up in the night deal with production errors and they are waking each other up as well, they are interns and don't know what they are doing.

Software seems like something that needs a sufficiently sized team after a certain point and nearly every team I know of anywhere is short people at the moment, even at FAANGs.


> How much of the workload is just the dev shortage/insufficient staffing?

ZERO PERCENT. Listen to me very closely. The quantity of people has ZERO IMPACT on your work quality. Well... maybe like 3%, but hear me out.

Your manager has a HUGH impact on your work quality. The work that is given to you, your PERCEPTION of the work, the urgency of the work, your development opportunities (and your response to them), the people you "hang out with at work" (note, this is about quality and nor quantity of work friends).

Of note, hiring more people can help with things like "We need 24 hr support", but that can also be helped by things like "well, actually, we only need 8 hr support, and the customer can pound sand" - that is where a manager comes in.


You're talking about quality, but what about quantity?

If you need to develop:

  - the back end for an application
  - the front end for an application
  - some external APIs for the application
  - a structured data storage layer, say, with an RDBMS
  - an object storage of some sort, for example, on top of S3
  - some sort of reporting functionality
  - mailing list functionality
  - etc.
Then no matter how good your manager is, there'll be more work for 1 person to reasonably manage.

Furthermore, the quality of your work can be stellar, but if there are 5 different problems, all of which need solving and you tell your clients to pound sand, you might find yourself with no clients.

Understaffing feels like a real problem.


Having many engineers causes changes to your engineering workflows and standards that make them a lot less productive, in order to cope with having so many engineers. Also, some of your many engineers will choose to spend their time creating internal platforms that are much worse than commercial and open-source alternatives, and since you are good collaborative synergistic team players you must always use the thing written internally, no matter how bad it is.

Creating a new backend endpoint requires a multi-week design review process and must be implemented in a straight-jacket of a programming language with 90% test coverage and tons of layers and extensive mocks of each layer.

Creating a new frontend requires the same multi-week design review process, and must use the Nth incarnation of our NIH Javascript framework. The developer who built it got promoted and then left; it's been handed over to India to maintain. So anytime anything goes wrong, post a question to the framework maintainer's slack channel and wait to hear back in the morning. Or don't. They're just as mystified by it as you are.

Creating new external APIs requires, ironically, no design review but you must use the shitty homegrown code generation tool to generate a PR to add your endpoints to the API gateway. The builds on this PR take 4 hours. Then after it's merged you can wait for the API gateway to be deployed later this week. You must repeat this process any time the schema changes.

Creating a new RDBMS table goes through a theoretically self-service workflow hosted by the RDBMS team but probably one step fails so you need to file a ticket and wait for them to fix it before your table is provisioned and ACLs dialed in.

In-house object storage implements an S3 compatible API but you must submit a ticket with a detailed description of your use case and wait a week or two to get provisioned.

If your email is static or contains only simple account properties, it's easy to create in the internal email platform's template builder UI. But if it needs complex data from your application, you'll need to land changes against the email platform to implement fetching of that data from your application. No, you can't push data into the template from the sending application, that would be too easy.

Compare to one or a few people working directly with Django/Rails, React, S3, Postmark, etc. and practicing the "write tests, not too many, mostly integration" thing.


> Creating a new backend endpoint requires a multi-week design review process and must be implemented in a straight-jacket of a programming language with 90% test coverage and tons of layers and extensive mocks of each layer.

Or perhaps an hour or so long meeting, in which the overall design is discussed so everyone is vaguely on the same page and the rest is handled when reviewing the merge request and/or as Slack messages in the appropriate channels, as necessary. Heck, shoot a message to the business analysts and maybe have a 5-10 minute conversation to clear up any of the stuff that you don't understand.

> Creating a new frontend requires the same multi-week design review process, and must use the Nth incarnation of our NIH Javascript framework. The developer who built it got promoted and then left; it's been handed over to India to maintain. So anytime anything goes wrong, post a question to the framework maintainer's slack channel and wait to hear back in the morning. Or don't. They're just as mystified by it as you are.

Or use the exact same approach as above, with whatever was used in the other components of the system, for familiarity's sake. Some vaguely recent version of Vue? Sure, go ahead! Maybe even copy some bits that don't feel like they should be shared code and could change in the future, but that'll be easy to reuse for now.

> Creating new external APIs requires, ironically, no design review but you must use the shitty homegrown code generation tool to generate a PR to add your endpoints to the API gateway. The builds on this PR take 4 hours. Then after it's merged you can wait for the API gateway to be deployed later this week. You must repeat this process any time the schema changes.

Quite on the contrary - it's just an API that perhaps needs a bit more attention from the security specialists and some additional scanning here and there. Knock it out, give it to others to test and fix any of the problems that they find. Builds might take around 5-15 minutes.

> Creating a new RDBMS table goes through a theoretically self-service workflow hosted by the RDBMS team but probably one step fails so you need to file a ticket and wait for them to fix it before your table is provisioned and ACLs dialed in.

Or, you know, are integrated as a migration file within the application, that will automatically get applied to the testing/staging/prod databases when the changes land in the environment. Maybe even use a local DB instance for breaking changes inside of a Docker container, so that you can discard any of the stuff that you break.

> In-house object storage implements an S3 compatible API but you must submit a ticket with a detailed description of your use case and wait a week or two to get provisioned.

Or just add whatever software you need to a Docker Swarm + Ansible deployment on your company's servers and have it be up there in a matter of minutes. Maybe grab a coffee, because rerunning parts of Ansible playbooks is sometimes a bit hard, but procrastinating for 10 minutes never hurt anyone. Then test whether everything works in your development environments and let the people responsible for accept testing, security and prod deployments handle the rest.

> If your email is static or contains only simple account properties, it's easy to create in the internal email platform's template builder UI. But if it needs complex data from your application, you'll need to land changes against the email platform to implement fetching of that data from your application. No, you can't push data into the template from the sending application, that would be too easy.

Or just add a new Java class and some templating logic to generate whatever you need in the application code.

> Compare to one or a few people working directly with Django/Rails, React, S3, Postmark, etc. and practicing the "write tests, not too many, mostly integration" thing.

And possibly end up with an insecure, faulty and otherwise unmaintainable solution because there wasn't even a modicum of oversight, or alternatively don't ship anything in a timely manner because the scope creep was unmanageable for the amount of people that you had. Or maybe just end up with those people using SaaS solutions which are more like SaaSS ( https://www.gnu.org/philosophy/who-does-that-server-really-s... ) which you were legally not allowed to do due to where the data ends up and then have bunches of fun dealing with the legal implications of that.

Bottom line: if most of your time is spent fighting corporate policies and bureaucratic apparatus, it doesn't really matter whether you have 1, 5 or 10 people on your team, because dysfunctional environments can happen at any scale. I'd say that what you describe is an environment that's actively hostile to getting anything done, which could stem from a lack of decent management and people doing whatever they can to make themselves look better for a performance review, or to justify their own existence. Having too many people would definitely contribute to that, but at this point i feel like this discussion has perhaps devolved into some parody of the Goldilocks story, where the corporation would struggle to find the right amount of people.

Furthermore, i guess it's a matter of attempting to centralize everything vs having a decentralized approach, seeing as different projects and teams might need different solutions, hopefully mostly open source ones. I do think that you're perhaps also bringing up the fact that larger corporations oftentimes attempt to standardize on this stuff, much to everyone's chagrin, but that's not necessarily caused by the team size alone, rather the point in the company's lifecycle.

I should know - i've seen both of the scenarios above and having few focused people can indeed work nicely! For example, i developed the homepage for the contact tracing application in my country alone, a page that saw hundreds of thousands of visitors: https://apturicovid.lv/#en Due to that trust placed in me and the creative freedom, i was able to deliver about 70 different deployments of the site as well over a few months.

Did it work nicely? Yes. Was there also a pretty big possibility of me messing up? Also yes. Would this approach work for startups, side hustles and other smaller projects? Sure. Would i want to do the same for your typical enterprise project where battling scope creep and shifting priorities is the everyday reality, whilst the concept of set sprints is a joke? Probably not, that's where having more people makes sense.


This is known as the "mythical man month." For some reason software engineering resists more programmers equating to more productivity. I believe this is because of the theoretical and personal nature of programming. We are constructing in our minds ideas about how a system should exist to solve the problems at hand. By adding more programmers, we are adding more ideas, that when ineffectively communicated (certainly human nature), compound into a smorgasbord of discord and bug tickets.

There is obviously some validity to your point. A 10 person startup could not run a mature company the size of Twitter. But that same 10 person startup will likely not see a double in productivity by hiring another 10 programmers. My experience has shown they will see a mysterious doubling in their miseries, and a many more "strategy meeting" to ameliorate the issues.

I hope that you would agree with me that it is feasible for a skilled programmer to accomplish everything in your list in some expedient amount of time, even though it would clearly go faster if he had a buddy to work with.

[0] https://en.wikipedia.org/wiki/The_Mythical_Man-Month


> ... But that same 10 person startup will likely not see a double in productivity by hiring another 10 programmers. ...

> I hope that you would agree with me that it is feasible for a skilled programmer to accomplish everything in your list in some expedient amount of time, even though it would clearly go faster if he had a buddy to work with.

Agreed. What you're describing above does sound like Brooks's law, which states that adding more manpower to a late project does make it even more late (at least as far as the mindset that you're hinting at goes): https://en.wikipedia.org/wiki/Brooks%27s_law

Or maybe it's simply diminishing returns in your example above, which also seems valid.

For the most part i've observed that in my own experience, however that's not exactly what i'm advocating for. Think more in the direction of Amdahl's law, but applied to the management of tasks within a project: https://en.wikipedia.org/wiki/Amdahl%27s_law

There will always be parts of a project which can be done in parallel and therefore should - that's not to say that it's possible to skip out on context for how they'd fit together with the greater solution, assume that all problems are interchangeable or run into any other problematic lines of thinking along the way, yet the possibilities should definitely be covered!

Here's perhaps a better example from my dayjob: There are problems with the Foo view in the application, which are costing X$/month. There are also problems with the Bar reporting functionality in the application, which are costing Y$/month. Furthermore, the system needs a new Baz module developed, which is projected to bring in additional Z$/month. Also, the metrics show that historically there have been ops related issues, so introducting containers into the mix could help save W$/month, as it has already been proven in other projects.

So essentially you have tasks that concern the following functionality: Foo, Bar, Baz and some DevOps stuff. So, if each task takes T amount of time, that'll be 4T in total.

If you have 1 engineer they can probably develop all of that eventually, but all of the tasks will probably be addressed in whatever order they get to them, so it'll therefore also take them 4T.

If you have 4 engineers AND the tasks have little overlap (this is where Amdahl's law makes a difference), they can each work on one of the tasks in parallel, so suddenly 4T becomes 1T.

I think it's clear, which would lead to the highest gain, i.e. X + Y + Z + W.

Of course, people always love to talk in absolutes and noone ever considers the particular circumstances of the projects. For example, in the 1M SLoC Java project that i'm currently working on, a single pizza team is pretty unlikely to step on each other's heels. Even entire features can be developed and delivered separately, without merge conflicts down the road. Whereas on another project that i'm also working on, a much smaller one, i get merge conflicts whenever i'm about to make a pull request to merge changes back into the master branch. Figuring out which project is which and what approaches and design patterns to use to minimize the pains of scaling, that's the hard part.


It is a fundamental truth that there is more work to do than time available to do it. That this pressure lands on developers is really a perpetual management failure.


> Your manager has a HUGH impact on your work quality.

Are you talking a Hugh Grant impact or a Hugh Jackman impact?

Sorry. Couldn't resist. Hugely.



I used to work in a company with 400+ developers and our team barely had any new features to add. We took care of one screen in the app.

However getting anything done was an impossible task and required huge amounts of code because of the complexity of the architecture.

I left after 6 months and a couple weeks because I couldn't bear the stress.

Fred Brooks is still right in 2021. More people, more problems.


> More people, more problems.

My theory is that teams scale exponentially. That means two people working together are probably as productive as two people working on completely separate things. (By completely separate, I mean totally different goals -- your weekend projects vs. my weekend projects. Definitely not employees at the same company, who in theory have some sort of common goal.) But it means that 10,000 people working on the same overarching project are about as productive as 14 people working on their own thing.

This sounds bad, but if you treat it as a law of the universe, it's not so bad. If you want to do a project whose scope is the same as 14 unrelated one-person projects, you just need 10,000 people. It certainly seems to happen in other fields; there are no one-person teams building skyscrapers or new subway systems.

People's workarounds for this are trying to split teams apart so that they can be somewhere lower on that exponential where the actual ratio doesn't feel so bad (1/1 is better than 14/10000). Unfortunately, it doesn't really result in great products and you end up with the classic "big company" feel -- where the company has 3 different chat apps, 2 infrastructure teams, is a gigabit ISP but can't upload or download to its own products at 1Gbps, etc. The individuals working on those projects are slightly more productive, but the company as a whole fails.

So I guess if you work at a company of size N, and you feel like you're as 100%∗log2(N)/N as productive at work as you are on your personal projects outside of work, then you're doing great. The problem is that this feels super bad to software engineers, who probably got their start working alone in their spare time. But it's the law of the universe and there is nothing you can do to fight it, except hope that you can figure out a business whose scope is low enough where that log2(N)/N is pretty close to 1.

(2 may not be the right exponent, of course, maybe it's something like 1.8. You'll never get around the fact that communication paths goes up like the factorial. At a 2 person company, there is only 1 1:1 to have. At a 40 person company, there are 39∗38∗37...=39! 1:1s to have. That ain't great scalability.)


This is why FAANG pays so much. Imagine 1000+ engineers working on a single page in a web app. Your entire team is responsible for 1 widget on the page.


Are FAANG developers more productive and efficient?


Coming from a bit different of a situation, I am at a government systems center that runs off customer funding rather than tax money, so we take clients. My group is being run short of people just as you describe.

Almost every project I have been on has been understaffed. My project last year was one year of funding to research application of deep learning for a problem that had only some classical engineering algorithms applied before. Besides the r&d, we promised to deliver software that they could use to verify experiments and test models against their systems, basically everything up to but not including integration (with integration coming later if results were good). The team was a PM, domain expert, and me. That's one person to do all the research and experiments and provide basically two working and tested software deliverables. Good times.

My current project is one piece of a government effort run shared between two three letter orgs, a smaller contractor that's kind of PMing plus infrastructure, a large contractor handling modeling, and us handling correlation. We already had a correlator written in Java for portability--that is why we were brought on--so should be easy, right? A bit of data analysis, configure the software accordingly, pass off the jar and config and should be good. The effort was funded for about two people plus a PM but had had one full time and one partial for the past six months. Our part of the project ballooned to helm charts, four Kafka interconnected docker containers, security scans on all that plus a bunch more stuff required every release, lots of core rewrites to hit their scalability goals, constant data analysis because every week a performer comes in with a "hey why didn't this particular example get correlated as expected" question. Basically one person to cloudify + do data analysis on big data + rewrite Java software to scale better is just crazy. I am on it because the "half time person" left a year ago and wasn't replaced, and the full time person is now leaving also in August, so they're finally forced to restaff. I was told I'd do data analysis and algorithm work only, but over the first month I've only done security fixes and cloud infrastructure related tasking.

I bring this up to point out, yeah it's cool to learn new stuff, but the shortage of people also means double or triple role work that definitely accelerates burn out.


I worked for a government agency funded by clients as my last job. Same issues.

A big issue with understaffing is knowledge. Like in my prior job and you in your current one, you are spread across so much tech. I know that even in my current job I have certain work that is little more than Google and assume the tutorial does the job.


It might also be applicants recognize that its not a healthy place to work -- interns staying up all night to fix production errors is a red flag. Then the problem is a bit self re-enforcing, and the only solution is to reset expectations and deliverables to maintain a healthy pace of work. If that pace is not possible, and the company can't offer competitive compensation for the scenario (ex: "We're asking you to carry the company, so here's some extra meaningful equity") then its in a downard spiral and its merely people abandoning a sinking ship.


Well that's just it. Apparently WLB is great otherwise at the company, but because they can't hire people, it now has become horrible.

Anecdotally, a heck of a lot of teams are in this state.


How much of the dev shortage is due to excessive gatekeeping? The average interviewing process has become very burdensome these days. Not sure it translates in a higher quality of hires wrt the past.


using interns as this is unacceptable.

is there a dev shortage or is there an expectation that you can hire people for cheap and work them to death?


In economic terms, there are no shortages. In practical terms though, it means that there are a bunch of projects out there that aren't viable without heavily working people and employers are incentivized to extract as much as possible from people.


projects that are not viable should simply not get done.

it always amazes me that people put up with this insanity. The first thing I teach junior developers that join my team is 1) learn to set boundaries 2) your WLB and mental sanity is more important than any project (if you are struggling you’re gonna get fired, ie don’t kill yourself with work when you are … disposable)


There were a few comments here throwing shade at the findings and also people who said that software developers are burning out way more than other occupations. I can't speak for this study in question, but the WHO did classify burnout as an occupational phenomenon a couple of years ago [1] and our own research (I work for Asana and we yearly survey 10K+ knowledge workers) suggests that indeed between 70-80% of knowledge workers in general experience burnout two or more times a year! Sad really. I've experienced it quite severely a long time ago and it can truly rock your world and take a very long time to recover from.

[1] https://www.who.int/news/item/28-05-2019-burn-out-an-occupat...

[2] https://asana.com/resources/anatomy-of-work


This is such a great example of bullshit studies.

Yeah I'm gonna go on a limb and say this is a marketing campaign. It's an insult to science to call this a "study".

From the pdf:

> The fieldwork of this study was carried out at remarkable speed, from the 23rd to 24th of June 2021

Data was collected in 1 day. Is this a joke? No it's not!

> The population sampled included Software Engineers aged 18+ living in the United Kingdom. In total we surveyed 258 such people

Survation, the market research agency who did the actual work, interviewed 258 people only, I don't see any definition or criteria on what that means, years of experience, field, cultural-background, or any of the many other very important aspects that anyone would wonder about.

I really don't understand how they can do a half-assed data survey on 258 people only, and then have the gall to claim "83% of developers".

What's really sad about this is that there is indeed an increased workload I've heard people say that they've experienced, caused by the side-effects of the covid-response most governments took, from working from home for extended periods to mental health issues.

This kind of click-bait bullshit marketing campaigns is pretty harmful to the very real issue of burnout because it muddles the ground and dilutes the issue.


Your criticism of the sample size is unfounded. You absolutely can have meaningful results with only 258 people provided you have good sampling. If you want to criticize their sampling, then by all means do it, but that's not what you've done here.


A lot of the COVID19 stuff only has a population size (NOT EVEN sample size) of like, 15. Ex: J&J's blood clot issue.

People make conclusions off of tinier sample sizes. There are plenty of scientific studies with just 50 or 100 people, but you need to keep the small numbers in mind.

As long as sampling was done correctly, then small sample size won't matter much !!


People make conclusions based on tiny sample sizes when the results fit their biases.

Most studies are worthless due to small sample sizes and confounding variables, and most “conclusions” are based on cherry-picked outliers. The fact that it’s common to do this doesn’t make it responsible or correct.


have you taken intro statistics? all small sample sizes do is require your effect to be more pronounced to still be statistically significant.


Ah, then I guess that's why so few "statistically significant" studies end up being contradicted by further research...


You make it sound like those studies were bad because the sample size was too small. In reality bad study design or having an unrepresentative sample are more common issues.


I think they go hand-in-hand. People naively apply intro-level statistics to messy real-world problems and then think they need an orders of magnitude smaller sample than they would need to even hint at the possibility of an interesting effect.


Neither a bad experiment nor an unrepresentative sample will be helped by a larger sample size, though.


True, though no sample is ever truly representative along every dimension. Even when a lot of effort is put into making a sample as representative as possible, there will be variance in all the potential ways the sample can diverge from the population; this variance seems to be consistently underestimated.


I agree with you broadly, though would the J&J sample size not be about 21.4 million? 15 is just the number of adverse events right?


When we get into "total population" statistics, we kinda leave the realm of statistics lol.

There's 15 documented adverse events, but we don't really know how many adverse events there were in total. If we simply study those 15 adverse events, then your population study is only those 15 events (with a hell-of-a-lot of selection bias).

----------

But I'm mostly thinking about how one person I know thinks: "Did you hear about these 15 cases and even one of them died! Clearly J&J is unsafe to take!!!!"

For that kind of conclusion, its like they've focused their mind entirely on 15 cases, as if their "population of study" was 15. Its obviously bad logic, but maybe I wasn't very clear in my original post that I was referencing some bad logic there.

---------

But there's plenty of 50-person or 200-person studies that ARE properly done, and the science is solid. Especially when we're looking at say Phase 1 / Phase 2 style trials of safety and/or efficacy (the earlier stage trials before a Phase 3 trial is conducted).

Its not a big enough sample to be "certain" of the results, but its a big enough sample to "provide evidence", possibly in conjuction with other studies.

----------

In many cases: having 5 or 6 studies with 50-random people each and performing a meta-study across them (especially if all 5 of those studies were of high quality sampling) would lead to a more robust conclusion than just one 1000-person study done slightly incorrectly (ex: the 1000-person "study" is a voluntary telephone survey or some other low-quality study)


Sorry but I disagree. The key point is that over a very large population, millions, only 15 people got it. It's just wrong to say it was computed over only 15 people.

This study referenced up above had the ~285 or whatever people who answered the survey, not that much info. ~285/350 is not that much data.

Anecdotally and based on my own experience I'd agree that the burnout is high, probably over 50%.


Would you agree that we can conclude that the incidence of burnout is high amongst software developers? The "real" level may be 83% or 93% or 43% but it seems fair to conclude that it is indeed some larger than desirable number that merits a larger and more robust study.

That is the value of a study of this size. The ability to quickly determine if there is any "there" there. The headline is sensationalistic but such is the nature of publish or perish science today.


> When we get into "total population" statistics, we kinda leave the realm of statistics lol.

Not even close to accurate.


When there's an election or census over some particular figure, there's no "+/-" or "standard deviation".

If the election count is say, 48% for A and 50% for B and 2% other, that's that. You don't have error bars, you don't have student-t tests, you don't have random varaibles, you don't have "confidence intervals". You assume you have the ground truth. You have a precise number assumed to be without error.

Its actually a fundamentally different kind of math than what is typically taught in statistic classes (standard error vs standard deviation. R-values, random variables, etc. etc.)

When it comes to "total population" style information gathering, the questions of reliability are about whether-or-not you in fact, grabbed the total population... and nothing really "mathematical" (such as what are my 95% confidence interval?)

---------

When people say 600,000 died of COVID19 in the USA, there's no error bar, standard deviation, or random variable on that number. It is what it is: the ground truth according to a particular source.

And when you have a 2nd source of information with slightly different numbers (ex: State databases vs US level databases), you have a non-mathematical problem for how to resolve the apparent contradiction.

There's no "metastudy" over other surveys. There's no "resample again", there's no reproducibility. These statistical concepts and tests simply do not apply. If you do in fact decide to conduct a 2nd census, you will find that the numbers will differ in practice (ex: recounting the same set of ballots a 2nd time). But there's no underlying mathematical model for why this happens, as is the case in typical population statistics (student t-test assumes a standard error over a standard deviation of a hypothetical sample, etc. etc.)


Elections are not statistical tests. Number of people dead are not statistical tests. You said "When we get into "total population" statistics, we kinda leave the realm of statistics lol." That is false. There are many questions that statistics is best suited to answer even if an entire population has been surveyed. Often the term population is poorly defined, and is much wider than the simple base case.

In the case of J&J blood clots, there's billions of additional people who could take the J&J vaccine. 15/21 million people who took he J&J vaccine got blood clots. But if you want to know what percentage of "people who take the J&J vaccine get blood clots" that's not the same thing. The population includes an infinite number of people.


Yes that is correct. Perhaps not as a rigorous experiment, but at a high level yes.


>There are plenty of scientific studies with just 50 or 100 people, but you need to keep the small numbers in mind.

The converse also needs to be held in mind as well. Very large studies are more likely to display statistically significant findings. Your p-value is really just a test of your sample size, so as the study grows your p-value tends to shrink. Get a large enough sample and you'll almost always find statistical significance. Just referencing a p-value isn't enough, it needs to be read in the context of the study size (whether large or small).

Edit: For those downvoting, here is a much better and eloquent explanation:

https://twitter.com/daniela_witten/status/131218095960997478...


This is a bad explanation of p values. Of the start:

> If you’re testing whether mean=0 and actually the truth is that mean=0.000000001, and if you have a large enough sample size, then YOU WILL GET A TINY P-VALUE.

Think about this for a second. Does it make sense that the results of my statistical test would change if I measured in terragrams vs. micrograms? No, it doesn't, because it's not the numerical difference between null hypothesis and sample mean, it's the difference relative to standard deviation.

If the effect size is 0.000000001 and the standard deviation is 0.0000000000000001 then this is an enormous, important effect. If the standard deviation is 0.1, then this is a really small and unimportant effect. It can still be observed correctly with a sufficient sample size though because you've stated it's a real effect to begin with.

You will absolutely not get a tiny p value just because your sample size is large. That is wrong. You will be able to detect very small effects, but if there is no difference, you are more likely to draw the correct conclusion with 1,000,000 samples than with 100 samples.

The point the author here is trying to convey but is not getting clearly, is that a large sample size can detect an effect that is practically useless and call it significant. p-values tell you if it seems probable, given your data, if there is a difference between the two datasets. They make no claims about whether the difference is material.

Example python code to do prove p values are not likelier to be smaller just due to large sample sizes, given no actual difference:

   import random
   from scipy.stats import ttest_ind

   num_tests = 10000
   res = []
   for test in range(num_tests):
      n = 100
      a = [random.random() for _ in range(n)]
      b = [random.random() for _ in range(n)]

      res = res + [ttest_ind(a,b)[1] <= 0.05]

   print(f"10,000 tests with n=200 {sum(res)/ num_tests}")

   num_tests = 1000
   res = []
   for test in range(num_tests):
      n = 100000
      a = [random.random() for _ in range(n)]
      b = [random.random() for _ in range(n)]

      res = res + [ttest_ind(a,b)[1] <= 0.05]

   print(f"10,000 tests with n=20,000 {sum(res)/ num_tests}")


This is a good explanation, thank you.

>You will be able to detect very small effects, but if there is no difference, you are more likely to draw the correct conclusion with 1,000,000 samples than with 100 samples.

I believe that this is to the tweet author's point where they stated that in the real-world a null hypothesis never exactly holds true. So you can detect even the tiniest of variance with a large enough sample.

>given no actual difference

I think the error here is that there is a priori knowledge of no difference, where the author is stating that in real-life scenarios there will always be some differences even if being too minute to be of practical significance. We can fabricate "no difference" in simulations but in real experimental design there will almost always be variance, even if it's just an artifact of the measurement process rather than being causal from the independent variable(s). Whether or not the differences statistically significant depends on the experiment design, to include sample size.

So while I understand your valid point I think the author's claim was more about the practical application of statistics rather than the mathematical precision as it relates with simulated examples. But I could be misinterpreting.


If you're performing a rigorous experiment, then you have a control group, and your null hypothesis is that the experimental group will be the same. In actuality you will find that many null hypotheses hold true and the experimental design has no effect on the outcome, at all. Of course in some fields, like psych, most everything has some effect. But it's absolutely not correct to blame p-values for helping you distinguish between no effect and astonishingly small effects. They're functioning correctly. That is a secondary problem to solve, usually by stating a minimum effect size.

Of far greater problem is False Discovery Rate related things. Where you test 20 different things at once, and by chance identify one of them as significant even though the true effect size is 0. This is another area where increasing your sample size can help avoid problems, but even still you need to acknowledge your tools are imperfect.


>In actuality you will find that many null hypotheses hold true

I'm assuming you mean within the confines of the experiment, correct? I agree. The tweet author was eluding to the fact that "IRL" the null hypothesis is almost never true at the population level. Meaning if you grab a large enough sample you will detect very, very small differences. (This was her Lucky Charms ~ blood type example in the tweet). I also agree with that. I don't think the two claims are mutually exclusive and the fact they can coexist is (I believe) precisely her point about sample size.


I mean there are many real world examples where the impact of A has no effect on B, obviously. The number I'm thinking of, has, truly, no effect on the time since you last blinked. No sample size will change that.

Lucky Charms does probably relate to blood type in some impossibly small way. It makes sense that a biological trait has some relation to dietary consumption. I don't think any sort of non-garbage tier journal with peer review would publish it, but good on p-values for helping us detect them though. Not bad on sample size for making it possible to discern this effect size with a high degree of confidence.

It's worth noting that we also have many other tools to help us. For example you can test, given an expected effect size and a sample size, what the probability is of getting a statistically significant result, or a non-significant result, or a significant result that erroneously goes in the wrong direction. Or what the range of likely true mean effect size is given a significant sample difference.

We want large samples. They enhance confidence in findings. The author's premise seems to be it's better not to know that small rocks exist if we're only looking for big rocks. But fails to mention that the tools to find small rocks also help us identify big rocks with more clarity.


The size of the sample depends on how rare of an event that is being studied. For something that happens 80% of the time, 300 is a very reasonable number. Something that happens 1/1000 or 999/1000 would need a much larger sample size.


Yes, I agree. But that’s to the same point. The p-value is still a measure of whether your sample size is large enough to capture a small probability event in the context of assessing the null hypothesis


Normal election poling uses 1k sample size as a minimum and 3k is preferred.


Yes. 130 people is sufficient for a good sample.


If we use a 95% confidence interval and we have a sample size of 258 then the margin of error is around 4.5%. If the sample size is 130, then the margin of error will be around 6.5%. That's a little high, but probably still good enough.


The sample size you need also depends on variance and the presence of confounding variables. It’s impossible to say whether n is “big enough” in the abstract; it depends completely on what you are measuring and how you measure it.


We are sampling a Bernoulli distribution for measuring burn out. A software developer could be burnt out/ not burnt out.

The variance of a Bernoulli distribution is P(1-P), which is (.83)(.17), since 83% of the respondents answered that they were burnt out.


But why are they "burned out"? Is it because they're a software developer or because they live in an advanced post-industrial civilization in the midst of a global pandemic? Could they feel burned out today but not tomorrow? Has the sun come out in the UK at any time in the last two weeks? Do all the participants even agree on the definition of "burnout"? Were there five other similar studies done that we don't know about that were discarded because they didn't produce the desired click-baity conclusion?

I could go on and on.


It's important that the 258 were uniformly drawn from the population of software engineers in the UK.

It's not important whether the sun came out in the last two weeks or not, unless we were trying to measure the weather affecting burn out.

It's true different people might have a different interpretation of burn out, but it should be pretty similar to other people. I'm going to assume all people who took the study understand the idea behind burn out.

The point of the poll isn't to answer why but to measure the percentage of burn out with some degree of accuracy. The margin of error in this poll was 4.5%. That's how all yes/no polling works. We don't know that 83% of people are burnt out, but we are reasonably confident that 78% to 88% of people are burnt out.


Sorry, but what part of "they only surveyed software developers in the United Kingdom" gives you the impression that their sampling might actually be good?

Worth noting that the study itself doesn't seem to mention their sampling methodology.


Studies are never meant to be an ultimate answer to everything. And just because they're not the ultimate answer, doesn't mean that they are therefore worthless, bad or "bullshit".


It actually does mean that in most cases. Having false confidence in a hypothesis based on non-rigorous studies is much worse (and in some cases, a lot more dangerous) than withholding judgment until the data (all the data, not some cherry-picked slice of it) really is conclusive.


No. You perform studies to validate observations or theories and, if you are able to validate them, offer potential factors as contributing causes. A study where the data does not support the conclusion is worthless, and can in fact be harmful.


I wouldn't be so quick to discard it. From personal experience, the past 14+ months of working from home has been very stressful.

I didn't think I was burnt out, until I started interviewing (due to work "stress", which turned out to be burn-out in disguise). One recruiter informed me they have seen a spike - 70% of interviewees canceled scheduled interviews, post-screening. I almost cancelled mine due to being overwhelmed/burnout from preparations + work, and my thesis is there are a lot of burned out developers out there. Unfortunately, I didn't hear what the cancellation rate baseline was before work-from-home.

If there are tech recruiters/HR folk/hiring managers reading this, I'd love to know the numbers you've been seeing for interview cancelations.


There should be a study on studies being posted to HN. We must be at like 99% of them that get critiqued for being a bad study in one way or another. I would love to see some examples where everyone accepts the study.


To be extremely fair to hn: almost all such studies about human behavior, especially those relying on polling, are garbage.

Often, the statistical methods are inappropriate at best, or downright deceiving at worst; there is nominal, if any, understanding about what "controlling for a variable" means, among many other such things. Not to mention that often the methods themselves are highly questionable in their reproducibility, statistics aside.

(Another classic is to point out flaws in generalizations for mice studies and such, but that is, imo, independent of the study's methods. It's a fair critique, but I don't think it's what you're interested in measuring w.r.t. "bad" studies.)


"Data was collected in 1 day. Is this a joke? No it's not!"

What makes you think data collected over 1 day worse than the same data collected in multiple days?

"I really don't understand how they can do a half-assed data survey on 258 people only, and then have the gall to claim "83% of DEVELOPERS""

It's called extrapolation.

No study ever interviews everyone. They all have to extrapolate from a subset of the entire population.

Obviously the more participants there are the more confidence you can have in the conclusion, but whether 200, 2000, or 20000 participants would make for a "good" study is completely arbitrary.


> They all have to extrapolate from a subset of the entire population.

This only really works if you have a representative subset of he population you're extrapolating to. Is the distribution of experience roughly equivalent to the general population of "developers"? Age distribution? Company size? # of children? Home life? These kinds of demographics can have a big impact, and I see no discussion of how they were accounted for.

Heck, even the general work ethic culture difference between the UK vs the rest of the world could be a huge confounding factor that I know wasn't really accounted for.


Getting a perfectly representative subset of the population, means surveying the whole population of the world. Also, accounting for every imaginable variable is always impossible, because we cannot ethically grow people in an isolated environment.

Sure, a meta-analysis of several studies reveals more than one single study, but that doesn't make a single study bad or worthless - or as said in the parent - "a bullshit study".


There are plenty of ways to account for not having a representative sample distribution, I'm just saying that I didn't see any kind of discussion about demographics or confounding variables in the actual study.

This was a pilot study, a way to open an avenue of discussion, and to be sure I think the study warrants more investigation. They even say so: "In future we would like to conduct larger studies over multiple countries."


"This only really works if you have a representative subset of he population"

Of course. Never said otherwise.


In the context of your comment it is implied that you believe the "study" in question does have a representative subset.

while you may not intend that, most people reading your comment would believe you think the study is fine


"In the context of your comment it is implied that you believe the "study" in question does have a representative subset."

No. I was replying to the specific objections the parent poster had.

I never said or implied anything about whether I thought it was a good study or not.


Clearly, several people (including me) think that you did imply it - so, regardless of whether or not you intended to do so, that's what you communicated in your post.

In particular, the way in which you mentioned extrapolation in the context of the study (and the conversation in general) carried the implication that the extrapolation was valid.


"Clearly, several people (including me) think that you did imply it"

Wow. Several people on the internet think something.

Is that a large enough sample size?


Nit, the 258 number makes this an inference on the broader population, not an extrapolation, if it's a random sample.

It is, however, an extrapolation to make claims about the entire dev world when looking just at UK devs.


You always want at least a week of data collection, to control for weekly effects. People are often happier on a Friday than a Monday, for instance.


I'm not sure how much of a difference the day of the week would make for people who are really burnt out.


The question is precisely whether they are "really" burnt out or just had a long day (or couple of days) at work and are feeling burnt out at that particular moment. We all experience something akin to burnout at some point or another but it makes a big difference whether it is "I felt burnt out for a day or two after a tough sprint or major production incident" or "I've been working 80h per week for months". The former you might be fine after a relaxing weekend whereas the latter may take months to recover and seriously jeopardize your long-term physical and mental health.


True burnout, probably not, but it's always better to avoid problems like this by collecting data for at least a week.


it wouldn't, it would make a difference for the people who aren't burnt out


Another study finds out that 83% of population are old ladies with a cat that happily respond to surveys by post.


Having a poll in the field for two days is pretty common, and n=258 is definitely enough for a representative sample.


- This was not a poll.

- Being common doesn't mean it's good or acceptable. In this case, it doesn't account for the effects of day of the week, or what sort of issues the person was having on that day specifically.

- n=258 is enough if it was sampled properly, and they provide no details to show that it was, so it wasn't. They provided nothing more than the engineers they surveyed were in the UK. Nothing on experience, age group distribution, family-size, years of work, company size, work change.

- As it stands, this provides 0 valuable information, and I can only assume it's intended for buzz.


> "It doesn't account for... what sort of issues the person was having on that day specifically"

I'm not clear on what you are suggesting here. Are you saying they should contact people multiple times to make sure they weren't having a bad day the first time around?

2-3 days is common because any longer than that and the survey results will no longer be reflective of a specific point in time.

Also, if you check their crosstabs they have breakdowns for gender and age.


Thanks for taking the time to read the details. I am always skeptical when I see "Study finds...".


The thing is, everything in this world is now gauged by engagement, not quality. The fact that this is getting more reactions on HN gives it even more exposure, and actively reinforces this behavior even more.

If you have a really good, thorough study that is so thorough that people have nothing to comment on it, it will not rise to the top of pretty much any platform, because comments are valued higher than upvotes on most platforms. So you want to engineer some clickbait in there so that your article gets madly commented upon; being incomplete is perhaps the easiest way to do this. (I don't know about HN, but it certainly appears Facebook and Twitter work that way.)


> everything in this world is now gauged by engagement

nope not true, only algorithm driven website. just because this post or comment made it to the top of HN doesn't mean people think it's valuable now, it just means like you said: it has buzz around it, and it should, because it's bullshit and should be exposed for the bullshit it is. Is that enough to get people to be more skeptical about what they read? I don't know, probably not though.


This is such a great example of bullshit studies.

Lots of social science seems to be confirmation of stuff that people have been suspecting. Not to take away from such things: It's important to verify with data.

However, the really interesting question: Where does the burnout come from? Is it possible to specifically characterize the source? By starting to break it down like that, we start toward finding a solution.


If they asked me to do this survey, their mail would end up in my spam box. I’m not burned out.


Sadly lots of today’s “research” meets this low threshold. Lots of politicized headlines use low quality research to push narratives.

#depressing


I recently came across a US Government funded study [1] that used anecdata of 12 people–just 12 tired souls!– to put forth a seal on the claim that e-ink screens are better for the human eyes than emissive type of displays.

Edit: Reality however: the opposite is true. Only in some scenarios (like reading under open sun) does one type of screen outdo the other. In most usual cases, that is indoor reading, the emissive display is better for the eyes.

[1] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5929099/


Yeah I can do a survey too :)


I’m guilty of posting before reading the article. I’m glad you pointed this out. Sounds like an awful study.

There’s true burnout and then there’s feelings of stress. They aren’t the same thing at all. I know because I’ve experienced both. Burnout cuts you down to your soul. I haven’t programmed professionally in 2 years and the thought of doing so makes me nauseous. That’s burnout.


Let's me 100% honest here - if we cannot define clearly the psychological and medical definition of a burn-out (with accurate diagnosis), it is just fine to discuss in non-scientific context. The moment you call this a study and no one here is willing to ask hard questions makes me extremely disappointed in HN and the intellectual curiosity that usually drives discussions here. Why do I feel like there is some political angle to this? (labor rights).


It would take a lot of mental gymnastics to twist this into a political statement. If we "cannot define clearly the psychological and medical definition of a burn-out", why are we even studying it in the first place? That would be like experimenting without a control group.


Looking back on this comment, I want to apologize for trying to say what is "real" burnout and what isn't.

Thinking about it more, they can all be symptoms of the same thing, to varying degrees. Apologies to anyone I may have upset by minimizing their experience.


Your arguments are specious; perhaps you're simply angry that you didn't understand an industry you've worked in for a while? Perhaps you're in management, or in the 20% who aren't burnt out, and you're embarrassed by proxy?

Psychological experiments and surveys are often done in only one or two days. Why would a wider window of time help when taking a snapshot of a current situation?

Hm, is 258 a small sample size? Let's put in some effort instead of just screaming "bullshit" a bunch. There are approximately 408,000 people in the entire population [0], and plugging everything into a calculator, we get two results: First, they should have tried to poll around 400 people (and perhaps they did!), but more importantly, the 95% confidence interval is only around ±6%.

Let me repeat that more clearly: By typical polling standards (95% confidence interval, random sample from filtered population), we expect that 77-89% of UK software developers are actually burnt out. That is still three out of every four people in the industry!

Not sure what to say. Take a stats class?

[0] https://www.statista.com/statistics/318818/numbers-of-progra...


All of those calculations assume a random sample which this was not:

"To reduce self-selection bias, invitations to complete surveys were sent out to members of Survation’s panel and the surveys were then conducted by online interview. This approach is far more rigorous than say, running a poll on a website in which anyone can choose to take part."

Are "members of Survation's panel" a random subset of developers? How many invitations were sent out to get 238 responses (i.e. what was the response rate)? Are developers with burnout more likely to respond to a survey about burnout? Was the sample population demographically similar to the overall population of developers?


It's a lot and I'm not surprised. Recent research in my country (from a reputable institute, the Netherlands) mentions 1 out of 6 people feel burnout and ranks IT industry in the top 3 sectors. This survey articulates it in quite a broad way however, just asking if respondents have 'feelings of burnout', which most of us have once in a while.

That doesn't mean they are burned out clinically speaking, or even if they have symptoms of burnout. It does point to a large, systematic problem in the industry (and society in general).


Your comment starts with a disingenuous sentence, and continues to show a level of bitterness I've seen before in people who believe themselves to be more knowledgeable/experienced than their surroundings (or they actually are), but never got a chance (or so they believe).

I didn't claim to know everything there is to know about statistics and research in general, but I think I know enough to tell that this is not a study, but a marketing campaign for whatever this usehaystack thing is.

> First, they should have tried to poll around 400 people (and perhaps they did!),

If you had read the article, you would have known that that's not what they did.

> By typical polling standards (95% confidence interval, random sample from filtered population), we expect that 77-89% of UK software developers are actually burnt out. That is still three out of every four people in the industry!

Do I understand correctly that you've worked on this? That certainly explains the hostility, but nevertheless.

> we expect that 77-89% of UK software developers are actually burnt out.

That's not what the link/study says.

> That is still three out of every four people in the industry!

This is absolutely wrong, without a shred of a doubt.

I recommend that you read: How to lie with Statistics by Darrell Huff.


I didn't produce this survey. I just know how to compute confidence intervals. I'm not going to regurgitate the Wikipedia article on confidence intervals at you; it's clear that you don't really want to learn how statistics work, and you'd rather feel smug because you surely know better than to consider a survey of your fellow people.


Just read other replies. I'm not going regurgitate other comments to you. You're clearly way more annoyed by this than average. I don't feel like this is worth engaging with anymore.


If you were a professional painter and every couple of months, someone snuck into your art studio, threw out all your brushes, paint, canvas, and then dumped a huge load of requests for you to "fix" your old paintings, I'm pretty sure you'd get burned out pretty quick.


The difference is that if you were a professional painter you'd also have a second job in order to survive.


As a art-degree holder turned engineer - yup.

While there's a romantic vision of the starving artist, being broke is a whole different type of burnout.


There are plenty of commercial artists (illustrators, designers, etc) that make a good living making art.

Even if you're creating fine art exclusively it's possible to make a good living at it if your art has mass appeal.

The problem is when you make art that's not really geared towards the masses. If you make something widely considered "ugly" or disturbing, that most people wouldn't want to hang in their living room or on the walls of their business then you've got a problem.

The other problem is if you've either got a style that looks too much like everyone else or if you don't have a consistent style. Then people can't distinguish your work as uniquely yours and that makes having a "brand" difficult to impossible. You can still get by this way (especially if your work still has mass appeal -- see the endless impressionist-style paintings out there), but it's still a lot tougher than if you have a distinctive style.

The most successful fine artists have both a distinctive style and mass appeal... or a good marketing team. Often marketing is actually more important than the art itself as far as sales go.

Problem is many artists aren't good at marketing and just want to make art and aren't yet famous/rich enough to afford to hire someone to do the marketing for them.


> The problem is when you make art that's not really geared towards the masses

Well, apparently there’s also a lot of money to be made doing very specific commissions.

Not quite sure where I heard about that, but the concept of ‘whales’ exists there as well.


More painters should take the Thomas Kinkade-pill and create art for the masses!


I also think some of it comes from the expectations of how much time someone is productive in a day. It’s as if the expectation is that you spend all eight hours of the day making brush strokes on the canvas. It’s not practical and it’s exhausting, especially when you’re not feeling particularly motivated or just need time to think about the work ahead.


It's important for managers to plan time for design, testing, feedback, meetings, administrative tasks, and context switching. In places I've worked that has planned this well, we usually allocated somewhere around 4 hours of development work per day to someone.

I think most burnout is systematic, and caused by incomplete task breakdowns.


I agree. I previously worked in an XP shop that spent seven hours a day doing pair programming. It was exhausting, and challenging to do non-dev work in the hour a day left. Especially as a team lead, it took a lot of extra time or carving out time away from the pair to get enough done.

I also don’t know if full eight hour days are good, in general. I feel like 5x6 or 4x8 would be a better schedule, though, with a four day week, I’d be curious to try “weekend Wednesday” out instead of just having a three day weekend.


I feel like measuring a workday in hours is more useful for measuring the workload of a non-stop assembly line type work than it is for a knowledge-based job. 8 hours of work where you can grab a coffee and read HN whenever you want could be easier on the mind than 6 hours of work where your attention is in constant demand.

I do wish I could do the same number of hours over 4 days instead of 5, though.


I agree. Ideally, we get to an end-state where it doesn't matter what hours (or how many) you work as long as your team's outcomes are being met (and the outcomes are reasonable). There are a lot of caveats and complexities in that statement, but it's a state we can keep pushing towards as an industry.


I really like this idea but I'm not sure how to implement it. If Bob works twice as fast as Jim, does Bob get to go home at noon or does Jim have to work late?


There are various types of employee pay arrangements other than hourly and salary that do this (even though, some salaried jobs do work the way you describe), like piece work.

https://en.wikipedia.org/wiki/Piece_work

And it's already really common for non-employee contract work to function that way. The guy who mows my lawn gets paid the same amount no matter what speed he mows it.

That being said, there are plenty of salaried jobs where people are assigned measurable responsibilities to attend to and don't have assigned work hours.


Also worked in an XP shop. Learned a lot, but worst dev job of my life. Full time pair programming is one of the worst ideas in software development in the last decade. It's unbelievably exhausting.


I agree that their programming is extremely draining when done non-stop. I can't do it 8 hours a day and I don't expect my team to do that either but I would like them to average 4 hours a day. In my experience if the rule is pair programming then the exception is working solo on simple or repetitive tasks that you've done before which works well.


> I think most burnout is systematic, and caused by incomplete task breakdowns.

As a counterpoint, I think burnout can contributed to by overly granular task breakdowns that remove individual developer creativity and autonomy.


I definitely agree with that too -- task breakdowns should be of a high level process -- they shouldn't prescribe or presume anything about the solution.


It's more like you get scrawlings from toddlers on your desk every day and the boss says "these were supposed to be the blueprints to the next skyscraper. I paid a lot for this so you can't throw it away" and then you need to work with crayon scrawlings and somehow get the job done


Well, at least a toddler would ask “why” every once in a while. I went long stretches where the only “why” asked with regards to the requirements passing my desk was the silent scream inside my head.


I think this is why a lot of developers take on things like woodworking as a hobby or even a new career. The tools just WORK. They are reliable. They don't randomly change. There isn't a constant churn. You don't spend all day fighting with your toolset. They fade into the background. Your time is spent on what you actually want to do.


I build wooden, hand-riveted boats in my spare time and it’s an absolute joy.

One day on the computer when I was fighting with a stupid custom-made build system, I muttered to myself, “anyone who says a good carpenter doesn’t blame their tools never had to use an exploding hammer.”


When I worked for a large corporation I actually did have tools deleted from my own laptop. I left very shortly after that. I've never had this problem in any other company, though. Is this common?


Is maintenance the cause of burnout then?


For me burnout comes from the endless treadmill of external pressures (arbitrary deadlines, changing requirements) and the endless treadmill of development work. There's not really ever a chance to step back and take a breath - just constantly moving from one high priority task to the next.

It's also a bit thankless, being good as this job, in a lot of cases, means you are moving really fast from one task to the next without any down time in between. I find it hard to get to a place where I can feel accomplished or proud of a piece of work, because the next fire/high-priority task is already "behind schedule".

I think part of it may also be due to the fact that the job is constant problem solving. Even though its nice to find solutions to problems, knowing you'll be perpetually in a state of "there is a problem that must be fixed" is taxing on mental health (for me at least).


The best/fastest performers usually get rewarded with... more work.

Something I've seen more and more often is that a lot of developers take a lot of time for every ticket. Much more time than I'd take, but I suppose it's because they are not incentivized to work fast, they'll still get paid the same no matter how many tickets they close. But I don't think paying more if you close more tickets is the solution, I don't have any solution, really.


Yep, sounds like SW development, partly why we get paid some of the highest salaries in the country. I try to keep in mind I have the power to step off the treadmill at regular intervals and go lie on a beach chair or hammock in the mountains.


The problem solving part is perfectly fine for me. Not getting rewarded financially for my hard work is a huge issue. Unfortunately I don't see that changing anytime soon (if ever). The people running companies often don't understand developer value at all in my opinion.


as far as the last sentence. I think this varies with temperament widely. I like a mix of problem solving and people facing work, but I would find it hard to believe a mathematician would be upset from constant problem solving


Mathematical problems are puzzles. They were talking about defects and new requirements.


That's a great summary of why being a dev is hard and taxing and stressful. My job is all those thing. And because I switched jobs 6 months ago, I haven't built up that confidence of previous accomplishment.


This 110% ^^^


In my experience (and I think also in the spirit of the analogy), it's not just maintenance. It's expecting to maintain everything you've ever done while also simultaneously being expected to keep producing new work at the same rate (and then being expected to maintain that).

It's that your work load seems to only increase. As the study found with regards to the causes of burnout, increased workload cited as the main reason.

I'm still providing updates to stuff from over five years ago (that has officially been retired), while my boss is breathing down my neck about new projects. The stress of doing this, the stress of trying to manage my manager's expectation, the stress of realizing you'll never be able to really put any of your work out of your mind or have a lighter load of responsibilities, etc. is what leads to burnout.


One of the undersold advantages about getting a new job is that all the implicit commitments you built up go away. Try it.


Yes but then in the new job you need to maintain other people's stuff, which is even harder that maintaining your own stuff. And over time you get to own that, including all the previous tech debt that you didn't create


And you don't have that kind of "previous work bank" of stuff you accomplished when something runs late, in a new job. And things always run late. In my new job they said here's this major piece of work. We already committed to it. 2 other teams and outside customers are expecting it. Can you do it in 2 months? I don't know, maybe, I guess? There's a great starting point, they said. Then it turned out to take 5 months. Everyone is mad at me, it was incredibly stressful, I worked 7 days a week. This is kind of a summary of the modern dev job.


At some point in your career, you gain enough confidence to say no, I can't commit to that. If they fire you for not making unreasonable commitments, that's net benefit to you.

Dev shops don't have to operate like that, and all it takes is one person to have the fortitude to say no.


Yes, and as the list of items to maintain grows, you must perform more and more context switches as bugs or other fixes crop up, spreading you more and more thin. Meanwhile the number of new requests does not slow down.


This is too often ignored. Maintenance, even of applications which do not require new features, can often be non-trivial. In my past job, a lot of challenges were caused by the fact that nobody really budgeted for maintaining the existing stuff but it actually required 2 full time people to deal with.


Heh, it's like having a kid that is your total dependent for 18 years. And then having another, and another, and another. And unlike real children, they don't help take care of each other.


Somebody said that the cause of burnout is working hard and then not seeing results.

Some common ways that happens in software development are working for months on a feature that gets cut (or undercut) at the last minute, or trying to demonstrate your ability for some role and not being recognized for it, or even just putting your heart and soul into a company that is circling the drain.

If those experiences happen over and over, without some intervening successes to recharge you, you get burned out.


Just going through this with some hosted cloud software because some environments were "deprecated". The best possible result from finishing migration will be that it worked as well as before. This is so frustrating since I don't even know why Microsoft or Amazon deprecated these in the first place. It is work purely generated on their whims.

Apparently some AWS tools also got changes since some scripts seem to be failing.

It is so stupid to host anything in the cloud if you don't serve millions of users... Not really burnout, but I hate it so much right now...


"The best possible result from finishing migration will be that it worked as well as before." Talk about working hard and not seeing results. Having worked in several large enterprises, keeping up with new standards causes this situation constantly. Authentication, security, configuration, static analysis, deployment systems. It feels like eventually you get to a point where you are always working but never add a new feature. This is the most demoralizing thing possible in software IMHO.


That actually aligns nicely with parents comment about maintenance. I work on a maintenance team and I'm constantly toeing the burn-out line. At best the system works as well as it did before, but that's really it. Sure we tweak it here and there, but otherwise it's hard work with no real visible changes.


I personally really like maintenance programming, but I wonder if I've been lucky. All the systems I've ever inherited, I've been able to make impactful changes to them such that they stopped being that one system that causes everyone production headaches. Even the systems that I couldn't make much changes with directly, I was able to isolate, and route other systems around them, such that new data stopped depending on the system. It took a lot of effort, and politicking, but I seem to find the exploration and the juggling to be far more satisfying than greenfield building.


You may have been incredibly lucky, or have a secret sauce I don't understand. I've worked a pair of maintenace positions. The previous one was incredibly creaky and had no other systems, so changes were scary and there was no recourse to move to something else. The only I'm working now I'm maintaining client code so I don't have any agency there either.

Maybe it's agency? Ultimately I don't feel like my contribution wows anyone.


I've done maintenance and didn't even have to wow anyone to get satisfaction.

There's something deeply satisfying to me in cutting out thousands of lines of code and no-one even noticing.

I will even go back and look at that commit over the next few weeks with a feeling of peace and accomplishment.

I've sometimes 'stolen' time from feature work to do that refactoring, and it's the massive reduction of complexity and TLOC that often gives me satisfaction, not delivering the feature.

Don't get me wrong, I've delivered features and applications I'm proud and happy with too, but I think there's a significant minority of us who really enjoy deleting tons of bad code too.


Yeah, maybe agency. I don't know. It helps that I'm fairly social for an introverted engineer. One of the biggest systems change I ever made was related to a legacy url that the main page had to use. It seems like the type of thing that should be simple to make, but due to the sheer amounts of redirects and multiple systems involved it was really painful to change the primary domain. Only like 2 people in the company had any idea the amount of effort necessary to avoid breaking countless things. But when I finally flipped the switch I spent the entire afternoon running around the company and showing random people I knew in other departments the home page. I got a lot of "huh?"s. literally no one was impressed. But, I can have an infectious enthusiasm, so as I pointed to the URL, and said "look! that was actually really hard to do", people would at least come around and say "ok? good job?". They still weren't particularly impressed, but they at least understood it wasn't trivial. Couple this with a good PM who I knew was also evangelizing some of this stuff, and you can get a lot of agency over time in an org.

So maybe the trick is evangelizing? I really don't know, I just know I've been successful, often in spite of the organization I was in. I'm also constantly aware of the amount of luck involved in everything I do, so it's always hard for me to believe that I was the important factor.


It's probably company attitude towards agency as well - my time is tracked and recorded against client budgets and tasks. I can't spend an afternoon tinkering if it's not based upon a client's requirement.


I actually question if burnout is related to the idea that as software developers we often come in being passion driven, but "work" as defined by other people tends to whittle away at passion.

Eg there's a common saying, something like "Don't let your passion become work". Implying that if you love creating art, it can be a struggle to keep that same love when you start creating for other people. Of course it's all variable, but i hope my explanation suffices.

Programming can often be similar in my mind. Sometimes i can be in the height of feeling super uninterested in my work - tired, exhausted, all things i'd collectively describe as progressing towards burnout... but then i daydream and get the urge to go code some random idea. The fire is still there, in me, to be creative and produce something in the way i enjoy. It didn't seem inhibited by my work at all. It just seems work is following down a path of unfun that accumulates.

It feels similar to side projects. Devs often joke about how easy it is to start something but how difficult it is to finish. Because larger projects will often become work themselves. But in this side projects you have the luxury of stopping. I suspect if you forced your way through, never stopping on them - you'd burnout in a similar way to real work. If this is at all accurate, the consistent push on uninteresting work is the driving force for burnout in these examples. You're used to doing the work with passion, but extended periods of non-passion feels a lot like burnout, to me at least.


Can you name another profession that throws out its tools so often?


Then... Don't keep throwing out your tools?


Then... Don't plan on lasting more than 10 years in the software industry?


This is exclusively a problem with web development. I work in fintech, writing C and using the gnu toolchain. We are a very successful business.


Mobile, Full Stack and Data Engineering also have similar issues. Not as bad as anything related to javascript though.


Even still, a search for "jQuery" on a jobs board typically still returns plenty of results.


Though I believe you're correct, this is not a problem inherent to web development. There's absolutely nothing stopping you from pinning things to specific versions other than the fact that projects couple security updates with breaking feature additions.


That’s not an answer to their question


Their reply was a non sequitur because they didn't understand the analogy.

But anyway most "fixes" are really just changing systems to adapt to change in other systems. I spent 3+ years at Google just migrating this and that and the other subsystem to a different datastore, and knew dozens if not hundreds of developers stuck on a treadmill of constantly rewriting to adapt to other people's rewrites.


Hah! More like the inability to perform maintenance, due to management continually demanding new features etc on tight deadlines... and once the burnout sets in, efficiency drops and you can't even think about sneaking some maintenance in... until something catches fire! Then you have a huge backlog and people screaming from multiple directions and nobody wants their deadlines to slip...


This is a really great question, I'd love to know how to minimize my own burnout. It's likely not just maintenance as I've felt burned out on brand new projects that have no maintenance.


FAANGs maintain their code? Impossible.


The newbies who haven’t figured out that new projects = promotions will do it.


It's worth mentioning that this is a myth. Invisible maintenance work on old products can be very impactful if done correctly.


I don't really have a horse in this race because I've made my millions and long since quit the FAANG shit, but this is absolutely not a myth. Perhaps you've never worked at a FAANG, or perhaps we've confused being impactful for being promotable.


Do you really think it's appropriate to compare maintenance of one's engineering work with intrusive changes to one's artistic output?

This doesn't invalidate the conclusion, strictly speaking, but the basis for it is flawed in my opinion.


Stop using esoteric tools and trendy frameworks and use more robust solutions. That might prevent the burnout.


Did you retire from engineering? Esoteric tools can certainly be a problem but that's a tiny portion of the pool of people who experience this. Technology in general changes all the time


Part of the issue is the constant churn, not just change. It's change for change's sake, and more broadly, complexity for it's own sake.

Take microservices for example. Classic cargo culting development of what Google and Facebook are doing, but usually without a dedicated staffing team of hundreds. No wonder devs are burning out if they have to learn how to run a k8s cluster where a simple binary would've sufficed.


A long time ago I was reading a text on burnout, and one factor that stood out was lack of closure.

Think of an architect - they design a building, deliver blueprints and the building gets built. They're done. They can drive by the building and say "Look, I've designed that building". There's closure.

A developer's job is rarely done. You do a fantastic job and fix 2 almost intractable bugs in a sprint. What happens the next sprint? You get more bugs to fix. There's rarely closure.


Interesting idea. Here's my anecdote [or in other words, we should be dubious that there is any meaning here at all beyond coincidence].

Most of the work that I've done was for medical technology where the customer tends to deploy more often in Europe. I have seen very few finished products where I had a hand in implementing something.

However, one of my happiest experiences was years ago when we finished a rewrite of some hardware for a customer that had to be done because they didn't want to interrupt their sales pipeline (also their customers wanted to be able to continue buying with out their purchasing pipeline being interrupted either).

Anyway, so things had to move fast and once we were done soon after (months later iirc) there was a website up that had the new product that we had worked on.

It felt really good (even years later now thinking back) to be able to actually see and be able to show people I knew the product that we worked over a year on.


It's the same in a lot of industries and jobs.

You're a doctor and cure a patient; what do you get tomorrow, more patients. You're a automotive worker and assemble 100 cars today; what do you get tomorrow, more cars to build. You're a mechanical engineer and fix a customer's issue with the machinery; what do you get next week, more customer issues.

Jobs where you can regularly fully close a project and never hear about it again are the exception.


In high school I worked some janitor jobs; every week your cleaning up a very similar mess as last week, but the work can be fulfilling if you're striving to do an excellent job and achieve mastery.

That's one area that is soul crushing in software; the complete disdain for excellence and mastery. Nobody on the business side seems to care about creating master pieces. They want to throw half cooked spaghetti against the wall and see what sticks before throwing more spaghetti against more walls, and they strive to be the next facebook and have tens of thousands of employees throwing spaghetti.

Android has 10 year old bugs with thousands of watchers and instead Google is off trying to use Android in cars or televisions or probably tennis shoes. Maybe it's finally fixed, but for years AWS cloud-formation and CDK was unable to be used with a significant and critical set of AWS services and features. Instead, AWS is off creating toy RC cars and creating elastic-cloud-simple-machine-model-shift service, which are both missing cloud-formation bindings. If your the guy who actually cares about your customers, and is trying to fix these things, you're career has been repeatedly hit-and-run by everyone else making career moves and running off to make purpose-built block-chain gamified cloud-native social machine-learned pez dispensers.

And even if you can't find a job that will let you create excellence, using and learning excellent tools can be rewarding. Instead it seems like a downward spiral. There was a brief moment of sanity when the industry dropped all tangled complexity that was XML, but in general it seems like the industry is going from well crafted products like PostgreSQL, to half-baked databases like Cassandra CQL which apparently doesn't allow NOT NULL constraints. Once Java, C++, and Javascript all start to have fairly nice feature sets, it suddenly becomes hard to not find one's self stuck in jobs requiring the obtuse language go. And then there's helm, which seems to be trying to win at a competition to see if it's possible to make a configuration that's more brittle and convoluted then the old M4 based Sendmail configs of yore.

Why should I even bother becoming anything more then slightly proficient in any of these if they're going to be replaced with something even uglier and less ergonomic next year? What's the point of pouring my sweat and tears into a product for a company if the company is just going to strand it and rush off to build something else that it's going to abandon and rush off to yet something else?


Anecdotally I'd say professions that often double up as a hobby and don't have a significant material or energy cost to exercise will be more susceptible to burn out. If you're _really_ into software engineering and struggle to pace yourself then burn out will often stop you sooner than finger fatigue or your computing costs.

I'm not saying workplace pressure etc isn't a factor, but those are also similar factors in pretty much all careers.


It’s an interesting idea, but I don’t know. Lawyers and doctors also say the workload burns them out. Some doctors have to do 24 hour shifts which are more than 3x physically demanding than working 1 8 hour day.

I understand what you mean by hobby though. It’s toxic to discount people feeling shitty. But the biggest complainers I know - maybe the shittiest people - get into programming or the guilded professions to make money.


They shouldn’t be forced into 24hour shifts, I’m sure they’d have an easier time with regular work hours


I “retired” after 12 years of web development due to extreme burnout. I now work in a completely different industry and could not be happier.

It felt like my workload was forever increasing. I could never rest on any laurels without falling significantly “behind”. I developed a major substance abuse problem as well (sober 14 months now), although untangling cause and effect here is impossible. I think I was abusing substances in part to cope with the stress.


> It felt like my workload was forever increasing. I could never rest on any laurels without falling significantly “behind”.

Sounds very familiar. Also, when I finally put my foot down to demand changes in order to avoid burnout, that was used against me when I was later discussing a pay raise. I learned to never bring up problems again, and got crashed hard.


That line about it being used against you rings true. Business is often a toxic stew of keeping secrets from others lest they exploit you. This warps our perception and experience of human intimacy. Yet one more reason I’m glad to have left the industry.


Your very first submission! "Ask HN: how do you avoid burnout and getting taken advantage of?"[0]

[0] https://news.ycombinator.com/item?id=939580


Haha, thanks. That brings back memories.

If I recall, that thread helped me avoid burnout for a while. But it eventually caught up to me again with a ferocity I was unable to defend against, with the catalyst being a parent who died. Fortunately, I was in a position of financial strength and faced the beast head-on. On further reflection... burnout turned out to be a gift in disguise. It was an impetus for necessary change.


I have had similar experiences. Thank you for sharing and good luck.


This is why I have a tough time putting much weight into the soft-sciences. I don't see any operational definition of "Burnout" so this is like asking people if they feel "fast".

How about something like "How many times a week do you feel like not coming into work?"

But these subjective terms are useless.


https://en.wikipedia.org/wiki/Occupational_burnout

Burnout is not just a subjective term. It is a highly researched syndrome and can be tested for.


Yes, but I doubt those questioned knew and used that definition. Reading https://haystack-books.s3.amazonaws.com/Study+to+understand+... it certainly wasn’t tested for.

It’s all self-reported, and the question didn’t even ask whether the subjects thought they were burnt out, just whether they felt like it (for me, that’s a weaker thing)


If that's the definition they're using, put it in the paper. Otherwise, it's just like asking someone if they feel "hot" at work.

And if it can be tested for, why not do a study that tests for it? Otherwise, it's like asking people if they feel like they have cancer.


Studies aren't there to cater to your ignorance. Burnout is well defined, and generally quite well understood by even the general public, this is not news. You switched from "soft science is all subjective nonsense" to "I don't like how it's written". You don't have to like it.


I don't think burnout is well defined at all - at least not as used colloquially. As I understand it there's a specific medical/ mental health condition called burnout that is pretty horrendous. But in everyday usage it can mean everything from that medical definition to "I've had a rough week". I don't see any reason to assume which end of that scale a survey respondent might be thinking of when they answer.


Any serious professional or academic study contains a "Definition of Terms" section that explains how the researchers are defining key elements in their study.

https://www.slideshare.net/myotakustyle/definition-of-terms-....

https://www.mvorganizing.org/why-is-a-definition-of-terms-se...


You should be able to measure burnout by watching employee churn. Something software development is notorious for. I don’t think people jump companies every year and a half in most industries.


My last job lost all Sr. developers including me (I think 7 of 12 total developers) in an 8 month period. Everyone was absolutely fried and could not go on. Some of those people had been there between 4 and 10 years.


they would if they could get a 20% raise without much fuss.

was reading a reddit post on an airline host who was 'on the road' to 6 figure income - just needed to stay with the same airline for another 5-6 years. "If you switch to another airline, you'll start at the bottom again". It's so strange to think that goes on, but apparently it does. Apparently pilots, to some degree as well. Even if you had 15 years of flawless piloting for airline X, moving to airline Y, you'd start off near the bottom again. What other professions are like this?


It makes sense from the company perspective. Pay for experience with your things, coworkers and processes - reward loyalty.

I think it would make even more sense for software developement, since the code base usually is so extremely specific that there are no other experts then those who work with it. Still the best way to increase wage is to change job to something you have no clue in until a year or two working with it. Saving pennies for a dollar.


I can see the "keep people tied to the same company" side as having company benefits, sure. And yeah, almost any existing project I've come in to takes months (minimum) to understand the interplay of business/code/company/process/etc.


That's a very interesting insight! I was thinking of leading indicators and wasn't thinking lagging indicators like churn.


Although I also think some part of that has to do with raise seeking. Lots of confounding variables at play.


I don’t think the soft scientists should take heat because what appears to be a report released by a marketing company was less than rigorous…


Recently I've been feeling more and more detached from work. I thought about taking annual leave but I'm covering for my boss who just went on paternity leave. My employer also coerced (forced) us to take a big chunk of our annual leave already because of concerns there will be a huge clash later in the year. So I feel reluctant to use even more of it and then suffer later in the year. I also thought about asking for unpaid leave but I just moved to a new role and it might be a bad look when I have my review in 6 months.

I'm having a harder and harder time sleeping. Sometimes I only get 2 or 3 hours and I'm up at 4am with what feels like a high level of anxiety, worrying about many kinds of scenarios. Sleeping pills I bought over the counter seem to have no effect.

Would be great if my employer gave us a week to breathe like some companies have, but we're also a startup who just got acquired, so no chance of that either. I'm struggling.


I was in the same boat as you. Definitely talk to someone, see someone, take something, etc. I didn't and it spiraled out of control.


I feel they should clarify what they mean, and what respondents meant, by 'burnout'. Burnout is an extreme state and I am extremely surprised that 83% would feel that way.

From the actual study "A total of 83% of Software Engineers reported feelings of burnout with only 17% reporting no burnout. 55%, a majority, reported “great” or “moderate” levels of burnout."

What is a "moderate level of burnout"? This is self-contradictory, so I'm not convinced that respondents really meant 'burnout' in the strict sense, and the question itself leads to this lack of clarity.


So burnout is a binary state of on or off?

Or is it a continuum where you go from slightly stressed to "I'm walking out that door right now and never coming back"


I'd say it's like a tiny smoldering piece of charcoal, that suddenly ignites when there's too much gasoline in the air. You may be aware of the signs early, and even know what your progression is, but just before the tipping point you may still be a completely productive worker.


I think one of the biggest factors for dev burnout is processes and lack of ownership/input devs are subjected to.

Most teams look at developers as just ticket takers and rarely involve them in the product or the decision making. Eventually you just tire of implementing the same stuff over and over like a code monkey.


I have a job where I get to do a lot of the requirements in addition to the development and trust me, it is just more work and I am more burnt out for it.


so what is the root cause? is it just the constant (often fake) urgency, the dull and repetitive nature of sprints and working on tickets, or the volume of work?


Probably a combination of those things + for many, poor lifestyle choices will be an issue


A large part of the problem is that management doesn’t realize that writing software is a creative endeavor. You don’t get more creative output by working people harder, you do it by giving them more freedom. I may create more value with one spark of insight than months if trudging along, yet almost all management is meant to optimize the trudge and not the spark. This also adds to the very low quality of most code (also cited in the study).

We see a lot of VCs blog about how you can’t succeeded if you don’t work long, hard hours but the truth is the complete opposite. You can’t operate at peak coding performance if you’re not rested, motivated and feeling creative.

4 day work weeks (or less), 100% flexible schedules, wfh and far fewer meetings are some ways you can combat burnout.


So true. I see programming as very similar activity to making poetry. I’m a filmmaker by trade but I enjoy doing some code every once in a while, and it’s always felt to me to be very similar to poetry in the way you have to think about it and structure it.


Managers (MBA and the likes) will optimize only what can the tested and optimized. Can anyone define spark? We can all agree it's needed, but no regular management practice looks at it because everybody knows it can't be optimized. So this is what we are left with: optimizing butt time. Yes, there's also the discipline of engineering managers, but they seem to always be on the losing side when financial discussions time comes.


Taking OP's "spark" comment at face value, if the reality is quality development isn't a steady curve but rather a nearly flat one followed by bursts of sharp peaks, then it does provide some rubric for how to manage the situation. Which is, manage it lightly, look only at very big milestones. Measuring and optimizing day to day incremental improvements might actually be counter productive.


> So this is what we are left with: optimizing butt time.

Hence low quality code bases and developer burnout. It’s up to us as developers to demand more from our employers. Thankfully the shift to wfh seems to have tipped the balance of power away from the MBA types, which is why you see all the pearl clutching blog posts about the return to the office and “working hard”.


I made the mistake early in my career of not taking enough vacation. If I'd just taken time off, I could've avoided partial burnout, coasting to handle burnout and nearly getting fired a few times. Funny how you learn things like this in your early twenties.


I read the linked article, then I read part of the original study the article was based upon. My Statistics 101 interpretation of this is the study is bogus. First of all, you have to properly define the terms of what 'burnout' consists of. The study doesn't seem to do that. Second of all, even a well designed study, which this is not, is not statistically significant with such a low number of population samples. They interviewed 258 people for this. My conclusion: this "information" is worthless.


Slightly off-topic, but can we just talk about the fact that reading this article required me to load JavaScript from FIVE different sources? The user experience of me scrolling down some formatted text with an image on the top could have been replicated in HTML and CSS easily. And even if it simply HAD to have been done in JS, why so many sources? I don't know who intercom.io or webflow.com are, but I doubt your article needs them to load. And if it does, you need to stop refusing to load pages before you're done plopping slow animations and subtle trackers on me.


You can use your browser's reader mode. The content is all there in the HTML, but there's a bunch of `style="opacity:0"` attributes to hide them until JS removes the attributes.


It could be genetic. RCCX gene mutations may make people more susceptible to CF/ME and being a software developer. For most people CF/ME will present as burnout.


it's very weird when people take correlation and present it as causation. I feel there are at least few steps in statement that people with that gene mutation are more susceptible to burn out. Like people with that gene mutation "like to be independent", "like variety", and therefore lack of those in regular 9-5 software development leads to burn out. How it is presented as a causation has only one practical application. Gene xyz is bad, don't hire people with gene xyz.


And I think it’s weird when people assume that nature would conform to morality. That would be convenient and I’m suspicious of convenience.

It’s a proposed causation hence the ‘may’ qualifier. I’m suggesting people should check it out further. They may be surprised.

Also, it has huge implications beyond employment. My particular RCCX mutations yield hEDS which I can now treat. I wish I would have known 20 years ago. That’s one of many treatable genetic disorders that people barely know about.

It would also mean burnout is curable, so instead of not hiring people susceptible to burnout we can instead cure it for everyone.


Only time I felt burned out was through horrible management.


I'd like separate breakdowns based on how long people have worked at the same company and same job.

I'm not sure who is more cynical, the contractor who has seen all companies operate similar, or the "lifer" who has to put up with the revolving door of thankless projects.

I personally can go from burned out from my data job to having fun programming embedded projects.

All bets are off if I'm working more than 40 hours a week.


My pet theory is that software never ends and thus we don’t get closure. Which then exacerbates the feelings of burnout.


This is just what it's like to work a repetitive desk job. I imagine it's similar for lawyers and accountants.


There is some misery inherent to sitting on the dark side of a sunny window for the privilege of doing it all again the next day.


I've had fun doing it and been miserable doing it. It depends on the work and opportunity for creativity/impact.


As far as I can tell, neither the article nor the linked report actually defines the term “burnout”. Additionally, it is possible that the (258, U.K.-based) developers who bothered to fill the questionnaire online were more likely to answer in the affirmative.


Shocking, when we're perpetually working in "sprints" that we're burnt out and overworked.

Sample size of 1 obviously, but I recall feeling less work stress in the days before agile took over the world.


I went through a period of really horrible depression during the pandemic and I doubt that’s unique. Basic work tasks were harder and it felt meaningless. I’m better now but isolation was difficult


Burnout is a clinical condition and there are tests for it. You can't just ask people if they "feel" burned-out, as the term is often employed outside of the condition.


Poor developers in their safe, respected, well paid jobs. Perhaps the should quit and find something easier on their fragile little souls.


Developers are getting too spoiled. Did someone do the same research for teachers for example.


If it turned out that teachers (and everyone else) also have an ~80% burnout rate, then that only shows a more fundamental problem. Not that anyone is being spoiled.

And it's okay if that starts by finding out that developers have a ~80% burnout rate. To be fair, you are at least correct that we should run the same study against teachers. (And really as many other people as we can find the time/money for.)


The thing is that these bare numbers say nothing without "in compare with". There should be comparison with other job types or historical at least to make certain assumptions. Also for such burnouts developers often have some type of compensation, while other jobs rather not. Compare for example developers problems with Amazon devilery service drivers problems with toilets. A lot of jobs just luck minimal comfort and compesations and hardly can complain.


This myth of "spoiled developers" definitely feeds into the problem. People feel like they cannot complain about their work conditions because they have nice toys and benefits at the work place.


As a long time developer, I think we are pretty good at complaining about our work conditions, and are spoiled in many ways. We also have some very valid complaints about the bullshit we have to deal with, interfering micromanagers making us less productive and less happy, etc. All these things can be true at once. You don’t need to dismiss any of them as myths.


I think we learned this past year that at least a few teachers are quite exceptionally spoiled (disappointing, and for me a bit surprising). Maybe I'm biased by the witnessing the antics of the California teachers union, but their ranks have been an embarrassment to the profession. Their continued push-back against returning to regular classes has grown tiresome (California has decided, with union pressure, to defy CDC guidelines and require masking and distancing despite vaccines and low risk -- "follow the science" is no longer spoken by them).

https://www.nbcnews.com/news/us-news/california-school-board...

https://meaww.com/alissa-piro-california-teacher-zoom-rant-h...


Yeah, teachers benefit from a social cachet that is outsize related to their true efficacy.

They are not nearly as effective or as important as people make them out to be.


They are very important, but not always for the reasons that people make them out to be important.


Mind explaining? As institutional day care? I believe it.


I wouldn’t go that far. I do think that, particularly for primary education, some people tend to overinflate the importance of teachers as academic experts and discount their social importance.

The value that school provides kids, parents, and the community is multifaceted.

If anything, primary educations are very important, but not because kids learn basic arithmetic, but because they learn the foundational social concepts that will permanently shape their ideas about justice, respect, cooperation, etc.


I would go as far as to say school is industrial daycare. Asserting that schools provide value to society because people learn things within their boundaries is, in my opinion, like saying that caffeine pills provide value to tired people by helping them stay awake. It's not wrong per se, but it's a hackneyed solution that doesn't address the root cause of the problem. "Our citizens are illiterate and don't understand mathematics? I know; let's kidnap their children and store those children in warehouses for 18 years while they learn morals and social values from each other instead of from productive adults."

The "foundational social concepts" children learn in school are taught to them by full-time teachers, who are only capable of repeating various myths about society that they themselves believe. The effective method to learn foundational social concepts is to live in and contribute to society, not to sit in a sterile, age-segregated social environment and listen to quarter-truths spoken by apathetic non-experts, and then be tested on arbitrary collections of unrelated fact.

I am not arguing against the concept of education in general, but the notion that it can be taught in school is, I think, one of the most damaging problems in society. You simply cannot learn quickly and efficiently as a numbered head in a sea of competitors; good, proper learning takes time, personal attention, discipline, and does not pace itself to a standardized schedule. I offer this essay, not as the foundation of my anti-school beliefs (which are supported by dozens of studies, hundreds of essays, and thousands of personal experiences), but as the same literary aperitif that allowed me to ask myself "what if school isn't good for society?"

https://www.cantrip.org/gatto.html


> The "foundational social concepts" children learn in school are taught to them by full-time teachers

I disagree. Many (most) of the social concepts that children learn in school are not taught by teachers, they're taught through interactions with other students, with the teachers providing guard rails. They're completely orthogonal to the curriculum.

There are certainly plenty of myths repeated in classrooms by teachers, but I don't think the ratio is worse than the number of myths you hear in any other place. In fact, I would bet professional educators probably do a better job of not repeating myths than the general population, since they study formal pedagogical methods. Go read a public comment section on any news story -- you'll find plenty of examples of egregious myths repeated by average "productive adults".


I in turn disagree with you. Children learn social norms and practices through interaction with other children, sure. Social concepts, like "what is a country", "what is the most ethical legal system", "what is the history of our nation", "what is the right way to interact with other people", "should I go to school", "should I respect authority" are taught by teachers and administrators, who provide the explanation of the concepts and the punishment for when the kids disagree with the concepts, which they do often because the "concepts" pushed by school systems are ridiculous to the point of being inhumane.

The public comments section on any news story is inundated with people who spent a large portion of their lives in school. When they were in school, they were taught to respect authority, keep their heads down, specialize in a certain field, and accept wage labor conditions for the rest of their lives. They were also taught that proper education starts with an authority and is handed down via lectures and tests, and other kinds of knowledge are not valuable unless certified by another authority endorsed by the school system. This is true whether they went to a public school or to a private religious one- the "authority" the children are supposed to listen to just changes. Kids don't teach each other these lessons, the school system does.


I've been a teacher (and a manager, all levels of software engineer up to principal, a construction worker, an inner-city and high gang-presence high school math teacher, a graphic designer, and worked under a yard maintenance guy from Mexico, and several other small jobs. The software stuff has been among the easiest jobs (even with mission critical systems failing in the middle of the night affecting millions of dollars in revenue) I've experienced.

Software developers at many, many companies are absolutely spoiled. I know there have to be negative stories, but on the whole, pay, benefits, perks, and even hours are better than a great many industries. Even game shop crunch hours seem about on par for doctors in residency, some (most?) finance shops, and I'm sure other less notable roles like my friend who works in water distribution who puts in 90 hour weeks regularly and just got off a 39 hour shift.


On the same logic, you can find tons of other 'easier' jobs that still experience burnout or boreout. What you're unconsciously doing is a form of the `fallacy of relative privation`. Sure, software engineering is an easier job than resident doctors or inner city math teachers or whatever. That doesn't mean it doesn't come with it's own baggage/challenges/frustrations/mental health issues and that engineers don't get to complain about them just because they're the least fucked in your view of what an 'easy' job is.

Software engineers are classified by the CDC wrt to suicide incidence higher than a lot of the jobs you're mentioning there (8th place out of 22). I guess being spoiled comes with the price of offing yourself more often than doctors, lawyers or teachers.


Your life isn't just your job. And your other qualities influence your career choice. Software attracts different people than law and medicine and teaching. If I had to bet, for example, I would guess that loneliness is more common among software engineers than lawyers, doctors, and teachers.


I've had many trades as well, before ending up as a software engineer. My hardest, most exhausting job was in the food industry. Entering office live was easy, work a picnic. Plenty of energy coming home after work. With my previous attained 'work mentality' I could way more done! ... and last year I had an extreme burn-out event (hospitalised). In my experience burn-out is a combination of steadily (for a longer period of time) putting in more energy then available, not taking enough rest and most importantly: emotional disturbance (going against your nature/ will)

For example: Work hard, have a bad relationship at home or at work can easily lead to burn-out.

Furthermore, software engineering seem to have some unique combined traits which sets it apart: 1. Creative proces which is hard to manage and easily leads to misunderstandings and conflict with customer or employer. 2. Screen-working enables stretching concentration well beyond human limits. 3. Modern western society tends to personally identify oneself with work too much (which is an easy trigger for emotional disturbance) 4. Programming is fun! Devs are being treated so well! So why not endure those problems a bit longer ...


Sure, but that has little or nothing to do with life satisfaction. Which is related to suicide rate.

In fact, being spoiled can lead to dissatisfaction. It's part of the problem I imagine.

Somebody doing something really hard for a tangible benefit to people they know, have really healthy life satisfaction. Like the plow truck driver in winter, folks like that. For the obvious reasons.

We're all social animals. Suicide has to do with that. Not how easy your job is!


You are considering only anecdata from one person.

Only in one country.

And comparing software development only with stressful jobs.

And if you still think that burnout is not a bigger issue in the software world, go around and look at how many 40+ or 50+ years old work in software.


Given the turnover in teaching in many places, they are also burned out.


The first sentence is false and the second is whataboutism.


Are software developers any different from other professions with regards to burnout?


Yes, we have the disposal income to take a few months off work when we want to and can spend that extra time writing blog posts about how hard life is.

Everyone else just sucks it up.


Sure. Everyone has their estimates planned as bad as dev


It might make us feel good to think we're unique and different, but realistically I can't imagine software development to be much more/less worse than being a mechanic, a carpenter, real estate agent, etc.

I sometimes wonder how real estate agents avoid burnout. All of the ones I've worked with work most weekends and are on call 24/7 (because they give you their personal cell phone number to call and text any time).


Getting your fair share of the transaction helps.


Can we talk for a second about the fact that to even read this article I needed to enable JavaScript from no fewer than FIVE different sources? I just want some plain HTML, and maybe a little CSS. I don't want your fancy animations and 50 trackers. The modern web is doomed.


Study finds 17% of software developers can't feel anything anymore.


8/10 data scientist working on this article were having a burnout.


How do we compare to workers in other industries?


If it helps. I feel burnout.


What is N?


A bit clickbaity, as it's referring to the only first question in their study (https://haystack-books.s3.amazonaws.com/Study+to+understand+...) which further breaks down into:

I feel burnt out from work:

- 21% to a great extent - 34% to a moderate extent - 28% to a small extent


Being even slightly burnt out should be alarming. Some people don't even acknowledge their burning out before they become unable to work and fall into depression. It's not normal to feel this way. Being burnt out is not just regular tiredness, it's a mental health issue.


Does everyone interpret the question as something that's headed towards being so depressed you're unable to work? Or would a lot of people interpret "I feel somewhat burnt out" as "I would enjoy a vacation and not needing to be productive".


But do the study's participants understand that definition of burnout? Burnout, the way its casually thrown around these days, seems to be synonymous with regular tiredness in casual conversation.


Very likely not. There's no such thing as small extent of burnout. 'small extent of burnout' means 'I can feel my own boundaries and need to do something for self-care'.

I've often flirted with burnout to the point of having to take unwanted hiatus and fail to complete what I wanted to do. Better awareness of boundaries and self-care made that better, and STOPPING overcommitment worked best of all. It meant there are things I can't say about myself now, even as ambitions of mine. That didn't used to be true: I lived in the space between owning my ambitions and executing on 'em.

Stopping burnout means YOU ARE LESS. It's not a method of learning to work more efficiently and accomplish more. It's getting more right-sized, including your expectations.

Those who are going to be in actual burnout, who are going to begin falling into hurdles and breaking themselves, will either never acknowledge 'small amount of burnout', or the opposite: that's what they call it, no matter how great the burning wreckage surrounding them. They'll be legless and bleeding out and trying to drag themselves on by one good fingernail, and if you asked them, they would say 'this is a small amount of burnout'.

It's still a problem but I'm betting some of the folks in that 83% total, are willing and able to scale back their commitments and will then not be in burnout. Burnout is when you break.

Physical exercise counterpart: muscle building and strain, vs. rhabdo.


Honestly as somebody not working in software development these posts seem ridiculous to me. The pay is great, the physical work environment is good, if you work for a FANG company you might get free food, access to a corporate gym, and other amenities.

Can somebody give example of another job that is better?

I do agree though that America's work culture is bad as a whole and does not afford sufficient work life balance in most jobs. But SWE have one of the best jobs out there, it seems from somebody looking in on the outside. There is a huge lack of understanding what other people do for work. Many jobs are much harder than an Amazon pick and pack position. I bring that up because Amazon is frequently mentioned as a bad employer here. See for example junior investment banker working 80 hours a week making power point decks for tight deadlines.


> Can somebody give example of another job that is better?

Dentistry. I know many Dentists and rarely if ever meet one that works more than 4 days a week. But I don't think that's a real retort to your point :p

> But SWE have one of the best jobs out there, it seems from somebody looking in on the outside. There is a huge lack of understanding what other people do for work.

Generally speaking anyone working in a situation where their efforts of improvement are frustrated will feel burnt out. Of course this is common in crappy jobs too and often manifests in highly managed employees who are driven with very strict metrics and work environments, because a constant threat of unemployement is the only method to sustain output. So its a real phenomenon and no amount of money or perks is the answer, in the same way that most people know that being rich isn't a silver bullet towards happiness. To over simplify, you can want both riches and fulfillment, and lament the lack of either independent of the other. More constructively as an employer, if you are paying and treating your employees well and they are still unhappy, it might be that they are spoiled brats and you should fire them and find others. OR it might be that there is an actual problem that needs fixed. IME its always been the latter, and failing to fix that problem has always been a missed opportunity for the places I've worked.


My feeling is that software developement is made more complicated than it needs to be because of bunch of frameworks and work processes churning that basically re-invent the exact same wheel spokes and all over and over again. The field desparately needs standardised work. Once the development process is standardised the pay and entitled blog posts will drop. I am quite confident for example you could build a web app with the same functionality as today with 2015 technology but it wouldn't work today because of churn inspired obselecence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: