EA is not Talent constrained or bottlenecked (archived)
Introduction
I am currently trying to figure out what sort of career I should go into. Whether I should do GPR or ETG or get a Master’s and beef up skills in AI. I am really unsure, of what the market has to offer, what the market needs, and where all I can contribute.
As a result, I was trying to asses what skills were constrained, with the intention that I could potentially beef up skills and reduce the bottleneck. I primarily looked at 80khours posts and surveys. At that time, I found this article: “After one year of applying for EA jobs…” which lead me into a rabbithole of articles, and evidence, that ended up changing my view completely. I try to write a post on this, in an attempt to clarify my thought and pool all the data there is for others to see clearly.
With this post I start with exploring what “talent constrained” means. Next, the different claims for TC are looked into, followed by, checking if these claims are true with atleast one example. In the end, we extrapolate this to what it means for me.
During this essay I hope to uncover the following:
-
Talent constrain (bottleneck, funding constrain etc..)
-
Easiness to get a job in EA (how I compare to these people)
-
What is the value you add working at an EAO (how to consider replaceability)
-
EA is definitely better than ETG in most cases (Stretch)
Definitions
We are going to be primarily dealing with the term “Talent Constrained”. To avoid confusion we try to understand what it is first. 80khours defines TC in “Why you should work on Talent gaps” (Nov 2015):
For some causes, additional money can buy substantial progress. In others, the key bottleneck is finding people with a specific skill set. This second set of causes are more “talent constrained” than “funding constrained”; we say they have a “talent gap”.
Ok! A cause is TC if finding people with specific skill set proves to be difficult. The difficulty I assume is in the lack of specifically skilled people and not some process/management constrain. EA Concepts however, clears this confusion up with a better worded (“actively hiring”) “example”:
Organization A: Has annual funding of $5m, so can fund more staff, and has been actively hiring for a year, but has been unable to find anyone suitable… Organization A is more talent constrained than funding constrained…
Note: In this article, discussions are on “Orgs that are TC” and not “Causes that are TC”. I am unable at this moment to add value when I am told that AI strategy Cause is TC with the lack of “Disentanglement Research” (DR). But if I know FHI and many other orgs is TC in DR, then I can plausibly get skilled in DR and apply to close the lack of people with that skill set. So looking at causes for me is less helpful and less concrete for me and is not what I have set out to uncover.
Different definitions and my thoughts in the footnotes:
Evidence for TC
80khours claims EA is TC. 80khours claims EA has been TC from 2015. 80khours claims following EA are TC. Note: there has been some confusion with this term and 80khours set out to clear it. I am not sure life is any different today as a result of this clearing up.
EA has been and is talent constrained according to surveys made by 80khours and CEA since 2017. Several organizations seem to think so in these surveys: 2017 survey, 2018 survey, 2019 survey. In all the surveys EA’s on average claim to be more Talent Constrained than Funding constrained. For example in 2019 EAOs reported feeling more (3 out of 5 rating) Talent Constrained and less (1 out of 5 rating) Funding Constrained1. More details in the footnote 2.
80khours doesn’t seem to have changed it’s position on this matter in its posts. Since 2015 already, 80khours seems to be suggesting that we should focus on providing talent to the community rather than ETG in “Why you should focus on talent gaps and not funding gaps. They make the case that if someone can setup a Charity that meets GiveWell’s criteria, then they seem to have access to 10s of millions of dollars. Another example 80khours made was about AI Safety and that the funds were enough as per the evaluation of Open Phil and that there are people who are ready to donate even more funds, but think there isn’t enough “talent pool” (back in 2015)3. This continues through June 2017 in “Working at EAO and “The world desperately needs AI strategists4.
In August 2018, 80k can be seen saying that we need people to work on AI safety, Biorisk, EA, GPR, Nuclear security and institutional decision making.
Why did we choose these categories (Research, Govt policy, eff non profits, ETG)5? Why do we especially highlight research, policy and non-profit jobs; deprioritize earning to give; and omit advocacy and entrepreneurship?
In brief, we think our list of top problems (AI safety, biorisk, EA, GPR, nuclear security, institutional decision-making) are mainly constrained by research insights, either those that directly solve the issue or insights about policy solutions. — High Impact Careers Aug 2018
In Nov 2019, 80khours tries to clear up the “confusions” we create when we talk about “talent gaps”.
Rather than funding vs. talent gaps, we propose that people aim to identify specific bottlenecks facing the field and the skills needed to resolve them. A ‘bottleneck’ is the resource that a field most needs in order to make progress.
Ok, what if a cause is bottlenecked by a specific skill? Let me think… ah Talent constrained. It appears that the word is not confusing as per their initial definition. But somehow somewhere in the way, wait am not sure what happened. Let’s read that article again and see where that gets us…
Today we usually recommend that people who are a good fit for filling these bottlenecks treat them as their first priority. This usually means initially considering relevant jobs in research, top non-profits and policy, and if you’re willing to consider something especially competitive, our list of priority paths.
In contrast, we rarely think that earning to give should be the top priority for people who could be a good fit for these other roles. This is another idea we hoped to highlight by talking about ‘talent constraints’.
However, we also recognize that our priority problems aren’t ‘talent constrained’ in general, and our priority paths require a fairly narrow set of skills. So, we continue to recommend building career capital and earning to give as a high impact option for people whose skills don’t match the particular constraints currently faced by our priority problems.
What I think Talent Constrained means
EAOs are talent constrained when there are not enough capable people to work at that EAO. There is a lot of demand and the supply is really low was my thought process.
80khours defines it:
Lack of people with specific skill set.
I am going to try and argue here that for majority of the people EA is not constrained by talent.
First we need to be more clear than 80khours, so we talk about the lack of talent constrain, in 80khours suggested top career paths.
- AI strategy and Policy research
- AI safety technical research
- Grant maker focused on top areas
- Work in effective Altruism orgs
- Global priorities researcher
- Bio-risk strategy and policy
- China Specialists
- Earning to give in quant trading
- Decision making psychology research and roles
People sometimes act as if the main alternative to earning to give is working at an ‘effective altruism non-profit’. However, this misses many types of high impact roles including those in academia, policy and relevant companies, which could absorb far more people. Our recent survey showed that roles in policy are highly valued, as are research positions that could be done within academia.
Footnote added at the end, needs to be corrected: 6
EA seems to be Talent Constrained (EAorgs say)
Rethink Charity, Open Phil Gen Researcher, TLYCS, EAF’s hiring round
I don’t have a whole lot of evidence for every single of the career paths. But just broadly in general looking at “Working at EAO” or “Global Priorities Research”, there seems to be evidence of the lack of highly skilled personnel for the job. Especially in the cases of Operations people, Senior hires in places like Rethink Charity and also Generalist researchers.
Open Philanthropy (OP) hired 5 Generalist Research Analysts (GRs) in 2018 and wrote about their experience and provide some key numbers. Apparently, there were hundreds of applicants with strong resumes and seemed quite aligned with OP’s mission. 58 of them performed best in their work-tests and 17 of them were offered 3-month trials from which 5 were selected. Further more OP acknowledges that it doesn’t have the capability to deploy such a pool of available talent.
I don’t know why GR is one of the most talked about (top 3) in all the surveys done by 80khours. Maybe it is because it is the demand but there is plenty of supply as seen above. Global priorities research seems to be a joke, considering the lack of capability to absorb talent. This is also a priority paths. I can hear 80k’s defense: “but we will need them in the coming years”, really? you are going to absorb 50 people and who knows how many more are going to be available especially in the coming years? All this considering the “startups should hire slow philosophy”. Maybe there are plenty of these orgs and everyone gets a job? but unlikely.
EAF (Germany), similar to OP, gave a detailed account into their hiring round for Operations and Research Analyst. Without any further research I assume that the RA position is similar to that of GiveWell’s. Within a period of 2 weeks they got 66 applicants. 66 applicants –> 17 work tests –> 10 interviews –> 4 trial week –> 2 offers. They didn’t want to have a larger period for apping due to organizational constraints. In the end they appear to have hired 4% of their initial applicants. Considering that this is Germany (not really an EA hub), it appears that they atleast had 17 potential candidates whom they wanted to test. It still appears that there are a lot of talented people out there for roles like these despite being in a non-EA hub.
Peter Hurford, has various titles such as Vice Chairman of the Board, President of Board and Co-executive Director of Charity Science, Rethink Charity and Rethink Priorities. He seems to have no problem finding talented people to work for him. In fact he says he finds it hard to turn down strong applicants and still has to do it anyway in 2019.
I’ve certainly had no problem finding junior staff for Rethink Priorities, Rethink Charity, or Charity Science (Note: Rethink Priorities is part of Rethink Charity but both are entirely separate from Charity Science)… and so far we’ve been lucky enough to have enough strong senior staff applications that we’re still finding ourselves turning down really strong applicants we would otherwise really love to hire.—Peter Hurford says in the 2019 survey
Peter Expresses his concern on the bemoaning of ETG. As evidence to his claim I believe unless I can be a quants trader I am a loser and that it is better to focus on working at an EAO as a result. And now even this is turning out to be hard as fuck. And the one org we turn to for career advice sucks.
The Life You Can Save’s Jon Behar also agrees with Peter that this whole “talent bottleneck”, atleast with the people they seem to be the case and that money is more in need for them. And that they are more strapped for Funding.
Firstly, it took me a while, but now I understand what is AI research. It splits to three or four things. (Technical) AI safety research (one done by ML engineers in OpenAI, MIRI etc…), AI strategy and policy (seems to be in the same bracket both strategy and policy) and then we have AI policy practice/implementation. This is roughly found in 80khours Policy guide and Technical safety research and this post by Carrick Flynn. Ok! now for the lack of or presence of AI safety and policy jobs. Miles Brundage in June 2017 previously from FHI and now in OpenAI (Policy) says in this podcast–“The world desperately needs AI strategists”–seems to imply that the policy research space is growing and there will be more jobs. Carrick Flynn in Sep 2017, who managed hiring for FHI in Strategy seems to suggest the desperate need for People but not now. Now there are very few positions due to the problems with “disentanglement”. That is the bottleneck apparently and suggests to develop skill (somehow) and wait for the opportunities. But it goes on to suggest that AI Strategy and Policy space is also not “Talent Constrained”. Will it be in the future? Yes there are claims and possibly so. How long into the future I don’t know. No one knows. It’s Feb 2020 now, and most of the articles on 80khours in the strategy space are still from 2017, So I don’t know if things changed and if there has been that success with disentanglement. There is no sign regarding people needing to work in the policy space.
Back in June 2017 already, Miles Brundage was in a podcast with 80000hours titled, “The world desperately needs AI strategists”. In that podcast Miles informs Rob, that getting a job is pretty competitive and that it is pretty much going to remain like this. Carrick Flynn from FHI who handled recruitment (in 2017), writes here, about the lack of jobs in AI strategy. There is one place called the “disentanglement research” and they need really skilled people at it, but otherwise, he asks people to hang on until this disentanglement progresses. Such as people from policy implementation, policy research are on hold. He also lists some broad areas which are bottlenecked, such as knowledge of Chinese politics, International law etc… which I am unsure how it is expected to work. As Carrick flynn points out there are probably not many jobs and they are going to be super competitive. If I come in once disentanglement is done it could be fire.
The AI strategy space is currently bottlenecked by entangled and under-defined research questions that are extremely difficult to resolve, as well as by a lack of current institutional capacity to absorb and utilize new researchers effectively.—Carrick Flynn FHI June 2017
Although this is not clear in numbers, I get the impression that there are “very few” jobs and more demand for disentanglement research mainly. He even said “FHI is hiring but in limited capacity”. I don’t have data on how many people applied etc… but there is some data in general and where availble I look at the evidence. So there seems to be some serious demand for disentanglement, but I want to take a stance and say we are further more not TC’ed.
You don’t expect Harvard to say they are talent constrained when they pretty much take in 5% of the total applicants. And the same with Y-combinator, they have tons of applicants in the order of 10000 and they accept only 100-150 (1%). There are some places where EA seems to be genuinely Talent constrained. An example would be:
So far we have seen AI safety, Rethink serieses, TLYCS and EAF all seem to be full a people and not having much capacity to take in new players. Except for one concrete thing of Disentanglement that I have, it appears for the jobs
Of course this is narrow in its view, but my point is I don’t want to slog my ass off and then find in the end that its too copetitive and that there are no jobs. If in the end selection if I am just slightly better than the other person what is the point? I dont know! and if that guy can’t get a job anywhere else.
Below are some more specific options that are among the most promising paths we know. Many of them are difficult to enter – you may need to start by investing in your skills for several years, and there may be relatively few positions available. However, if you have potential to excel in any of these paths, we encourage you to seriously consider it, as it may be one of your highest-impact options.—High Impact Careers April 2019
So these are not bottleneck options, ok! but why does 80khours want us to get better and work on it? again! I was thinking maybe focussing on bottlenecks was the think to do! This is just tooo tooo confusing for me.
No: of vacancies in a year? (50?) aaron getler
https://forum.effectivealtruism.org/posts/pzEJmc5gRskHjGTak/many-ea-orgs-say-they-place-a-lot-of-financial-value-on?commentId=r3vv6HWvZ8riBfLEM
Here Jon Beyer and Peter Hurford seem to be hitting the nail in the head. Please have a look
EA is not talent constrained (People says)
This is also proxy for will I get a job?
I stumbled upon this post by accident and it was the most popular post (most upvotes ever) in any post seen in the EA Forum (all time). OK. When I started getting into it, it started hitting me, my reality was completely being shaken. Great. And for the first time I had evidence and not words (80k) about what this god forsaken TC meant. This sent me into a crazy spiral of evidence which allowed me to take two steps back and probably impressed in me to test the 80khours claims from now on. Atleast 80khours claims. Those n***** have been fucking with me for far too long. First it was CC, now it is TC. Anyways taking a shit on 80khours is for another post.
Many people have applied. Many good intelligent people seem to have applied and not made it. A lot of effort goes into these things. Mostly if you ask them, it seems like they will say ah but EA is TC (The great term coined by 80k). And the distancing of 80khours from theses feedback or suggestions that 80khours could potentially offer feedback and shave years of time waste in people is not really nice. Because they publicise themselves as some for the workld career feedback giving institution. But they are not. They are all about the elite. But then market your fucking ass like that. If you don’t have two oscars, this advice is not for you. Up until two years, they wouldn’t even say what they meant by young and old. They love to keep it vague.
User EA applicant, wrote a post on Feb 2019 which garnered the most likes ever in a post in the EA forum. This atleast suggests that this is an important topic. He applied to 20 posts and barely got a job after one year and some 500 hrs of time later. He seems to have gotten a lot of feedback from other EAs in EAOs that he is “worthy” to be in EAOs. I so very much agree with his take on what 80khours is saying:
“Hey you! You know, all these ideas that you had about making the world a better place, like working for Doctors without Borders? They probably aren’t that great. The long-term future is what matters. And that is not funding constrained, so earning to give is kind of off the table as well. But the good news is, we really, really need people working on these things. We are so talent constraint… (20 applications later) … Yeah, when we said that we need people, we meant capable people. Not you. You suck.”
They literally said all of this. But let’s not digress.
So this EA applicant from Germany, has couple of scholarships, has 8 publications, has done many internships in the field of medicine, has led medical refugee camps managing 50 people and has taken classes for 150 students at a time in University in Mathematics, and was ranked 16th out of 6000 people in Medical school. He applied to 20 positions in the “long-termism, EA movement building, grant-making type” jobs and got rejected to most of them (3 of them he didn’t pursue the work trial due to “visa issues”). Apparently he spent around 400-800 hours in one year for this and is completely dejected that he has got none of the jobs. He seems to have gone to atleast the 2nd stage in most interviews. He also claims to know “several” people who went to a great university like Oxford, were the top 5%, lead local EA chapters, EA aligned and motivated, and 5 positions later 100% rejections. “Several people” after this post messaged this applicant stating that they had similar experiences.
Some may argue 5 is less number of jobs to apply for, but it seems to be consistent that EA is not really TCed by these kind of talents who are ready to do research, program management, operations you name it. In fact there are more people in the comments and in other posts, who decided based on 80khours possibly, that EAOs are going to be their life but then it’s not really TCed. There are enough “good” applicants for them to pick up from.
Saying, I need even better talent and saying that is what you meant by TC, is kinda not right I think. As you can always say that, for every case, that you want even more output from people. Another student reporting his fuckery.
I’ve recently graduated from one of the top ~10 universities worldwide, after investing heavily in EA throughout my studies. While a student, EA was the biggest thing in my life. I read a lot, and several of my EA peers told me I stood out as particularly well-informed about EA topics, especially long-termist ones. Eventually I contributed some of my own research too. I also invested enormous amounts of time in student EA projects. Many people, including ones I thought well-informed about the talent landscape, fully expected that I would go work for an ‘EA organisation’. Naively, I believed it too.
Over the last seven months, I’ve made over 20 unsuccessful job applications (I keep a spreadsheet). This has increased the severity of my depression and anxiety. — Anaonymousthrowaway
I tread carefully as we are mostly dealing with words here, but if people who were well informed about the talent landscape didn’t predict him right after 20 applications, then is EA still TCed.
Such investment above seems to favor the elite as pointed out by Milan Griffes. Only people who the time to spend 500 hrs in a year and without the stress of job seem to be successful (next section).
Something that is more brutal out of all this subtexting is taht you suck if you ETG other than quants trading. You are not worth a call from 80khours, you suck. as pointed out by this guy:
So instead I earn-to-give, and am constantly hit with messages (see above caveat! messages may not be real!) of “Why are you doing this? Nobody’s funding-constrained! Money isn’t real! Only talent constraints matter!” while knowing that if I tried to help with talent constraints, I would get “Sorry, we have 2,000 applicants per position, you’re imposing a huge cost on us by even making us evaluate you”.
https://physticuffs.tumblr.com/post/183108805284/slatestarscratchpad-this-post-is-venting-it
Joel Miller on applying to operations in this facebook post: https://www.facebook.com/groups/473795076132698/permalink/1077231712455695/
etc… etc…
“wow, these people are really impressive, and I find it surprising that they could not find a job” — Max Daniel
Reply to max daniel’s epistemic status:
But bro, the point is people like X can’t contribute. Taht is it. And we are really unsure as to what in the hell 80khours means considering there are organizations that clearly express that they don’t have capacity. How can there be TC when there is no space to absorb. And I don’t trust it one bit. I think they are trying to generalize info and shit goes wrong there and then they dont use examples. They just write lectures of long notes. OK I need examples for every single claims that I make. That is not the attitude of 80khours I think.
Sending out personalized invitations…
Kinda people who get in
Max daniel Aaron Gertler, Luke prog etc… and what their creds are.
From previous article
Someone who didn’t get there?
EA applicant aaron gertler Max daniel all from:
https://forum.effectivealtruism.org/posts/jmbP9rwXncfa32seH/after-one-year-of-applying-for-ea-jobs-it-is-really-really?commentId=Aic5bcvLmunfhmnhr
Is there any bottleneck? (evidence)
There is no bottleneck at this moment it looks like, in that case consider going into personal fit types? or potential in the future types?
EA applicant talks about his 5 friends, Max daniel gets several rejections, despite thier credentials such as:
However, I don’t think I am a very special case. I know several people who fulfil all of the following criteria:
They studied/are studying at postgraduate level at a highly competitive university (like Oxford) or in a highly competitive subject (like medical school)
They are within the top 5% of their course
They have impressive extracurricular activities (like leading a local EA chapter, having organised successful big events, peer-reviewed publications while studying, …)
They are very motivated and EA aligned
They applied for at least 5 positions in the EA community and got rejected in 100% of the cases.
I think I also fulfil all these criteria. Here is my CV roughly at the time when I was doing the applications. It sports such features as ranking 16th out of around 6000 German medical students, and 8 peer-reviewed publications while studying.
People working at EA organisations, sometimes in senior positions, were surprised when they heard I didn’t get an offer (from another organisation). But he did get several interviews.
Regarding me being a bit of an outlier: Yes, I think so as well. I personally don’t know anyone who applied for quite as many positions. I still don’t think I am a very special case. I also got several private messages in response to this post, of people saying they had made similar experiences. — EA Applicant
I know at least 2 people who unsuccessfully applied to a large number of ‘EA jobs’. (I’m aware there are many more.) I feel confident that they have several highly impressive relevant skills, e.g. because I’ve seen some of their writing and/or their CVs. I’m aware I don’t know the full distribution of their relevant skills, and that the people who made the hiring decisions are in a much better position to make them than I. I’m still left with a subjective sense of “wow, these people are really impressive, and I find it surprising that they could not find a job”.—Max Daniel
I’ve recently graduated from one of the top ~10 universities worldwide, after investing heavily in EA throughout my studies. While a student, EA was the biggest thing in my life. I read a lot, and several of my EA peers told me I stood out as particularly well-informed about EA topics, especially long-termist ones. Eventually I contributed some of my own research too. I also invested enormous amounts of time in student EA projects. Many people, including ones I thought well-informed about the talent landscape, fully expected that I would go work for an ‘EA organisation’. Naively, I believed it too.
Over the last seven months, I’ve made over 20 unsuccessful job applications (I keep a spreadsheet). This has increased the severity of my depression and anxiety.
— Anonymousthrowaway
But 80khours continues to argue shit like Talent constrain but where is the talent constrain mother fucker?
People working here don’t seem to have funds to grow, idhula talent oru keda?
And more of this here: https://physticuffs.tumblr.com/post/183108805284/slatestarscratchpad-this-post-is-venting-it
So instead I earn-to-give, and am constantly hit with messages (see above caveat! messages may not be real!) of “Why are you doing this? Nobody’s funding-constrained! Money isn’t real! Only talent constraints matter!” while knowing that if I tried to help with talent constraints, I would get “Sorry, we have 2,000 applicants per position, you’re imposing a huge cost on us by even making us evaluate you”.
https://www.facebook.com/groups/473795076132698/permalink/1077231712455695/
Joel miller on apping to operations.
My biggest qualm is these people from 80khours come out and say, oh you misunderstood what we said, not that what the fuck they said was misleading. They think giving a post of explanaiton is all they need to do. Fuck me and all the time I and several other AEA’s are attempting to spend to find a fucking a job which will add value.
The thought or less thought that seems to go into loose words like career capital, bottleneck, talent constrained, is worrying.
https://forum.effectivealtruism.org/posts/2XfiQuHrNFCyKsmuZ/max_daniel-s-shortform?commentId=nRciXmjukddKadzwZ
From another article
2019
https://forum.effectivealtruism.org/posts/TpoeJ9A2G5Sipxfit/ea-leaders-forum-survey-on-ea-priorities-data-and-analysis
“The 2019 results were very similar to those of 2018, with few exceptions. Demand remains high for people with skills in management, prioritization, and research, as well as experts on government and policy.”
Policy seems to have risen quite a bit since last year again for only EA as a whole, but not for “My Org”.
2018
https://80000hours.org/2018/10/2018-talent-gaps-survey/#appendix-2-answers-to-open-comment-questions
If I make a claim saying I am going to look at “My Org” details only as it captures what that that organizations need, then atleast the orgs listed things that they want:
Operations, management, GR, and MI and hustle bustle type of vague roles.
Although gov and policy experts are rated high on what EA needs as a whole.
https://forum.effectivealtruism.org/posts/TpoeJ9A2G5Sipxfit/ea-leaders-forum-survey-on-ea-priorities-data-and-analysis
Funding constraints always lower than talent, but close by in the 5 point scale.
It is pretty clear that Swe engineers are low in need (6 vs 33)
Where as ML engineers seem to be on par with GR last year and this year!
Generalist researcher exit opportunities
All this maybe if you get a job then you are worth
What are the high impact careers?
https://80000hours.org/articles/high-impact-careers/#4-apply-an-unusual-strength-to-a-needed-niche
80khours sucks
https://forum.effectivealtruism.org/posts/7bp9Qjy7rCtuhGChs/survey-of-ea-org-leaders-about-what-skills-and-experience#comments
Peter Hurford suggests that the whole discussion of the EA world being “talent constrained” seems to be bogus as he has had no problem whatsoever to find people he can hire. But then it stands why these idiot orgs are writing that they are talent constrained. He seems to also be questioning the numbers to trade for x employees. And 80khours seems to be suggesting in this article that that number includes some extra bullshit () and might not represent the true value of your worth.
I think we need to look at this more clearly. My god! This is just questioning what I should be doing like dw vs etg, the prime foundation. “An organisation reporting being ‘talent constrained’ doesn’t necessarily indicate that they are about to hire a large number of people.” — WTF https://forum.effectivealtruism.org/posts/pzEJmc5gRskHjGTak/many-ea-orgs-say-they-place-a-lot-of-financial-value-on
And that if you get a job then you it is probably worth it. Man the wording infuriates me. Give me motherfucking examples dammit. I don’t trust your fucking reasoning and playing with fucking words. Piss off will you.
AGB seems to point out that in an already full talent pool with serious competitiveness, if I go and fight for a spot and then there are two outcomes. I get the job and the rest of the overflowing pool (who did not make the cut) go to find other jobs or
Rewrite of the discussion on replaceability and saying people are worth more than what they are
I think it is a discussion for if I should work in EAO or do DS. Nam saying…
80k’s rebuttal to talent gaps Nov 2018 article
continues to say even today that there is bottleneck. Great. I don’t know what they are talking about. I am unable to trust their findings, especially the ones without examples. As the onus is on me to determine if what they are saying is true. I really need a reality check and the burden should not be on me, it should be on 80khours to provide examples with what they are saying and why they are saying shit. Otherwise we are again lost in the CC TC word game of who understands what 80khours is saying. God help us.
We believe many of our top problem areas are highly constrained by specific skill sets, as we outlined for AI safety earlier. What’s more, there was an important shift in this direction from 2014 as additional large funders entered these areas, especially the Open Philanthropy Project. This increase in funding created a spike in demand for certain key positions that couldn’t be quickly matched by an equivalent increase in people able to fill them. This led to a bottleneck of people with these types of skills, which persists today.
I spent the last 2 years writing my ass off to follow their priciples in CC and others to end up now (just by accident), in a position where EAOs is not going to be for me in a long time.
Another way to see the problem is that a typical job application process only accepts 1-10% of applicants. This means that even if an organization is 3-times keener to hire than average, its acceptance rate would still only be 3-30%, and most applicants will still not get the job.
(rolling my eyes! OK!).Then that fucking thing is not TC. There is no skill bottleneck. My problem is not with 3-30% (there could always be fluff applicants). My problem is with that the way it appears as though there are many jobs and its hard to fill them, because the right people don’t exist, whereas in reality it is completely different. If you look at OP’s hiring round, you get the picture. They had 50 candidates of good stature that they had to choose from and several of whom they would consider hiring in the future. So I am not really sure how this is contrain in skill. Anyways fuck 80khours and their definitions.
My worries are further exploding because they think the proxy for their success is VIEWS. OMG! Are you serious BT? are these the kind of people you trust to change the world! Views gOoh! When you don’t care about those people who view it and all you care about is the super super elite. Fuck off.
My qualms are that 80khours did a poor job with it. They think english is the best way to communitcate not numbers. They did this with CC being focus now its TC. My biggest fear is listening more to their english blindly, without being able to test their claims. And that is the danger of the situation. Most people (atleast me and EA applicant) seem to heavily rely on theirr “research”, but am not sure it is that good or atleast the way they talk. Look at how they loosely use words like MANY (without numbers or actual examples of what it is they mean) in the 2018 survey. and TC and CC. Where do they get their info from? More explanation is needed. maybe they trying to generalze it so please stop that… your OK is not my OK. (everything with exmaples)
Conclusion
-
The surveys are wrong or pointless or not trustable
-
Other than Disentanglement I do not know of any other bottleneck for the long-termism gang.
-
GR, Operations, AI policy and strategy seem to be saturated and full, without many new jobs includnig open phil and FHI
-
Working outside the EA, in Policy
-
I don’t trust 80khours to communicate right
-
For me, its al
Reflection to my life.
I think am at a cross roads and I very very unsure what I can excel at. ML seems to be lucrative as atleast I would potentially have a job even if I am not good enough for a Tech AI safety job.
The problem is the way the bottlenecks haev been framed in 80khours, I don’t really know what to do. I am unsure I am unsure how to proceed.
Does the world need managers? how do I become one? is it also NOT TC?
Also I want evidence that something is TC or not. I dont’ want to trust words. I want someone to talk in numbers.
Wrap it up…. and prepare to post in EA blogs. and get feedback. how is one supposed to decide how to work. Don’t just tell me based on personal fit, what ever that advice is…
Bemoaning ETG
https://80000hours.org/articles/high-impact-careers/#global-priorities-researcher
ETG last spot (if you have a personal fit dialogue)
Working at EAO
talent gaps 2015 article
peter hurford
EA applicant
and several others
MIRI wanting money
Number of people doated from less wrong?
EA is competitive
In conclusion
I don’t knwo what all those orgs in the surveys were talking about. Does anyone though?
Considering the supposed need is for a particular type of workers (as in the survey) might not be ideal. i.e., just looking at the DEMAND alone is not useful. as there are 100 other or atleast 58 other GR’s for 5 positions and there is no evidence it is going to suddenly grow like crazy that the deamand is to be met. Hence my claim of non-bottlenecks.
It appears useful to consider how to work in AI safety due to the claim of potential bottleneck in that region too, but place caution on how bottlenecked it becomes.
Additionally the value of being 1% better than the previous hire (difficult to measure), needs to be estimated to give me a drive to actually try and beat all these GR’s. There is also a fear that I might not be working for an EAO anytime (possibly in my life?nah)
I was thinking I was always going to become a GR, but I am strongly considering against it. but I need to check the value of what I could be or maybe there are some REAL places where the supply is shit, which might need me.
More organizations should publish data like OP or EAF for each round to allow people to understand how bad or good the scene is and how the candidates were (some quantitative measure to compare).
Market is so competitive and there doesn’t seem to be examples of TC except for in Disentanglement as far as I know. or I have examples for.
points about 80khours
-
Waste of time of people
-
english english english… many… how many ? though
-
squishy terms, career capital talent constrained
-
Inability to apologize for the possible miscommunication
-
Lack of ability to provide a proper reply when asked about the time spent by EAs
-
Let’s not loose focus. We want to identify what we should be doing. Suprisingly this post was useful with its output on GR and the lack of demand. Atleast teh abundance of supply.
-
0 accountability (e.g., TC has 0 examples), its like people say or they say and we got to take it on face value.
-
making claims without evidence (all regard to TC). Need statements that I can test. 80khours is not reliable.
-
trying to genenralize goddammit
Is there anything that is talent constrained?
All this to me appears like a really fake hype. And further to cover what they actually meant, they said they are talking about fucking even smarter talent. I think one and only one example I have is people in the AI safety where “Disentanglement” is a big deal and not many people seem to be able to do that according to one of the operations guys from a certain EAO.
80k’s explanation
An organisation reporting being ‘talent constrained’ doesn’t necessarily indicate that they are about to hire a large number of people.” — WTF https://forum.effectivealtruism.org/posts/pzEJmc5gRskHjGTak/many-ea-orgs-say-they-place-a-lot-of-financial-value-on
Benjamin todd in the same post continues to say he feels TC.
Takeaways
Certain jobs seem to be 0 constrained. There are not many jobs available, EAO’s are hiring slowly.
Rob Wiblin should take questions from listeners and what topics they want to hear more about. keiran@80000hours.org
Examples
And also in the same article include shit like,
Interestingly, many of the organizations report being neither heavily constrained by funding or talent,…
Talk about being vague. WTF is many! Jesus.
Naming of posts, my god: Aug 2018 : Highest impact career paths
April 2019: our list of high-impact careers
I find this confusing, if something is not leading anymore, put a note on it and deprecate it. God dang it.
And it is so fucking similar in the index. Jesus! fuck you guys!
Diff definitions
There seems to be more than one definition:
Another factor is that hiring takes up a lot of senior staff time – you need to source candidates, train them, test them out, and many won’t work out. Moreover, a bad hire could easily harm morale, take up a lot of time, or damage the reputation of the organisation, so there is a lot of risk. This means that it takes a long time to convert funding into good new staff, creating a talent bottleneck. But if a potential hire takes a short amount of time to evaluate, train and manage, they often wouldn’t get replaced for a long time. — Working at EAO June 2017.
Confusing fuckers.
Another definition being:
A cause is constrained by a type of talent, X, if adding a (paid) worker with talent X to the cause would create much more progress than adding funding equal to that person’s salary. — Focus on talent gaps
80khours defines TC in “Why you should work on Talent gaps” (Nov 2015):
For some causes, additional money can buy substantial progress. In others, the key bottleneck is finding people with a specific skill set. This second set of causes are more “talent constrained” than “funding constrained”; we say they have a “talent gap”.
In the same article there is a “slightly more precise definition”:
A cause is constrained by a type of talent, X, if adding a (paid) worker with talent X to the cause would create much more progress than adding funding equal to that person’s salary.
This was initially confusing and made me wonder “what does this have to do with being talent constrained?”. There are not many people with the specific skill to work in Disentanglement research. It is thus TC. Adding a good paid worker to FHI in Disentanglement Research seems to be vital as it could potentially allow the many areas in Policy and Strategy currently in hold to continue making progress. Whereas adding salary of the researcher (say 300k$ per year) to FHI (AI strategy and policy cause) doesn’t seem to do much, as it seems to have enough money from Elon Musk, Open Phil etc… to fund all the high quality proposals it needs. The above quotes seems to check out. The statement holds when we look at Global Health and Funding with the case of AMF.
In Nov 2018 80khours clarifies the TC description,
An organisation is talent constrained when, for someone who could take (a reasonably important) job at that organisation, they would typically contribute more to that organisation by taking the job than earning to give.
Why the definition can’t be the simple one at top and needs to keep having variants is puzzling to me. This re-definition also stands the test of examples such as Disentanglement Research (in FHI) being TC and Global Health work (in AMF) being FC.
EA Concepts defines TC like how similar to 80k’s 2015 post,
Organization A: Has annual funding of $5m, so can fund more staff, and has been actively hiring for a year, but has been unable to find anyone suitable… Organization A is more talent constrained than funding constrained…
Very clear, A definition you would expect from an org like 80khours but…! When there is a lack of talent X in a particular cause and people are looking for it, then the cause is TC by X.
Definitions that I don’t want to accept, as its just getting broad and pointless
I don’t think what 80khours says in “Working at EAO” matches the definition we started off earlier.
Another factor (for orgs being TC) is that hiring takes up a lot of senior staff time – you need to source candidates, train them, test them out, and many won’t work out. Moreover, a bad hire could easily harm morale, take up a lot of time, or damage the reputation of the organisation, so there is a lot of risk. This means that it takes a long time to convert funding into good new staff, creating a talent bottleneck. But if a potential hire takes a short amount of time to evaluate, train and manage, they often wouldn’t get replaced for a long time. — Working at EAO June 2017.
The definition of TC is that an org needs to actively be looking for people and is unable to find them. Like in the case of disentanglement research. I don’t know where the time taken to hire is playing a role. OP found set out to find 5 people and found more than 5 people. It took months for this hiring round. People would laugh if anyone said OP is TC in GR. Sure they would like to hire more people without spending any time from the management or operations. Sure, but such people don’t exist. Saying that OP is TC because it wants new people but is unable to hire them, sounds to me like wrong understanding of TC, or trying to bloat the definition of TC just like they did CC. I am really getting frustrated, by the lack of thought that goes into these writings. At best it is something like management constrained, not TALENT.
Moving on.
40000 definitions of TC
Sentences that add little or no value (highly circular)
We try to highlight how our views depend on problem selection in our recent article and the survey. For instance, global health is significantly more funding constrained than global catastrophic risks, so earning to give is a relatively more attractive path if you’re focused on health — though as per point 2, additional funding is useful in both.— Nov 2019 Clarifying talent gaps
Today we usually recommend that people who are a good fit for filling these bottlenecks treat them as their first priority. This usually means initially considering relevant jobs in research, top non-profits and policy, and if you’re willing to consider something especially competitive, our list of priority paths.
In contrast, we rarely think that earning to give should be the top priority for people who could be a good fit for these other roles. This is another idea we hoped to highlight by talking about ‘talent constraints’.
However, we also recognize that our priority problems aren’t ‘talent constrained’ in general, and our priority paths require a fairly narrow set of skills. So, we continue to recommend building career capital and earning to give as a high impact option for people whose skills don’t match the particular constraints currently faced by our priority problems.
how about some decisive advice.. Apply to us, if we don’t pic you, the priority paths are not for you. just etg. Something where I can check based on your “advice”.
Lack of accountability to reply to people who have spent years wasting based on their response.
Bloating a definition so that it becomes untestable or useless
Another factor is that hiring takes up a lot of senior staff time – you need to source candidates, train them, test them out, and many won’t work out. Moreover, a bad hire could easily harm morale, take up a lot of time, or damage the reputation of the organisation, so there is a lot of risk. This means that it takes a long time to convert funding into good new staff, creating a talent bottleneck. But if a potential hire takes a short amount of time to evaluate, train and manage, they often wouldn’t get replaced for a long time. — Working at EAO June 2017.
they did the same thign wiht CC as well.
Unclear motherfuckers
It appears that another way to arrive at these above “priority career paths” is to look at the problem profiles and check what the bottlenecks are. For example, in the profile on AI (March 2017), we see that 80khours calls for people to help in AI technical research, AI strategy and policy, Complimentary roles and Advocacy and capacity building. Here again they strongly discourage ETG as there are enough funds. So basically EVERYTHING IN AI. I guess this is the same with each problem profile resulting in the above career paths.
Personal fit
And “If I am a good fit for these”, then I should apply. There is no way for me of determining that I am a good personal fit, especially considering that there are several other people and that it is competitive as shit.
Another piece of evidence would be the Problem profile and their alleged bottlenecks, such as in AI safety, strategy, complementary roles and here again they say TC TC TC. I am tired of how loosely they use the word TC and then have the audacity to say that TC not in general but in ungeneral. where So to me all this screams there are several places where I am needed. And “If I have a good personal fit for these”, then you should focus on them (identifying how you have a good personal fit is a problem beyond me)
Until I saw the EA forum (by accident), I didn’t know how to get advice on these things or to look up people who are similar to you to see who is a personal fit and who not. Which was like 2 weeks back. Until then the word of 80khours is all I had. Things like these makes me think there is a huge demand
Clarification on talent constrained
Why aren’t there many more EA’s then?
Why aren’t EA hiring like crazy ?
Is bottleneck and talent constrained the same thing?
EA is funding constrained
They have been making quite a fuss about working at EAOs and not etging (as in the Why should we focus more on talent gaps)
ETG and that too as a quants person in trading is ranked 8th in their top priority paths behind, AI safety, policy and strategy, China specialists, working in EAOs and doing global priorities research. 80khours suggests ETG unless it is quants person in trading or hedge-funds (It’s their 8th top priority path) probably due to the potential300k$ average donations per year.
If you’re able to take a job where you earn more than you need, and you think none of the categories above are a great fit for you, we’d encourage you to consider earning to give. It’s also worth considering this option if you have an unusually good fit for a very high-earning career.
In brief, we think our list of top problems (AI safety, biorisk, EA, GPR, nuclear security, institutional decision-making) are mainly constrained by research insights, either those that directly solve the issue or insights about policy solutions.
On the other hand, there is currently more funding available than is being spent in these areas, so earning to give doesn’t seem like the key bottleneck. —high impact careers(Aug 2018)
Very easy to get into EAO
How easy is it to get into EAO
It is very easy to get into EAO:
What are the predictors of success? Based on our experience, the people most likely to excel at EA organisations tend to have the following traits:
They really care about effective altruism, and are happy to talk about it all day. This is one of the main things the organisations look for, and it can be hard if you don’t share the same level of enthusiasm about effective altruism as other staff. They’re excited and enthusiastic about the mission of the specific organisation they work for. You get a lot of responsibility in these roles, and it can be hard to sustain the intensity and effort required to succeed without being excited by the mission and strategy of the organisation.
They’re self-directed, able to manage their time well, they can create efficient and productive plans, and keep track of complex projects. — Working at EAOs
More shit like this in “how can I get these jobs at EA” at Working at EAOs. Such as Volunteer at EAGx, help run a local EA group, participate in feedback and reviews etc… (why I say shit hopefully is clear later).
vague
vague
So the survey is supposed to inform us of what we need more of. So when I see GRs are needed I know GRs such as in Open Phil or any of the other EA orgs, is what is needed. If they say operations people are missing, then I think they need people like Tanya to run EA orgs. Simple.
But when people say one-on-one skills, it could mean many things (as AG pointed out): talking to politicians about causes (policy people maybe), People who are good at convincing people changing their career path (career counselors), it could also mean fundraisers in the front line bringing the big bucks. I am sure we can think of other things as well. So what is it that people in the survey meant? everything? Why not just say, “we need policy people with good social skills instead”. At least this way the community knows what you are missing “exactly” rather than allowing it to interpretation. This way, people can act on it. What “good social skills” means? Well let’s not get into that discussion as well. If I and the surveyors could be provided with one example, like Tanya for FHI, it makes things concrete and people exactly know what the survey is talking about.
I am not a big fan of these broad terminologies as they don’t allow ME to act on them. Case in point: “Best ways to gain Career Capital (CC) are: Work at a growing organisation that has a reputation for high performance; Getting a graduate degree; Working in Tech sector; Taking a data science job; Working in think tanks; Making “good connections”, Having runway etc… “ Literally everything under the sun.
I am unable to act on it. I could in theory pursue everything. I don’t know how to compare which has higher CC and lower CC. The definition says: “CC puts you in a better position to make a difference in the future, including skills, connections, credentials and runway.” When I work in Data Science in a FAANG job do I have higher CC compared to when I work in computer science degree? I don’t know.
Economists routinely measure the impact of high-school dropout vs high-school diploma vs some years of college but dropout vs undergrad degree vs grad degree, in different fields, using the variable “median weekly earnings” or “lifetime earnings”. So when someone says, “you need a degree to get ahead in life”, I can imagine what they mean $470 weekly wage increase. Whereas when someone says, “Computer Science PhD is good CC”, I am lost. Contrast that to saying “Best ways to gain CC is by looking at earnings”. Then I could look at median earnings for Data Science Faang job vs Phd in computer science in say top 20 university (based on my capability) and get ahead in life.
Replaceable
https://80000hours.org/career-reviews/working-at-effective-altruist-organisations/#but-wont-i-be-replaceable
read it
Also some statements about value… https://80000hours.org/career-reviews/working-at-effective-altruist-organisations/#but-wont-i-be-replaceable
Value
EA people are valued super high :https://forum.effectivealtruism.org/posts/pzEJmc5gRskHjGTak/many-ea-orgs-say-they-place-a-lot-of-financial-value-on
Milan Griffes estimate on 80khours says he could make a contribution of 97k$ tomorrow if he works in givewell https://80000hours.org/2016/08/reflections-from-a-givewell-employee/
Calculating value is very tricky
ETG vs EA (https://80000hours.org/2015/11/why-you-should-focus-more-on-talent-gaps-not-funding-gaps/)
Milan Griffes
how much value 80khours thinks you have by asking what to pay for last hire.
What did I think before, what changes now to my future?
How hard is it to get a job in EA now?
There seems to be money, even trying to earn to give seems to be pointless, considering at max I can be a data scientist. Entrepreneurship and starting a non-profit could well be things in the bank. –> based on the article talent gaps vs funding gaps
References
EA is TC (footnote)
EA has been and is talent constrained according to surveys made by 80khours and CEA since 2017. Several organizations seem to think so in these surveys: 2017 survey, 2018 survey, 2019 survey. In all the surveys EA’s on average claim to be more Talent Constrained than Funding constrained. For example in 2019 EAOs reported feeling more (3 out of 5 rating) Talent Constrained and less (1 out of 5 rating) Funding Constrained1. More details in the footnote 2.
Since 2015 already, 80khours seems to be suggesting that we should focus on providing talent to the community rather than ETG in “Why you should focus on talent gaps and not funding gaps. They make the case that if someone can setup a Charity that meets GiveWell’s criteria, then they seem to have access to 10s of millions of dollars. Another example 80khours made was about AI Safety and that the funds were enough as per the evaluation of Open Phil and that there are people who are ready to donate even more funds, but think there isn’t enough “talent pool” (back in 2015)3.
In June 2017, in “Working at EAO”, they quote the survey to inform people, that EAOs are TC. During the same time, a podcast is titled “The world desperately needs AI strategists”, where Miles Brundage talks to Rob about AI strategists. Here Miles is seen making claims that AI strategy is a growth area and that jobs are currently “few” and “pretty competitive”. Just as seen in the growth of AI safety there is expected to be growth in this space as well in a couple of years. Whether the adjective desperate was warrented or not is a separate debate but the message seems to be that there is LOT OF TC (atleast based on the title).
As late as August 2018, 80k can be seen saying that we need people to work on AI safety, Biorisk, EA, GPR, Nuclear security and institutional decision making.
Why did we choose these categories (Research, Govt policy, eff non profits, ETG)5? Why do we especially highlight research, policy and non-profit jobs; deprioritize earning to give; and omit advocacy and entrepreneurship?
In brief, we think our list of top problems (AI safety, biorisk, EA, GPR, nuclear security, institutional decision-making) are mainly constrained by research insights, either those that directly solve the issue or insights about policy solutions. — High Impact Careers Aug 2018
In Nov 2019, 80khours tries to clear up the “confusions” we create when we talk about “talent gaps”.
Rather than funding vs. talent gaps, we propose that people aim to identify specific bottlenecks facing the field and the skills needed to resolve them. A ‘bottleneck’ is the resource that a field most needs in order to make progress.
Ok, what if a cause is bottlenecked by a specific skill? Let me think… ah Talent constrained. It appears that the word is not confusing as per their initial definition. But somehow somewhere in the way, wait am not sure what happened. Let’s read that article again and see where that gets us…
Today we usually recommend that people who are a good fit for filling these bottlenecks treat them as their first priority. This usually means initially considering relevant jobs in research, top non-profits and policy, and if you’re willing to consider something especially competitive, our list of priority paths.
In contrast, we rarely think that earning to give should be the top priority for people who could be a good fit for these other roles. This is another idea we hoped to highlight by talking about ‘talent constraints’.
However, we also recognize that our priority problems aren’t ‘talent constrained’ in general, and our priority paths require a fairly narrow set of skills. So, we continue to recommend building career capital and earning to give as a high impact option for people whose skills don’t match the particular constraints currently faced by our priority problems.
What I think Talent Constrained means
EAOs are talent constrained when there are not enough capable people to work at that EAO. There is a lot of demand and the supply is really low was my thought process.
80khours defines it:
Lack of people with specific skill set.
I am going to try and argue here that for majority of the people EA is not constrained by talent.
First we need to be more clear than 80khours, so we talk about the lack of talent constrain, in 80khours suggested top career paths.
- AI strategy and Policy research
- AI safety technical research
- Grant maker focused on top areas
- Work in effective Altruism orgs
- Global priorities researcher
- Bio-risk strategy and policy
- China Specialists
- Earning to give in quant trading
- Decision making psychology research and roles
People sometimes act as if the main alternative to earning to give is working at an ‘effective altruism non-profit’. However, this misses many types of high impact roles including those in academia, policy and relevant companies, which could absorb far more people. Our recent survey showed that roles in policy are highly valued, as are research positions that could be done within academia.
Footnotes
about ETG as in the later paragraph of the quotes.
from is ea bottlenecked 2
Qualms with 80khours
I am really frustrated with 80khours. Atleast to my knowledge they have made 4 claims:
- replaceability
- TC
- Pushing CC
- how easy getting a job in EA is
- They write in english (not numbers)
- They don’t think they need to apologise
-
very bad explanation: Think twice Talent gaps Survey 2019 EA work about cash
-
LAck of decisive advice (saying just about everything)
-
bloating a definition to make it useless
- contradiction: profile on AI says basically everything where as it is somehow TC in specific areas in AI.
TC not in general but for AI it is in general, for Working in EAO it is in general.
-
lack of updation of posts.
- Several definitions for TC (god help me)
-
repeat posts (not sure what the difference is)
- Mistakes mistakes mistakes
recent hire cost,
- useless advice
go to the companies and ask them how much donation I would be irrelavent. BS
-
Not one place do they seem to mention the EA forum (My savior)
-
how useless ETG is
-
untestable claims unless for EA forum:
In brief, we think our list of top problems (AI safety, biorisk, EA, GPR, nuclear security, institutional decision-making) are mainly constrained by research insights, either those that directly solve the issue or insights about policy solutions.
Look at is-ea-bottlenecked.markdown
- one of the hardest things to do is to estimate the impact and they conviniently leave that out. They get very createive with their example giving.
Other
- They expect english to convey shit… not numbers
When someone asked that 80khours hsould consider the cost of us applying and figuring out how much life sucks they literally made a shitty comment.
Peter Expresses his concern on the bemoaning of ETG. As evidence to his claim I believe unless I can be a quants trader I am a loser and that it is better to focus on working at an EAO as a result. And now even this is turning out to be hard as fuck. And the one org we turn to for career advice sucks.
https://80000hours.org/career-reviews/working-at-effective-altruist-organisations/#choosing-which-organisation-to-work-at
useless advice
https://80000hours.org/key-ideas/#most-pressing-problems contradictory advice in 2019 article: operations staff still bottleneck
How replaceable is GR in actually?
I tried to get data but that didn’t seem to work out in this case. But is this what the world is undergoing? is it true that who ever doens’t get the job has to do something significantly bad ? according to peter hurford yes, according to his friends yes.
So now that EA is not TC atleast for the type of jobs I was planning to upskill myself in (GR, AI safety guy), I need another plan.
I was thinking of working in GiveWell maybe and from there on going further. Or doing a masters in AI and then finding my way into there.
You wont get into EA
Start with 80khours telling people they shouldn’t ETG (as it is pointless). They should focus on
and maybe list all the people who didn’t get in
Orgs are hiring slow they remain small…
How I think 80khours misleads everybody
-
definitions
-
no examples
-
working for the elite
or try to see if you can make that a post the things I have written earlier.
or maybe it is just better to take mild digs at them… here and there.
Furthermore, 80khours has a history with mistakes in the past. They have gone back and forth with replaceability since 2012, they realized they put too much stress on career capital, downplayed how hard it is to get a job in EA. I am having a hard time understanding how several people
-
1 = how much things cost is never a practical limiting factor for you; 5 = you are considering shrinking to avoid running out of money
1 = you could hire many outstanding candidates who want to work at your org if you chose that approach, or had the capacity to absorb them, or had the money; 5 = you can’t get any of the people you need to grow, or you are losing the good people you have
-
In the 2017 survey they (80khours) say,
On a 0-4 scale EA organisations viewed themselves as 2.5 ‘talent constrained’ and 1.2 ‘funding constrained’ (average),…
In 2017 surveyed orgs seem to want Forecasting-Priorities capabilities, GRs, Management and operations as top 4.
2017 survey includes:
The survey includes (number of respondents in parentheses): 80,000 Hours (3), AI Impacts (1), Animal Charity Evaluators (1), Center for Applied Rationality (2), Centre for Effective Altruism (3), Centre for the Study of Existential Risk (1), Charity Science: Health (1), Foundational Research Institute (2), Future of Humanity Institute (3), GiveWell (2), Global Priorities Institute (1), Leverage Research (1), Machine Intelligence Research Institute (2), Open Philanthropy Project (5), Rethink Charity (1), Sentience Institute (1) and Other (6) (who were mainly researchers).
In the 2018 survey they (80khours) say,
On a scale of 0 to 4, respondents saw themselves as 2.8 constrained by talent and 1.5 by funding, similar to last year and consistent with the donation trade-off figures.
The effective altruism community’s greatest talent needs are inthe fields of operations, management, generalist research, government and policy expertise, and AI/machine learning expertise… Leaders thought the key bottleneck for the community is to get More dedicated people (e.g. work at EA orgs, research in AI safety/biosecurity/economics, etg over $1m) converted from moderate engagement. The second biggest is to increase impact of existing dedicated people through e.g. better research, coordination, decision-making.” — 2018 survey
In 2018 surveyed orgs wanted Operations, Management, GRs, and AI technical expertise were the top 4.
2018 survey includes:
80,000 Hours (3), AI Impacts (1), Animal Charity Evaluators (2), Center for Applied Rationality (2), Centre for Effective Altruism (2), Centre for the Study of Existential Risk (1), Berkeley Center for Human-Compatible AI (1), Charity Science: Health (1), DeepMind (1), Foundational Research Institute (2), Future of Humanity Institute (2), GiveWell (1), Global Priorities Institute (2), LessWrong (1), Machine Intelligence Research Institute (1), Open Philanthropy Project (4), OpenAI (1), Rethink Charity (2), Sentience Institute (1), SparkWave (1), and Other (5)
-
Well if you think of OpenAI, yes they seemt to be havign billions in investment for them to burn, but just last month MIRI came to me asking me for money in december as they could not meet some 1m dollars or something… ↩ ↩2
-
“The world desperately needs AI strategists”, where Miles Brundage talks to Rob about AI strategists. Here Miles is seen making claims that AI strategy is a growth area and that jobs are currently “few” and “pretty competitive”. Just as seen in the growth of AI safety there is expected to be growth in this space as well in a couple of years. Whether the adjective desperate was warrented or not is a separate debate but the message seems to be that there is LOT OF TC (atleast based on the title). ↩
-
I think it is an error and that they were clearly not talking ↩ ↩2
-
EA has been and is talent constrained according to surveys made by 80khours and CEA since 2017. Several organizations seem to think so in these surveys: 2017 survey, 2018 survey, 2019 survey. In all the surveys EA’s on average claim to be more Talent Constrained than Funding constrained. For example in 2019 EAOs reported feeling more (3 out of 5 rating) Talent Constrained and less (1 out of 5 rating) Funding Constrained1. More details in the footnote 2.
Since 2015 already, 80khours seems to be suggesting that we should focus on providing talent to the community rather than ETG in “Why you should focus on talent gaps and not funding gaps. They make the case that if someone can setup a Charity that meets GiveWell’s criteria, then they seem to have access to 10s of millions of dollars. Another example 80khours made was about AI Safety and that the funds were enough as per the evaluation of Open Phil and that there are people who are ready to donate even more funds, but think there isn’t enough “talent pool” (back in 2015)3.
In June 2017, in “Working at EAO”, they quote the survey to inform people, that EAOs are TC. During the same time, a podcast is titled “The world desperately needs AI strategists”, where Miles Brundage talks to Rob about AI strategists. Here Miles is seen making claims that AI strategy is a growth area and that jobs are currently “few” and “pretty competitive”. Just as seen in the growth of AI safety there is expected to be growth in this space as well in a couple of years. Whether the adjective desperate was warrented or not is a separate debate but the message seems to be that there is LOT OF TC (atleast based on the title).
As late as August 2018, 80k can be seen saying that we need people to work on AI safety, Biorisk, EA, GPR, Nuclear security and institutional decision making.
Why did we choose these categories (Research, Govt policy, eff non profits, ETG)5? Why do we especially highlight research, policy and non-profit jobs; deprioritize earning to give; and omit advocacy and entrepreneurship?
In brief, we think our list of top problems (AI safety, biorisk, EA, GPR, nuclear security, institutional decision-making) are mainly constrained by research insights, either those that directly solve the issue or insights about policy solutions. — High Impact Careers Aug 2018
In Nov 2019, 80khours tries to clear up the “confusions” we create when we talk about “talent gaps”.
Rather than funding vs. talent gaps, we propose that people aim to identify specific bottlenecks facing the field and the skills needed to resolve them. A ‘bottleneck’ is the resource that a field most needs in order to make progress.
Ok, what if a cause is bottlenecked by a specific skill? Let me think… ah Talent constrained. It appears that the word is not confusing as per their initial definition. But somehow somewhere in the way, wait am not sure what happened. Let’s read that article again and see where that gets us…
Today we usually recommend that people who are a good fit for filling these bottlenecks treat them as their first priority. This usually means initially considering relevant jobs in research, top non-profits and policy, and if you’re willing to consider something especially competitive, our list of priority paths.
In contrast, we rarely think that earning to give should be the top priority for people who could be a good fit for these other roles. This is another idea we hoped to highlight by talking about ‘talent constraints’.
However, we also recognize that our priority problems aren’t ‘talent constrained’ in general, and our priority paths require a fairly narrow set of skills. So, we continue to recommend building career capital and earning to give as a high impact option for people whose skills don’t match the particular constraints currently faced by our priority problems.
What I think Talent Constrained means
EAOs are talent constrained when there are not enough capable people to work at that EAO. There is a lot of demand and the supply is really low was my thought process.
80khours defines it:
Lack of people with specific skill set.
I am going to try and argue here that for majority of the people EA is not constrained by talent.
First we need to be more clear than 80khours, so we talk about the lack of talent constrain, in 80khours suggested top career paths.
- AI strategy and Policy research
- AI safety technical research
- Grant maker focused on top areas
- Work in effective Altruism orgs
- Global priorities researcher
- Bio-risk strategy and policy
- China Specialists
- Earning to give in quant trading
- Decision making psychology research and roles
People sometimes act as if the main alternative to earning to give is working at an ‘effective altruism non-profit’. However, this misses many types of high impact roles including those in academia, policy and relevant companies, which could absorb far more people. Our recent survey showed that roles in policy are highly valued, as are research positions that could be done within academia.