The people I know who work in life sciences R&D (basically anything bio) have had their funding absolutely annihilated. PhDs with 20 years of experience working second jobs as substitute high school teachers, lab workers taking up tech support positions paying a fraction of what was already terrible pay.
What's worse is that in most of these fields, you don't really even start working until after your PhD.
4 years is going to be a long time to underfund what's basically 4 entire classes of researchers coming out of Doctorate programs. It might take decades to recover our research programs.
This is why I became a teaching professor. My employment and promotion are not conditioned on how much money I bring in and what I publish. But I still get to spend 4 months of the year doing research that's important to me. I don't publish as often but when I do, it's substantive work.
I've seen too many promising academic careers torched at 6-years because they had unfundable ideas. With this new administration, we see how "fundability" and "good important research" are often at odds and can change as quickly as the political winds.
When I was in gradschool it was over drones and the politics was within the FAA and their shifting definitions of what an "unmanned aerial vehicle" technically was. Recently you wouldn't get funding if you didn't have the word "equity" in your proposal. Now you don't get funding if you do have the word "equity" in your proposal. New boss, same as old boss.
Heaven forbid you were researching suddenly now <VORBOTEN> topic, your entire career is torched. I just didn't want to tie my career to that kind of capriciousness.
This was true when I was a grad student, decades ago. It was true when I worked in a lab as an undergraduate before that.
Specifics of the current environment aside, welcome to academic life. Unless you are one of the exceptionally fortunate few to have a permanent fellowship of some sort (e.g. Howard Hughes), your primary job as a research professor is to raise funding.
It really depends on what you mean by "decades", but I've been in the system for a generation and what you're saying doesn't match what I see on the ground.
During the doubling of the NIH budget under Clinton and Bush the younger times were great. After, budgets stagnated and things were harder but there was still funding out there. The disruption we're seeing now is a completely different animal: program officers are gone, fewer and less detailed summary statements go out, some programs are on hiatus (SBIR/STTR) and if you have something in the till it was wasted time, &c. NSF is a complete train wreck.
My startup had an STTR in for the last cycle and we can't talk to the program officer about our summary statement, nor can we resubmit, nor are we likely to be funded. That's a lot of lost time and money for a startup that, since we're atoms and not bits, is funded on a shoestring budget. The only time something like this happened in my memory was the shutdown in 2013 and that wasn't even close to the disruption we're seeing now.
I was also in science during Clinton, and what I’m saying was true then. The increase in funding went hand in hand with a massive increase in people seeking funding. So maybe there was some golden era of happy times when nobody had to chase grants, but it hasn’t been in my lifetime.
But again, I explicitly said that my point was independent of recent changes in funding. I am no longer in science, but it seems to be true that funding has declined. That doesn’t mean that chasing grants is something unprecedented for scientists to be doing.
The Clinton era was the golden age for life sciences (can’t speak to others) and it’s been a decline since then, either stagnant or a sharper downturn. Now? Complete operational collapse, a completely different animal altogether, and it’s not one agency it’s all agencies. You seem to be saying that chasing grants is not unprecedented, which has been true since Galileo and the patron system, but that isn’t a profound observation it’s the status quo. What I and others on the ground are saying is that now is a sudden and profound shift, having committed funding pulled or applications in process effectively frozen and simultaneously new awardees decimated, in a way that is impossible to sustain the basic and translational research enterprise. And outside of the feds, there isn’t a viable source of patient capital to turn to on the scale we’ve been operating.
Yes, I understand your claim that things are tighter now; I've repeatedly acknowledged that fact, and in any case, I have no personal basis to dispute the argument. But again, that's not related to the point I'm making.
One last time: OP was complaining that the group has to spend all of it's time raising funding, but that's always been true in my lifetime. There's never been a magical age where being a PI (or even a senior lab member) wasn't a perpetual process of raising funds, and anyone going into science should know this. Hence my comment: welcome to academia.
For whatever it's worth, this is basically reason #1 that most PhD grads I know voluntarily jumped off the hamster wheel. Anyone who gets a PhD and expects to be doing labwork as a PI is deeply deluded, and it needs to be shouted for the folks in the back: you are signing up for a lifetime of writing grants, teaching classes, and otherwise doing bureaucratic schleps. The current administration did not suddenly make this true.
I read SubiculumCode's post in the same context as bane's, speaking to the current environment.
You're saying that a group having to spend all of its time fundraising has always been true in your lifetime and you link it to your time as a grad student decades ago and earlier when you were an undergrad. Do I have that right? The dominance of fundraising might have been true for your specific experience and viewpoint, but I don't understand your basis for claiming it was universal: it certainly wasn't my experience (R1 engineering, not software) nor my colleagues around that time.
Complaints about fundraising and administrivia have always been plentiful but actual time spent on teaching and service and research were dominant, with the expected proportions of the three legged stool varying based on role and institution. What SubiculumCode and bane and myself are reacting to now is the dramatic shift in how dominant (because funding has been pulled, funding allocation methods have suddenly shifted) and unproductive (fewer summary statements, less or no feedback from SROs and POs, eliminated opportunities for resubmissions) that work has become. The closest I can remember to the current was around the aftermath of the 2008 recession and 2013 government shutdown and that pales in comparison to the disruption of now.
> You're saying that a group having to spend all of its time fundraising has always been true in your lifetime and you link it to your time as a grad student decades ago and earlier when you were an undergrad. Do I have that right?
I mean, yes...but everyone on this thread admits that it's still true (in fact, worse today), so I'm not sure what point you're making with this. Y'all are arguing that it's worse now, which is not a claim I am disputing [1]. The entire point of citing my "old" experience is that, in fact, we were all doing the same stuff back in the stone ages. I also haven't forgotten or misremembered due to my advancing age [2].
> The dominance of fundraising might have been true for your specific experience and viewpoint, but I don't understand your basis for claiming it was universal: it certainly wasn't my experience (R1 engineering, not software) nor my colleagues around that time.
OK. I never said my experience was universal. I was in the biological sciences, not engineering. To be clear, I'm not claiming experience in economics or english literature, either.
Again, I don't dispute that things might be worse today, but the situation is absolutely not new, and any grad student in the sciences [3] who expects otherwise has been seriously misled. That is my point.
[1] To be clear, I'm not saying it is or isn't worse today. I am making no claim with regard to the severity of the fundraising market. The market can be a bajillion times worse than when I came up, and my point is still valid -- back then, professors spent nearly all of their time chasing money! Today, professors spend nearly all of their time chasing money!
[2] This is a joke. I'm not old, and my experiences not as ancient as you're alluding. I understand that every generation clings to the belief that their struggles are unique in time, but it's probably a bad idea to take that notion seriously.
[3] Yes, I made the general claim "in the sciences". Because insults about age aside, and even though the specifics will vary from year to year and topic to topic, it's very important to realize that if you become a professor in the sciences, this is what you will be doing. You will not be in the lab making gadgets or potions or whatever -- you will be filling out grants, making slide decks, reviewing papers, and giving talks. If you cannot handle this life, quit now. It will not get better.
There are certainly ways to go work in a lab and do "fun stuff" forever, but a) you often don't need a graduate degree for these, and b) you shouldn't be deluded about which path you're on.
But clearly there was some science going on. Any time spent writing grants rather than doing research feels wasteful, but it's the way to get funding. The percentage of time spent doing that is changing, and the percentage of grants applications that get funding is going way down, demonstrating a big change in the amount of effort that goes directly to waste. Unfunded grants are not evidence of bad research that does not get funded, but merely of the funding level.
Science gets done by the people you hire with the money you raise. And yes, everyone in a group is always thinking about the next grant.
I’m not joking. I’m not exaggerating. This is the job, and it’s always been this way (at least in my lifetime). Maybe it’s worse because of the current administration, but complaining that academic life is mostly about grant writing is like a fish complaining about water.
I really wish people would stop trying to gaslight all of us into believing the current crisis is just business as usual.
Yes, previous US presidents told some lies.
Yes, previous US presidents and politicians had some unsavory associations or potential conflicts of interest.
Yes, previously some labs spent too much time writing grants and not enough actually doing research.
The problem is, these things are becoming the norm now, and your anecdotal memory of "aw, man, we spent all our time doing that back in the day!" is not a reliable indicator that really, nothing has changed, we should just stop complaining. Especially since we know that human memory is not only fallible, it is prone to specifically being better at remembering the exceptional, and the unpleasant.
Actually a PhD is a con, not a bonus if you want normal jobs.
If a private lab needs a chemist or biologist for say, quality assurance, one of the most common jobs in the field, then privates prefer fresh graduates:
- they cost much less
- even if the PhD would be fine with the pay, he/she will still be skipped over a fresh graduate because the person is over qualified and will jump to something more related to his/her field as soon as possible.
Thus these people's CV are genuinely worse for anything unrelated to their skill set.
I haven't been on the job market as a new PhD in (my god) nearly 20 years now, but at the time I was looking for work, having a PhD on my resume was the only reason I was able to snag interviews at Apple/Google/McKinsey/Bain/Twitter/etc. I never did anything related to my actual degree, but it certainly opened doors for me.
You picked an example to support your conclusion in mentioning QA jobs which typically don't require a PhD. There still very much are other jobs that do require a PhD so I don't see what the point is there.
More fundamentally this mentality of looking at education only through the lens of financial return is just so disappointing. Of course your country is self-sabotaging its science system if it's full of people who think that way.
I can pretty safely say that me and most people around me, when we got our PhDs, what job we'd later get really wasn't the primary concern.
We wanted to work on interesting problems at the frontier of what's known (and maybe also get a job doing that later).
I'm just talking about my experience as a former researcher.
If you spend 10 years of your life working on dye sensitized solar cells and perovskite, the number of positions for those roles in your area/country might be limited or non existent and at the same time you may no longer find any funding at your current position.
Thus you need to look for jobs outside your sphere of conpetence and for those your PhD may not be that useful, if not even a malus.
I have a friend who has a PhD in applied mathematics, has spent the last 5 years of his life on deep and machine learning problems, and he's applied to several positions as an ML researcher and his CV is not considered often due to the lack of professional, non academic experience.
And we talking the very booming ML sector for someone who understands the ins and outs of the math and architecture behind the models (area: UK and northern Europe).
> Actually a PhD is a con, not a bonus if you want normal jobs.
Depends on the market, which is true for any field. In places where there's a lot of technical work to be done, employers can hire PhD's and will do so if there's a local supply.
>4 years is going to be a long time to underfund what's basically 4 entire classes of researchers coming out of Doctorate programs. It might take decades to recover our research programs.
It's very optimistic to think that this madness is going to end in four years.
An average NIH R01 grant is $600,000 dollars per year for ~5 years. Forgoing a $100m student center would net you 33 projects. For reference, Stanford had 1000 ongoing projects for FY 2025
Most of that "grift" goes to salaries for professors, staff, for the very expensive lab space, pensions and health care for the professors, etc.
These rates are all highly negotiated and highly justified down to details. The average professor may not know how much overhead goes into actually running lab space and paying for all the infrastructure that's necessary for research, but it's not insubstantial.
People who know nothing about that side of the business, even professors at universities, say "that's outrageous, let's cut it" without even understanding where the money goes. It's a very DOGE view, and a disastrous one to act on without first understanding the particulars.
I don't know how to comment on this considering you don't seem to know that the majority of research staff salaries in highly successful labs is paid entirely through grant money.
"administrative grift" as you call it is on top of awarded amounts, not a part of it. If the University is forced to spend all $3M themselves and also forego the operating overhead, what you'll get isn't more projects but fewer projects and also smaller, less capable research organizations.
Which is what some people want, but other people recognize that more research, bigger projects, and large, world-class academic organizations capable of conducting it are part of maintaining strong national security. Such activities are not cheap, they are also not profitable, but again because they are crucial for national security, it's the government's prerogative and obligation to help fund such activities, even if you consider it grift.
The increase in F&A rates is due to the facilities portion, which in the "before times" was negotiated every 4 years with DHHS and had concrete data in the negotiation process to help ensure it was fair. The admin portion for universities has been capped at 26% since 1991.
I see comments like this where destructionists have their simplistic bullshit releasing on full-spread, and it reminds me to go back and upvote the article. HN is one of the few places where this feel-good nonsense actually gets rejected, giving us the possibility of discussing how to move past this societal mental illness.
> HN is one of the few places where this feel-good nonsense actually gets rejected
Something I learned a long time ago is that it doesn't matter how well you argue a point with a nincompoop, they will simply shrug and repeat their horseradish verbatim in the next thread, hoping that next time they don't attract an audience with as much critical thinking. Unless you are willing to waste as much time as they are arguing on the internet, it's a fruitless endeavor.
It's really up to the moderators of a social space to keep bad faith nincompoops out, and Hacker News has shown themselves to be complicit and unwilling to do what is necessary to prevent its own enshittification. At this point, this place is just Reddit with a tone policing and a nuclear downvote button.
The way I think about it is that the person I'm arguing with online is not really the person I'm trying to persuade; I'm trying to persuade the rest of the people reading.
The tech community was the source of the largest threat to American science in a century. As cheesy as it sounds, I think its my duty to counter the lazy talking points that otherwise go unaddressed in these circles.
> I'm trying to persuade the rest of the people reading.
That does help, and is part of the reason I myself engage with these folks from time to time, but it requires discipline to recognize when you're throwing good effort after bad.
You want to give your voice the greatest chance of being seen. Strategically responding to upvoted bad faith in a highly visible thread is a good idea. Keeping an argument alive 5-6 levels deep in a subthread that was already flagged is less so.
The mods here are worse than complicit. Dang in the past has allowed threats of violence while warning/deleting/banning petty name calling in the responses. It’s frankly disgusting.
Hacker News is Reddit with a tech-supremacy mindset.
More nonsense - indirect costs fund shared facilities, equipment, supplies, and data resources. To the extent that there is bloat, it funds the compliance that they are required by law to do. I would support simplifying this to reduce regulatory cost; I do not support paranoid whining.
The ratio has just been going up and up and up, and to suggest it pays for "equipment, supplies and data resources" is a bad joke considering the people doing the work end up saddled with yet more administrative bloat in the form of hostile, complicated processes in accessing the funds to buy the very equipment and supplies that enables the research.
That's because organizations get bigger as projects become more complicated and varied. Larger organizations require more overhead as a percentage of the operating costs. 30 years ago many schools didn't even have Computer Science departments. Today schools are now starting to stand up Artificial Intelligence departments. It's not cheap to maintain these organizations.
Anyway, it really comes down to a simple tension: you can have big science, good science, or cheap science. Choose two.
For a long time we've optimized for big and good. This has yielded dividends in terms of science and technology output, but it's very expensive. Yet, the ROI is decidedly, emphatically positive.
For some reason people seem to think we can do this all cheaper, somehow, by pulling funding and making all these organizations smaller. I don't see how this is possible, because it relies on an uncanny ability to predict which projects will succeed and fail ahead of time.
What I think will happen is the money will dry up, the talent will go to places that want to spend the money, and the remaining programs will be cheap, small-stakes research better suited for the 20th century, unable to compete with countries that actually want to invest in the future.
I don't believe that the feeling is that it can be done "cheap", but rather the inevitable draw towards privatizing previously government functions. This drives towards a more immediate profit motive, which inevitably pushes research towards a more applied focus.
Who's doing the research? Who's benefiting?
VC backed companies are often slotted in as the for profit version of academic R&D without the "encumbrances" of non-profitable blue-sky pure research.
Which tuition are you referring to? Nameplate tuition is like the sticker price on a new car; few to no people pay it. Net tuition is the number that actually matters, and it's been largely flat the last 8 years.
I don't know the figures for large universities, but at the small liberal arts college I graduated from and the one I've worked at for the last 15 years, the average figure for "full pay" students—which, as the name suggests, is the students who pay, or whose families pay, the full sticker price, either directly or through loans—has generally been between 46% and 53%.
Now, if you have figures showing that what you claim is true on the whole across all of US higher education, please, by all means, post the links. I'm genuinely interested to know just how different it is with the larger universities.
So you're saying academics use the same opaque market practices as, e.g. health insurance? Yeah all the more reasons to cut funding. If they have nothing to hide they have nothing to fear with transparency.
You seem to have no interest in transparance or understanding, but answer everything with "cut the universities" no matter what.
If differential pricing based on ability to pay is a reason to destroy something, then we had better destroy 90% of B2B. But it's not a reason, you're just parroting the same desired end result no matter what is actually said about universities.
Overseas students are not immigrants. They are on student visas (and most likely from very wealthy families... at least most of the ones I knew at Purdue were).
It is in the United States best interest to retain the best students as they graduate and create a system to promote student visa to green card to naturalization, but only a very few do.
Mostly, foreign students are price gouged by our universities to prop up a failing business model and make it more difficult for citizens to afford higher education.
Sure, it's in the United States' interest to retain the best foreign students (and in many students' interest to study in a country which will permit them to live and work there after their study). That doesn't mean the current administration is necessarily inclined to act this way
International student enrolment is down 17% this year, because the administration chose to take a broadly similar approach to student visas as they did to immigration, with a "pause" on interviews and lots of revocations, plus of course the concern their lawful student visa status isn't a guarantee they won't get taken off to processing centres by ICE thugs with quotas to hit. Other bright ideas the administration proposed with include a four year student visa limit to rule out the possibility of completing a PhD in a normal time frame. That's gonna hurt universities using the foreign students to prop their business up, and citizens who'll have to pick up their tab instead if they want their courses to continue...
One reason foreign enrollment is declining is concern about (mainly) Chinese espionage. That’s entirely reasonable, given the vast amount of stolen engineering and research…
That is the mind hack. People will always assume that the administration has the United States best interest in mind. If people can drop that assumption, they might make a beginning with understanding the firehose of seemingly erratic policy.
The US is a resource to be stripped, the interest in mind is self-interest. "Make us great again!" Back to the gilded age, whatever it takes.
> It is in the United States best interest to retain the best students
Yeah? Tell that to the US government.
As it stands, foreign student enrollment has dropped precipitously year-on-year. The international students are scared, and with good reason.
If ICE happens to roll up to campus, do you really think they'll be checking each student's visa status? Not on your life. They'll just round up everyone who doesn't look white enough, and if they're very, very lucky, they might just get sent back home in a speedy manner. If they're not, they'll get put in camps for indeterminate amounts of time, denied any access to the legal system, and treated worse than animals.
They need to cut funding until academia stops gamifying the research process. Aka cheating. It's bizarre to hear the stories that come out of this twisted world and then seeing them expect to keep getting paid the same.
Whenever I have dug into views like these, this is not a rational view based on first principles, it's about carrying out culture war based on a very odd phrase I heard first here on Hacker News: "elite conflict."
Destruction of scientific research is viewed as a positive win for the culture war. The particulars, what's actually happening with science, is completely secondary to discrediting the institution as a whole.
It's bizarre to hear the words that come out of this administration's mouth on... Almost any topic, and then see an actual person actually arguing that anything those people say or do needs to be defended.
Have you considered holding it to the same standard you want to hold your enemies to?
Not everyone is a tribalist.i don't have to agree with everything Trump does. Do you not know that? Or does the world feel safer when you split people into simple categories?
Academia in particular loves to push one-track thought and cancel culture. Hard to believe it used to be a place for diverse thought and DIScourse. Now if you disagree with the groupthink you are a racist. It's a very 2001 George Bush "you are either with us or against us" culture that absolutely deserves to have its funding cut.
What's worse is that in most of these fields, you don't really even start working until after your PhD.
4 years is going to be a long time to underfund what's basically 4 entire classes of researchers coming out of Doctorate programs. It might take decades to recover our research programs.