Blood Music was way cooler then this just saying.
I begin to believe that some people literally do not have senses of humor with which to distinguish impossible statements meant nonseriously from seriously.
“It’s everyone else’s fault they don’t recognize me as a genius,” said the dork ass loser
Ah yes, my favourite joke structure. All set up with no discernable punchline. Especially if the humour requires forensics.
I like the way he thinks the lack of punctuation in his “joke” is the tell that it’s a joke.
He’s also apparently never heard the aphorism that if you have to explain the joke, it’s probably not that funny.
I like deadpan humor and often have to clarify that some quip was a pun, a reference or sarcasm, but I don’t blame the listeners whenever they don’t get them.
If I were a self-identified contrarian habitually posting controversial hot takes in flowery prose, I’d hope to be a little less belligerent and defensive if people mistake an ironic joke for a sincere belief.
It sometimes hurts that people believe you’d actually mean the dumb joke you said but you either have to suck it up and take the L or start marking up your irony.
If you’re reading this, here’s a reminder to give your eyes a break from screens. If you like, you can do some eye stretches. Here’s how:
- Read any of Yud’s tweets
- Close your eyes
- Let your eyes roll from processing whatever drivel he’s written. Try for about 30 seconds.
To unpack the post a bit:
So my understanding is that Yud is convinced that the inscrutable matrices (note: just inscrutable to him) in his LLM have achieved sentience. In his near-future world where AI can exert itself in the physical world at will and, in particular, transfer data into your body, what possible use does it have for a bitcoin? What possible benefit would come from reprogramming human DNA beyond the intellectual challenge? I’ve recently been thinking about how Yud is supposedly the canonical AI-doomer, but his (and the TESCREAL community in general’s) AI ideation is rarely more than just third-rate, first-thought-worst-thought sci-fi.
also:
people keep on talking about… the near-term dangers of AI but they never come up with any[thing] really interesting"
Given the current public discourse on AI and how it might be exploited to make the working class redundant, this is just Yud telling on himself for the gazillionth time.
also a later tweet:
right that’s the danger of LLMs. they don’t reason by analogy. they don’t reason at all. you just put a computer virus in one end and a DNA virus comes out the other
Well, consider my priors adjusted, Yud correctly identifies that LLMs don’t reason, good job my guy. Yet, somehow he believes it’s possible that today’s LLMs can still spit out viable genetic viruses. Well, last I checked, no one on stack overflow has cracked that one yet.
Actually, if one of us could write that as a stack overflow question, maybe we can spook Yud. That would be fun.
Did he always show his ass so much when tweeting or is this a recent development?
I love how what if LLM but it makes your DNA mine bitcoin is the culmination of untold amounts of dollars in MIRI research grant money. Real effective altruism is when you tithe 80% of your income in perpetuity just so sneerclub can have more content.
He always had a tendency to be wrong, but this is going right into not even wrong territory.
The LLM modifies your DNA to create a biological wireless wifi transmitter! (He isn’t saying this but that is what is required and not just that, but so much more, like a whole DNA equivalent of a network stack, cryptography, getting rid of waste heat, etc etc. He just believes AGI is magic and that LLMs will become AGI, he has lost his mind. I mean look at how he dismisses somebody going ‘nice science fiction story brah’ with IT IS LLMS!).
Ignore the implication that the virus could rewire my guts into an LTE modem or brainwash me into reading and typing out entire bitcoin transaction blocks for a moment. Yud considers the ability to freely mutate humans to an arbitrary extent and the supervillain plan he comes up with is a fucking cryptocoin miner?
How does someone this creatively bankrupt produce 660 thousand words of a fanfic?
Not to dehumanize but are we sure Yudkowski isn’t an LLM himself?
God you sneerclubbers are never satisified, last time he invented biological bacteria factories creating diamondoids which killed every human alive a the same moment and it wasn’t realistic enough and now he creates a bitcoinminerbrain and suddenly he isn’t creative. Whaddayawant?!
But yeah, it is really not that creative indeed, but then again, most ‘agi kills everybody’ stories are already a bit done.
The concept is just another grey goo scenario rehash but I grant that “diamondoid bacteria” is a striking name for it.
It just reminds me of the “diamondilium” in the Futurama movie that proved the Futurama writers didn’t know how to sustain a whole movie.
The new one i put up has him whining about how little they got!!
I’m not even sure I’d trust those numbers. Both because he’s not a reliable narrator, and because artificially keeping NPO numbers low and moving the bulk of the money in another channel is a fairly known game
Also the math doesn’t really check out for…checks notes SFBA existence, so there would be even more questions to ask there
Small detail: biological viruses are not even remotely similar to computer “viruses”.
that’s where the LLM comes in! oh my god check your reading comprehension
U-huh, and an LLM trained on video game source code and clothing patterns can invent real life Gauntlets of Dexterity.
Why exactly is he so convinced LLMs are indistinguishable from magic? In the reality where I live, LLMs can sometimes produce a correct function on their own and are not capable of reliably transpiling code even for well specified and understood systems, let alone doing comic book mad scientist ass arbitrary code execution on viral DNA. Honestly, they’re hardly capable of doing anything reliably.
Along with the AI compiler story he inflicted on Xitter recently, I think he’s simply confused LLM and LLVM.
For decades he build a belief system where high intelligence is basically magic. That is needed to power his fears of AGI turning everything into paperclips, and it has become such a load bearing belief (one of the reasons for it is is a fear of death and grief over people he lost so not totally weird) that he has other assumptions added to this, for example we know that computers are pretty limited by matter esp the higher end ones need all kinds of metals which must be mined etc today. So that is why he switches his fears to biology, as biology is ‘cheap’ ‘easy’ and ‘everywhere’. The patterns in his reasoning are not that hard to grok. That is also why he thinks LLMs (which clearly are now at the start of their development not the end, it is like the early internet! (personally I think we are mostly at the end and we will just see a few relatively minor improvents but no big revolutionary leap)) will lead to AGI, on some level he needs this.
Men will nuke datacenters before going to therapy for grief and their mid life crisis.
What I don’t get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes… humans don’t entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can’t do when it doesn’t have access to the physical world, only things humans have written about it?
Even if it is using its godly intelligence to predict the next word, wouldn’t it only be able to predict the next word as it relates to things that have already been discovered through experiment? What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?
I guess maybe he thinks all of biology is “in” the DNA and it’s just a matter of simulating the ‘compilation’ process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that’s such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material
What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?
He considers deriving stuff from first principles much more versatile than it actually is. That and he really believes in the possibility of using simulations for anything.