• 1 Post
  • 31 Comments
Joined 1 year ago
cake
Cake day: December 16th, 2023

help-circle



  • This guy has like a billion videos that are just some variation of “Here’s a tech bro startup making a gadgetbahn and here’s why it wouldn’t work and trains are a thousand times better”. Great that it exists, but since these startups never learn from others’ mistakes and thus keep making the same missteps over and over and over again, it makes the videos very samey after a while. Not sure what I would do in his position.



  • I left a comment that made a similar point with some data:

    4: Please stop sharing conspiracy theories

    5: Higher wages are useless if your country’s infrastructure and tax system is so piss poor that you need to spend more on basic necessities. We have economic metrics that account for some of this, such as the difference between income and discretionary income. Free-market propagandists always point to the US having high income, but the same can not be said for discretionary income. For example, if we compare the US to the Netherlands, we see that the US median disposable income is 41K while in the Netherlands it’s 36K. But let’s compare how much you have to spend in your day to day life and calculate the discretionary income based on that:

    ________________________US_______Netherlands

    income________________41k_______36k

    food___________________5.1k_______3.7k

    shelter_________________13.2k______13k

    clothing________________1.2k_______1.5k

    transport______________6.3k_______3.4k

    health__________________3.2k_______1.8k

    student debt___________2.1k_______0.8k

    discretionary income__9.9k_______11.8k

    As we see, the case the free-market capitalist makes falls apart once we look at discretionary income, which collectivist and social policies ensure is higher in the Netherlands.

    EDIT: Scott has edited the post to make 4 seem less like an endorsement and more an ironic share. This is better, but I still prefer it if these things aren’t spread at all.

    EDIT 2: Source for the 2021 US-Dutch disposable income vs discretionary income (as well as a lot of other comparisons between median US and Dutch expenditure): https://www.moneymacro.rocks/2021-07-02-dutch-vs-america-middle-class/





  • I actually don’t find this a bad post, but I do want to point out that it got way more karma than any of titotals more critical posts, even though I find many of them better. This once again points to how the EA Forum’s voting-power-by-popularity karma system creates groupthink; being critical nets you less voting power than being lauditory, and it disincentivizes calling out bullshit in general.

    When Ives Parr of “Effective Altruism is when you want to spend money on genetic engineering for race-and-IQ theories” fame, made a seperate post complaining that that post got downvoted despite nobody giving a good counterargument, I wanted to comment and call him out on his bullshit, but why bother with a karma system that allows him and his buddies to downvote it out of the frontpage while leaving you with less voting power? A lot of EA’s missteps are just one off blunders, but what makes the EA forum’s “”“epistocratic”“” voting system so much worse is that it’s systematic, every post and comment is now affected by this calculus of how much you can criticize the people with a lot of power on the forum without losing power of your own, making groupthink almost inevitable. Given the fact that people who are on the forum longer have on average more voting power than newer voices, I can’t help but wonder if this is by design.









  • It made me think of epistemic luck in the rat-sphere in general, him inventing then immediately fumbling ‘gettier attack’ is just such a perfect example, but there are other examples in there such as Yud saying:

    Personally, I’m used to operating without the cognitive support of a civilization in controversial domains, and have some confidence in my own ability to independently invent everything important that would be on the other side of the filter and check it myself before speaking. So you know, from having read this, that I checked all the speakable and unspeakable arguments I had thought of, and concluded that this speakable argument would be good on net to publish[…]

    Which @200fifty points out:

    Zack is actually correct that this is a pretty wild thing to say… “Rest assured that I considered all possible counterarguments against my position which I was able to generate with my mega super brain. No, I haven’t actually looked at the arguments against my position, but I’m confident in my ability to think of everything that people who disagree with me would say.” It so happens that Yudkowsky is on the ‘right side’ politically in this particular case, but man, this is real sloppy for someone who claims to be on the side of capital-T truth.



  • While the writer is wrong, the post itself is actually quite interesting and made me think more about epistemic luck. I think Zack does correctly point out cases where I would say rationalists got epistemically lucky, although his views on the matter seem entirely different. I think this quote is a good microcosm of this post:

    The Times’s insinuation that Scott Alexander is a racist like Charles Murray seems like a “Gettier attack”: the charge is essentially correct, even though the evidence used to prosecute the charge before a jury of distracted New York Times readers is completely bogus.

    A “Gettier attack” is a very interesting concept I will keep in my back pocket, but he clearly doesn’t know what a Gettier problem is. With a Gettier case a belief is both true and justified, but still not knowledge because the usually solid justification fails unexpectedly. The classic example is looking at your watch and seeing it’s 7:00, believing it’s 7:00, and it actually is 7:00, but it isn’t knowledge because the usually solid justification of “my watch tells the time” failed unexpectedly when your watch broke when it reached 7:00 the last time and has been stuck on 7:00 ever since. You got epistemically lucky.

    So while this isn’t a “Gettier attack” Zack did get at least a partial dose of epistemic luck. He believes it isn’t justified and therefore a Gettier attack, but in fact, you need justification for a Gettier attack, and it is justified, so he got some epistemic luck writing about epistemic luck. This is what a good chunk of this post feels like.