• 5 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: July 4th, 2023

help-circle

  • I don’t disagree, but Windows’ built in screen casting is hard to find and clunky to use. Linux is even worse off. Until earlier this year there was no real support from any Linux desktop environment. There’s a GNOME project that’s supposed to be putting together support. It was announced to ship with GNOME 46, but I’m not a GNOME user so I just tried to install the flatpak on my Kubuntu machine. It detects my TV but fails to connect with it. Definitely still needs work.


  • Some of that focus involves adding features that have become table-stakes in other browsers.

    Speaking of this, does anyone else feel like Firefox’s lack of ability to wirelessly screencast is a major problem when it comes to convincing others to switch away from chromium browsers? I know chromecast and airplay themselves are both proprietary, and therefore counter to firefox’s open source philosophy, but they could at least implement first party support for miracast (or DLNA?) A surprising number of smart TVs work well with those protocols. They just tend not to advertise it because most people don’t know what they are.

    I admit that I haven’t looked much into this since some years ago when I first switched over to firefox as my main browser, but at the time I found that there weren’t even any decent addons for screen casting functionality. I’ve learned to live without it, but I know a lot of people who use that functionality on a daily basis and could (quite justifiably) never be convinced to switch without an equivalent.



  • I agree. The concept is simple, and it’s not perfect, but it isn’t dumb either. This is basically recreating how coal and oil got in the ground in the first place. Plants absorbed carbon from the air as they grew, then they got buried in a way that prevented them from decomposing and re-releasing it into the atmosphere. My main question here would be whether burying it only 10 feet under ground is really enough for long term storage. The other big elephant in the room with carbon capture is that it can be a convenient excuse for companies to avoid doing work towards actually decarbonizing their operations. If, as the article suggests, this is used primarily by industries like cement making that don’t currently have a way to become carbon neutral then it’s a good thing. If it’s just used as cynical green washing by companies who could be doing better, then it’s at best a wash, and arguably a net negative.






  • Out of curiosity, what software is normally being run on your clusters? Based on my reading, it seems like some companies run clusters for business purposes. E.g. an engineering company might use it for structural analysis of their designs, or a pharmaceutical company might simulate the interactions of new drugs. I assume in those cases they’ve bought a license for some kind of high-end software that’s been specifically written to run in a distributed environment. I also found references to some software libraries that are meant to support writing programs in this environment. I assume those are used more by academics who have a very specific question they want to answer (and may not have funding for commercial software) so they write their own code that’s hyper focused on their area of study.

    Is that basically how it works, or have I misunderstood?


  • This actually came up in my research. Folding@Home is considered a “grid computer” According to Wikipedia:

    Grid computing is distinguished from … cluster computing in that grid computers have each node set to perform a different task/application. Grid computers also tend to be more heterogeneous and geographically dispersed (thus not physically coupled) than cluster computers.

    The primary performance disadvantage is that the various processors and local storage areas do not have high-speed connections. This arrangement is thus well-suited to applications in which multiple parallel computations can take place independently, without the need to communicate intermediate results between processors.



  • I’m not sure what you’d want to run in a homelab that would use even 10 machines, but it could be fun to find out.

    Oh yeah, this is absolutely a solution in search of a problem. It all started with the discovery that these old (but not ancient, most of them are intel 7th gen) computers were being auctioned off for like $20 a piece. From there I started trying to work backwards towards something I could do with them.


  • I was looking at HP mini PCs. The ones that were for sale used 7th gen i5s with a 35W TDP. They’re sold with a 65W power brick so presumably the whole system would never draw more than that. I could run a 16 node cluster flat out on a little over a kW, which is within the rating of a single residential circuit breaker. I certainly wouldn’t want to keep it running all the time, but it’s not like I’d have to get my electric system upgraded if I wanted to set one up and run it for a couple of hours as an experiment.







  • I read something a while ago that really put all these “ancient mysteries” into perspective: Modern humans with modern brains have existed in our current form for at least tens of thousands of years. During that time we’ve seen huge advancement as a society thanks to the accumulation and sharing of scientific knowledge, but any individual human today has no more brainpower than one living 10,000 years ago.

    In other words, if we can sit around today and brainstorm a dozen different ways to build a pyramid with nothing but ramps and levers, there’s absolutely no reason to think that the smartest builders in ancient egypt couldn’t have come up withl the same ideas or better.

    Attributing these achievements to aliens, or divine intervention, or anything other than raw human ingenuity is a disservice to our ancestors.



  • I’m a big fan of upgradable hardware, but lately I’ve found that the bigger problem with Android phones is the lack of software support. I had my last phone for 5 years and finally upgraded not because there were any major hardware problems, but because the android version was so far out of date that I was starting to feel the pain of missing out on some major improvements, plus some apps actually were starting to break. I picked my current phone specifically because Samsung was promising to support four major version upgrades which is, unfortunately, industry leading among Android OEMs despite lagging hugely behind Apple’s software support for their older models.

    Fairphone seems to have a mixed track record on this. According to their website the Fairphone 2 got 5 major updates (great!). But the Fairphone 3 got only one update (bad). And the fairphone 4 has received one update so far with a second one promised. After that they say that they’ll try to provide two more updates, but they’re not making any promises because the processor will be out of support with Qualcomm by then.

    This is, unfortunately, a very understandable position to take. The fact that Android OEMs rely on third parties like Qualcomm to design and support their processors is definitely the major problem here. Big guys like Samsung and Google can throw their weight around and squeeze a year or two of extra support out. But for small players like fairphone it’s not surprising that they find themselves in this position.

    The fact is that any sane company would prefer to make money selling new chips, rather than spending it to support old ones. This problem will persist until consumers start demanding longer software support on their devices and making it a major part of their buying decision.