

I’m in the exact same position.


I’m in the exact same position.


I would say chocolatey and scoop are pretty much interchangeable. I don’t remember why I landed on scoop. Agreed that until recently there have been no package managers on Windows whatsoever.


I mean kinda. You have to use both WinGet and Scoop to cover all the use cases…


Well, in this case I think it’s a remnant of n++ predating any package manager on windows. I do think that an embedded self-updater is better than having to download a new version through the browser.
It wasn’t entirely clear to me if the compromise effects those of us who installed it though scoop/winget, as the package manager should pull directly from the correct source, so the compromised updater shouldnt matter. Reinstalled to be sure.


I think what sem is talking about is that Bungie first made “Marathon” in 1994. The new game is a pretty much unrelated game, reusing the name.


Absolutely! I’m certainly not going to insist that things should only be published on RSS feeds! Anything is better than the authoritative source for evacuation warnings being a Twitter account. Every platform has its place, I’m just a big fan of always starting with an RSS feed for announcements as the “root source”, then pushing changes to the feed to all other platforms.


Not really. RSS feeds are better for announcements in my opinion, as there’s no account associated, and the ways of viewing them are even more flexible and simple than the Fedi infrastructure.


My god the headline of the Fudzilla article is misleading.
Yes, they froze the samples of polymer, but the actual change to the technique is to increase the post-emulsion bake!
Well, first of all China does make lithography equipment (for instance, Shanghai Micro Electronics Equipment, who are currently at 28 nm). There are a couple of others iirc, and they typically got started by licensing lithography technology from Japanese companies and then building on it.
The issue is mostly one of economics – fabs want higher-resolution lithography as soon as possible, and they only buy it once, which means that the first company to develop new litho technologies takes the lions share of the revenue. If you’re second to the technology, or are more than half a dozen nodes behind like SMEE is, theres not a lot of demand because there are fabs full of litho machines from when that node was new, and theres not as much demand for them anymore.
The issue with a new company making leading edge nodes is the incredible R&D and development cost involved. Nikon, Canon, and ASML shared the market when they all started developing EUV tech, and it took ASML 15+ years to develop it! Canon and Nikon teamed up, spent tens of billions of dollars on R&D, and dropped out once they realized they couldn’t beat ASML to market because there wouldn’t be enough market left for them to make their money back.
If you want to learn more about the history of the semiconductor industry, I recommend the Asianometry YouTube channel!


Yup, and the price of the Xbox Ally is ridiculous, as expected!


Openness is great, but there’s no financial reason to make specialized hardware to operate an open platform.
Historically, consoles have been sold near cost, and profits have been made on game sales after the fact. If you can just buy your games from Steam on console, the price of the console will go up. At some point, it no longer makes sense to buy the specialized hardware.
But we’ll get to see how that goes! It’s looking more and more like the next Xbox is going to run Windows.
If you are truly starting from scratch, shooting for Raspberry Pi performance isn’t starting small, thats a huge goal. It’s a complex chip built on a fairly modern process node (28 nm for the 4B) using the second-best-established architecture.
The reasonable goal to shoot for would be an 8086-like chip, then perhaps something like a K3-II or early Pentium, then slowly work your way up from there.
There are a couple of further questions to be able to answer this best. First, when you say using only tech that is in the open, nothing proprietary, how strictly do you mean that? Historically, what Chinese foundries have done is buy a fab line far enough from the leading edge to not be questioned, then use that as a starting point for working towards smaller nodes. If thats allowed, it would be fairly trivial, 40 nm doesnt perform that badly.
If you want the equivalent of “open-source” fab equipment, as far as I know that has never existed. In better news, if you go back to DUV/immersion lithography, its not just ASML manufacturing lithography, Nikon and Canon were still in the game, so power was less centralized.
Second, what is the actual goal? If it’s just compute, no big deal. As long as you can write a C compiler for your architecture (or use RISC-V as other folks have mentioned) getting the Linux kernel running shouldn’t be too hard. However, you’re going to have to deal with manually modifying the firmware of any peripherals you want to run – PCIe devices, USB, I2C, etc. Not a firmware engineer, so I have no idea how hard it would be, but this is one of the things that’s been holding back Linux on Arm over the years.
All in all, depending on how strict you want to be, it could be anywhere between slightly difficult and effectively impossible.


Given the state of that trailer, its a shame it’s anywhere. We hate to see an IP being milked so hard.


Yes, I’m also surprised it’s so low, if only because during sales you can get like 3-5 older indie games for $15. Those games are often shorter and have more controlled scope as well, meaning more folks would actually have time to play them.
On the other hand, it means folks are only buying games they’ll actually play, which is good.
Depends on how much power is being transmitted to each base station, but it would have to be a colossal satellite to be “we’re all going to die”.
I pointed that out mostly as a limitation on how much power could be transmitted to each base station.
Microwave scattering is an absolute nightmare over that kind of distance. Even for much shorter distances, microwaves are only practical to transport over a couple of meters in a waveguide.
If its transmitting to a base station, we can assume it’s in geosynchronous orbit, or about 22,000 miles from the surface. With a fairly large dish on the satellite, you could probably keep the beam fairly tight until it hit the atmosphere, but that last ~100 miles of air would scatter it like no tomorrow. Clouds and humidity are also a huge problem – water is an exceptionally good absorber in most of the MW band.
I saw numbers reported for the transmission efficiency somewhere (will update this if I can find it again), and they were sub-30%. The other 70% is either boiling clouds on its way down, or missing the reviever on the ground and gently cooking the surrounding area.


My understanding is that they offer them all, but publishers havent been able to reliably get 8 or 16gb cards. Whether thats Nintendo being shady or some legitimate supply issue, I don’t know.


Well, I doubt they’ll release one for my clippers since they’re discontinued, so that inspired me to go ahead and model a variable-depth one for myself. Based on some of the comments here, I thickened the comb blades to make them print more easily.

Consider me enlightened! The recontextualization of the statement makes it make a lot more sense. As with many things, being swept into the tech sphere robbed it of meaning. I also think that the sentiment of “the tools used to build something are not the same tools which can effectively dismantle it” is true in many senses, not just in the context of social/politiical/institutional change.
I was confused by the section “How Taking the ‘Master’s Tools’ Seriously Can Serve Enshittification”. It transitions from an argument that
(which is true), then links this group directly to Facebook. While these descriptors apply to both the founders of the internet and the founders of the tech giants, facebook is at least 15 years younger than the foundation of the public internet, and these two groups are both mutually exclusive and ideologically at odds. The author then goes on to use the social harms of big tech to push back against Doctorow’s first stage of enshittification, when the companies are “good”.
I think this is a fundamental misreading of Doctorow. He has spent his career as a free software advocate, and claiming that the first stage of corporate capture of the internet is the ideal would be anathemic to his more general arguments. What he means by “good” here – and he says this frequently in public discussions on enshittification – is that the product does what it says on the box, with no BS. That people are tempted to use it because it allows people to access the internet without coming up against the sharp edges of the technology itself, and that is a reasonable compromise for many people at first, because it allows more people to access the internet.
The article argues that in order to fully represent the experience of all stakeholders, the internet “getting worse” is an incomplete view, and to understand the impacts outside the white, male, etc. perspective, we should use the tools of decolonialism, which would be true if Doctorows project was a thorough sociological analysis of the impacts of technology. But it isn’t, it’s a rallying cry. The goal of his book is to make a coherent narative of the change in experience for consumers of technology over the era of Big Tech, and it does that. This is far from the only case where it leaves out strong tie-ins to other philosophical or sociological concepts, but there is a strength in a focused argument as well.
It’s unsurprising that Doctorow misappropriated Audre Lorde’s words in their meme form, becuase that’s what the book is – an abbreviated, digestible approach to the topic. However, I’m glad that someone made those connections.