You will find many more at feddit.nu
You will find many more at feddit.nu
Dataloss is never fun. File systemet in general need a long time to iron out all the bugs. Hope it is in a better state today. I remember when ext4 was new and crashed in a laptop. Ubuntu was to early to adopt it, or I did not use LTS.
But as always, make sure to have a proper backup on a different physical location.
I am more looking into BTRF for backup due to I run Linux and not BSD ZFS requires more RAM I only have one disk I want to benefit from snapshots, compression and deduplication.
So Google is a monopoly and removing funding to Firefox will help them not to be a monopoly? That does not sound right. Rather the opposite.
Nothing has been decided or done yet. Most likely they will just be forced to not abuse their position, for example make ads for it on www.google.com, don’t bundle Chrome with Android and such things.
I believe there will always be an alternative to Chrome available as the Open Source community will find a way together.
Block Chrome and use anything not Chrome based. In other words use Firefox.
If the government did it - I would be very angry. Do you know how much internet help with democracy in the world? Think of it as shutting down all libraries with books, and forbidde any kind of communication.
It would of course try to setup some communication and gather people. I currently only have one irc server. I guess people want something webbased as they probably don’t have the client software.
I would probably lose my job as well as it is a software as a service company that require internet to function.
A general protocol standard just for sync would be nice. But then there is the problem of getting all big players on board. There are open protocols like syncthing and seafile use but not compatible with eachother.
Just stop supporting the biggest actor in the market.
Putin, please stop this war without demands.
If there is a problem at low power usage then you can easily solve it by temporary add more power. Lets say add a 40watt lamp or something, later remove it from the calculation.
Yes, you can write bad code and that matters most. Rust is more low level than high level language. Rust is new so not much bloat library has been written yet :) So far I have seen many lean Rust applications in the open source world. Please note I used the word “should” - no guarantees.
SQLite makes minimum memory usage much lower than MySQL. Many that would selfhost this is just for one single user and don’t need a standalone database
I can image that the application itself for doing this would not require much ram at all but having a MySQL requires much more ram usage in order of magnitudes.
I love it is written in Rust. Should mean fast and efficient, low memory usage.
Edit: It uses MySQL as database so it is heavy.
Save in terms on load I mean CPU usage. The question is how much money they will save by we utilize this instead. I does think it is as heavy like ChatGPT or anything like that.
Personally i dont save password in the browser. I use keepassxc and the web extension to the browser. But Firefox sync for all Firefox settings, sending tabs etc.
How much will we save on the production Mozilla Firefox servers in terms of load?
Keepassxc + syncthing to phone in read only mode and to other machine. So 3 copies on different machine, while one of them is on me
I use very simple software for this. My firewall can use route monitoring and failover and use policy based routing. I just send all traffic to another machine with the diagnosis part. It does ping through the firewall and fetch some info from the firewall. The page itself is not pretty but say what is wrong. Enough for parents to read what error. I also send DNS traffic to a special DNS server that responds with the same static ip address - enough for the browser to continue with a HTTP GET that the firewall will send forward to my landing page. It is sad that I don’t have any more problems since I changed ISP.
Had a scenario when the page said gateway reachable but nothing more. ISP issue. DHCP lease slowly ran out. There were a fiber cut between our town and the next. Not much I could do about it. Just configured the IP static and could reach some friends through IRC in the same city so we could talk about it.
The webpage itself was written in php that read icmp logs and showed the relevants logs of up and down. Very simple.
The dvds are fine for offline use. But I dont know how to keep them updated. Probably result in taking loads of spaces as I guess they are equal to a repo mirror
I use it with Kubuntu. Doing apt update is now much faster. I did some testing and found some good public mirror so I could max my connection(100 Mbit) with about 15ms latency to the server. But I think the problem was there are so many small files. Running nala to fetch the files in parallel helps of course. With apt local ng I don’t need nala at all. The low latency and files on gigabit connection to my server leads to fast access. Just need to find a good way to fill it with new updates.
A second problem is to figure out if something can be done to speed up the apt upgrade, which I guess is not possible. Workaround with snapshots and send diff does not sound efficient either, even on older hardware.
apt update - 4 seconds vs 16 seconds.
apt upgrade --download-only - 10 seconds vs 84 seconds;
I am wondering if it is that good to have single instance for feddiverse. It hurt feddiverse servers to send to yet another location, or is it more like p2p so it scales well?