American
American
I use a DNS server on my local network, and then I also use Tailscale.
I have my private DNS server configured in tailscale so whether on or off my local network everything uses my DNS server.
This way I don’t have to change any DNS settings no matter where I am and all my domains work properly.
And my phone always has DNS adblocking even on cell data or public Wi-Fi
The other advantage is you can configure the reverse proxy of some services to only accept connections originating from your tailscale network to effectively make them only privately accessible or behave differently when accessed from specific devices
Another cool trick is using tailscale to ensure your portable devices always can access your Pihole(s) from anywhere and then setting those server’s tailscale addresses as your DNS servers in tailscale.
This way you can always use your DNS from anywhere, even on cell data or on public networks
I keep a third instance of Pihole running on a VPS and use it as the first DNS server in tailscale so it will resolve a bit faster than my local DNS servers when I’m away from home
And as many others have mentioned, it can be self-hosted as well.
Also fun side note:
As long as you are logged into a GitHub account and in a desktop browser you can press the .
key on your keyboard while viewing any GitHub repo to open it in vscode web.
Yeah this is what I do.
Putting Cloudflare as my secondary would allow some requests to get through and then often the device whose requests went to Cloudflare would continue using Cloudflare for a while.
The best solution I found was to run a second Pihole and use it as the secondary.
You can use something like orbital sync to keep them syncronized
It depends what I’m backing up and where it’s backing up to.
I do local/lan backups at a much higher rate because there’s more bandwidth to spare and effectively free storage. So for those as often as every 10 mins if there are changes to back up.
For less critical things and/or cloud backups I have a less frequent schedule as losing more time on those is less critical and it costs more to store on the cloud.
I use Kopia for backups on all my servers and desktop/laptop.
I’ve been very happy with it, it’s FOSS and it saved my ass when Windows Update corrupted my bitlocker disk and I lost everything. That was also the last straw that put me on Linux full-time.
It’s not a Windows app.
You can run it on Windows with Docker, but I would suggest a Linux server and a reverse proxy for the best experience (like most self-host solutions)
Definitely Immich.
There’s a lot of these kinds of services, hosted or self-hosted that are labeled as a “Google Photos replacement”
But very few of said services have features like face matching and object recognition alongside automatic backups.
IMO it’s not a legitimate replacement for Google Photos without those features and Immich really delivers on that without compromising your privacy.
I use and love Kopia for all my backups: local, LAN, and cloud.
Kopia creates snapshots of the files and directories you designate, then encrypts these snapshots before they leave your computer, and finally uploads these encrypted snapshots to cloud/network/local storage called a repository. Snapshots are maintained as a set of historical point-in-time records based on policies that you define.
Kopia uses content-addressable storage for snapshots, which has many benefits:
Each snapshot is always incremental. This means that all data is uploaded once to the repository based on file content, and a file is only re-uploaded to the repository if the file is modified. Kopia uses file splitting based on rolling hash, which allows efficient handling of changes to very large files: any file that gets modified is efficiently snapshotted by only uploading the changed parts and not the entire file.
Multiple copies of the same file will be stored once. This is known as deduplication and saves you a lot of storage space (i.e., saves you money).
After moving or renaming even large files, Kopia can recognize that they have the same content and won’t need to upload them again.
Multiple users or computers can share the same repository: if different users have the same files, the files are uploaded only once as Kopia deduplicates content across the entire repository.
There’s a ton of other great features but that’s most relevant to what you asked.
You are right, I dunno why I thought it wasn’t actually proxying all the traffic.
I can see how that could potentially be expensive for them if you were using it to stream video or something
I would disagree.
Particularly on the cost/beta stuff.
Tailscale has long supported DNS addresses that link to your tailnet. Typically they only accept connections from addresses allowed within your tailnet, but there isn’t anything particularly complex about how funnel allows any incoming address.
Further, like most of tailscale’s operations, funnel isn’t requiring them to host or even proxy any significant amount of data, it’s just directing incoming connections on that domain to a device on your tailnet.
The hosting cost to tailscale is insignificant and really no different than what they do on a basic tailnet.
I don’t think it will become a paid only option and I don’t think it’s too beta to use for a home server.
Personally I don’t bother using it because I’m comfortable exposing my IP address and opening a port to my home server using direct DNS.
But there are some advantages to using tailscale funnel in that your ip will be obfuscated and the traffic will be routed through WireGuard so potentially more secure.
Ah yeah that’s the one, sorry
Most of these I use at least regularly, quite a few I use constantly.
I can’t imagine living without Searxng, VaultWarden, Immich, JellyFin, and CryptPad.
I also wouldn’t want to go back to using the free ad-supported services out there for things like memos, kutt, and lenpaste.
Also librechat I think is underappreciated. Even just using it for GPT with an api key is infinitely better for your privacy than using the free chatgpt service that collects/owns all your data.
But it’s also great for using gpt4 to generate an image prompt, sending it through a prompt refiner, and then sending it to Stable Diffusion to generate an image, all via a single self-hosted interface.
You’re right, for some reason I thought Firebase was allowed.
Yeah netfy is a FOSS notification service.
As to drop-in replacements, I don’t think such a thing really exists on the user side, this is fully up to the app developer in how they want to implement notifications.
To use netfy instead of FCM your app would need to be designed to do so or support it as an alternative option.
I think unless they use netfy or a similar alternative then yes.
The vast majority of apps will be using GCM or FCM for notifications.
Now whether or not those push messages are encrypted/don’t contain private data is up to the app developer so how much is exposed can certainly vary.
That’s true, signal is pretty good about that.
I wasn’t saying Signal required them necessarily, just that even it uses them. But now reading back through my comment I can see how that could be easily misinterpreted. My bad
That’s cool, but also doesn’t sound all that useful.
A fairly significant number of apps depends on Firebase and the like and don’t even have the option to pull notifications otherwise. And virtually every app at least use them.
When’s the last time you’ve seen a chat app that didn’t require push notifications to function? Even Signal uses them. (Though they do so in a way that doesn’t expose any private data)
You just can’t disable push without severely crippling the experience.
Further I’m not even sure disabling them on-device will change anything at all about governments being able to surveil them server-side. Afaik you are only stopping your phone from receiving them, they would still be sent to the Firebase server from the app’s cloud servers.
I don’t think this issue is avoidable other than app developers not using (or using in a secure manner) Firebase or GCM (or ACM) etc
Sandboxed GooglePlay services can be used, if needed.
I don’t see how that would prevent this at all.
What is being discussed here is governments compromising the push notification service on Apple’s servers (and presumably Google’s as well)
Sandboxing Google services on your phone does nothing to change the fact that virtually all apps that receive messages/notifications are going to be using the push notification APIs that are compromised.
Whether or not private data is sent in those pushes and whether or not they are encrypted is up to the app developers.
It’s common for push messages to simply be used as a triggering mechanism to tell the device to download the message securely so much of what is compromised in those cases will simply be done metadata or even just “a new message is available”
But even so, that information could be used to link your device to data they acquired using other methods based on the timing of the push and subsequent download or “pull”
The problem is that if you go ahead and disable push notifications/only use apps that allow you to, you are going to have abysmal battery life and an increase in data use because your phone will have to constantly ping cloud servers asking if new messages/notifications are available.
I use AWS S3 🤷
Now do Tucker Carlson next