thejoker8814

joined 1 year ago
[–] [email protected] 2 points 1 year ago

Even if it's a bit different. It's always nice to see what's out there. I will definitively look into it.

[–] [email protected] 2 points 1 year ago (3 children)

Sure - but that would be another thing to self-host - because I have at least 5 machines which need to send, and I have a dynamic IP address - so it would involve updating the MX records via DNS API for at least 5 sub domains.

To be honest, I'm a KISS kind of guy - not everything technical possible or imaginable is worthwhile. Especially if it's such a crucial part like alert monitoring. I want it done simple, secure, without caveats and keeping the complexity on the lowest level possible.

[–] [email protected] 2 points 1 year ago

Many people underestimate it, my goto for a fast an reliable file share service, which does just that, is production ready, has great client software and uses just few resources.

Seafile

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Biometric authentication methods are in general not very secure. Besides the fact there are a whole lot of procedures to fake/ spoof the biometric data. Once breached - you can not change your biometrics that easy. Keys and passwords can easily be replaced.

Further, if you use biometric auth for a lot of services you open your self to a single point of attack - that is pretty similar using the same password for many services. And we don’t do that!

[–] [email protected] 2 points 1 year ago

Thanks for the info about the .zip domain. Totally missed that.

[–] [email protected] 3 points 1 year ago

True, but for playing around with lemmy and doing some test's it's ideal - and it's free! In case you are serious about hosting a lemmy instance, there should be at least some sort of backup/ disaster recovery strategy in place.

[–] [email protected] 8 points 1 year ago (3 children)

I’m not sure if it’s still valid, but Oracle Cloud Infrastructure (OCI) had a 4 vCPU, 24 GB RAM, 200 GB HDD free tier. No costs, ever! You could sign up there and setup an even bigger instance.

[–] [email protected] 5 points 1 year ago

I know it’s been mentioned before - but plain Wireguard is my way to go. KISS - keep it simple, stupid! setup might be a little bit of a learning curve, but once you got it for one device, others aren’t a big issue.

I had a CA, with OpenVPN, but that’s to much for a small setup like remote access to your home network.

Use it on iOS, Ubuntu and Windows to access my home services and DNS (Split-Tunnel).

It’s a pretty easy setup on OpenWrt. A quick look into the fresh tomato wiki tells me, that it shouldn’t be to complicated to achieve on your router (firmware). If you need help with setting Wireguard up, let me know, I’m happy to help out.

[–] [email protected] 3 points 1 year ago

Codeberg is using Forgejo, basically Gitea. You can change the editor, if you like with other editors if you host Forgejo or Gitea yourself. Features like CI/CD can be deactivated.

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Funny 😄 pretty much asked myself the same thing, the day before yesterday.

Specifically, I have been looking for encrypted mail hosters supporting your own domain. Also, hosting in Europe on dedicated Hardware (or at least guaranteed European VPS), GDPR compliance and some sort of certification/ verification of the said requirements and their claims!

What I came up with:

  • mailbox.org (never heard of it before, but pretty much has your requirements covered) <- Tor nodes, anonymous accounts(no personal data at all!)
  • proton mail
  • Tutanota (pretty young - but interesting concept)

I won’t cite their individual plans - that’s for you to figure out in detail.

The thing that bugs me with the Proton Mail and Tutanota, to effectively make use of their threat model/ encryption you have to use their Apps/ Software. EDIT: I’m currently using Microsoft365 - with it you are pretty much locked in - I fear with Proton or Tutanota it’s the same. Migrating is a pain.

I’m trying mailbox.org at the moment - they got a 30-free trail.

[–] [email protected] 2 points 1 year ago

I have to agree, RAID has only one purpose - keep your data/ storage operating during a disk failure. Does not matter which RAID level or SW. Thank god you mentioned it before.

There can be benefits in addition depending on RAID level and layout, for example read & write speed or more IOP/s than an individual disk (either SSD or HDD). However, the main purpose is still to eliminate a single disk as a single point of failure!

Back to topic - if you have a strong requirement to run your services which (rely) on the SSD storage, even if a disk fails - then SSD Raid yes.

For example.: I have s server running productive instances of Seafile, Gitea, and some minor services. I use them for business. Therefore those services have to be available, even if one disk fails. I cannot wait to restore a backup, wait for a a replacement disk and tell a client, Hey, sorry my server disk failed” (unprofessional)

For protection against data loss - backups: one local on another NAS, one in the cloud. 👌🏼

[–] [email protected] 2 points 1 year ago

Thanks for letting me know.

Damn shame - why do humans have to ruin places, like a plague of locusts

view more: ‹ prev next ›