this post was submitted on 03 Oct 2024
552 points (97.9% liked)

Technology

59581 readers
2916 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 month ago

Interoperability is a big job, but the extent to which it matters varies widely according to the use case. There are layers of standards atop other standards, some new, some near deprecation. There are some extremely large and complex datasets that need a shit-ton of metadata to decipher or even extract. Some more modern dataset standards have that metadata baked into the file, but even then there are corner cases. And the standards for zero-trust security enclaves, discoverability, non-repudiation, attribution, multidimensional queries, notification and alerting, pub/sub are all relatively new, so we occasionally encounter operational situations that the standards authors didn't anticipate.