{ "@context":[ "https://www.w3.org/ns/activitystreams", {"Hashtag":"as:Hashtag"} ], "published":"2024-01-11T22:16:56.476Z", "attributedTo":"https://k.matthias.org/actors/relistan", "to":["https://www.w3.org/ns/activitystreams#Public"], "cc":["https://k.matthias.org/actors/relistan/followers"], "content":"

I've been writing a new Event archiving service for the new event-based environment at my new gig. This is a second chance to iterate on what we built at Community that I talked about in my blog post on the subject. This time, I'm writing in #Golang because I'm on my own on this stuff at the moment. I've taken a different approach with it by archiving events to a local copy of #DuckDB as they come off the wire. Then I use DuckDB's native Parquet and S3 support to write the events batches out to S3, where they can then be queried with Athena.

This approach seems to be good so far. I will learn more when I get it into production. I feel another blog post coming later this year...

", "mediaType":"text/html", "attachment":[], "tag":[ {"type":"Hashtag","name":"#Golang","href":"https://k.matthias.org/tags/Golang"}, {"type":"Hashtag","name":"#DuckDB","href":"https://k.matthias.org/tags/DuckDB"} ], "type":"Note", "id":"https://k.matthias.org/objects/hhW21K_ZzGY" }