The backend uses Laravel to parse and clean raw exports before sending structured data to GPT-5 for categorization.
Nothing fancy; just a pragmatic way to turn raw bank data into something understandable.
I am working on Tailstream (https://tailstream.io/), turning logs into task time visual data streams. Built the web application, web site and a Go CLI agent (open source) and am now slightly pivoting into making it more log-focused.
Working on faceted search for logs and CLI client now and trying to share my progress on X.
It's been a while since I've used CloudWatch myself. How would you expect this? IE would you lean more towards having a lambda/firehose that forwards events to to the API (which is [public](https://tailstream.io/docs/api) by the way!) or would you expect some kind of agent / connector to run that automatically pulls the logs from CloudWatch?
I do not have any specific perspective on how this is best done. I believe being able to run inside a closed environment might be preferable, logs do contain pretty sensitive stuff. Perhaps a container that pulls from CloudWatch might be an option?
These two channels don't do nearly the quality that the original post has, but these two old youtube channels have CSS drawings and are pretty fun to watch
Perhaps a silly question, but if it is so broken, then why do you keep working there? Is it purely a money-thing? Is there still hope that things can improve?
Not judging, but I am genuinely curious about what drives engineers to stay.
(Yes, this is sarcasm)
reply