Hacker Newsnew | past | comments | ask | show | jobs | submit | thatxliner's commentslogin

I've always wondered: why not just this "IRC" I hear about a lot? Apparently it was popular in the older times

I see Shadcn and Hero Icons

The empathy section is quite interesting


well advent of code also needs an account


It’s not necessary to see the problems though


It's not clear that you will need an account to see the problems. Logged in with my account and it's exactly the same page. It's not Dec 1st everywhere yet, so they might open up for everyone when they do open them up.


This also has a paid account and a business account.


And if you have a paid account, you get extra time to complete the challenge!

Somehow, SadServers seems to have entirely missed the concept of a "puzzle".


Just wondering, how did you get that domain name? I’ve been looking for registrars offering .hu



whois data points to https://www.domain.hu.


Doesn’t GitButler require a specific workflow?


I was looking at PEP 690 and I saw

> A world in which Python only supported imports behaving in a lazy manner would likely be great...we do not envision the Python langauge transitioning to a world where lazy imports are the default...this concept would add complexity to our ecosystem.

Why can't lazy be the default and instead propose an `eager` syntax? The only argument I can imagine is that there's some API that runs a side effect based on importing, but perhaps making it eager for modules with side effects would be a sufficient temporary fix?


How can TOML be too minimal? Also, what about Apple Pickle?


A friend of mine introduced TOML to a reasonably big open source project and he mentioned there were some unexpected downsides to it. I've asked him to chime in here, because I think he's more qualified to reply (note that I pointed him to a sibling comment that's also asking about TOML, here https://news.ycombinator.com/item?id=45295715).

As for Apple Pkl, I think we share the goal of robustness and "toolability", but pkl seems way more complex. I personally think it's more useful to keep things simple, to avoid advancing in the configuration complexity clock (for context, see https://mikehadlow.blogspot.com/2012/05/configuration-comple...).


Which LLM (or LLM API) were you using?


Cloudflare Workers AI


Whisper-level transcription accuracy should be sufficient


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: