Honestly having an AI agent that interviews a interviewees agent sounds like a great "first filter" for certain tech jobs if you do it right. As in "here are the api specs, build an agent that can receive questions and reply with information based on your Resume." Would be vibe-codeable by anyone with skills in an hour. I remember seeing a company a while back that switched to only accepting Resumes through a weirdly formatted API and they said it cut down on irrelevant spam for software jobs immensely.
The problem is if every other employer buys some AI interview solution as the initial screening then instead of a 15 min application to most likely be ghosted it will become 15 min + 45 minutes of AI slop conversation to be ghosted. While taking up no extra time from the recruiters' or hiring managers' schedule.
I agree with the author, but I would also say there is something above goals and constraints. Values. A set of things that, when comparing multiple options, make the choice clear. An example of some values I frequently use is "What will give me the most enjoyment the furthest into the future? "What will result in the world being a better place?" "What will make me become someone who resembles Jesus more?" They are different from constraints as they don't knock out any options by default. Instead, they make triaging when there are many different things I could be doing much easier, and circumvent my messy intuition which is based on hormones, hunger, weather, etc.
I think values, goals, and constraints are all valuable, but it's a hierarchy. We should create constraints that help us become more aligned with our values. We should create shorter-term goals that make it easy to stay within our constraints.
To support both my point and the authors, here is Benjamin Franklin's "Thirteen Virtues," which seem to be a mix of constraints and values (zero goals): https://fs.blog/the-thirteen-virtues/
> I think values, goals, and constraints are all valuable, but it's a hierarchy. We should create constraints that help us become more aligned with our values.
Thank you for saying it so well.
I have found difficulty in finding my values. Writing my obituary helped: https://www.jjude.com/my-obituary/. I wrote that almost 16 years back (published only in 2020). It helped me choose my pursuits well.
I don't live in the biggest house in town, or own a sports car. But I work for 3 days a week, homeschool two kids, have breakfast and dinner together as a family, we either workout at home or swim as a family, preach in two churches, and enjoy my work. I consistently feel, I am living in a dream.
Thanks for sharing the 13 virtues. It was a bit dense to read, so here's an (LLM assisted) friendlier version.
Temperance (Practice Self-Control): Don't overeat, and don't drink just to get drunk. Practice moderation in your habits.
Silence (Speak with Purpose): Only speak if you have something meaningful or helpful to say. Avoid gossip and pointless chatter.
Order (Be Organized): Keep your belongings organized and manage your time effectively. Have a place for everything, and a dedicated time for each task.
Resolution (Be Decisive and Committed): Figure out what you need to do, and then follow through. Do what you say you're going to do.
Frugality (Be Mindful of Your Money): Spend money only on things that truly benefit you or others. Be resourceful and avoid waste.
Industry (Work Hard and Be Productive): Use your time wisely. Always be engaged in a useful activity and eliminate distractions.
Sincerity (Be Genuine and Honest): Don't deceive people. Be sincere in your thoughts and words and speak with good intentions.
Justice (Be Fair and Responsible): Don't harm anyone. Fulfill your responsibilities and be fair in all your dealings.
Moderation (Avoid Extremes): Practice balance in all things. Don't overreact, and learn to let go of grudges.
Cleanliness (Be Clean and Tidy): Maintain good personal hygiene and keep your clothes and living space clean.
Tranquility (Stay Calm and Composed): Don't get upset by small things or events you can't control.
Chastity (Practice Sexual Responsibility): Treat sexuality with respect, in a way that isn't harmful to your well-being or anyone else's peace of mind and reputation.
Humility (Be Humble): Learn from others. Prioritize listening and learning over ego.
I fight to record any presentations I do as often as possible. When I am asked for the slides I send the full recording instead as the way to manage this exact issue.
Few things are as frustrating as finding slides from what seems to be an insightful talk strongly pertaining to what you are working on, but no recording of that talk to watch.
I applaud the effort to record such talks, especially in the current age where you know few people will actually watch it and appreciate your effort (but some big LLM provider will certainly lift it as part of a mass scrape and charge a few bucks for access to your findings without crediting you).
If information is important enough to bother someone asking for a copy of it, but not important enough to spend an hour ingesting, I'm not sure what to tell you.
The thing is: When working with the material afterwards the important part are the small details. The talk/recording are good for the high level overview and following along on the big picture, but for details it is annoying as one has to jump around for specific words and phrases. Something written or an image/diagram is a lot better to study in depth.
And there lies the trouble with slides: During a talk they should support what is being said, but they are often abused as also being the handout for afterwards.
It sounds like you want detailed documentation. That’s fine, but that’s not what a talk is. A good talk isn’t a reference. And good documentation isn’t an engaging talk.
If people want that, produce two artifacts. Don’t try shoehorn a talk into being documentation. That’s just a recipe for bad work.
It depends on what the talk is about. Of course Steve Jobs' of cited iPhone introduction didn't have any details for in depth research later on, but was a high level product introduction.
A technical talk however explains a concept, a tool or something and thus contains technical information to follow up with, but for that I need the words, the phrases stated so I even know what to look for in the manual. And probably I want to follow it in the order they presented it (I hope they thought about the order they presented it in!) however the manual is ordered more in a reference order.
So yeah, if you do a high level marketing talk it doesn't matter, but then I also won't spend the time on watching a second time. If it has technical depth, then being able to follow the depth is good.
I have dealt with this issue as well before. If folks need something more in depth I will use a LLM + some massaging of my own to create a supporting document. Here is an example of a very disorganized conversation and the supporting document I made with it: https://www.danielvanzant.com/p/what-does-the-structure-of-l... It has clear definitions of the key terms. Timestamps for the important moments, and links to external resources to learn more about any of the topics.
For most talks, I would say no. If I were going to a lecture by Pynchon (ha!) I would want to listen at 1x. For 99% of talks at conferences which are mostly just a way of communicating technical data, a text transcription that is then reduced in word count by 50% is probably only a very small loss (if that), and a 90%+ time savings.
This gives me an idea for a website. All of the talks of a conference, audio transcribed and LLM summarized into 3-minute reads.
It might be worth doing the whole INFOCON archive…
The slices of a good presentation are worthless without the presentation itself. If the deck is valuable in and of itself, it could have just been an email or word doc in the first place.
Well, it's not the reality of most slides I've seen. Most of them seem to be a pretty good summary of the talk. Weirdly, some of them contain more information than the talk.
I do believe most presentations I've seen could've been an email or an article. So I guess I agree with you?
> I do believe most presentations I've seen could've been an email or an article. So I guess I agree with you?
Yeah, I really should have said that in my original post. Most presentations could have been a one pager, and any presentation worth sitting through the slides aren't worth having.
Always recording is a good practice I think. It's so cheap with video conferencing that you might as well. Even if nobody uses it later, it didn't cost much. And if you get that one presentation that provides stellar value it's a gift that keeps on giving.
I don't really agree that a recording is always better than the slides. Slides are a text medium, and as such can be searched. You can also go through them much, much faster than through a recording (even if you can listen at 2x). If you're just looking for something specific, slides can be much better.
And sometimes you need to get the whole experience. And then the recording is much better.
Yeah, recordings are fine for those who missed it. And with video conferencing recording is so easy that you might as well do it, living the motto "better to have it and not need it, than to need it and not have it".
But when someone specifically asks for slides, it just feels like a dick move to say "you don't want the slides, rewatch the whole presentation instead".
Sometimes you're just looking for the link on slide 45, the pithy problem description on slide 5, or, y'know, you just want to quickly go through the main points again.
Why would I want to listen or watch a presentation (even sped up), when I can read a transcript many times faster, can scan through for the bits that are most relevant, and can quickly jump back to review something if I want to?
It's only when you read the transcript of pretty much any presentation or podcast that you realise how superficial most are and how low the information density actually is.
There exists a slider at the bottom of most videos you can click and drag to your prefered location /s
A video of the presentation is pretty much always better than just the slides. Even if you got the slides you'd have to click through them to find the one you were looking for. Your argument could just as easily be phrased:
"What do you expect people to do with that? Click through and read every slide?"
And it would make about as much sense as the original argument (none).
> Your argument could just as easily be phrased: "What do you expect people to do with that? Click through and read every slide?"
I've had considerable practice at reading. Learned it at a young age, and I got to be pretty good at it over the years. I can get through a slide deck much faster reading it than watching a presentation.
Thank you for pointing out that watching the presentation and clicking through the slides takes you just as long. I assumed most people were at my level of reading speed. It must've been hard coming forward like that. I'm sorry I made you go through that. In the future I will check my privilege.
I'm glad someone else had the same thought. I have been wondering what their "secret sauce" is for a while given how their model doesn't degrade for long-context nearly as much as other LLMs that are otherwise competitive. It could also just be that they used longer-context training data than anyone else though.
Is there any effort to organize scientific literature like this? I know journals often generate tags for papers but those can often be quite poor and restricted to the field The journal is in. I would happily join a volunteer effort to create tags and do some tag-wrangling for scientific literature in my research area.
Have been searching for a deep research tool that I can hook up to both my personal notes (in Obsidian) and the web and this looks like this has those capabilities. Now the only piece left is to figure out a way to export the deep research outputs back into my Obsidian somehow.
Sometimes I wanted to do a little coding to automate things with my personal productivity tool so i feel a programatic interface that open source implementation like this provides is very convenient
Being able to control how many tokens are spent on thinking is a game-changer. I've been building fairly complex, efficient, systems with many LLMs. Despite the advantages, reasoning models have been a no-go due to how variable the cost is, and how hard that makes it to calculate a final per-query cost for the customer. Being able to say "I know this model can always solve this problem in this many thinking tokens" and thus limiting the cost for that component is huge.
Is stochastic calculus something that requires a computer to stimulate many possible unfolding of events, or is there a more elegant mathematical way to solve for some of the important final outputs and probability distributions if you know the distribution of dW? This is an awesome article. I've seen stochastic calculus before but this is the first time I really felt like I started to grok it.
In case the other responses to your question are a little difficult to parse, and to answer your question a little more directly:
- Usually, you will only get analytic answers for simple questions about simple distributions.
- For more complicated problems (either because the question is complicated, or the distribution is complicated, or both), you will need to use numerical methods.
- This doesn't necessarily mean you'll need to do many simulations, as in a Monte Carlo method, although that can be a very reasonable (albeit expensive) approach.
More direct questions about certain probabilities can be answered without using a Monte Carlo method. The Fokker-Planck equation is a partial differential equation which can be solved using a variety of non-Monte Carlo approaches. The quasipotential and committor functions are interesting objects which come up in the simulation of rare events that can also be computed "directly" (i.e., without using a Monte Carlo approach). The crux of the problem is that applying standard numerical methods to the computation of these objects faces the curse of dimensionality. Finding good ways to compute these things in the high-dimensional case (or even the infinite-dimensional case) is a very hot area of research in applied mathematics. Personally, I think unless you have a very clear physical application where the mathematics map cleanly onto what you're doing, all this stuff is probably a bit of a waste of time...
Thanks for the explanation this was very helpful. You've given me a whole new list of stuff to Google. The quasipotential/comittor functions especially seem quite interesting although I'm having a bit of trouble finding good resources on them.
They are pretty advanced and pretty esoteric. They will be very difficult to get into without a solid graduate background in some of this stuff, or unless you're willing to roll up your sleeves and do some serious learning. The book "Applied Stochastic Analysis" by Weinan E, Tiejun Li, and Eric Vanden-Eijnden is probably a decent place to start. I took a look at this book a while ago, and it's probably decent enough to get a foothold on the literature in order to figure out if this stuff will be useful for you. These guys are all monsters in the field.
It depends a bit on exactly what you want to calculate, but in general things like the probability density function of the solution of a stochastic differential equation (SDE) at time t satisfies a partial differential equation (PDE) that is first order in time and second order in space [0]. (This PDE is known to physicists as the Fokker-Planck equation and to mathematicians as the Kolmogorov forward equation.) Except in special examples, the PDE will not have exact analytical solutions, and a numerical solution is needed. Such a numerical solution will be very expensive in high dimensions, however, so in high-dimensional problems it is cheaper to solve the SDE and do Monte Carlo sampling, rather than try to solve the PDE.
Edit: sometimes people are interested in other types of questions, for example the solution when certain random events occur. Analogous comments apply. Also, while stochastic calculus is very useful for working with SDEs, if your interest is other types of Markov (or even non-Markov) processes you may need other tools.
Edit again: as another commenter mentioned, in special cases the SDE itself may also have exact solutions, but in general not.
[0] This statement is specific to stochastic differential equations, i.e., a differential equation with (gaussian) white noise forcing. For other types of stochastic processes, e.g., Markov jump processes, the evolution equation for distributions have a different form (but some general principles apply to both, e.g., forms of the Chapman-Kolmogorov equation, etc).
Certain simple stochastic differential equations can be solved explicitly analytically (like some integrals and simple ordinary differential equations can be solved explicitly), for example the classic Black Scholes equation. More complicated ones typically can't be solved in that way.
What one often wishes to have is the expectation of a function of a stochastic process at some point, and what can be shown is that this expectation obeys a certain (deterministic) partial differential equation. This then can be solved using numerical PDE solvers.
In higher dimensions, though, or if the process is highly path-dependent (not Markovian), one resorts to Monte Carlo simulation, which does indeed simulate "many possible unfolding of events".
It has been a while since I studied along these lines (stochastic chemical reaction simulations in my case) but I think the answer is often yes, but not always (I don't think). A random walk for example will be a normal distribution (and you know the mean, and you know the variance is going to infinity), so I do think in that case you end up with an elegant analytical solution if I'm understanding correctly as the inputs can determine the function the variance follows through time.
But often no, you need to run a stochastic algorithm (e.g. Gillespie's algorithm in the case of simple stochastic chemical kinetics) as there will be no analytical solution.
For normal distributions I think do - black scholes is an analytical solution to option pricing. Been a while since I studied stochastic calculus
I question why this is the second highest article on hacker news currently, can’t imagine many people reading this website are REALLY in this field or a related one, or if it’s just signaling like saying you have a copy of Knuths books or that famous lisp one
This is one of those archetypal submissions on HN: mathematics (preferably pure, using the word "calculus" outside of integrals/derivatives gives additional points), moderately high number of upvotes, very few comments. Pretty much the opposite of political posts, where everyone can "contribute" to the discussion.
I upvote so it sticks around longer, so it has a better chance of generating interesting comments.
I also upvote because I find it interesting to learn about stuff I didn't know about. I might not understand it, but I do like the exposure regardless.
Depends on what you want to know. If you want to get some trajectories then simulation of the stochastic differential equation is required. But if you just want to know the statistics of the paths, then in many cases you can write and try to solve the Fokker-Planck equation, which is a partial differential equation, to get the path density.
I'm curious from anyone who has done it. Is there any "pleasure" to be had in learning or implementing assembly (like there is for LISP or RISC-V) or is it something you learn and implement because you want to do something else (like learning COBOL if you need to work with certain kinds of systems). It has always piqued my interest but I don't have a good reason in my day-to-day job to get into it. Wondering if it is worth committing some time to for the fun of it.
I did the first 27 chapters of this tutorial just because I was interested in learning more and it was thoroughly enjoyable: https://mariokartwii.com/armv8/
I actually quite like coding in assembly now (though I haven’t done much more than the tutorial, just made an array library that I could call from C). I think it’s so fun because at that level there’s very little magic left - you’re really saying exactly what should happen. What you see is mostly what you get. It also helped me understand linking a lot better and other things that I understood at a high level but still felt fuzzy on some details.
Am now interested to check out this ffmpeg tutorial bc it’s x86 and not ARM :)
This looks to be very cool will check it out. Wild to see it on a Mario Kart Wii Site, but I guess modders/hackers are one of the groups of people who still need to work with assembly frequently.
Learning at least one assembly language is very rewarding because it puts you in touch with the most primitive forms of practical programming: while there are theoretical models like Turing machines or lambda calculus that are even more simplistic, the architectures that programmers actually work with have some forgiving qualities.
It isn't a thing to be scared of - assembly is verbose, not complex. Everything you do in it needs load and store, load and store, millions of times. When you add some macros and build-time checks, or put it in the context of a Forth system(which wraps an interpreter around "run chunks of assembly", enabling interactive development and scripting) - it's not that far off from C, and it removes the magic of the compiler.
I'm an advocate for going retro with it as well; an 8-bit machine in an emulator keeps the working model small, in a well-documented zone, and adds constraints that make it valuable to think about doing more tasks in assembly, which so often is not the case once you are using a 32-bit or later architecture and you have a lot of resources to throw around. People who develop in assembly for work will have more specific preferences, but beginners mostly need an environment where the documentation and examples are good. Rosetta Code has some good assembly language examples that are worth using as a way to learn.
One “fun” thing about it is that it’s higher level than you think, because the actual chip may do things with branch prediction and pipelining that you can only barely control.
I remember a university course where we competed on who could have the most performant assembly program for a specific task; everyone tried various variants of loop unrolling to eke out the best performance and guide the processor away from bad branch predictions. I may or may not have hit Ballmer Peak the night before the due date and tried a setup that most others missed, and won the competition by a hair!
There’s also the incredible joy of seeing https://github.com/chrislgarry/Apollo-11 and quipping “this is a Unix system; I know this!” Knowing how to read the language of how we made it to the moon will never fade in wonder.
Learning assembly was profound for me, not because I've used it (I haven't in 30 years of coding), but because it completed the picture - from transistors to logic gates to CPU architecture to high-level programming. That moment when you understand how it all fits together is worth the effort, even if you never write assembly professionally.
While I think that learning assembly is very useful, I think that one must be careful at applying assembly language concepts in a HLL C/X++/Zig..
For example, an HLL pointer is different from an assembly pointer(1).
Sure the HLL pointer will be lowered to an assembly language pointer eventually but it still has a different semantic.
1: because you're relying on the compiler to use efficiently the registers, HLL pointers must be restricted otherwise programs would be awfully slow as soon as you'd use one pointer.
This out of everything, convvinced me. The more I get the "full picture" the more I appreciate what a wondrous thing computers are. I've learned all the way down to Forth/C and from the bottom up to programming FPGAs with Verilog so Assembly may be just what I need to finally close that last gap.
I have spent the last ~25 years deep in assembly because it's fun. It's occasionally useful, but there's so much pleasure in getting every last byte where it belongs, or working through binaries that no one has inspected in decades, or building an emulator that was previously impossible. It's one of the few areas where I still feel The Magic, in the way I did when I first started out.
Learning assembly is really valuable even if you never write any. Looking at the x64 or ARM64 assembly generated by i.e. the C or C# you write can help you understand its performance characteristics a lot better, and you can optimize based on that knowledge without having to drop down to a lower level.
Of course, most applications probably never need optimization to that degree, so it's still kind of a niche skill.
If you're working with C++ (and I'd imagine C), knowing how to debug the assembly comes up. And if you've written assembly it helps to be aware of basic patterns such as loops, variables, etc. to not get completely lost.
Compilers have debug symbols, you can tune optimization levels, etc. so it's hopefully not too scary of a mess once you objdump it, but I've seen people both use their assembly knowledge at work and get rewarded handsomely for it.
If you want to get the ultimate performance out of a processor, understanding assembly is paramount. Writing it by hand is less critical today than it was in the days of old 8- and 16-bit CPUs when memory was at a premium, instruction cycle counts were known constants, and sequential execution was guaranteed. But being able to read your compiler's output and understand what the optimizer does is a huge performance win.
Yes, it is definitely worth it. You get a much better understanding of CPU architectures. Also, most of your knowledge will be applicable to any platform.
Depends on whether you have a suitable problem, and the problem domains have shrunk greatly over the last 40 years. I've used it for bit-twiddling in real-time data acquisition; for implementing a GUI (on OS/2 1.0, which didn't come with one) and for acquiring a handy character set for the same from DOS; for accessing some obscure features of the PC architecture (thanks, Thom Hogan); for trying unsuccessfully to access paged screen memory with an RSX on the Z80 based Amstrad; and for successfully implementing an RSX to quash high-bit characters from the keyboard on the same to deal with an editor which had an array of only 128 characters for deciding what to do with input.
All of them apart from the screen memory thing were fun, but the only one which could be useful these days is the bit-twiddling. All the rest have been made obsolete by improved operating systems, so that the domain of useful assembler programs shrinks ever further. OTOH, debugging them is vastly easier than in the old days where all you got was random lines drawn across your screen as the system crashed, and you couldn't even single-step because you had to bank-switch out.
I learned 8086 (not x86) assembly in a university course during my bachelors degree and won a contest to create the first correct implementation that would play "Jingle Bells" on the PC-Speaker[0] attached to the custom built computer.
That was very fun and I kept playing around with assembly a bit afterwards, but never got around to learning any of the extensions made in x86 assembler and beyond.
In my masters degree, there was another course, where one built their own computer PCB in Eagle, got it fabbed and then had to make a game for the 8052 CPU on there. 8052 assembly is very fun! The processor has a few bytes of ram where every bit is individually addressable and testable. I built the game Tetris on three attached persistence of vision LED-Matrices[1]. Unfortunately, the repository isn't very clean, but I used expressive variable names, so it should be readable. I did create my own calling convention for performance reasons and calculated how many cpu cycles were available for game logic between screen refreshes. Those were all very fun things to think about :)
Reading assembly now has me look up instruction names here and there, but mostly I can understand what's going on.
It was cool back in the day, when the alternative was BASIC, also during the demoscene early days.
Nowadays most of that can be done with intrinsics, which were already present in some 1960's system programming languages, predating UNIX for a decade.
Modern Assembly is too complex, it is probably easier to target retrogaming, or virtual consoles, if the purpose is having fun.
There's still a lot of reasons to learn it to apply your skills, not just because you want to do it for fun. It's quite helpful when debugging, critical in fields like binary security or compilers, and basically the whole game if you're writing (say) SIMD algorithms.
I'm about 60% with RISC-V, I'm enjoying learning it, and my use-case is being able to embed some assembly on ESP32 code.
A few years ago I embarked on learning ARM assembly, I also got far, but I found it more laborious somehow. x64 is just too much for me to want to learn.
Depends on the ISA. ARM32 is a lot more enjoyable to work with than x86-64. In-order VLIW architectures like TileGX and Blackfin (IIRC) are fun if you like puzzles. Implementing tight loops of vectorized operations on most any ISA is similarly entertaining.
Given there’s a mini genre of games that emulate using assembly to solve puzzles the answer is clearly yes. Not sure if any of them teach a real language.
The most popular are the Zachtronics games and Tomorrow Corp games. They’re so so good!
I took a course in it in college. Extreme fun. Currently, python microservices don't have much need of this exact skill, but it gave me a significant confidence bust at the time that I actually know what is going on.
I mean, some people are interested in computers. Some people are interested in performance. Some people like to understand how the things they work with and use on a regular basis work at a very fundamental level; it's not like understanding assembly is like trying to understand computing via physics, it is directly a part of the process. I think there was a time when many people found it exciting to learn, still there are some, but now there are so many non-technical programmers working in the field, making web pages, etc., that it is a minority percentage compared earlier times.