Hacker Newsnew | past | comments | ask | show | jobs | submit | sclangdon's commentslogin

Isn't it this case no matter who wrote the code? How do you ever run anything if you're worried about bugs?

When I write the code myself, I'm not worried that I snuck a `git reset --hard` somewhere.

Do you only run code you wrote yourself?

I depend on the community to vet libraries that I add to my stack. The community of people.

When I use AI to write code, I have absolutely no guarantee about what it just did, so I have to read through it all carefully.


Different type of creator, different type of bugs. I'd assume a human giving me a way to delete merged branches has probably had the same issue, solved the same problem and understands unspecified context around the problem (e.g protect local data). They probably run it themselves so bugs are most likely to occur in edge cases around none standard use as it works for them.

Ais are giving you what they get from common patterns, parsing documentation etc. Depending what you're asking this might be an entirely novel combination of commands never run before. And depending on the model/prompt it might solve in a way any human would balk at (push main to origin, delete .git, re-clone from origin. Merged local branches are gone!)

It's like the ai art issues - people struggle with relative proportions and tones and making it look real. Ai has no issues with tones, but will add extra fingers or arms etc that humans rarely struggle with. You have to look for different things, and Ai bugs are definitely more dangerous than (most) human bugs.

(Depends a little, it's pretty easy to tell if a human knows what they're talking about. There's for sure humans who could write super destructive code, but other elements usually make you suspicious and worried about the code before that)


> ...a way any human would balk at (push main to origin, delete .git, re-clone from origin. Merged local branches are gone!)

Ahem... But yeah, then I -- or we, I and my team at the time -- (somewhat, at least rudimentarily) learned git.


It makes a difference whether an AI or a human wrote it. AIs make more random, inconsistent errors or omissions that a human wouldn’t make. AIs also don’t dog-feed their code the way human developers of tools usually do, catching more errors or unfit/missing logic that way.

He may be referring to the fact that it could be pronounced Auschwitz. I must admit, my immediate thought on reading the name was "why would someone name their app after a concentration camp?"


wow, I learned something new today


I had the same reaction. Even just a hyphen between like aws-viz would probably do wonders.


Yes I think any minor tweak would remove the effect. awsvizul, vizaws, awsvista.

It’s just a bit too on the nose at present.


This is also what I immediately saw and heard also.


Most games don't have the time budget for that. Visuals take precedence, and it's not easy to get to 60fps as it is, especially if you're doing a lot of other processing.

And stealth games especially will probably rely a lot on shadows and other visual things, which make rendering more expensive.


That's true, I just like when stealth games go beyond just player visibility, some of my inspirations do that:

- Thief Gold, sound was important, each surface would make a certain amount of noise, and you could damp the noise by covering the ground in moss, or reduce it by walking slower.

- Splinter Cell Chaos Theory: which had a sound-meter with 2 indicators, one for the noise you are making and the background noise. I specially like this system because of how it allows you to make much more well-informed decisions.

- Sniper Elite: you can cover the sound of your rifle with loud background noise in some maps, they are regular, and they have cue moments before the loudest part.


Not PvE, but Hunt Showdowns use of sound is worth mentioning. Like you mentioned, every surface has a sound and level, animals react to your sight and sound and will make their own noises, weather and env effects cover or uncover sounds, you can use items or gunfire or anything else to decoy. Really great. All a pvp game so you're trying to sneak up on actual people


I realise for casual users Windows is always going to be the OS of choice if for no other reason than it comes pre-installed and most people don't know how to reinstall an operating system.

However, Windows may be in trouble with more tech-literate people who do know how to change it. I can only speak for myself, but I've been a Windows user since 95. All but one of my programming jobs over the last 20 years have also been working on Windows. But I really dislike the direction Microsoft are taking and I find Windows to be terribly slow these days, with each version seemingly worse than the previous one. So I decided to look elsewhere.

A couple of months ago I bought a new laptop with the express intention of running Linux on it and giving it a good college try (I didn't want to mess around with dual-booting and I still need Windows on my main PC for work... for now). I know very little about Linux, but I've decided I'm not going to use Windows past 10 so it's time to find something else.

I went with Debian running dwm (Debian because I value stability over everything else, and dwm because I like the suckless philosophy) and it's honestly surprised me how good it's been. It's SO snappy. Everything is instant. It's really been a breath of fresh air.

I was especially dreading programming since I've solely used Visual Studio since Visual C++ 4.0 and don't really know anything else. Anyway, I went all-in and started learning Vim, GDB, and Make, and boy do I feel like I've been missing out. I'm really enjoying programming again, which for me has just become a job over the years.

Anyway, my point is, if tech-literate people are willing to give Linux a try, I wonder how many of them would be as surprised as I was and may make the switch permanently. With Windows getting worse, and Linux getting better, maybe more than ever.


People occassionally ask me what my best advice is for becoming a great programmer, and they are surprised when I say vim, bash (including for and while loops), and core tools like sed, grep, awk/cut.

When you know enough bash to (without having to look it up every time) write a command that filters (sed) and parses (awk, cut) and then loops (while, for) you will be amazed at what you can do and how quickly you can do it. Then add Vim and you can fly through tasks near the speed of thought.

In summary, I think you've made a great choice!


Never felt that text replacement is among the tasks I do often.


Oops typo, where I said "filters (sed)" I actually meant "filters (grep)"

Yeah text replacement isn't super common. I do use it every so often, but grep I use probably dozens of time per day


Interesting to know that my Z80, Amiga and PC demoscene skills were worthless until I got to use Xenix.


It's not that we think it's arcane or that we are in our own "bubbles of thought", it's that we aren't doing math. We're programming a computer. And a competent programmer would know, or at least suspect, that doing it with logarithms will be slower and more complicated for a computer. The author even points out that even he wouldn't use his solution.

P.S. Please look up the word literally.


I'm having a hard time imagining a situation where "printing out the number in a human readable format" is more time consuming than "figuring out what the number is".

I think a competent programmer might also ask themselves "am I prematurely optimizing?" if their first instinct is to pick the method that only works on a computer. I've operated in this space long enough that bit shifting is synonymous with doing the logarithm in my mind, but if I had to explain how my code works, I would use the logarithm explanation. I would be sure to point out that the computer does log (base 2) of a number much much MUCH faster than any other base.

Its probably excessive to say that literally every one is taught logarithms as the ideal solution to this problem, but logarithms are almost universally introduced by explaining that the log (base 10) of a number is always greater than or equal to the number of digits in that base 10 number. So if you completed a high school education in the United States, you have almost certainly heard that much at least.

edit: printing out the number is almost always gonna be faster than figuring out the value of the number, if the speed of the operation matters. My original post implied the opposite. Part of being a competent programmer is recognizing that optimizing is sometimes bikeshedding.


The author's final suggested solution at the bottom of the article still relies on logarithms.

> doing it with logarithms will be slower and more complicated for a computer

This is a fascinating point of view and while it isn't wrong in certain "low-level optimization golf" viewpoints is in part based on old wrong assumptions from early chipsets that haven't been true in decades. Most FPUs in modern computers will do basic logarithms in nearly as many cycles as any other floating point math. It is marvelous technology. That many languages wrap these CPU features in what look like library function calls like Math.log() instead of having some sort of "log operator" is as much an historic accident of mathematical notation and that logarithms were extremely slow for a human.

Logarithms used to be the domain of lookup books (you might have one or more volumes, if not a shelf-full) and was one of the keys to the existence of slide rules and why an Engineer would actually have a set of slide rules in different logarithmic bases. Mathematicians would spend lifetimes doing the complex calculations to fill a lookup book of logarithmic data.

Today's computers excel at it. Early CPU designs saved transistors and made logarithms a domain of application/language design. Some of the most famous game designs did interesting hacks of pre-computing logarithm tables for a specific set of needs and embedding them in ROM in useful memory versus CPU time trade-offs. Today's CPU designs have plenty of transistors and logarithm support in hardware is just about guaranteed. (That's just CPU designs even; GPU designs can be logarithmic monsters in how many and how fast they can do.)

Yesterday's mathematicians envy the speed at which a modern computer can calculate logarithms.

In 2023 if you are trying to optimize an algorithm away from logarithms to some other mix of arithmetic you are either writing retro games for a classic chipset like the MOS 6502, stuck by your bosses in a history-challenged backwards language such as COBOL, or massively prematurely optimizing what the CPU can already better optimize for you. I wish that was something any competent programmer would know or at least suspect. It's 2023, it's okay to learn to use logarithms like a mathematician, because you aren't going to need that "optimization" of bit shifts and addition/subtraction/multiplication/division that obscures what your actual high-level algorithmic need and complexity is.


Maybe, but a circuit designer is pretty high up the list of things that need to be boring and functional.


Kim Justice has a good documentary on Sensible Software, which I highly recommend if you're at all interested in Sensible Soccer or their other games like Cannon Fodder or Mega Lo Mania.

https://youtu.be/lJWro7NGBKo


Things such as function overloading, operator overloading, dynamic dispatch, etc, are all conveinient but also obscure what is actually happening.

To take C++ as an example (I love C++ btw, this isn't another C++ bashing), a line as simple as "variable1 = variable2;", could be doing all kinds of things (including not even assigning) because of the ability to overload the asignment operator. As someone reading this line of code for the first time, you don't really know for sure what is actually happening unless you also go and read the code for "=".


Though COBOL infamously has the ALTER statement, which allows you to rewrite code at run time ;)


My first job was as a COBOL programmer on an IBM mainframe (OS/370 I believe) for Xerox. The system was absolutely massive and incredibly difficult to navigate. There were thousands of files with names that only had 6 characters. So everything was a two-letter system code, followed by four digits. LP0456, OP0234, SB1245, etc.

Then, in each of those files the variables had similar names in order to cut down on unique identifiers. WP100, WP101, up to the max number you needed at any given time. It was a nightmare. There was as much external documentation to keep track of all of this stuff as there was code.

On top of that, you could only build programs overnight in a batch process with the rest of the system, which was run by a different team in a different country (Spain, I believe). So you would write your program in the day, phone up the guys in Spain and explain that there was a new program going in and you were responsible for it. Problem was, if your program failed to build it would hold up the whole system, which HAD to be up and running and live again by the morning. This meant that every time you made a change that went into the nightly build you also had to be on call.

I remember eagerly watching the build queue at 1AM, waiting for my file's turn. Then immediately shutting everything down and going to sleep the moment it was successful. If it wasn't successful, well, then you had to fix it on the spot and phone Spain again to tell them to give it another try. Unfortunately, you rarely knew why it was failing.


Filenames? We used to DREAM of having filenames!

Writing COBOL on the UNIVAC 490 series (30 bit word, FIELDATA character set), our files were on tapes. A tape request was submitted to the tape library, they would pull the tape(s) containing your file(s), and deliver them to the operations floor.

The customer would submit hand-written transactions, which would be sent to the keypunchers, who would punch them onto cards.

The tapes and transaction cards would be sent to the ops floor (where there was the actual computer). The operator would load the first tape in each file onto a UNISERVO tape drive, and go to the patch panel to run wires to assign logical devices to physical devices. ( https://www.computerhistory.org/collections/catalog/10266686... see page 57)

Then they would put the job deck into the card reader and enter the "UR" (read the Unit Record device) command into the console, which had a spool of paper, not a screen. When the program was awaiting input, they would load the transaction cards and give the console command to continue the program.

We did have a fearsome beast of a FASTRAND drum system for random-access storage, but that was just for executable code and temporary files.

Then you might well get called in the middle of the night, because one of the programs in the job aborted. They called it "aborted" then. Or ABEND (ABnormal END), if you were an IBM person. You would come in, read the core dump (reading core dumps was a valuable skill), and determine that there was a data problem in a transaction (e.g. an alpha character in a numeric field). You would find the offending card, pull it, tear it in half and set it aside to be addressed in the morning as a missing transaction, and tell ops to rerun the job. With luck, there were no bad cards after that one.

Tell that to kids today, and they won't believe you.


> ... ABEND ...

Sorry, I don't know German.


Luxury.


I shall never complain about resolving merge conflicts or dependency issues ever again.


Good God that's awful lol. Say what you will about the state of software development today but at least we aren't there lmao


I don't really understand this idea of never taking your hands off the keyboard. Maybe people program differently to me, but most of the time I'm not typing anything. Most of my time is spent thinking. When my thoughts are clear and the problem is solved, then I type. And when I do, it's usually no more than a dozen lines at a time.

I get the impression from these people that they are constantly typing things. In fact, they're typing so much that they can't possibly waste valuable seconds using a mouse. I must be misunderstanding what they mean because that just can't be right.

And what's with the "you can achieve the same thing faster, without breaking your concentration" in regard to using a spell-checker or a calculator or whatever. Are you being serious? I can achieve the same thing faster? I mean how long do you think it takes to check the spelling of a word? Even if I must look it up in a physical dictionary, how long are we talking here?

Guys, seriously, slow down. You're going to burn out. I don't want to judge because I don't know you. Maybe you're a rockstar, but I'd guess that if you're really going this fast, the quality of your code is suffering.


> In fact, they're typing so much that they can't possibly waste valuable seconds using a mouse. I must be misunderstanding what they mean because that just can't be right.

Yes, you are misunderstanding. First, it's not only about typing but it's generally about doing whatever you are trying to do without unnecessary delays, including navigation between functions, files, windows, etc. Second, it's not just about saving seconds here and there: the end goal is to stay in the flow state, avoiding unnecessary interruptions and context switches. Every time you reach for the mouse and move the cursor or scroll a document you stop thinking at the problem at hand because you're focusing on the motion, and when you're done with the mouse you waste mental energy resuming your previous train of thought, maybe even forgetting something.

> Even if I must look it up in a physical dictionary, how long are we talking here?

This is actually a good example because it seriously disrupts your thinking by forcing you to pause for some seconds and completely focus on something else. I am sure proof-reading a long document in this way is much more mentally exhausting (and slower, but again speed is not the point).


Your experience is very different from mine!

Using a mouse does not distract me at all. I don't think about it, consciously, any more than I think about the motions my fingers make as I operate the keyboard. I'm not thinking about the tools, I'm thinking about what I'm doing through the tools; my hands move automatically.

I suppose it is like learning a musical instrument. At first you have to learn how to operate the instrument, practicing the motions to build up muscle memory. Then you start learning to play notes through the instrument. Eventually you stop thinking about the instrument, or the notes, because all that has become habit, and you just think about the music you are making; the instrument feels like an extension of your body.

If you have had a long-standing preference to use only the keyboard, and not the mouse, perhaps the mouse feels distracting for you because it is not part of the instrument you have learned to play. Of course this could be a self-perpetuating tendency.


> Every time you reach for the mouse and move the cursor or scroll a document you stop thinking at the problem at hand because [...]

... because I have well over 200 of own keybindings in my spacemacs dotfile already and quite a few of them don't do what I need in some particular situation or some underlying package got broken or, or, or...

> you're focusing on the motion, and when you're done with the mouse you waste mental energy resuming your previous train of thought, maybe even forgetting something.

That's because it's already half past 3 p.m and except having two or three coffees, I haven't eaten anything yet.

Besides, I highly doubt if I don't get any Nobel- or Turing-Award it's because I don't keep my fingers on the home row.


If you’re thinking, you don’t even need to be at the computer.

I can’t speak for others, but I keep my hands on the keyboard a lot of the time because I’m reading. As a (neo)vim geek, the key is to be really good at moving through the code. Being able to jump from where you are to where you want to be in a file far away without stopping allows you to read the code in a more linear way than it’s written.


It's also about doing less movement with the arm. I just got a big screen and my desire to avoid the mouse just increased a lot (increasing the cursor speed mitigates this a bit however). An external keyboard (that has a numpad) may force you to spread the arm a bit more than what is ideal.

It's also less cognitive load when the keyboard shortcuts are internalized. This may be worth it for operations you do a lot.

> Even if I must look it up in a physical dictionary, how long are we talking here?

This, however, takes an infinite time.


It's not really about speed, it's more about preferences. I'm a keyboard type of guy, I just find using a keyboard nicer, that's it. I don't like having to move windows around with a mouse, I find the keyboard experience better. Same with scrolling, I find using up and down keys much nicer than a wheel as well. It doesn't mean that one way is the right way and the other is wrong, it's like taste I guess.


Magic Mouse continuous scroll experience was a game changer, if I had to go back to basic mice I would prefer keyboard for everything


I have long wondered the same. What are people doing which makes them so concerned with the efficiency of their editors? I spend the great majority of my work time reading and thinking. When it seems that solving a problem will require me to change a great deal of code, that's generally a sign that I haven't done enough thinking yet. On the rare occasion that I do need to write a lot of new code at once, it's not that big a deal to just... type it out!

I have had a few co-workers who seemed to spend a great deal of time typing, but I have not generally been impressed with the quality of their work. In fact, the most incompetent developer I have ever met, an enthusiastic proponent of his favorite editor and its automation features, was also the most prolific, routinely churning out hundreds or even thousands of lines of awful, bloated, bug-ridden code a day.

The field of software development is large and varied, so of course it is possible that there are competent people doing solid work which really does involve a tremendous amount of fiddly editing, thereby justifying the otherwise inexplicable degree of attention given to sophisticated editors; but I cannot imagine what their working lives are like, and I hope I never have to find out.


The whole thing actually goes back to smoking weed. The ‘no mouse ethos’ goes at least as far back as the ratpoison window manager and this historic post: https://www.nongnu.org/ratpoison/inspiration.html Tiling window managers, living in terminals, vi, that Firefox with vi keybinding - all part of this THC cult. I recall some other ideologically foundational work where the author talked about being able to work one handed with this type of setup, in this case I think performance is taking a back seat to joint-smoking ergonomics.


Did you miss the part in the link you posted where he says that he’s joking?


I am similar to you with regard to process and how much I output at a time, but in my case having to fuss with the mouse a whole bunch makes it that much easier for me to lose my train of thought.

That said, I don't optimize heavily against this the way some folks do. I use emacs and org-mode and GNOME (well, whatever the System76 folks are calling their reskinned GNOME desktop :)) which I think provide a nice balance. Emacs lets me switch between files with a couple keystrokes rather than having to dig through 10 tabs, and GNOME I think encourages an alt-tab based workflow. I still use the mouse for most other things though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: