Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the linked blog, the author describes himself: “I am not a mathematician or physicist and am definitely not credible at all”.


The author responded to this exact statement at this Reddit comment:

https://old.reddit.com/r/math/comments/1b5s32x/the_case_agai...

You may also read other discussions by the author and others at the Reddit post:

https://old.reddit.com/r/math/comments/1b5s32x/the_case_agai...

A few Reddit comments appear to agree at that non-mathematicians (e.g. amateurs and game programmers) use geometric algebra non-rigorously.

I suspect the issue is not about mathematicians who know how to be rigorous. Both posts are complaining about how mathematical outsiders are using and teaching geometric algebra non-rigorously.


Well the real issue is that the GA proponents do this weird bait-and-switch where they talk about caring a lot about being intuitive and simple, and then they do a bunch of nonsense with the geometric product that makes no sense. It's all at least possibly rigorous (depends on who you're reading), but the thing I want people to question is: is it good? Is the GA way better?

When GA has clean formulas for calculating stuff, sure. But should you write all your linear algebra formulas in terms of the geometric product? heck no. For most people GA is their first exposure to wedge products, though, and those actually are great, so they think that's GA. No, that's just ordinary well-known material that should be in linear algebra classes already. What GA adds on top of that is mostly really weird, although there is a kernel of quality inside it somewhere.


What's it even mean to use Imaginary Numbers or Quaternions "rigorously"? It's Geometry. The point is points, lines, applying transformations. If anything the field needs less rigor and more friendly intuition.


Right, complex numbers and the delta function lacked rigour for years and physicists and engineers didn't care because it turned out it captured the logic sufficiently well that useful stuff could be gained with it. Then the mathematicians did the rigour bit and everyone was happy again.


1 + 1 didn't need rigor for years. then people did it in the Pricipia Mathematica and no one was happy.

i + j is just as silly as 1 + 1.


I see, that makes sense. Thank you!


yeah, oops, that was supposed to be tongue-in-cheek. No, I'm just a guy who likes math, and I try to write things that make seem correct. I just don't want anyone, like, citing me on something... like, check the work yourself if you want to use it, I'm not a reliable source.

Anyway, go read a bunch of GA books and papers and you'll see exactly what I'm talking about there. At some point in the past (like ~8 years ago) I think I had read or at least skimmed everything that had ever been published on the subject. And some of it's good! Although at times, like, unnecessary. The rest, though... yikes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: