Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You think that people learning math will find it less confusing if they have no context (which, in mathematics, is largely equivalent to "assumptions") for what they are learning and have to build their mathematical knowledge from first principles? Or am I misunderstanding your post?


I don't think anything, I was recalling what I heard Sussman say at a conference, which he makes clear in http://en.wikipedia.org/wiki/SICM. Though I saw him a good 8 years after the book, so I'm sure he'd grown some new opinions.

He is of the opinion that if you can't write the algorithm, you don't understand it; forcing students to write the algorithm aids in teaching them a real understanding. He spent quite a bit of time complaining about the vagueness of mathematical notation due to its implicit assumptions for the purposes of teaching and the superiority of executable code for the task.

On a side note, the speech was aimed at both engineers and programmers and the audience was a mix of both. He did a wow demo with a circuit on the overview, calculating all the voltages and resistances on the fly from chosen starting values that made all the programmers smile really big and all the engineers jaws drop. Everyone was impressed because he flew through it, but the programmers much less so because what he did was just how a programmer thinks, albeit it very fast. He was demonstrating to the engineers the benefits of thinking like a programmer. As he is both and teaches both, he clearly connected some dots that they generally don't.


He is of the opinion that if you can't write the algorithm, you don't understand it; forcing students to write the algorithm aids in teaching them a real understanding.

Absolutely.

I had memorized the notation for derivatives. Until I was able to realize how the notation tied to finite numerical approximations, I never really understood it.

If you don't know what I'm talking about, the right way to approximate (d^2/dx^2)(f(x)) is to write an operator d defined as d(f)(x) = f(x + dx) - f(x). And now your numerical approximation for the second derivative is (d(d(f))/(dx * dx))(x).

Now replace your d operator with the much better d(f)(x) = f(x + dx/2) - f(x - dx/2), and see how good your numerical approximations become!

The notation actually means something. When we switched from infinitesmals to limits, we lost sight of that. When you try to write the algorithm, you're going to be reminded of its meaning, fast.


That is fine for you, not everyone needs to know about Wiener path integral formulation of quantum mechanics ... which is based on functions that are almost continuous everywhere but nowhere differentiable (the set of functions that are differentiable and with which you are familiar is a set of measure zero i.e. it is comparable to using only integers instead of real numbers). The limit process, which you discard as not meaning anything, is essential to understand the Wiener path integral formalism. Fortunately, mathematicians and physicists did not "lose sight of that" ... and are able to go beyond what your limited understanding would limit you to. I don't mean this in a deragotary way - you obviously don't need to understand at a deeper level than you do; few people do ... but to those that need this deeper level, the mathematical formalism that you dismiss is, in fact, essential.

Edit: my apology for making an unwarranted assumption about what your level of understanding is.


I don't mean this in a deragotary way - you obviously don't need to understand at a deeper level than you do; few people do ...

Do you like to assume much?

I have a masters in mathematics. I likely understand a lot more than you give me credit for.

I do not "discard" the limit process. I'm saying that the d/dx notation, which we get from the old infinitesmal approach, has insights buried in it that most students miss. I certainly missed it for a long time.

Limits won for good reason - they were the first consistent approach to defining the derivative that mathematicians understood, and were a key step on the way to resolving the inconsistencies in previous definitions that Fourier series had revealed.

But limits are not actually "essential". I am personally aware of multiple completely rigorous formalizations of Calculus. The second most famous of which is Robinson's non-standard analysis, which has absolutely nothing looking like a limit in sight.


> When you try to write the algorithm, you're going to be reminded of its meaning, fast.

Exactly, and forced to replace implicit assumptions with explicit instructions to make it work.


But that said, what about pure existence proofs?

There are algorithms that we can prove exist, but we have no idea how to find them.


The programming is a tool to aid learning, not replace all the other tools at your disposal. Use it when it helps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: