This is like reading a modernized version of Shakespeare. You can still kind of tell the story is good, but it just feels lacking. It doesn't convey the same kind of weird magic, even though you aren't 100% sure what they are on about.
It is a perfect example of Rich Hickey's "Simple made easy" talk from a few days ago.
The very notion seems strange to me, because when I read SICP it was primarily a "learn LISP's model of computation" book, not a "learn how to program, and we just happen to be using a LISP" book. Because of that, it doesn't make much sense to me to port SICP to a non-LISP language.
The way it was explained when I took the class was that the language is irrelevant for such an intro class and he (harvey, the guy who taught in scheme) just used scheme because SICP is so good.
We actually went through tons of different programming paradigms that scheme and lisp probably weren't really designed for, like OO (closures upon closures, all the way down). Throughout the class harvey and the TA's emphasized that everything we were learning could be done in any language, and didn't actually require scheme or lisp (yes, we never got to macros).
I can understand why you it might seem strange to port SICP over to scheme if you read SICP with the intention of learning lisp, but at berkeley the point of using SICP was to iterate through many different programming paradigms and cover basic computer science. unless they've found a better manual for this, it makes a lot of sense to just translate SICP instead of starting from scratch.
I don't know, I found the way they did OO actually made more sense than what I was taught earlier in a class using Java. It also seems more logical and elegant than Python's OO implementation, even if Python's isn't quite as arbitrary and complicated as Java's.
I think the language was only irrelevant because Scheme is so simple. We covered it in a couple of lectures and then it stayed in the background.
I really don't see the case for switching to Python at all--why "fix" something that isn't broken? Everybody seems to just assume that Python is a better choice.
None of the professors want to teach in scheme anymore, and a vocal subset of the eecs students think scheme is a waste of time because it has no real world applications.
That said, I think scheme is one of, if not, the best language to begin programming in, and agree with you that none of the reasons given for switching were very compelling.
Those scheme handles OO paradigms, the way it does is kind of annoying below the abstraction. however, scheme wasn't designed to handle OO, it was designed for functional programming. if i remember correctly, mutability is a hacked on addition (one of the reasons you use set! instead of def).
most of the arguments that i heard about moving away from scheme were about how it wasn't worth teaching to people who had never programmed before because it wasn't practical/real world enough or too abstract/LISPy.
i think scheme is still great if you already know some programming because you can appreciate it even more.
If you are MIT/Stanford/UCB and you are worried that you have to teach an industrially useful language because the students need it to get jobs - then you should close down and send them to Pheonix or ITT-Tech
My understanding was that MIT switched to python because in later courses they could actually use it to get things done (robotics, image proc, etc).
While if they taught scheme they had to at the end of the course pretty much say: well that was fun, now we have to teach you something useful to get the rest of the course done.
i meant it wasn't real world/practical enough for other classes, though a certain subset of students only really seemed to care about languages and concepts that they might use in industry.
mit and berkeley had/have the exact same "problem" with scheme (ie they don't/can't use it in any other class except the intro-to-into-cs class). This usually resulted in upped div classes spending a a class or two (basically a week of instruction, or a week of lab) on teaching the basics of the language the course is in. Though when i took the compilers course a few semesters later, the second slide of the entire course was "RTFM", and the professor dryly noted that we would be compiling a subset of python down to x86 assembler using c++, and despite the fact that he didn't expect any of us to know those languages at all, we didn't have the time to spend a lecture or discussion/lab on learning them, so we would have to learn them, on our own time.
I suppose that switching to a language that can be used in many classes is nice, but i don't think the students should have that much trouble switching if the professors supply supplementary material and they (students) understand the intro courses well enough. Plus, I think it's a good idea to get exposed to as many languages as possible, so you learn to think in a way that you can code well in any language.
Maybe at MIT an "introductory course" means something a lot different than the school I went to, but SICP covered a lot of stuff in more depth than my entire CS degree did. I'm referring mostly to the 4th and 5th chapters, which cover language implementation right down to the hardware. Are those chapters included in MIT's "introductory" computer science course?
My impression, and I recall this from reading the introduction, is that the fourth chapter is included--just like at Berkeley--but the fifth chapter isn't.
The way the Berkeley intro course series is structured, it actually makes sense to not teach the fifth chapter in the first class: the idea is that you go from the top down, starting with really high abstractions and working your way down to building a computer out of logic gates. The fifth chapter fits in with the latter portion more than with the rest of SICP.
Also, MIT doesn't teach SICP at all in its intro course any more, as far as I know. They've also switched to Python, but I don't think they've kept the book, structure or material from the Scheme course (unlike what Berkeley is doing here).
My experience was quite the opposite. SICP was a "learn about computation" book with a lisp-like language as the least-unnecessary-complexity way of expressing it.
This is ok, for a CS 101 class. I'm sure the students will turn out just fine. Some of them will likely be intrigued by the purple book their class name references. Looking at the schedule, it looks like it just covers the "Structure" and not the "Interpretation". My favorite parts of SICP are the 4th and 5th chapters, on metacircular evaluation, interpreters, lazy Scheme, Prolog, and compilation to byte code. I guess it's probably too much for CS 101 though. It would also be a lot more work to do those chapters in Python, since you'd need to cover scanning and parsing, instead of just using (read).
I think that the metacircular evaluator and friends are perfect for a CS 101 class--they really changed my perception of how software works. After all, this is a CS class rather than just a programming class, and SICP is a good overview of both while this new class is probably less so.
Oh, I definitely agree. SICP was the beginning of the rabbit hole for me. But I had spent a lot of time programming before I started it, so I'm not sure if it really is the best text for CS 101, where some students maybe have never written a line of code. Peter Norvig's review sort of sums up this regarding SICP: http://www.amazon.com/review/R403HR4VL71K8
I've never been involved in teaching CS or programming, and while I really do think SICP should be required reading for all CS students (and for all programmers) just because it's so good, I respect that teaching is very challenging, and fitting it all into a single semester might be tough. There are some very smart and experienced people who have spent a lot of time thinking about this (see "The Structure and Interpretation of the Computer Science Curriculum"), and I hesitate to put SICP on a pedestal where it can't be touched.
As I said, hopefully some (or many!) of the students taking this class at Berkley will be curious about the original book and will seek it out on their own.
One thing to note is that this class is really more like CS101.5--if you have never programmed, you're highly encouraged to take a very introductory class for a semester before this one. Apparently about 90% (or something) of the people taking it have had at least some programming experience.
I also don't think SICP is perfect; however, I think it inherently much better than anything--even itself--done in Python simply because Python basically forces less breadth. I recently read an interesting paper[1] that I think could lead to an even better class; if you're interested in this sort of thing, it's worth checking out.
It's also not really designed for a completely introductory class, but I hope that most of the people coming into Berkeley CS aren't completely unprepared.
On a slightly unrelated note, a completely different approach to an intro class could be cool too: instead of focusing on learning, maybe the first semester should be dedicated to just building something cool on a computer and letting the learning happen naturally. The more formal classes can always come later. For something like that, I could see using Python--although it wouldn't be my first choice by far--but that's a completely different story.
I took CS61A in the spring of 2008, when it was still done using Scheme. Looking through the first few lectures and the titles of later lectures, I'm struck by how much the material is the same. Most notably, in the first several lectures, they've made a point of stripping down Python to a subset that makes its evaluation near-identical to Lisp - by using only named functions (no '1+2', only 'add(1, 2)'), they've kept the emphasis in those lectures on the environment model of evaluation.
Looking through the later lecture titles, it's quite clear that (despite what other commenters are saying) the "deeper" parts of the course have been preserved.
There's a bit that's been lost in translation; in particular, while the emphasis on metalinguistic abstraction has been kept (students are going to be implementing an object-class system, in a rather elegant way that uses Python's capability to redefine attribute getters) the more exotic models of computation pursued in the old 61A have been abandoned. No more ambiguous evaluator. This is perhaps inevitable - it's just impractical to create a metacircular evaluator for a language as complex as Python. Still, the core of the class has remained, which is a testament to the fundamental similarity of the the Lisp model of computation with that of many modern scripting languages.
I'm simultaneously disappointed and thrilled about this. As I've suspected, teaching SICP in a language other than Scheme would require dropping chapters 4 and 5. They're simply too hard to do in any other language. Which of course is a bummer, since chapters 4 and 5 are huge amounts of fun.
All the same, I'm thrilled that SICP is being taught.
SICP has been taught here for a long time, in Scheme. Unfortunately, the professor--Dr. Brian Harvey, who apparently contributed significantly to the second edition of the book and has taught this course for decades--has decided to retire. Now that he is gone, I guess the people running the course have decided to move away from Scheme to Python, for some odd and unknown reason, but still use the book (at least for now).
The reason they moved is because no one else wants to teach it in scheme. When I took the class in 2007 he mentioned then that after he retired the class would switch to python because he was the only one who thought SICP was worth the headache of dealing with scheme.
However, they still use SICP behind the scenes because they have a bunch of lecture notes for it, and because it really is a great book.
btw, i don't think harvey actually retires until 2013 or 14. i think he's just doing behind the scenes stuff now.
Well, at least they're still using Emacs. That being said, they have a weird way of describing the hotkeys. For example, for C-x C-f, they tell you to hold down C-x, release all keys, then press C-f. You don't need to release the control key, and it adds at least half a second if you do.
"I felt a great disturbance in the Force, as if millions of voices cried out in terror and were suddenly silenced. I fear something terrible has happened."
Won't this end in disaster considering the use of tail recursion in SICP versus Guido's flat out refusal to allow tail recursion optimization in Python?
While I think that this will end in disaster, tail-call recursion is almost inconsequential---this is a very introductory course, just an overview of CS, so performance isn't important.
What is important are the big ideas. For example, the course covers several paradigms: functional, oo and logic programming. The book uses Scheme for the first two and a language with very Scheme-like syntax for the last; Python is completely unsuitable for all but oop, and even there is too complex compared to Scheme.
I took this course in its old iteration last year, and it was a brilliant course, probably the single best course I've taken on any topic. That was partly because of the professor, who has retired, and partly because of the book and language. Now that only the book is the same, I suspect the course is very far from brilliant.
I should amend this by saying that the professor, who is actually from Google, is probably good--he created a set of AI projects used in a bunch of AI classes that are very good. However, the previous professor was particularly good as a professor--his research involved CS education. So far, he has been the best professor I've had, and I've had some very good ones.
> tail-call recursion is almost inconsequential---this is a very introductory course, just an overview of CS, so performance isn't important.
> What is important are the big ideas.
Go back and read section 1.2 ("Procedures and the processes they generate") again. As far as its authors are concerned, this is one of the big ideas. Basic concepts of time and space complexity and recursive vs. iterative processes are so important that they appear as soon as the basics of Scheme have been introduced. It's a core theme of the book (Why is this n-queens program slow? How can we eliminate two extra stack saves in this register machine program?), and the main questions in chapter 5 that the evaluators in chapter 4 don't answer fully (but look closely at the CPS evaluator in section 4.3) are "how do procedures return values to their callers?" and the related "how can we write an evaluator that doesn't grow the stack when executing tail-recursive procedures?"
Ultimately, the beauty of SICP isn't in the paradigms covered but the understanding of how programs execute, and the beauty of Scheme is the simpleness of its control flow . It's far easier to a understand Scheme program than one written in Python, Haskell, Prolog, or any other high-level language. Do most people understand (modulo sophisticated optimizations) how the Python interpreter actually handles, say, comprehensions, iterators, generators, or its complex object-oriented features? Can a first-year student add these features by hand (forgetting completely about macros and first-class continuations) to an interpreter herself?
It amazes me that Berkeley professors don't want to teach Scheme (and worries me, as they're much cleverer than me). The whole magic of the language is that in the end there isn't any magic at all.
Any reason for that conclusion? I personally have known the instructor of the course for the better part of a decade and he's one of the finest CS educators out there and has numerous awards to show for it.
I've talked with him about this choice and he has a lot of sound reasons for making this move backed by his years of experience teaching in various languages.
What's your line of thought? I use Clojure many hours every day and while I think that LISP languages are awesome, I don't think they're a good choice for a 1st programming course either.
One advantage Lisps have is that they tend to be much more inclusive of functional programming. Python supports (limited) lambdas and some higher-order functions, but it really feels like the language tries to steer you away from too much functional programming.
The best time to learn about different paradigms is right when you're starting out; if you basically only learn how to program imperatively, you're liable to start believing that that's the only way to go. I know because this happened to me (I was self-taught and learned a different set of languages, but it had the same effect); ultimately it took me longer to come to and start using functional programming properly than it would have had I learned about it at the very beginning.
Full disclosure: I've known John (the instructor) for 3 years, and I'm a TA for this course right now. On the other hand, I was also 'born' Scheme at MIT (which has also switched to Python).
I can confirm that moving to Python was a very well-reasoned decision. Having TA'd the Scheme version of the same course 2 or 3 times, I can say that for all talk of Scheme having very simple syntax, it's surprisingly hard for students to get used to.
- The code is hard to read, and, a humans, we're not built for nested expressions.
- Useful data structures, like key-value stores, and indexable lists, are limited, or introduced late
- Having recursion thrown at you in week 1, before you've learned basic debugging, the concept of abstraction, or how to write readable code, gets in the way of learning to program.
How does the course get around the problem of functional programming being a complete pain in the arse in Python? For me the value of SICP was to teach me how easy, beautiful and powerful it can be to program that way, because everything else before it failed to communicate that. I can't see how an SICP in python could have the same effect.
Python has some powerful elements too. It sure it lacks elements and features of historical and current pure functional[1] programming languages, but I find it elegant to use python as a set-theory powered language. Often I can solve complex problems in a few lines by thinking about the sets of elements I am manipulating, constructing and transforming them. Instead of imperative, python code becomes very descriptive. Indexing a dictionary with frozensets and making use of a proper definition of __hash__ to put various objects as dictionary keys or inside sets allows for tremendous power and terseness and explicitness.
In [2] the solution given by yairchu can be written succinctly via generators, which look extremely like a mathematical set definition.
def grandKids(generation, kidsFunc, val):
return reduce(lambda a, v: (x for v in a for x in kidsFunc(v)), xrange(generation), [val])
I often happen to think about a problem and explicit it with pen and paper using pure mathematical set notation then implement it in Python.
Mind: I said "a pain in the arse", not "impossible". And i think not only about the writing (writing such things regularly is not what any programmer would want to do), but also the debugging perspective.
What you have done there is cute and commendable as a mental exercise, but if i'd ever encounter that in production code someone would make a close encounter with my chainsaw. This is basically undebuggable (i don't know the line/statement debuggers python has available, so i'm guessing here at what happens) because either the debugger will step over that in one step because it's one statement, or it'll just keep stepping on the same line again and again which is also entirely useless.
Lastly, what you did there is basically golf the living hell out of that code to bring it down from multiple statement lines to a single one. If i have to reach to such means i might as well use Perl. Only Perl actually does allow me to put multiple statement subs into a lambda, so i don't have to golf there.
Weird days when i have to golf in Python but don't need to in Perl.
That's definitely cool, but as I mentioned in another comment, check out what they're missing:
metacircular Scheme interpreter
lazily-evaluated Scheme interpreter
non-deterministic Scheme interpreter
pseudo-Prolog in Scheme
register machine simulator
compiler to bytecode for the same machine
I took this class while it was still taught in Scheme, and I loved it. Now I'm teaching it in Python. I am also sad to see Scheme go, but I think we gained as much as we have lost in our switch to Python. For example, I argue that Python dictionaries are more intuitive to use than the old "association lists" implementation in Scheme (we still taught the implementation). Concepts like MapReduce and concurrency that we cover later on in the course are also cleaner and more elegant than the Scheme implementation. The above-the-line OO syntax is also much, much easier to use.
We are still covering interpreters as our last unit. It will be for a "calculator" language, with conditionals and assignments. We are hoping to cover a lot of the same concepts, but we decided that a metacircular interpreter is obviously too difficult. We are keeping project 4 the same (Logo interpreter) but we have ported it to Python and wrote some new questions. Personally, I think the OO-centric interpreter is an improvement over the old Logo interpreter in Scheme (we can have Environment objects that now contain Frame objects, for example. Before, environments were just a list of association lists).
Yeah, the interpreters were really cool. Being able to write a program, have it run and do stuff, and know that you're the one who made it run is really a great feeling which I'm probably failing to describe properly.
The lazy interpreter was also brilliant--realizing that you can change a language's behavior drastically with a small change in the interpreter is very empowering. Coincidentally, that's what pushed me over the edge to learning Haskell, so I'm extra grateful there.
Overall, the amount of magic that class showed me definitely made it worthy of the wizard on the cover.
The interpreter for Logo is actually extremely similar to an interpreter for Scheme; additionally, we went over several variations on the Scheme interpreter in lecture.
We didn't do the registers and bytecode, which was too bad although fair to students taking the class with less programming experience. One happy side-effect of the switch to Python is that there is now a self-paced version that allows the students to do all that if they want.
Most of the reasons cited for moving to Python are much more applicable to Haskell and Erlang. Additionally, both languages--and I say this as somebody addicted to Haskell--are too narrow: the perfect introductory language should be able to accommodate both functional programming better than Python and imperative better than Haskell.
Having a statically typed language would add unnecessary complexity to the course; those languages come later anyhow.
Finally, some of the particularly brilliant insights that Scheme gives (code as data and an extremely elegant interpreter in Scheme) are absent in both Haskell and Erlang.
That said, we really should have more functional programming in other classes. I think CMU does this with ML throughout the CS program, and I envy them in that regard. However, this is a different issue; I don't think a language like that would fit any better to SICP than Python.
It is a perfect example of Rich Hickey's "Simple made easy" talk from a few days ago.