It depends on where you took computer science. I took a few foundational classes at community college.
It very much felt like a Wikipedia article on the history of computers somehow stretched out over an entire summer.
I have my own issues with the way college is generally setup. Do students really need a massive amusement park when self study along with 3 or 4 exams would provided the same value. Will spending 70k per year in total cost of attendence at said amusement park serve them?
I don't really like boot camps either, personally I'd like companies to be more open to actually training people again. I doubt it'll happen though.
Well, yeah. That's true for any field of study. Every college has strengths and weaknesses- its the opposite of a franchise.
>> I took a few foundational classes at community college.
A few foundational classes is somewhat different to classes you take in prep for a major. I did a foundational class in astronomy, designed for students who were just looking for an introduction. It was very different to my comp Sci classes in tone and style.
Yes there was some math involved, but not much in the comp science classes. Math was a pre-requisite though so we got our math in, well, math.
This is one of the only skills you can learn for practically nothing. A cheap laptop is all you need. I taught myself enough to get a middle class job with nothing but free time and 3$ iced coffees.
I just don’t like the idea of gate keeping it behind an expensive degree. The source code for most popular frameworks and tools is free for anyone to read.
It’s not like medicine or something where you need to drop 300k on education.
No, it's certainly not like medicine or law. And you can certainly aquire skills on your own.
Of course, in this field, learning is continuous. You're not going to use just one language (much less one framework) over a decades-long career. It's also likely that your domain will change, your focus area and so on.
A good college course doesn't prepare you for programming in one language, but all of them. (In the sense that once you understand the theory of programming, language is just syntax.)
You get exposure to different types of languages (imperative, functional etc).
I think for me the critical takeaways though were research, critical thinking and communication. The "skills" are easy to learn yourself, but the formality in which you place that learning is harder to do yourself.
Which is not to say a degree is a requirement- it's clearly not. But it's helpful because it builds a strong foundation on which the self-learning can rest.
It very much felt like a Wikipedia article on the history of computers somehow stretched out over an entire summer.
I have my own issues with the way college is generally setup. Do students really need a massive amusement park when self study along with 3 or 4 exams would provided the same value. Will spending 70k per year in total cost of attendence at said amusement park serve them?
I don't really like boot camps either, personally I'd like companies to be more open to actually training people again. I doubt it'll happen though.