Manual work is underrated. Especially the kind of work where you can see an immediate end result from your efforts. I worked in a kitchen for two years and I can tell you slicing food is way more satisfying than coding frontend.
Not the average software engineer and definitely not at the entry level. I always had a feeling that "engineer shortage" articles are only made to push people into studying CS so employers can drive wages (even) lower.
>I always had a feeling that "engineer shortage" articles are only made to push people into studying CS so employers can drive wages (even) lower.
And it's working. In the last 10 years, CS has gone from an obscure group of nerds to the number 1 major on many campuses. I literally can't imagine having to compete with the sheer volume of new grads today for a junior position as a self taught developer.
I don't know if those types of articles have any effect. Salaries are much higher in SE than other fields. That has a much larger impact of driving people towards CS than any articles could have.
That's my impression too all these years. The quality is lower too. I'd rather have less engineers but higher quality but that doesn't work w/ the bean counters.
I graduated undergrad in 2006 where many of my peers were passionate about their courses, had side projects, etc...Now it's just a means for a job and no need to learn beyond what is taught - and the kids (that my company hires, at least) expect the same of their mentors.
When I personally interview I see if they have a github. If they have one, I check if they actually have commits or they just uploaded all their school work without know what exactly version control is. It's not how I purely judge the candidates - but it is something I consider. If they don't have a github to show off their work, they better nail the technical part of the interview. But not everyone at my company interviews like that.
I graduated undergrad a few years before you in 2002 in the Bay Area. My time there included the run up and bursting of the dot com bubble. There were _plenty_ of people in my program at that time who were in it for the money or their parents wanted them in it for the money.
Since I had trouble finding a fulltime software job at the time, I ended up going to graduate school for a few years instead. That period (2002-2006) was a kind of golden era of people studying CS because they were interested in it. A popular mythology had set in for those few years that "tech was over" and studying CS was no longer worthwhile. Enrollments cratered.
By about 2006 or so, things had already began to pick up and were accelerating before the financial crisis started in 2008. There was a blip there with some layoffs and no raises, but things really started accelerating after that. More students eventually flooded back into CS programs, with increased dreams of striking it rich.
For what it's worth, we've been running about 15 TP-Link EAP225 in a warehouse without any hiccups so far. Most importantly they don't randomly die or lose the controller pairing like some low end Ubiquiti units tried in the past. The only quirk is that on Windows Server you have to configure the service manually, but it's no big deal. [0]
Just wondering, is job hopping as engineering manager as easy as coder/individual contributor?
It seems like the market for people with focus on soft skills is way more saturated (we get many more applications for project management roles than for software developers for example).
Project management is a pretty saturated field, you are correct. Engineering team management is considerably less saturated and EMs are just as in-demand as ICs.
However, much of the value of a good EM is derived as someone who builds and grows a team over a long period of time, so you don't see nearly as much short-term job hopping as you do with ICs.
+1
Author here: the other thing is (at least for me) that it becomes really hard to leave my team after a while because they just grow on me. I feel responsible for their wellbeing and want to make sure they are set up for success. When I was an IC it was much easier (in my head).
It's a sales job and a hard one and most people are crap at it,hence the churn. I've spoken to a few in their early 20s that were very good but then they aren't your usual Java is the same as JavaScript type of people.
It was me! Thanks for the tip about M -> H. Sorry I deleted the comment - I wasn't 100% sure about the models (I don't have the laptops near me at the moment and I didn't want to spread misinformation).
I've quit toxic competitive gaming (League of Legends, Overwatch) and replaced it with Netflix and shortwave radio listening. I'm still procrastinating because I work a bullshit job, but at least I don't rage as much and I always have some topics for small talk and conversation, which I feel is boosting my mood by a lot.
Similar but less drastic -- I traded League of Legends, Counter-Strike, and Valorant for games like Minecraft, Rust, and Valheim. I can play at my own pace, and even doing "nothing" in those games is calming and fun. I'm definitely a much less anxious and angry person because of it.
Sure. Unless there's a generous bonus on quality of work or project completion, the only rational way to play the game as a random employee is to maximize the gain and minimize the effort.
This is a consequence of society breaking down into a collection of selfish individuals, which I think is largely a result of economic theories turning into self-fulfilling prophecies. Nobody is going to take pride in their work when it's viewed as nothing more than a selfish and cynical bargain by society at large.
But what if your company ultimately acts as a selfish individual, too? Perhaps it even has a fiduciary duty to do so?
There's a good argument that most venture backed software companies are exactly "a selfish and cynical bargain".
It's been a good few years since I've met a senior technical staffer at a 'successful' Bay company who genuinely thinks they're "making the world a better place".
EDIT: Not that I disagree with your broader point that selfishness is an undesirable characteristic for an individual, society or ideology. But it seems inappropriate to attribute the problem to ICs or their managers.
This isn't actually true, but there's a lot of people invested in interpreting Ford vs Dodge that way. Pretty much the only thing you can't do is directly hand investor money to rank and file employees. Everything else can be done just fine if the management wants and is prepared to put up a decent cover story.
Yeah, the companies are what kicked this into overdrive, I think. I mean if you want to get at the historical and philosophical roots of this problem it's an enormous discussion that won't fit here, but a few broad strokes worth bringing up:
1. We're obviously materially better off than any point in history.
2. Much of modern economic theory is built on the idea of selfish individuals pursuing their own interest. It's clearly been effective in a material sense (see #1).
3. Modern companies are obviously better than some of the obscene abuses of early capitalism.
All of that said, it seems like there was a time when a person's work was connected to and respected by the society around them. You could argue that was always a sucker's game, but I don't think so. Human beings are wired to operate in social communities, and we take a lot of our cues from the people around us respecting and admiring our work.
Many companies at one point followed that model. You were "part of the team/family/whatever." What you contributed was valued. That made it possible to take pride in it. You might work your entire career at one company.
That model is obviously totally dead, and I think the companies fired the first shot. Once workers became chips in a game instead of fellow human beings, they were bound to play the game right back. So here we are, trying to scrounge out some sort of structural meaning to the work we do, when it's obvious to most people that nobody around us really cares or respects what we do. It's just "let's churn this out so we can all get paid." That's a poor way to motivate human beings and I think contributed to a lot of our malaise.
> That model is obviously totally dead, and I think the companies fired the first shot.
I'm not sure it was "companies" in general. I think a major cause was the shift in stock ownership in public companies from individuals to mutual funds and other financial institutions, over the decades after WWII.
From the standpoint of a company that actually wants to build lasting value, you want your public stock owners to be individuals, investing on long time horizons for things like their retirement. Then you can implement longer term plans and strategies without having to worry as much about immediate returns.
But if most of your stock is owned by mutual funds, then the fact that the individuals whose retirement savings are in those mutual funds are investing on a long time horizon doesn't help you; the funds themselves are looking at your short term returns, and if those don't measure up, they'll sell your stock and buy some other company's.
In short, a system that was set up with the best of intentions, to help people diversify their retirement savings and earn better average returns on them, has had the unintended consequence of putting Darwinian selection pressure on individual companies to prioritize short term returns over everything else. Which in turn has led to the demise of the "work for the company your whole career and the company will take care of you" model; no company can afford to do that in the new selective environment.
> All of that said, it seems like there was a time when a person's work was connected to and respected by the society around them. You could argue that was always a sucker's game, but I don't think so.
I think that you are idealizing historical societies here. Yes, society and values of people in it change. But that supposed connection and respect was never guaranteed to all that many people. No matter which period you talk about, past societies had large social, interpersonal and cultural problems of exactly this kind.
you would need to overcome an Occam's razored hypothesis like:
- management largely has asymmetry when it comes to remote knowledge worker output, especially high performers (contrast assembly line: easy to flag slackers)
- we don't have equity or input on direction (contrast worker cooperatives)
- and we're getting paid a lot of money to "hack" on problems (contrast e.g. whalers, dangerous as hell)
so yeah, i'm gonna let y'all rationalize all this, but i think it's just a great time to be a US computer laborer. won't last forever, but you can use your immense savings (you ARE saving, right?) for all that good stuff whenever you want or are forced out.
I would add the reflexive turn in consumer culture from the late 60s on (Bob Dylan, The Graduate, Cool Hand Luke, Easy Rider, Apocalypse Now, etc) manufactured the emotional appeal of existential individualism and the absurdity of believing in instutitions and collectivity.
We're basically in the "Beautiful Souls" phase of Hegel's Philosophy of History
Don't you see the other end? It's not game-theoretically optimal group cooperation, it's memes-as-agents, caring about individuals as much as we care about individual cells in our bodies.
I've noticed the used video card prices have skyrocketed. I've put up an eBay auction for a GTX 1060 6GB and it ended up selling for twice than what I bought it used two years ago. Also lots of people DMing me asking if I have more to sell. Is there some mining craze going on with old cards?
> Is there some mining craze going on with old cards?
Combination of mining craze and massive demand for entertainment electronics and GPUs since everybody has been stuck at home due to pandemic and decided to build/upgrade their computers.
Which also lead to the situation that scalping outfits have extended their services from covering mostly designer appeal to now also include electronics like GPUs and consoles. The popularity of this also attracts a lot more people to use these kinds of services as a form of investment, which most certainly does not help an already supply constrained market.
And mining. Ethereum is still mined often on videocards. And in wake of Bitcoin skyrocketing, ETH prices are way up too. So it pays more (in dollars) to mine, now.
My bet is that the improvement in integrated graphics (think Apple M1 or Vega 11) will solve the "gaming on a low end laptop" problem far before game streaming.
Games will simply use more power as the average user's rig becomes more powerful. Do you really think we'll solve low power ray tracing before we solve streaming? Streaming also scales better. Rendering double the frames requires double the power but streaming double the frames does not because frames are mostly similar and compression helps a great deal.
This is certainly moving the goal posts... is everything in the future going to be VR? I doubt it. There will be a market for streaming 4K@60 for a long time.
Over on /r/oculusquest, Shadow's streaming PC is somewhat popular for running full PC VR games on the portable Quest (as not all the owners have full gaming desktops). So it's not like it's impossible.
I’ve actually experimented with this a bit and while it can be done it isnt at all playable for 90% of games (anything requiring even moderate response times).
Oculus has been working on home desktop to quest WiFi streaming and say the experience isn’t optimal yet for release and that is all local.
If you want to play mobas or competitive FPS the actual amount of grunt required is going to be flat.
If you want to play the latest assassin's creed then it's harder to do, but still, assassin's creed Odyssey on a 1070 is still stunningly good and gets decent frames - the requirements will always increase but the amount of power required to get a certain fidelity shouldn't be too bad.
Streaming also does away with the hassle of downloading updates and managing storage space. I would happily avoid loading up my macbook with games even if it could run them.