Hacker Newsnew | past | comments | ask | show | jobs | submit | irrigation's commentslogin

"If every neighborhood had its own vertical farm, how many fewer semis would be choking up the metro area?"

In my area - Southern Ontario -- farmland is cheap and plentiful, a lot of it turning to brush or "hobby" farms (which usually means farms that don't really produce). In my area just outside of the metro area, everyone's property, with long growing seasons and excellent soil -- are in the measure of acres, and only grow decorative plants.

We recently as a family started considering the notion of mixing up life and moving somewhere else in Canada -- 100+ acre farms in PEI, NB, and Nova Scotia can be had for under $100,000. Many are now considered vacant land having been abandoned as farmsteads.

Whenever these sorts of articles talk about the "solution" to locally grown or a food crisis, I compare these two facts and something is not meshing.


In your considerations, what did you have planned to do if you did end up buying a farmstead? Pick up farming or livestock, or just live on the countryside with a big ranch available without animals but otherwise live and work like you do now presumably in a city, or a combination?

I've been really interested in the idea of moving out of the city for some time but I don't know many people who've done so. They usually just moved to a smaller city (i.e. a village) without any real farmland, but a decent backyard. (something pretty rare here in Amsterdam).

Thanks


Professionally I can essentially work anywhere with a high speed connection, and my wife is in healthcare and can work anywhere near even a somewhat built up area. We would like to do low-intensity farming, have some chickens (primarily for eggs), ideally have some woodlands, and so on. My wife really wants to try keeping bees, and the honey products from that.

I'd love to have enough land to have a small nanny-suite house, a bunkie off in the woods somewhere, etc. With connectivity (which is starting to appear in even really rural areas via wireless options), I really have to imagine that more of an exodus out of urban areas will happen by the people who want the more remote living and had only moved to urban areas out of proximity needs.


Man that sounds lovely. (I assume you do independent development, freelancing/consulting, or work remotely in dev?)

Just the other day I was checking out the blog by http://foodcyclist.com/farm-blog/

He's not a super techie, he has a few websites and it's not always super professionally structured, but check it out. He's basically a dude who got into farming later in life, first did an apprenticeship, and then started his own community supported agriculture (CSA) business.

His initial focus was chickens which you mentioned wanting to do. It's looks like it's pretty easy for him. He orders chicks online, they have enough food in their bellies to survive the trip in the mail (sounds crazy but apparently it's a normal thing). He has a heat lamp and a basic food/water installation. He designed his own pens for the older chicken for which he has a blueprint online for free, they're actually very neat. And then he has to do the butchering which is the worst part, but he has about 60 customers for his CSA who paid him upfront for the season (I think about 20 weeks), and he delivers one whole chicken to them per week, so about 50 per week, and he actually has a very high chicken price at about $25 or so. Anyway so all these people basically paid him in advance, so he starts with $30k and can make the investments he needs to. (seller discretionary cashflow is about $8k per year).

I can easily see how producing 1 chicken per week for your family, plus eggs, is a piece of cake. He's now also moving into other things, eggs, crops, hogs and vineyards and his own brewery. You can see some of his financial plans for 2015 here: http://www.farmmarketingsolutions.com/about/income-reports/2...

The cool thing is he's trying to be 100% transparent. Very interesting insight into small-scale farming. Providing for yourself is pretty easy, not trivial but very very doable. Providing for a CSA also looks like a very decent business, it's hard work but you can compete because people pay a premium for this stuff. On a larger scale I'm skeptical, selling wholesale really sucks and it just doesn't seem to be worth it unless you automate it (which seems only economical when you get economies of scale, i.e. a large scale) largely, or go big on certification and find a bio/eco/sustainability/local niche that wholesalers are seeing increased demand for themselves, too. Anyway I know that's not really your goal here but I thought I'd share the link :) All the best


That is a fantastic blog, thank you very much. I love reading stuff like that.

I have the luxury, of sorts, that the farm doesn't really need to support anyone financially (the primary income still coming from `traditional' sources), and ultimately just having a selection of foods available for my own immediate and extended family would be awesome. I've done the home vegetable garden thing for years, and would really like to take it to the next level. One of the next things about a lot of crops is that minimal effort often gets you 70% to the best outcome, so while a dedicated farmer carefully tends everything to fully optimize, I find the bounty and selection of just a variety of lazily planted tomato plants incredible.


The article may have made this unclear, but just being happy about good news for you isn't self-promotion.

The article quite clearly states that such self-promotion is what is often termed "humble bragging" (and in case the article might have misunderstood the study, the actual study itself uses very neutral language, like "sharing accomplishments"), which unfortunately is essentially just sharing any good news at all -- the examples given are new car, promotion, but the gist of the pretty limited study was sharing accomplishments.

From which I have to say that nostrademons is somewhat on the mark. It is unfortunate that many people project their own negative feelings (jealousy, resentment) onto other people's statements, to the point where our lives have to be presented as a shit sandwich if we ever really want to share good news without "bragging" --- "Pretty surprised that a dumb jerk like me got a promotion. I'll probably end up getting myself fired.". It's easy enough to do, but ultimately it's usually lying to yourself and the audience just to pander to other people's insecurities and just adding noise to the whole world.

And one of the best examples of this is the infamous Facebook baby updates: To people who care about the person, those are wonderful bits of news about something that is the one of most important events in the person you care abouts lives. To other people it's some asshole trying to lord over them just because they can procreate...I mean any asshole can go and have a kid and really do you need three pictures of a little shit and noise machine in a whole week? Who do they think they are...

"Bragging" is in the eye of the beholder more often than not, and generally is a case where people have to start to increasingly filter facts about their lives lest they pass the "bragging" line in the sand of the listener. Many people love to hear your bad news, but they resent your good news.


Rather incredibly, the comments and moderation throughout this very story are a microcosm of what is being discussed -- people drag so much baggage into these discussions, with pekk just loading their comment with a mountain sized chip on their shoulder.

My comment above has been down"voted" yet is absolutely accurate -- this rather weak study has zero to do with someone saying they're a 10x programmer or the "best" PHP coder. It specifically asks about sharing accomplishments. Cue loads of baggage and justifications for people's entirely negative reactions.


Not worth getting excited about. I started this sub-thread because I thought it was an intriguing, somewhat counter-intuitive explanation for the experiment's results. I think that the discussion in the sub-thread has provided pretty ample evidence of at least the meta-point I was trying to explore. But I don't bear any personal ill-will toward other commenters here; other peoples' perspectives are their own, and part of the point of the Internet is to share them.


There's a large retailer here in Canada -- Canadian Tire -- that I personally think should be legally forbidden from selling a good portion of the products they sell: Tools that break on the first use. Toys that have a very short path to the landfill (I feel a pang of guilt when a relative gifts one of my children with a "New Bright" or whatever utter junk brand toy is that they picked up at CT, knowing it won't make it through the night, immediately gauging just how large of a garbage bag it will need). Ultra low quality outdoor wear and tents. Cheaply made bicycles. BBQs that rust out 3 months into their life.

They are purveyors of poor satisfaction garbage dump filler products that only barely fulfill their stated purpose. Most consumers have become so accustomed to this that they don't even realize there's an issue.

It's a serious problem. A minimum level of durability for a given purpose is one of those things that is a benefit of the commons -- it is good for the entire planet.


There's a reason they have the nickname Crappy Tire, eh?


A Dell R920 with four E7-8880L v2 15 core processors (e.g. 60 real threads, 120 with HT), and 1024GB / 1TB of memory costs about $50,000 USD. To go to 1.5TB of memory pushes you to $60,000 USD.

Expensive in a relative to a low-end server or month of cloud usage, but that's an absurd amount of computational power.


As someone who extensively uses pgsql and SQL Server, I don't think it's fair to say that PostgreSQL is "more advanced" because it has one particular, somewhat trendy feature. There are many critical features that pgsql either lacks (query parallelization), or came many years after SQL Server (index-only scans, true materialized views, etc).

And really json in the database is just the more modern version of xml in the database, which of course SQL Server has supported for years. The problem with xml in the database was not a fundamental one, but rather an implementation issue -- where the json model in most solutions is some variation of toss-some-structured-data-in and then work with it, the XML model of SQL Server required significant poorly documented, confusing to implement configuration to work with in any meaningful way. It really killed the feature.

Personally I still think XML is superior to JSON, but it got usurped by the architectural astronauts who kept layering noise on it to the point of being unusable.


I thought it was implied in the context, but the "more advanced" obviously meant "more advanced in terms of JSON support". Sorry if there was clarification needed.

What SQL Server 2016 will have next year seems to be more limited than what PostgreSQL 9.2 (by 2012) had. PostgreSQL at least had a native datatype ("json") to store text that syntactically validated JSON, while in SQL Server, 4 years later, you may have non-valid JSON data on a column expected to have JSON (unless you setup the validation as a constraint, which is prone to error and cannot be used in functions which expect or return JSON data anyway). Plus PostgreSQL had by 9.2 many functions to query JSON types in a very easy manner, while I don't see the same functionality coming to SQL Server 2016.

Regarding indexing, by 9.2's time you could use functional indexes to index arbitrary JSON paths (one index per path). In SQL Server you'd only be able to use computed (persisted) columns to index JSON paths, which seems to be a less elegant and less performing solution (more storage required) to the same problem.

So all in all, I honestly think that in terms of JSON support SQL Server 2016 will be less advanced than 2012's PostgreSQL 9.2, and definitely way less advanced than current's 9.4 or even more this year's 9.5. But that's, of course, only my opinion :)


json's been around for over ten years and is the default serialization format for most web frameworks. To call it trendy in the same breath as singing the praises of XML is a bit gauche.


JSON is used for persistence in a limited number of projects, and still lacks fundamental things like a validation schema or even a date type. Yes, it absolutely is "trendy", and it absolutely pales compared to XML. They both fill virtually the identical requirement, one just slowly repeating all of the mistakes of the other, as history tends to repeat itself.


JSON is an improvement overall even though there are certainly issues. It is easier to read, easier to parse, and usually fewer bytes for the same data. Validation schema can be provided by JSON Schema [1]. The lack of date type and comments is annoying, but not that big of a deal. I think parsing and reading are far more important to get right which XML didn't.

1. http://json-schema.org/


> but not that big of a deal. I think parsing and reading are far more important to get right which XML didn't.

...in your opinion, for your use cases. I think XML is eminently more readable due to the fact that it has named types and also because I never have to read the following and figure out what goes where:

    }}}}}]]}}]]}}
JSON is great for sending data to Javascript though and I'm pretty sure that's the only reason anyone is using it.


You have the same problem in XML with that amount of nesting. Then you pretty print the XML or JSON and problem solved.


Worth noting that there is an Instagram Hyperlapse-type app on Android called "Gallus". It's made by some guy (e.g. lacks polish), and apparently isn't perfect on all devices (1), but my experience has been that it yields as good if not better outcomes on my Nexus 5 than Instagram's app on my iPhone.

It has gone almost entirely unnoticed, but it creates outputs a world better than this Microsoft app. I get that they're going for 'the videos you already have', but if I have a mobile device with so many sensors, why wouldn't I use them?

(1) - I mention that because whenever something pushes the edge, if someone finds that it doesn't work on their own Android handset, they tend to be the absolute loudest about it, declaring that therefore it must not work on anyone's handset. It is the most bizarre aspect to apps in Android land.


Any discussion about agile invariably yields the predictable flurry of personal hangups and misrepresentations.

Most common of all, though, are the claims that if you just have a great team full of great developers, you don't need a process. Agile and waterfall both be damned.

But most teams aren't great. Most of the teams of people participating on HN statistically aren't great. But those teams still need to develop something -- banks and big cos and countless firms are churning out code by the millions of lines, by millions of completely average developers -- and agile is an honest conclusion that the classic method of software development doesn't work, most critically by biting off work in such enormous chunks that implementation or design issues -- the fit to the problem -- aren't discovered until enormous efforts have been wasted.

So talk about how agile failed, in your estimation, at some shop or another. But the truth is that almost anything would be a "failure" versus the idealized notion of a perfect design built expertly by great developers. Agile is trying to make the best of a normal situation.


"But most teams aren't great"

Most teams work hard enough to become great. Most teams work long enough hours to become great. Most teams are smart enough to become great. Most teams even have the desire to become great.

So what is stopping them?

At the developer level: Lack of autonomy, lack of purpose, and lack of mastery. These issues aren't fixed by Agile.

At the management level: High turnover. Lack of attention/focus on things that aren't easily measurable. Lack of leadership. Constant arbitrary deadlines. A short term focus. Politics. Again, none of it fixed by Agile.


This is a great insight, and describes some of the projects I've worked on exactly. I think a fix to these two problems is to hire great developers that you can trust, and give them much bigger tasks. I find that the management breaks things up into too small of tasks so that they can measure it, which creates a bottom up approach, and that's just not how good software works. I prefer the top down approach, which means design the code from a high level (architecture, components, overall structure) then drill down and build the different parts.


I'm not sure what your argument is here. Because the classical model failed agile can't fail too?

I think you're also misunderstanding "great teams". I at least do not mean a team of great developers, but a team that works well together. It's a great team, not a great group of developers. You can have the best developers in the world, it won't work if they can't stand each other.


The "argument" is that making software is an ugly, brutish affairs in virtually every case, with virtually every approach. There are extraordinarily few cases throughout history where a team converged, looked at the problem, built a solution, and voila, everyone emerged happy. In the real world there is always a discord between requirements and reality, skillsets and the problem space, change management and the need for rapid change, scope creep, and on and on.

It is the story of every software project everywhere throughout time.

But invariably the comments will fill with tales of woe with agile (usually with people betraying a complete lack of understanding of it, as an aside, such as seen with most uses of the term "scrum" throughout these discussions. It serves as the canary in the coal mine when noxious gases are afoot), as if troubled projects are some unique reality of agile. They're a reality of every single methodology. Agile merely tries to help reduce a few of the bigger and most deadly issues, which are friction with the business (the ability to adapt to change), and enormous investments of time and money to a project that ends up being a solution to the wrong problem. It doesn't suddenly make everyone cooperative and great.

As to my "misunderstand", so the solution as you see it is just to have developers that work well together. Easy peasie solution.


You've created a strawman to attack by citing waterfall.

Stop citing waterfall as the reason for agile methodologies. It's like citing the lack of evidence for a christian God as the reason Science exists instead of pointing out how effective science has been in helping us control our environment.

And if you can't describe how effective Agile has been, then have you no reason for your opinion outside of religious views. Which is ok if it works for you, but don't attack a strawman as a way of dismissing other people's experiences.


Speaking of strawmen, where did I cite waterfall? The entire foundation of your post is a fiction.

Software is tough. It is always, and has always been tough. The same flexibility that is the great benefit of the domain is also its curse. This is true across any methodology, so try not to project your own hangups into that statement.

Agile is something that some shops try to do, imperfectly, against that reality. Every one of these angry anti-agile comments seem to opine that agile is terrible versus some mythical alternative that is shapeless and amorphous, boiled down to "have a great team". That is, quite simply, nonsense. It is the pat solution of the bottom feeder.

Religious views? Again, the entire foundation of the anti-agile screed, which is generally by people with a chip on their shoulder (and they probably had a chip on their shoulder about everything that people with more influence over them got adopted) is that it is deficient compared with an unspoken, unstated alternative.

And I'm the one bringing religion in this? Christ.


I may have responded to the wrong person (or you've edited your post) as the post I meant to respond to was most definitely comparing waterfall to agile.

It's very common for agile proponents to push Waterfall as the necessary alternative to agile, but that's a logical fallacy.


> In the real world there is always a discord between requirements and reality, skillsets and the problem space, change management and the need for rapid change, scope creep, and on and on.

We follow Agile. At the beginning of iteration planning, the dev-team, has to task-out the items and score the complexity, and then decide the cut-off point as to what can and cannot be achieved within the 2 week iteration.

It's taken us several years, but this ruthless cycle of feedback, and responsibility has led us to a point, where we scope it right about 75% of the time.

Having also been involved in projects that have overrun deliverable dates by years - I wouldn't have believed it possible for a software-management process to work so well.

Of course it helps, that there is genuine philosophical buy-in, and that our revenue is derived from our software (eg. we're not a cost-centre).


Once again.

> Because the classical model failed agile can't fail too?

> so the solution as you see it is just to have developers that work well together. Easy peasie solution.

No one said it's easy. No one said you can just have it. No one claims it's the solution to how we build software.

> usually with people betraying a complete lack of understanding of it, as an aside, such as seen with most uses of the term "scrum" throughout these discussions

Why don't you just tell me you think I completely lack understanding of the subject matter?


The beauty of Agile is: if you question or criticize it, you don't understand it; if it didn't work for you, you didn't apply it properly; if you applied it rigorously and it failed, you didn't have a deep understanding of it. Was it Sartre who said "it is what it is not"?


Agile is not an "it", agile is not static. Application of agile principals means guiding and improving development processes. This is not a damned if you do damned if you don't scenario. As the article suggested, you have to "inspect and adapt". If something did not work for you, then capture the results, perform a lesson's learned, and try again.

To say "apply agile methods", is to apply a nimbleness of the mind. This is where not everyone "gets it" right away. If someone suggests a paradigm shift from what you as a developer are accustomed to, you may very well fall flat on your face. But to apply agile methods means to be able to get back up and try a different approach.


The very definition of agile seems itself to be very "agile" and constantly changing depending on the discussion.


None of the people who complain about agile offer any viable alternatives. I get the impression that these are developers who simply hate being managed period, and would prefer that management never spoke to them at all.


I certainly wouldn't mind if the middle manager that had an hour long meeting today never spoke to me again. The first slide was MLK with "I have a dream" which then continued onwards with 15 minutes talking about how to move post-it stickers on a whiteboard and 15 minutes spent doing a failed analogy between a GPS and continuous reevaluation of when the next release will be done.

At the same time, management has decided to have a release cycle that is 10 months (or in reality after the inevitable delays: 12)..


Because there isnt a one size fits all approach that works well. The most effective project I have worked on had ~15 hours of meetings every week.


This terrible article is actually a great example of a submarine. Note the mention of a completely irrelevant "security" company halfway down, and then a link to their blog post.

In any case, it's terrible because it takes testing criteria under extreme conditions (don't store your SSDs at 55C) and then fearmongers this as the norm.


submarine?


It's a reference to a PG essay where he talks about how PR agencies often give press releases and news stories that are little more than hidden adverts by reporting selective truths: http://www.paulgraham.com/submarine.html


Sounds similar to the 'churnalism' concept.

https://en.m.wikipedia.org/wiki/Churnalism


"Patio11, back when he still did consulting, charged 30k/wk"

He says, adding heavy draw to an initiative. Absolutely nothing or anyone can validate those incredible claims, though. Every engagement was, to us, like the girlfriend from Canada.


If you said you could I increase my bottom line by at least 300k by working on some low hanging fruits that I've missed so far, I wouldn't hesitate to give you $30k... Of course, I'd pay you after I have results though.


I'm a cynical sort with a lot of experience in the industry consulting, and I can't help but view a lot of these pieces as heavily...embellished. Akin to "fake it to make it". Ala "I'm so overwhelmed with business at 30K per week....but would you like to buy some tupperware?"

I understand the author is here, and hopefully people don't embarrassingly downvote this purely to try to act civil -- I think the embellishment is doing a massive disservice to the entire industry. It is often complete fantasy. I will also say that I found the claims the other day about the 30K+ per week consulting engagements, apparently chosen at whim, completely ludicrous (again, having worked with countless organizations). It has zero verification or believability, but is a nice draw for someone's new initiative.


i can't speak for anyone else - but by the same token, i'm not embellishing anything. verifiable? no, because i'm not sharing my bank account statements. believable? definitely - most of my friends make about what i make a year freelancing, so it's not totally out of scope or outside the norm (i also never said i make $30k/week freelancing - some weeks i do, but others, i definitely don't).

i've built a career and an audience by being very honest, even about my epic failures and shortcomings. i know you don't know me from adam - but luckily my audience and client-base does, and really they're the ones i cater to. my pricing is listed publicly on my website, and i wouldn't do that if clients weren't willing to pay for it (i don't have a sugar daddy or sugar mama to support me).

i also appreciate that you voiced your opinion in an actually civil way! i hope it doesn't get down voted.


The 30K thing was related to a post on here a few days back where an author claimed that they had their pick of "30K / week" engagements. As an interesting contrast with your policy, they claimed that they kept their rates hidden, thus allowing them to magically and endlessly ramp it up.

In any case, I've long come to learn in the freelance industry that the more people talk about their success (which can be "so many clients I turn them away", or "I am now charging 50K+"), the more unlikely that success actually is. e.g. the guy who puts "published author" at the top of his blog likely has an Amazon Digital Delivery PDF book. The guy who talks about "top clients" had someone from Apple visit their blog once. And so on.

The more bombastic about success someone is, the more incredible the claims, the more likely they're struggling for relevance, using the escalating accomplishments to create it. Again, this has been my experience in the industry, where when you look behind the curtain you find Lenny asking you not to tell anyone how he lives.

In no way am I saying that about you, but just in general HN has been awash in "so much success I have to wear shades....now would you like to buy some tupperware?" type posts (I'm not being facetious with the tupperware thing -- it's people claiming boundless success, and then pitching something so petty that it puts it all in stark contrast), and I have to think it does a massive disservice to most of the readership, putting in their notice on this notion that there are endless lines of clients throwing money around for vague needs.


Why do you find it ludicrous? The organization I work for just spent $50,000 for work that many of its employees could have done in about a week. However, since we're all booked up with work for the next 5 years, it made sense to farm it out.

Various flavors of this are probably playing out in those "ludicrous" $30k/week engagements.


Well, I thought like you for a long time because the idea you can make more by turning down some work and markerting yourself in a niche is is very counterintuitive.

Before trying this approach I lose a lot of time racing to the bottom.


I didn't say that it was counterintuitive, or that you can't turn down clients.

I was observing the reality that most freelancers are more likely to find themselves absolutely starving for business, and shouldn't look at this and wonder what they're doing wrong. Telling a client to keep their money because it's an "ego project", when they're your only client over a month period, is a quick recipe for bankruptcy. It doesn't magically make other clients come knocking.

Once you get to the point where you're reaching saturation, sure you start to evaluate engagements for most reward/utility/etc. Many if not most freelancers will never come anywhere close to that.


One alternative is to live below your means and build up some savings so that you are not living paycheck to paycheck as a freelancer because that is even more of a recipe for disaster.

Look, $30k/week does sound ridiculous to me too, but Patrick doesn't strike me as a bullshitter either. He's just a great combination of programmer, businessman and marketer. As far as I remember, he never claimed to have a continuous stream of $30k/week projects lined up, those are words you're putting in his mouth to make it appear more embellished than it actually is.

The very terminology you use—"saturating" one's freelance schedule—is a mindset I've been in before, and I now think is dangerous. If you are overworked and stressed about money and billable hours, there is no free time or headspace to move up the food chain. Even worse, the quality of your work can slip due to minor hiccups like scope creep, unforeseen rabbit holes, or just getting sick for a few days; then you are undermining your future pitches and word-of-mouth!

Admittedly it's not easy to start charging more, you need to find the right clients, and you need to shift to a value-based billing where you take more risk on yourself. This is phenomenally difficult for anyone with expenses and wage-earner's mindset. However I wouldn't write off Patrick as being too privileged, lucky, or any other excuse—those things may be true, but that doesn't mean there aren't lessons that can be applied to people grinding it out at a lower level.


Saturated means that you've filled (or can fill at will) the time you've allotted and targeted, not 100% of your possible time. e.g. 100% of 50% of available time.

>he never claimed to have a continuous stream of $30k/week projects lined up, those are words you're putting in his mouth

He claimed it was his so-called "rack rate". His standard rate. Over a year long period. The prior year his "rack" rate was 20K/week. He didn't say "I put out a ridiculous rate and wouldn't you believe it this one guy bit!", but claimed that it was a ongoing day to day rate that saw continued success. That was sort of the whole, rather odd, basis of that part.

Extraordinary claims, as they say, require extraordinary proof. Yet we don't know a single client, or a single project, that paid these grossly out of the ordinary rates, and I can tell you that I don't know a single organization that would even consider such a rate for an individual freelancer (respected consulting shops -- the sort that you hire because no one ever gets fired for hiring them -- with armies of bodies yield less on their contract). People can wave their hands and say that it's because it saved so much but that sort of argument leaves me agape, wondering if a bunch of people just discovered the business world.

It just doesn't work like that.

I have absolutely no fundamental reason to believe Patrick a "bullshitter", or to accuse him of being so. On the flip side, I have absolutely no reason not to consider that he might be. Because it turns out that a remarkable number of people are -- particularly when they're trying to get attention for something -- somewhat aggrandizing. And we all know how important signalling is, and how if you act the part, maybe, the myth goes, you'll become the part. It plays a part of quite a few HN front pagers.


> The prior year his "rack" rate was 20K/week. He didn't say "I put out a ridiculous rate and wouldn't you believe it this one guy bit!", but claimed that it was a ongoing day to day rate that saw continued success.

But how many weeks do you have to work at that rate to constitute "continued success"? He came off a salaryman job, if he kept his expenses in check he would only need a couple of those jobs per year.

> (respected consulting shops -- the sort that you hire because no one ever gets fired for hiring them -- with armies of bodies yield less on their contract)

The type of companies that hire those consultancies are generally big slow and bureaucratic. They have plenty of money, but they don't have the agility to hire a single consultant and effect quick changes that move the needle. A corollary is that they also have established tech staffs and probably have some reasonable level of optimization.

By contrast there are a lot of mid-tier companies that have revenues in the 6-8 figure range which are generally not operating their tech with an enterprise level of sophistication. They fly under the radar of the tech establishment because they aren't tech-focused and they aren't really attractive to the big consultancies, but they are profitable because of domain knowledge and relationships in small niches. In these companies a hybrid technical/business consultant can move the needle far more than he can in a large company.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: