Much like search engines combat black hat SEO, amazon and reviews systems can do the same.
It would not be to hard too for amazon. Use reviews that are a 'Verified Purchase' and highly 'helpful' reviews as a training/test set. Then machine learn a weighting to every review based on the (product, reviewer, other reviews, etc features...)
The outcome could actually hinder tactics like this for products that game the system much like how link farms hurt sites they were propping up when Google figured out how to stop sites from gaming a search engine.
In this case, the positive fake reviews are for verified purchases of ebooks that are relatively cheap... I'd suggest reviews from accounts more than 2 years old, with over 5 reviews, and more than $500 spent would probably be a better baseline for training.
It's always a cat and mouse game however. Once they employ machine learning, I'm assuming the spammers will adapt and create reviews that fool the algorithm(though at an increased initial cost).
Trusting "verified purchase" reviews is a pretty easy way to allow a lot of fake reviews. On a 99c ebook your royalty is 35%, that means a verified fake review only costs you 65c.
To me, running gevent's monkey-patches in production is a sign of desperation for speed over all else. And the desperation isn't necessary, because there are better options now.
True, it comparisons do a lot better with more features.
This paper looks to just show the major winning aspect of using CovNets as they do not need many features as the deep net learns its own representations of the training data. It more to show CovNets work on more then just vision.
But architeching the pooling layers IS adding complex to the simple input feature set. Therefore the comparison should be of only state of the art ML.
Uh, this metric is nuts, because it views anything that isn't 9-5 on a payroll as 'not a real job'. That ignores everything from independent contractors to seasonal work. These are huge components of the economy and always have been.
So what does a movement in the metric even mean? Does it mean that people are working less, or does it mean that more people are self employed? Or students longer?
I think wages are a more insightful way of estimating labor supply and demand. It's not great news:
"U.S. real (inflation adjusted) median household income was $51,939 in 2013 versus $51,759 in 2012, statistically unchanged. The 2011 level was $51,842 and the 2010 level was $52,646.
In 2013, real median household income was 8.0% lower than the 2007 pre-recession level of $56,436.[1]" (wikipediarite)
There is no 1GB limit. I think you might be confused about compute units? Compute units are just a measure of RAM usage over time -- a compute unit is 1GB of RAM used for one hour. An app can use more than 1GB of RAM; it will just consume compute units faster. E.g. an app using 2GB will consume a compute unit in 30 minutes.
This all relates to our upcoming managed hosting. Self-hosted installs are not metered since it's your own hardware.
Hi there, I hope I have replied to your query with my answer above. Please don't call this a 'scam', as a self taught coder, had to work pretty hard to launch this app. Most of the reviews you see are from friends and family. They know me, trust me and know that what I am doing is not a scam. Nothing shady about it buddy.
I understand your defensiveness to his blasting, but it's feedback (although cloaked in antagonistic language). By responding you draw attention to it. I'd ignore and see if others say same thing.
I do agree with him on one thing: "feed poor children" can send a negative message. For him its scammy, for me it's "ugh another poorly thought through tom's knock off?" Because feeding poor children isn't at all connected to first world people learning new vocab.
Unless you really believe in feeding poor children and you have a good story to back it up and you can quickly communicate that, something that would make more sense is something related to literacy. "Learn new vocab and donate books to poor children" sounds a lot better.