Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Question for the author: how do you actually deploy your model? Do you have a dependency on Spark in your production system?


Since its just a dot product between the learned weights and the feature vector, we do this in the application layer as lacksconfidence surmised.


Being that this is a SVM, which is typically evaluated as a simple linear sum of weights, I imagine they reimplemented that in the application layer. Would be curious how they handled the normalization steps (reimplement that as well?)


Yep. We normalize our features as part of training, and the stdevs of each feature are part of the resulting model, along with the weights. (The means are always 0 because of the way we construct our training set.) The weights we use in production are actually normalized_weight / stdev.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: