machine learning - How to serve a Spark MLlib model? -


i'm evaluating tools production ml based applications , 1 of our options spark mllib , have questions how serve model once trained?

for example in azure ml, once trained, model exposed web service can consumed application, , it's similar case amazon ml.

how serve/deploy ml models in apache spark ?

from 1 hand, machine learning model built spark can't served way serve in azure ml or amazon ml in traditional manner.

databricks claims able deploy models using it's notebook haven't tried yet.

on other hand, can use model in 3 ways :

  • training on fly inside application applying prediction. can done in spark application or notebook.
  • train model , save if implements mlwriter load in application or notebook , run against data.
  • train model spark , export pmml format using jpmml-spark. pmml allows different statistical , data mining tools speak same language. in way, predictive solution can moved among tools , applications without need custom coding. e.g spark ml r.

those 3 possible ways.

of course, can think of architecture in have restful service behind can build using spark-jobserver per example train , deploy needs development. it's not out-of-the-box solution.

you might use projects oryx 2 create full lambda architecture train, deploy , serve model.

unfortunately, describing each of mentioned above solution quite broad , doesn't fit in scope of so.


Comments

Popular posts from this blog

asynchronous - C# WinSCP .NET assembly: How to upload multiple files asynchronously -

aws api gateway - SerializationException in posting new Records via Dynamodb Proxy Service in API -

asp.net - Problems sending emails from forum -