How to submit batch jar Spark jobs by livy Programmatic API
I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data
{
"className": "org.apache.spark.examples.SparkPi",
"queue": "default",
"name": "SparkPi by Livy",
"proxyUser": "hadoop",
"executorMemory": "5g",
"args": [2000],
"file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
}
but I cannot find any document about this, is this possible? how?
java scala apache-spark livy
add a comment |
I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data
{
"className": "org.apache.spark.examples.SparkPi",
"queue": "default",
"name": "SparkPi by Livy",
"proxyUser": "hadoop",
"executorMemory": "5g",
"args": [2000],
"file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
}
but I cannot find any document about this, is this possible? how?
java scala apache-spark livy
add a comment |
I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data
{
"className": "org.apache.spark.examples.SparkPi",
"queue": "default",
"name": "SparkPi by Livy",
"proxyUser": "hadoop",
"executorMemory": "5g",
"args": [2000],
"file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
}
but I cannot find any document about this, is this possible? how?
java scala apache-spark livy
I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data
{
"className": "org.apache.spark.examples.SparkPi",
"queue": "default",
"name": "SparkPi by Livy",
"proxyUser": "hadoop",
"executorMemory": "5g",
"args": [2000],
"file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
}
but I cannot find any document about this, is this possible? how?
java scala apache-spark livy
java scala apache-spark livy
asked Nov 21 '18 at 3:34
寒江雪
64
64
add a comment |
add a comment |
1 Answer
1
active
oldest
votes
Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,
- First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.
- Submit job using either curl (for testing) and implement using http client api.
Sample code to submit spark job using http client in scala
import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet,
HttpPost, HttpPut}
import org.apache.http.entity.StringEntity
import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
import org.apache.http.util.EntityUtils
import scala.util.parsing.json.{JSON, JSONObject}
def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {
val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")
val data = Map(
"className"-> className,
"file" -> jarPath,
"driverMemory" -> "2g",
"name" -> "LivyTest",
"proxyUser" -> "hadoop")
if(extraArgs != null && !extraArgs.isEmpty) {
data + ( "args" -> extraArgs)
}
val json = new JSONObject(data)
println(json.toString())
val params = new StringEntity(json.toString(),"UTF-8")
params.setContentType("application/json")
jobSubmitRequest.addHeader("Content-Type", "application/json")
jobSubmitRequest.addHeader("Accept", "*/*")
jobSubmitRequest.setEntity(params)
val client: CloseableHttpClient = HttpClientBuilder.create().build()
val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
HttpReqUtil.parseHttpResponse(response)._2
}
Please refer the post for more details
https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/
Sample project in the following link
https://github.com/ravikramesh/spark-rest-service
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53404896%2fhow-to-submit-batch-jar-spark-jobs-by-livy-programmatic-api%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,
- First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.
- Submit job using either curl (for testing) and implement using http client api.
Sample code to submit spark job using http client in scala
import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet,
HttpPost, HttpPut}
import org.apache.http.entity.StringEntity
import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
import org.apache.http.util.EntityUtils
import scala.util.parsing.json.{JSON, JSONObject}
def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {
val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")
val data = Map(
"className"-> className,
"file" -> jarPath,
"driverMemory" -> "2g",
"name" -> "LivyTest",
"proxyUser" -> "hadoop")
if(extraArgs != null && !extraArgs.isEmpty) {
data + ( "args" -> extraArgs)
}
val json = new JSONObject(data)
println(json.toString())
val params = new StringEntity(json.toString(),"UTF-8")
params.setContentType("application/json")
jobSubmitRequest.addHeader("Content-Type", "application/json")
jobSubmitRequest.addHeader("Accept", "*/*")
jobSubmitRequest.setEntity(params)
val client: CloseableHttpClient = HttpClientBuilder.create().build()
val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
HttpReqUtil.parseHttpResponse(response)._2
}
Please refer the post for more details
https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/
Sample project in the following link
https://github.com/ravikramesh/spark-rest-service
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
add a comment |
Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,
- First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.
- Submit job using either curl (for testing) and implement using http client api.
Sample code to submit spark job using http client in scala
import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet,
HttpPost, HttpPut}
import org.apache.http.entity.StringEntity
import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
import org.apache.http.util.EntityUtils
import scala.util.parsing.json.{JSON, JSONObject}
def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {
val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")
val data = Map(
"className"-> className,
"file" -> jarPath,
"driverMemory" -> "2g",
"name" -> "LivyTest",
"proxyUser" -> "hadoop")
if(extraArgs != null && !extraArgs.isEmpty) {
data + ( "args" -> extraArgs)
}
val json = new JSONObject(data)
println(json.toString())
val params = new StringEntity(json.toString(),"UTF-8")
params.setContentType("application/json")
jobSubmitRequest.addHeader("Content-Type", "application/json")
jobSubmitRequest.addHeader("Accept", "*/*")
jobSubmitRequest.setEntity(params)
val client: CloseableHttpClient = HttpClientBuilder.create().build()
val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
HttpReqUtil.parseHttpResponse(response)._2
}
Please refer the post for more details
https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/
Sample project in the following link
https://github.com/ravikramesh/spark-rest-service
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
add a comment |
Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,
- First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.
- Submit job using either curl (for testing) and implement using http client api.
Sample code to submit spark job using http client in scala
import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet,
HttpPost, HttpPut}
import org.apache.http.entity.StringEntity
import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
import org.apache.http.util.EntityUtils
import scala.util.parsing.json.{JSON, JSONObject}
def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {
val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")
val data = Map(
"className"-> className,
"file" -> jarPath,
"driverMemory" -> "2g",
"name" -> "LivyTest",
"proxyUser" -> "hadoop")
if(extraArgs != null && !extraArgs.isEmpty) {
data + ( "args" -> extraArgs)
}
val json = new JSONObject(data)
println(json.toString())
val params = new StringEntity(json.toString(),"UTF-8")
params.setContentType("application/json")
jobSubmitRequest.addHeader("Content-Type", "application/json")
jobSubmitRequest.addHeader("Accept", "*/*")
jobSubmitRequest.setEntity(params)
val client: CloseableHttpClient = HttpClientBuilder.create().build()
val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
HttpReqUtil.parseHttpResponse(response)._2
}
Please refer the post for more details
https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/
Sample project in the following link
https://github.com/ravikramesh/spark-rest-service
Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,
- First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.
- Submit job using either curl (for testing) and implement using http client api.
Sample code to submit spark job using http client in scala
import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet,
HttpPost, HttpPut}
import org.apache.http.entity.StringEntity
import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
import org.apache.http.util.EntityUtils
import scala.util.parsing.json.{JSON, JSONObject}
def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {
val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")
val data = Map(
"className"-> className,
"file" -> jarPath,
"driverMemory" -> "2g",
"name" -> "LivyTest",
"proxyUser" -> "hadoop")
if(extraArgs != null && !extraArgs.isEmpty) {
data + ( "args" -> extraArgs)
}
val json = new JSONObject(data)
println(json.toString())
val params = new StringEntity(json.toString(),"UTF-8")
params.setContentType("application/json")
jobSubmitRequest.addHeader("Content-Type", "application/json")
jobSubmitRequest.addHeader("Accept", "*/*")
jobSubmitRequest.setEntity(params)
val client: CloseableHttpClient = HttpClientBuilder.create().build()
val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
HttpReqUtil.parseHttpResponse(response)._2
}
Please refer the post for more details
https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/
Sample project in the following link
https://github.com/ravikramesh/spark-rest-service
answered Nov 29 '18 at 16:34
Ravikumar
5871216
5871216
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
add a comment |
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
@寒江雪 : if you are okay with the answer pls care to accept this as owner!
– Ram Ghadiyaram
Nov 30 '18 at 2:46
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53404896%2fhow-to-submit-batch-jar-spark-jobs-by-livy-programmatic-api%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown