How to submit batch jar Spark jobs by livy Programmatic API












1














I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data



{
"className": "org.apache.spark.examples.SparkPi",
"queue": "default",
"name": "SparkPi by Livy",
"proxyUser": "hadoop",
"executorMemory": "5g",
"args": [2000],
"file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
}


but I cannot find any document about this, is this possible? how?










share|improve this question



























    1














    I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data



    {
    "className": "org.apache.spark.examples.SparkPi",
    "queue": "default",
    "name": "SparkPi by Livy",
    "proxyUser": "hadoop",
    "executorMemory": "5g",
    "args": [2000],
    "file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
    }


    but I cannot find any document about this, is this possible? how?










    share|improve this question

























      1












      1








      1


      1





      I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data



      {
      "className": "org.apache.spark.examples.SparkPi",
      "queue": "default",
      "name": "SparkPi by Livy",
      "proxyUser": "hadoop",
      "executorMemory": "5g",
      "args": [2000],
      "file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
      }


      but I cannot find any document about this, is this possible? how?










      share|improve this question













      I want to submit batch jar Spark jobs using livy Programmatic API, like using rest API batches, I have the json data



      {
      "className": "org.apache.spark.examples.SparkPi",
      "queue": "default",
      "name": "SparkPi by Livy",
      "proxyUser": "hadoop",
      "executorMemory": "5g",
      "args": [2000],
      "file": "hdfs://host:port/resources/spark-examples_2.11-2.1.1.jar"
      }


      but I cannot find any document about this, is this possible? how?







      java scala apache-spark livy






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 21 '18 at 3:34









      寒江雪

      64




      64
























          1 Answer
          1






          active

          oldest

          votes


















          1














          Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,




          • First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.

          • Submit job using either curl (for testing) and implement using http client api.


          Sample code to submit spark job using http client in scala



          import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet, 
          HttpPost, HttpPut}
          import org.apache.http.entity.StringEntity
          import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
          import org.apache.http.util.EntityUtils

          import scala.util.parsing.json.{JSON, JSONObject}

          def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {

          val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")

          val data = Map(
          "className"-> className,
          "file" -> jarPath,
          "driverMemory" -> "2g",
          "name" -> "LivyTest",
          "proxyUser" -> "hadoop")

          if(extraArgs != null && !extraArgs.isEmpty) {
          data + ( "args" -> extraArgs)
          }

          val json = new JSONObject(data)

          println(json.toString())

          val params = new StringEntity(json.toString(),"UTF-8")
          params.setContentType("application/json")

          jobSubmitRequest.addHeader("Content-Type", "application/json")
          jobSubmitRequest.addHeader("Accept", "*/*")
          jobSubmitRequest.setEntity(params)

          val client: CloseableHttpClient = HttpClientBuilder.create().build()
          val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
          HttpReqUtil.parseHttpResponse(response)._2
          }


          Please refer the post for more details
          https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/



          Sample project in the following link
          https://github.com/ravikramesh/spark-rest-service






          share|improve this answer





















          • @寒江雪 : if you are okay with the answer pls care to accept this as owner!
            – Ram Ghadiyaram
            Nov 30 '18 at 2:46











          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53404896%2fhow-to-submit-batch-jar-spark-jobs-by-livy-programmatic-api%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          1














          Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,




          • First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.

          • Submit job using either curl (for testing) and implement using http client api.


          Sample code to submit spark job using http client in scala



          import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet, 
          HttpPost, HttpPut}
          import org.apache.http.entity.StringEntity
          import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
          import org.apache.http.util.EntityUtils

          import scala.util.parsing.json.{JSON, JSONObject}

          def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {

          val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")

          val data = Map(
          "className"-> className,
          "file" -> jarPath,
          "driverMemory" -> "2g",
          "name" -> "LivyTest",
          "proxyUser" -> "hadoop")

          if(extraArgs != null && !extraArgs.isEmpty) {
          data + ( "args" -> extraArgs)
          }

          val json = new JSONObject(data)

          println(json.toString())

          val params = new StringEntity(json.toString(),"UTF-8")
          params.setContentType("application/json")

          jobSubmitRequest.addHeader("Content-Type", "application/json")
          jobSubmitRequest.addHeader("Accept", "*/*")
          jobSubmitRequest.setEntity(params)

          val client: CloseableHttpClient = HttpClientBuilder.create().build()
          val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
          HttpReqUtil.parseHttpResponse(response)._2
          }


          Please refer the post for more details
          https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/



          Sample project in the following link
          https://github.com/ravikramesh/spark-rest-service






          share|improve this answer





















          • @寒江雪 : if you are okay with the answer pls care to accept this as owner!
            – Ram Ghadiyaram
            Nov 30 '18 at 2:46
















          1














          Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,




          • First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.

          • Submit job using either curl (for testing) and implement using http client api.


          Sample code to submit spark job using http client in scala



          import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet, 
          HttpPost, HttpPut}
          import org.apache.http.entity.StringEntity
          import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
          import org.apache.http.util.EntityUtils

          import scala.util.parsing.json.{JSON, JSONObject}

          def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {

          val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")

          val data = Map(
          "className"-> className,
          "file" -> jarPath,
          "driverMemory" -> "2g",
          "name" -> "LivyTest",
          "proxyUser" -> "hadoop")

          if(extraArgs != null && !extraArgs.isEmpty) {
          data + ( "args" -> extraArgs)
          }

          val json = new JSONObject(data)

          println(json.toString())

          val params = new StringEntity(json.toString(),"UTF-8")
          params.setContentType("application/json")

          jobSubmitRequest.addHeader("Content-Type", "application/json")
          jobSubmitRequest.addHeader("Accept", "*/*")
          jobSubmitRequest.setEntity(params)

          val client: CloseableHttpClient = HttpClientBuilder.create().build()
          val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
          HttpReqUtil.parseHttpResponse(response)._2
          }


          Please refer the post for more details
          https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/



          Sample project in the following link
          https://github.com/ravikramesh/spark-rest-service






          share|improve this answer





















          • @寒江雪 : if you are okay with the answer pls care to accept this as owner!
            – Ram Ghadiyaram
            Nov 30 '18 at 2:46














          1












          1








          1






          Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,




          • First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.

          • Submit job using either curl (for testing) and implement using http client api.


          Sample code to submit spark job using http client in scala



          import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet, 
          HttpPost, HttpPut}
          import org.apache.http.entity.StringEntity
          import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
          import org.apache.http.util.EntityUtils

          import scala.util.parsing.json.{JSON, JSONObject}

          def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {

          val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")

          val data = Map(
          "className"-> className,
          "file" -> jarPath,
          "driverMemory" -> "2g",
          "name" -> "LivyTest",
          "proxyUser" -> "hadoop")

          if(extraArgs != null && !extraArgs.isEmpty) {
          data + ( "args" -> extraArgs)
          }

          val json = new JSONObject(data)

          println(json.toString())

          val params = new StringEntity(json.toString(),"UTF-8")
          params.setContentType("application/json")

          jobSubmitRequest.addHeader("Content-Type", "application/json")
          jobSubmitRequest.addHeader("Accept", "*/*")
          jobSubmitRequest.setEntity(params)

          val client: CloseableHttpClient = HttpClientBuilder.create().build()
          val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
          HttpReqUtil.parseHttpResponse(response)._2
          }


          Please refer the post for more details
          https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/



          Sample project in the following link
          https://github.com/ravikramesh/spark-rest-service






          share|improve this answer












          Yes, you can submit spark jobs via rest API using Livy. Please follow the below steps,




          • First build spark application and create the assembly jar and upload the application jar on the cluster storage (HDFS) of the hadoop cluster.

          • Submit job using either curl (for testing) and implement using http client api.


          Sample code to submit spark job using http client in scala



          import org.apache.http.client.methods.{CloseableHttpResponse, HttpGet, 
          HttpPost, HttpPut}
          import org.apache.http.entity.StringEntity
          import org.apache.http.impl.client.{CloseableHttpClient, HttpClientBuilder}
          import org.apache.http.util.EntityUtils

          import scala.util.parsing.json.{JSON, JSONObject}

          def submitJob(className: String, jarPath:String, extraArgs: List[String]) : JSONObject = {

          val jobSubmitRequest = new HttpPost(s"${clusterConfig.livyserver}/batches")

          val data = Map(
          "className"-> className,
          "file" -> jarPath,
          "driverMemory" -> "2g",
          "name" -> "LivyTest",
          "proxyUser" -> "hadoop")

          if(extraArgs != null && !extraArgs.isEmpty) {
          data + ( "args" -> extraArgs)
          }

          val json = new JSONObject(data)

          println(json.toString())

          val params = new StringEntity(json.toString(),"UTF-8")
          params.setContentType("application/json")

          jobSubmitRequest.addHeader("Content-Type", "application/json")
          jobSubmitRequest.addHeader("Accept", "*/*")
          jobSubmitRequest.setEntity(params)

          val client: CloseableHttpClient = HttpClientBuilder.create().build()
          val response: CloseableHttpResponse = client.execute(jobSubmitRequest)
          HttpReqUtil.parseHttpResponse(response)._2
          }


          Please refer the post for more details
          https://www.linkedin.com/pulse/submitting-spark-jobs-remote-cluster-via-livy-rest-api-ramasamy/



          Sample project in the following link
          https://github.com/ravikramesh/spark-rest-service







          share|improve this answer












          share|improve this answer



          share|improve this answer










          answered Nov 29 '18 at 16:34









          Ravikumar

          5871216




          5871216












          • @寒江雪 : if you are okay with the answer pls care to accept this as owner!
            – Ram Ghadiyaram
            Nov 30 '18 at 2:46


















          • @寒江雪 : if you are okay with the answer pls care to accept this as owner!
            – Ram Ghadiyaram
            Nov 30 '18 at 2:46
















          @寒江雪 : if you are okay with the answer pls care to accept this as owner!
          – Ram Ghadiyaram
          Nov 30 '18 at 2:46




          @寒江雪 : if you are okay with the answer pls care to accept this as owner!
          – Ram Ghadiyaram
          Nov 30 '18 at 2:46


















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.





          Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


          Please pay close attention to the following guidance:


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53404896%2fhow-to-submit-batch-jar-spark-jobs-by-livy-programmatic-api%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Costa Masnaga

          Fotorealismo

          Sidney Franklin