本文为您介绍Spark-1.x依赖的配置以及Spark-1.x相关示例。

配置Spark-1.x的依赖

通过MaxCompute提供的Spark客户端提交应用,需要在pom.xml文件中添加以下依赖。
<properties>
    <spark.version>1.6.3</spark.version>
    <cupid.sdk.version>3.3.3-public</cupid.sdk.version>
    <scala.version>2.10.4</scala.version>
    <scala.binary.version>2.10</scala.binary.version>
</properties>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_${scala.binary.version}</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_${scala.binary.version}</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-mllib_${scala.binary.version}</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_${scala.binary.version}</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>com.aliyun.odps</groupId>
    <artifactId>cupid-sdk</artifactId>
    <version>${cupid.sdk.version}</version>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>com.aliyun.odps</groupId>
    <artifactId>hadoop-fs-oss</artifactId>
    <version>${cupid.sdk.version}</version>
</dependency>
<dependency>
    <groupId>com.aliyun.odps</groupId>
    <artifactId>odps-spark-datasource_${scala.binary.version}</artifactId>
    <version>${cupid.sdk.version}</version>
</dependency>
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-library</artifactId>
    <version>${scala.version}</version>
</dependency>
<dependency>
    <groupId>org.scala-lang</groupId>
    <artifactId>scala-actors</artifactId>
    <version>${scala.version}</version>
</dependency>
上述代码中Scope的定义如下:
  • spark-core、spark-sql等所有Spark社区发布的包,使用providedScope。
  • odps-spark-datasource使用默认的compileScope。

WordCount示例

  • 详细代码
  • 提交方式
    Step 1. build aliyun-cupid-sdk
    Step 2. properly set spark.defaults.conf
    Step 3. bin/spark-submit --master yarn-cluster --class \
          com.aliyun.odps.spark.examples.WordCount \
          ${path to aliyun-cupid-sdk}/spark/spark-1.x/spark-examples/target/spark-examples_2.10-version-shaded.jar

Spark-SQL on MaxCompute Table 示例

  • 详细代码
  • 提交方式
    # 运行可能会报Table Not Found的异常,这是因为用户的MaxCompute Project中没有代码中指定的表。
    # 可参考代码中的各种接口,实现对应Table的SparkSQL应用。
    Step 1. build aliyun-cupid-sdk
    Step 2. properly set spark.defaults.conf
    Step 3. bin/spark-submit --master yarn-cluster --class \
          com.aliyun.odps.spark.examples.sparksql.SparkSQL \
          ${path to aliyun-cupid-sdk}/spark/spark-1.x/spark-examples/target/spark-examples_2.10-version-shaded.jar

GraphX PageRank示例

  • 详细代码
  • 提交方式
    Step 1. build aliyun-cupid-sdk
    Step 2. properly set spark.defaults.conf
    Step 3. bin/spark-submit --master yarn-cluster --class \
          com.aliyun.odps.spark.examples.graphx.PageRank \
          ${path to aliyun-cupid-sdk}/spark/spark-1.x/spark-examples/target/spark-examples_2.10-version-shaded.jar

Mllib Kmeans-ON-OSS示例

  • 详细代码
  • 提交方式
    # 在代码中填入OSS账号信息,再编译提交。
    conf.set("spark.hadoop.fs.oss.accessKeyId", "***")
    conf.set("spark.hadoop.fs.oss.accessKeySecret", "***")
    conf.set("spark.hadoop.fs.oss.endpoint", "oss-cn-hangzhou-zmf.aliyuncs.com")
    Step 1. build aliyun-cupid-sdk
    Step 2. properly set spark.defaults.conf
    Step 3. bin/spark-submit --master yarn-cluster --class \
          com.aliyun.odps.spark.examples.mllib.KmeansModelSaveToOss \
          ${path to aliyun-cupid-sdk}/spark/spark-1.x/spark-examples/target/spark-examples_2.10-version-shaded.jar

OSS UnstructuredData示例

  • 详细代码
  • 提交方式
    # 在代码中填入OSS账号信息,再编译提交。
    conf.set("spark.hadoop.fs.oss.accessKeyId", "***")
    conf.set("spark.hadoop.fs.oss.accessKeySecret", "***")
    conf.set("spark.hadoop.fs.oss.endpoint", "oss-cn-hangzhou-zmf.aliyuncs.com")
    Step 1. build aliyun-cupid-sdk
    Step 2. properly set spark.defaults.conf
    Step 3. bin/spark-submit --master yarn-cluster --class \
          com.aliyun.odps.spark.examples.oss.SparkUnstructuredDataCompute \
          ${path to aliyun-cupid-sdk}/spark/spark-1.x/spark-examples/target/spark-examples_2.10-version-shaded.jar