本文主要介绍在DLA中如何快速上手时序时空引擎Ganos。

前提条件

请提交工单获取DLA Ganos SDK用于本机编译调试。

操作步骤

  1. 准备数据
    将tiff文件上传至OSS指定目录下,例如:
    oss://bucket名称/raster
  2. 加载OSS数据
    1. 新建maven工程dla-ganos-quickstart,然后编辑项目的pom.xml文件:
      <?xml version="1.0" encoding="UTF-8"?>
      <project xmlns="http://maven.apache.org/POM/4.0.0"
               xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
               xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
          <modelVersion>4.0.0</modelVersion>
      
          <groupId>com.aliyun.ganos.dla</groupId>
          <artifactId>dla-ganos-quickstart</artifactId>
          <version>1.0</version>
          <properties>
              <scala.version>2.11.12</scala.version>
              <scala.binary.version>2.11</scala.binary.version>
              <scala.xml.version>1.0.6</scala.xml.version>
              <scala.parsers.version>1.0.6</scala.parsers.version>
              <scalalogging.version>3.8.0</scalalogging.version>
              <spark.version>2.4.3</spark.version>
              <kryo.version>3.0.3</kryo.version>
          </properties>
          <dependencies>
              <dependency>
                  <groupId>com.aliyun.ganos</groupId>
                  <artifactId>dla-ganos-sdk</artifactId>
                  <version>1.0</version>
                  <scope>system</scope>
                  <systemPath>
                      下载的dla-ganos-sdk-1.0.jar的路径
                  </systemPath>
              </dependency>
              <dependency>    
                  <groupId>io.spray</groupId>    
                  <artifactId>spray-json_2.11</artifactId>    
                  <version>1.3.5</version>
              </dependency>
              <dependency>
                  <groupId>org.apache.spark</groupId>
                  <artifactId>spark-core_${scala.binary.version}</artifactId>
                  <exclusions>
                      <exclusion>
                          <groupId>com.fasterxml.jackson.core</groupId>
                          <artifactId>jackson-databind</artifactId>
                      </exclusion>
                  </exclusions>
                  <version>${spark.version}</version>
                  <scope>provided</scope>
              </dependency>
              <dependency>
                  <groupId>org.apache.spark</groupId>
                  <artifactId>spark-sql_${scala.binary.version}</artifactId>
                  <version>${spark.version}</version>
                  <scope>provided</scope>
              </dependency>
              <dependency>
                  <groupId>org.apache.spark</groupId>
                  <artifactId>spark-catalyst_${scala.binary.version}</artifactId>
                  <version>${spark.version}</version>
                  <scope>provided</scope>
              </dependency>
               <!-- GeoTools -->
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-geojson</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-metadata</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-referencing</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools.jdbc</groupId>
                  <artifactId>gt-jdbc-postgis</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>org.geotools</groupId>
                  <artifactId>gt-epsg-hsql</artifactId>
                  <version>23.0</version>
              </dependency>
              <dependency>
                  <groupId>com.aliyun.oss</groupId>
                  <artifactId>aliyun-sdk-oss</artifactId>
                  <version>3.9.0</version>
              </dependency>
          </dependencies>
      
          <build>
              <plugins>
                  <plugin>
                      <groupId>org.apache.maven.plugins</groupId>
                      <artifactId>maven-compiler-plugin</artifactId>
                      <configuration>
                          <source>1.8</source>
                          <target>1.8</target>
                      </configuration>
                  </plugin>
                  <plugin>
                      <groupId>net.alchim31.maven</groupId>
                      <artifactId>scala-maven-plugin</artifactId>
                      <executions>
                          <execution>
                              <id>scala-compile-first</id>
                              <phase>process-resources</phase>
                              <goals>
                                  <goal>add-source</goal>
                                  <goal>compile</goal>
                              </goals>
                          </execution>
                          <execution>
                              <id>scala-test-compile</id>
                              <phase>process-test-resources</phase>
                              <goals>
                                  <goal>testCompile</goal>
                              </goals>
                          </execution>
                      </executions>
                      <configuration>
                          <compilerPlugins>
                              <compilerPlugin>
                                  <groupId>org.spire-math</groupId>
                                  <artifactId>kind-projector_2.11</artifactId>
                                  <version>0.9.4</version>
                              </compilerPlugin>
                          </compilerPlugins>
                      </configuration>
                  </plugin>
              </plugins>
          </build>
           <repositories>
              <repository>
                  <id>osgeo</id>
                  <name>OSGeo Release Repository</name>
                  <url>https://repo.osgeo.org/repository/release/</url>
                  <snapshots><enabled>false</enabled></snapshots>
                  <releases><enabled>true</enabled></releases>
              </repository>
              <repository>
                  <id>osgeo-snapshot</id>
                  <name>OSGeo Snapshot Repository</name>
                  <url>https://repo.osgeo.org/repository/snapshot/</url>
                  <snapshots><enabled>true</enabled></snapshots>
                  <releases><enabled>false</enabled></releases>
              </repository>
          </repositories>
      </project>
    2. 创建scala文件OSSTest.scala
      import com.aliyun.ganos.dla._
      import com.aliyun.ganos.dla.raster._
      import com.aliyun.ganos.dla.oss._
      import com.aliyun.ganos.dla.geometry._
      import com.typesafe.config.ConfigFactory
      import org.apache.log4j.{Level, Logger}
      import org.apache.spark.SparkConf
      import org.apache.spark.sql.SparkSession
      
      object OSSTest extends App {
      
        Logger.getLogger("org").setLevel(Level.ERROR)
        Logger.getLogger("com").setLevel(Level.ERROR)
        val spark: SparkSession = {
          val session = SparkSession.builder
            .withKryoSerialization
            .config(additionalConf)
            .getOrCreate()
          session
        }
      
        spark.withGanosGeometry
        spark.withGanosRaster
      
        val uri = new java.net.URI("oss://bucket名称/raster") //指定文件夹路径
      
        val options = Map(
          "crs"->"EPSG:4326",
          "endpoint" -> "EndPoint地址",
          "accessKeyId" -> "用户AccessKeyID",
          "accessKeySecret" -> "用户AccessKeySecret")
        
        val rf = spark.read.ganos.oss(options).loadLayer(uri)
        rf.show
        def additionalConf = new SparkConf(false)
      }
    3. 进行编译工程:
      mvn clean package
  3. 提交作业
    登录Data Lake Analytics管理控制台提交Spark作业,详情请参见:创建和执行Spark作业2
    保存运行成功后查看作业状态,可以看到原始tiff以Tile的形式加载到Spark中,然后您就可以对这些Tile进行相关操作了。dla_ganos_quickstart_1