GeoSparkViz 可視化
上一節(jié)我們使用GeoSpark SQL簡(jiǎn)單進(jìn)行了空間的一些操作,本節(jié)我們繼續(xù)利用GeoSpark SQL以及GeoSparkViz將我們的結(jié)果進(jìn)行渲染展示。下面是我們今天要用到的新的6個(gè)SQL函數(shù):
-
CAST ( expression AS type ):這是PostGresql原生提供的函數(shù),用于字段類型轉(zhuǎn)換 -
ST_Point (X:decimal, Y:decimal, UUID1, UUID2, ...):從給定的X和Y生成一個(gè)點(diǎn)要素。 -
ST_Pixelize (A:geometry, ResolutionX:int, ResolutionY:int, Boundary:geometry):將給定的geometry轉(zhuǎn)為圖片中的像素。 -
ST_Colorize (weight:Double, maxWeight:Double, mandatory color: string (Optional)):給定一個(gè)像素的權(quán)重,返回相應(yīng)的顏色。權(quán)重可以是任意含義的數(shù)值,比如溫度和濕度等觀測(cè)值。 -
ST_Render (A:pixel, B:color):給定像素和顏色,生成Java中的BufferedImage對(duì)象,代表了一副圖片。 -
ST_Envelope_Aggr (A:geometryColumn):返回A的最小外接矩形
加載數(shù)據(jù)
SparkSession spark = SparkSession.builder().
config("spark.serializer","org.apache.spark.serializer.KryoSerializer").
config("spark.kryo.registrator", "org.datasyslab.geospark.serde.GeoSparkKryoRegistrator").
master("local[*]").appName("Learn05").getOrCreate();
GeoSparkSQLRegistrator.registerAll(spark);
GeoSparkVizRegistrator.registerAll(spark);
// 加載CSV文件,CSV中的第一列為WKT格式
String inputCSVPath = Learn04.class.getResource("/checkin.csv").toString();
Dataset rawDF = spark.read().format("csv").
option("delimiter", ",").
option("header", "false").
load(inputCSVPath);
rawDF.createOrReplaceTempView("pointtable");
構(gòu)建幾何圖形(Geometry)
// 創(chuàng)建Geometry列
String sqlText = "select ST_Point(cast(_c0 as Decimal(24,20)), cast(_c1 as Decimal(24,20))) AS shape, _c2 from pointtable";
Dataset spatialDf = spark.sql(sqlText);
spatialDf.createOrReplaceTempView("pointtable");
spatialDf.show();
+--------------------+----------+
| shape| _c2|
+--------------------+----------+
|POINT (-88.331492...| hotel|
|POINT (-88.175933...| gas|
|POINT (-88.388954...| bar|
|POINT (-88.221102...|restaurant|
+--------------------+----------+
渲染
接下來(lái)就要對(duì)上面四個(gè)點(diǎn)進(jìn)行展示,首先我們要將地理位置轉(zhuǎn)為屏幕上的像素坐標(biāo),首先使用ST_Pixelize,但是ST_Pixelize要求首先提供一個(gè)邊界范圍,所以我們先用ST_Envelope_Aggr來(lái)生成bound。
sqlText = "SELECT ST_Envelope_Aggr(shape) as bound FROM pointtable";
spatialDf = spark.sql(sqlText);
spatialDf.createOrReplaceTempView("boundtable");
spatialDf.show();
+--------------------+
| bound|
+--------------------+
|POLYGON ((-88.388...|
+--------------------+
生成像素
sqlText = "SELECT pixel, shape FROM pointtable " +
"LATERAL VIEW ST_Pixelize(ST_Transform(shape, 'epsg:4326','epsg:3857'), 256, 256, (SELECT ST_Transform(bound, 'epsg:4326','epsg:3857') FROM boundtable)) AS pixel";
spatialDf = spark.sql(sqlText);
spatialDf.createOrReplaceTempView("pixels");
spatialDf.show();
+--------------------+--------------------+
| pixel| shape|
+--------------------+--------------------+
|Pixel(x=69.0, y=0...|POINT (-88.331492...|
|Pixel(x=255.0, y=...|POINT (-88.175933...|
|Pixel(x=0.0, y=23...|POINT (-88.388954...|
|Pixel(x=201.0, y=...|POINT (-88.221102...|
+--------------------+--------------------+
生成顏色
本次僅僅是展示點(diǎn),因此可以給每個(gè)點(diǎn)固定的顏色,所以這里權(quán)重就填1。
sqlText = "SELECT ST_Colorize(1, 1, 'red') as color, pixel FROM pixels";
spatialDf = spark.sql(sqlText);
spatialDf.createOrReplaceTempView("pixelaggregates");
spatialDf.show(false);
+------+----------------------------------------------------------------------------+
|color |pixel |
+------+----------------------------------------------------------------------------+
|-65536|Pixel(x=69.0, y=0.0, width=256, height=256, isDuplicate=false, tileId=-1) |
|-65536|Pixel(x=255.0, y=255.0, width=256, height=256, isDuplicate=false, tileId=-1)|
|-65536|Pixel(x=0.0, y=230.0, width=256, height=256, isDuplicate=false, tileId=-1) |
|-65536|Pixel(x=201.0, y=186.0, width=256, height=256, isDuplicate=false, tileId=-1)|
+------+----------------------------------------------------------------------------+
渲染
在得到像素點(diǎn)和顏色后,就可以調(diào)用ST_Render生成圖片了
sqlText = "SELECT ST_Render(pixel, color) AS image, (SELECT ST_AsText(bound) FROM boundtable) AS boundary FROM pixelaggregates" ;
spatialDf = spark.sql(sqlText);
spatialDf.createOrReplaceTempView("images");
spatialDf.show();
+--------------------------+--------------------+
| image| boundary|
+--------------------------+--------------------+
|Image(width=256height=256)|POLYGON ((-88.388...|
+--------------------------+--------------------+
保存圖片
Dataset<org.apache.spark.sql.Row> images = spark.table("images");
Row[] take = (Row[])images.take(1);
ImageSerializableWrapper image = (ImageSerializableWrapper)take[0].get(0);
new ImageGenerator().SaveRasterImageAsLocalFile(image.getImage(),System.getProperty("user.home") + "/point", ImageType.PNG);

1559199793770.png