HIVE UDTF 自定義函數(shù)

HIVE UDTF 自定義函數(shù)

關(guān)鍵詞:HIVE UDTF 開發(fā) 實(shí)例
Hive運(yùn)行用戶自定義函數(shù)對數(shù)據(jù)信息處理,可以試用show functions查看 hive當(dāng)前支持的函數(shù),查看凡是如下

hive> show functions
    > ;
OK
!
!=
%
&
*
+
-
/

hive支持三種類型的UDF函數(shù):

  • 普通UDF函數(shù)
    操作單個(gè)數(shù)據(jù)行,且產(chǎn)生一個(gè)數(shù)據(jù)作為輸出。例如(數(shù)學(xué)函數(shù),字符串函數(shù))
  • 聚合udf (UDAF)
    接受多個(gè)數(shù)據(jù)行,并產(chǎn)生一個(gè)數(shù)據(jù)行作為輸出。例如(COUNT,MAX函數(shù)等)
  • 表生成UDF(UDTF)
    接受一個(gè)數(shù)據(jù)行,然后返回產(chǎn)生多個(gè)數(shù)據(jù)行(一個(gè)表作為輸出)

UDTF自定義函數(shù)的實(shí)現(xiàn):

編碼實(shí)現(xiàn):
UDTF函數(shù)的實(shí)現(xiàn)必須通過繼承抽象類GenericUDTF,并且要實(shí)現(xiàn)initialize, process,close 函數(shù)。

  • initialize實(shí)現(xiàn)如下:
package com.jd.risk.hive.UDTF;

import org.apache.hadoop.hive.ql.exec.UDFArgumentException;
import org.apache.hadoop.hive.ql.udf.generic.GenericUDTF;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory;
import org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory;

import java.util.ArrayList;
import java.util.List;
public class FeatureParseUDTF extends GenericUDTF {

    private PrimitiveObjectInspector stringOI = null;

    @Override
    public StructObjectInspector initialize(ObjectInspector[] objectInspectors) throws UDFArgumentException {

        // 異常檢測
        if (objectInspectors.length != 1) {
            throw new UDFArgumentException("NameParserGenericUDTF() takes exactly one argument");
        }

        if(objectInspectors[0].getCategory()!=ObjectInspector.Category.PRIMITIVE&&((PrimitiveObjectInspector) objectInspectors[0]).getPrimitiveCategory() != PrimitiveObjectInspector.PrimitiveCategory.STRING) {
            throw new UDFArgumentException("NameParserGenericUDTF() takes a string as a parameter");
        }

        //輸入
        stringOI = (PrimitiveObjectInspector) objectInspectors[0];

        // 輸出
        List<String> fieldNames = new ArrayList<String>(2);
        List<ObjectInspector> fieldOIs = new ArrayList<ObjectInspector>(2);

        // 輸出列名
        fieldNames.add("name");
        fieldNames.add("value");
        fieldOIs.add(PrimitiveObjectInspectorFactory.javaStringObjectInspector);
        fieldOIs.add(PrimitiveObjectInspectorFactory.javaStringObjectInspector);
        return ObjectInspectorFactory.getStandardStructObjectInspector(fieldNames, fieldOIs);
    }
}

hive將通過initialize方法來獲取UDTF函數(shù)要求的參數(shù)類型然后返回與UDTF函數(shù)輸出行對象相應(yīng)的Inspector。initialize使用PrimitiveObjectInspector來解析輸入的數(shù)據(jù),同時(shí)定義輸出對象Inspector所需要的field。

  • process實(shí)現(xiàn)如下:
    @Override
    public void process(Object[] record) throws HiveException {
    
        final String feature = stringOI.getPrimitiveJavaObject(record[0]).toString();
        ArrayList<Object[]> results = parseInputRecord(feature);
        Iterator<Object[]> it = results.iterator();
        while (it.hasNext()){
            Object[] r= it.next();
            forward(r);
        }
    }
    /**
     * 解析函數(shù),將json格式字符格式化成多行數(shù)據(jù)
     * @param feature
     * @return
     */
    public ArrayList<Object[]> parseInputRecord(String feature){
        ArrayList<Object[]> resultList = null;
        try {
            JSONObject json = JSON.parseObject(feature);
            resultList = new ArrayList<Object[]>();
            for (String nameSpace : json.keySet()) {
                JSONObject dimensionJson = json.getJSONObject(nameSpace);
                for (String dimensionName : dimensionJson.keySet()) {
                    JSONObject featureJson = dimensionJson.getJSONObject(dimensionName);
                    for (String featureName : featureJson.keySet()) {
                        String property_name = nameSpace + ":" + dimensionName + ":" + featureName;
                        Object[] item = new Object[2];
                        item[0] = property_name;
                        item[1] = featureJson.get(featureName);
                        resultList.add(item);
                    }
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
        return resultList;
    }

process函數(shù)實(shí)現(xiàn)具體的數(shù)據(jù)解析過程,在通過stringIO獲取輸入字段,程序中使用parseInputRecord方法將json字符串解析成多個(gè)字符,將返回一個(gè)List完成一行轉(zhuǎn)多行的任務(wù)。最后forward將多行數(shù)據(jù)做udtf函數(shù)的輸出。

  • close實(shí)現(xiàn)如下:
    @Override
    public void close() throws HiveException {

  }
  • maven 依賴:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.jd.udf</groupId>
    <artifactId>featureParse</artifactId>
    <version>1.0-SNAPSHOT</version>


    <dependencies>

        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-exec</artifactId>
            <version>0.12.0</version>
            <scope>provided</scope>
        </dependency>
        <!-- JSON -->
        <dependency>
            <groupId>com.alibaba</groupId>
            <artifactId>fastjson</artifactId>
            <version>1.1.31</version>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <configuration>
                    <archive>
                        <manifest>
                            <mainClass>com.allen.capturewebdata.Main</mainClass>
                        </manifest>
                    </archive>
                    <descriptorRefs>
                        <descriptorRef>jar-with-dependencies</descriptorRef>
                    </descriptorRefs>
                </configuration>
            </plugin>
        </plugins>
    </build>

</project>

打包命令:

mvn assembly:assembly

UDTF使用方式:
未使用UDTF函數(shù):

hive> select features from table1 where dt = '2017-07-18'
OK
{"rcm": {"ordering_date": {"feature1": "0","feature2": "1","feature3": "2"}}}
Time taken: 505.014 seconds, Fetched: 1 row(s)
hive>   

使用UDTF函數(shù):

hive> select featureParseUDTF(features)from table1 where dt = '2017-07-18'
OK
rcm:ordering_date:feature3 2
rcm:ordering_date:feature2 1
rcm:ordering_date:feature1 0
Time taken: 505.014 seconds, Fetched: 3 row(s)
hive>   

加載featureParseUDTF方法:

hive> add jar  /home/udtf/featureParse-1.0-SNAPSHOT-jar-with-dependencies.jar
    > ;
Added [/home/udtf/featureParse-1.0-SNAPSHOT-jar-with-dependencies.jar] to class path
Added resources: [/home/udtf/featureParse-1.0-SNAPSHOT-jar-with-dependencies.jar]
hive>  Create  temporary function featureParseUDTF as 'com.jd.risk.hive.UDTF.FeatureParseUDTF';
OK
Time taken: 0.024 seconds
hive> select featureParseUDTF(features)from table1 where dt = '2017-07-18'
OK
rcm:ordering_date:feature3 2
rcm:ordering_date:feature2 1
rcm:ordering_date:feature1 0
Time taken: 505.014 seconds, Fetched: 3 row(s)

參考文獻(xiàn)

1、http://beekeeperdata.com/posts/hadoop/2015/07/26/Hive-UDTF-Tutorial.html
2、https://acadgild.com/blog/hive-udtf/
3、http://db3.iteye.com/blog/1072778

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容