Hadoop Rpc(二):使用hadoop rpc實現(xiàn)自己的協(xié)議

在上一章Hadoop Rpc(一)已經(jīng)淺顯的使用了protobuffer,現(xiàn)在就基于上一篇里生成的類,基于hadoop rpc來實現(xiàn)咱們自己的協(xié)議。這樣也方便咱們在IDE中對hadoop rpc的代碼進行調(diào)試。

第一步:定一個接口繼承自rpc.proto.MyResourceTracker.MyResourceTrackerService.BlockingInterface

rpc.proto.MyResourceTracker.MyResourceTrackerService.BlockingInterface是protoc編譯MyResourceTracker.proto文件生成的類
這一步的目的是能通過ProtocolInfo注解來定義協(xié)議的版本,版本是必要的,否則運行時會報錯

@ProtocolInfo(protocolName = "rpc.proto.MyResourceTracker", protocolVersion = 1)
public interface MyResourceTracker extends rpc.proto.MyResourceTracker.MyResourceTrackerService.BlockingInterface {
}

第二步:實現(xiàn)協(xié)議的實現(xiàn)類

public class MyResourceTrackerService implements MyResourceTracker {
    @Override
    public MyResourceTrackerProtos.MyResourceTrackerResponseProto registerNodeManager(RpcController controller, MyResourceTrackerProtos.MyResourceTrackerRequestProto req) throws ServiceException {
        int cpu = req.getCpu();
        int memory = req.getMemory();
        String hostId = req.getHostId();
        System.out.println(String.format("cpu: %d, memory: %d, hostId: %s", cpu, memory, hostId));
        return MyResourceTrackerProtos.MyResourceTrackerResponseProto.newBuilder().setFlag(true).build();
    }
}

第三步:生成server并啟動

注意的點在代碼中進行注釋

public class Server {
    public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();
// 這里不設置,默認使用過期的WritableRpcEngine。
// 第二個參數(shù)一定要與下面的setProtocol中的類相同
        RPC.setProtocolEngine(conf, MyResourceTracker.class, ProtobufRpcEngine.class);
// 這個是必須的,這里就是生成stub程序,
// blockingService實際就是個裝飾器,實際干活的就是實現(xiàn)類MyResourceTrackerService
        BlockingService blockingService = rpc.proto.MyResourceTracker.MyResourceTrackerService.newReflectiveBlockingService(new MyResourceTrackerService());
        RPC.Server server = new RPC.Builder(conf).
                setProtocol(MyResourceTracker.class).
                setInstance(blockingService).
                setBindAddress("127.0.0.1").
                setPort(2222).
                setNumHandlers(10).build();
        server.start();
    }
}

第四步:生成client

注意的點在代碼中進行注釋

public class Client {
    public static void main(String[] args) throws IOException, ServiceException {
        Configuration conf = new Configuration();
        // 必須設置,不然會使用WritableRpcEngine
        RPC.setProtocolEngine(conf, MyResourceTracker.class, ProtobufRpcEngine.class);
        // 根據(jù)server端設置就好,注意version要一致
        MyResourceTracker proxy = RPC.getProxy(MyResourceTracker.class, 1, new InetSocketAddress("127.0.0.1", 2222), conf);
        MyResourceTrackerProtos.MyResourceTrackerRequestProto req = MyResourceTrackerProtos.MyResourceTrackerRequestProto.newBuilder().
                setCpu(1).setMemory(2).setHostId("localhost:5006").build();
        MyResourceTrackerProtos.MyResourceTrackerResponseProto myResourceTrackerResponseProto = proxy.registerNodeManager(null, req);
        System.out.println(myResourceTrackerResponseProto.getFlag());
    }
}

運行

服務端:


服務端輸出

客戶端:


客戶端輸出

POM文件

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>hadoop-learn</artifactId>
    <packaging>pom</packaging>
    <version>1.0-SNAPSHOT</version>

    <modules>
        <module>learn-hadoop-yarn</module>
        <module>java-analysis</module>
        <module>learn-antlr4</module>
        <module>learn-zookeeper</module>
    </modules>

    <properties>
        <maven.compiler.source>11</maven.compiler.source>
        <maven.compiler.target>11</maven.compiler.target>
        <hadoop.version>3.2.2</hadoop.version>
        <antlr4.version>4.8</antlr4.version>
        <junit.version>4.12</junit.version>
        <protobuf.version>2.5.0</protobuf.version>
        <slf4j.version>1.7.25</slf4j.version>
        <guava.version>27.0-jre</guava.version>
        <commons-collections.version>3.2.2</commons-collections.version>
        <commons-lang3.version>3.7</commons-lang3.version>
        <htrace3.version>3.1.0-incubating</htrace3.version>
        <htrace4.version>4.1.0-incubating</htrace4.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <scope>compile</scope>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-auth</artifactId>
            <scope>compile</scope>
            <version>${hadoop.version}</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>${junit.version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>com.google.protobuf</groupId>
            <artifactId>protobuf-java</artifactId>
            <version>${protobuf.version}</version>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.woodstox</groupId>
            <artifactId>woodstox-core</artifactId>
            <version>5.0.3</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-api</artifactId>
            <version>${slf4j.version}</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>${slf4j.version}</version>
        </dependency>
        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>jul-to-slf4j</artifactId>
            <version>${slf4j.version}</version>
        </dependency>
        <dependency>
            <groupId>com.google.guava</groupId>
            <artifactId>guava</artifactId>
            <version>${guava.version}</version>
        </dependency>
        <dependency>
            <groupId>commons-collections</groupId>
            <artifactId>commons-collections</artifactId>
            <version>${commons-collections.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-configuration2</artifactId>
            <version>2.1.1</version>
            <exclusions>
                <exclusion>
                    <groupId>org.apache.commons</groupId>
                    <artifactId>commons-lang3</artifactId>
                </exclusion>
            </exclusions>
        </dependency>
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-lang3</artifactId>
            <version>${commons-lang3.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.htrace</groupId>
            <artifactId>htrace-core</artifactId>
            <version>${htrace3.version}</version>
        </dependency>
        <dependency>
            <groupId>org.apache.htrace</groupId>
            <artifactId>htrace-core4</artifactId>
            <version>${htrace4.version}</version>
        </dependency>
    </dependencies>
    <build>
        <plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.8.1</version>
                <configuration>
                    <encoding>UTF-8</encoding>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
        </plugins>
    </build>
</project>
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務。

相關閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容