instructions
The socket-based implementation of the previous article is very simple, but for actual production, netTY is generally used.
As for netty’s advantages, please refer to:
Why Netty?
Houbb. Making. IO / 2019/05/10 /…
Code implementation
Maven is introduced into
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>${netty.version}</version>
</dependency>
Copy the code
Import the Maven package corresponding to Netty, 4.1.17.final.
Server-side code implementation
Netty’s server startup code is relatively fixed.
package com.github.houbb.rpc.server.core;
import com.github.houbb.log.integration.core.Log;
import com.github.houbb.log.integration.core.LogFactory;
import com.github.houbb.rpc.server.constant.RpcServerConst;
import com.github.houbb.rpc.server.handler.RpcServerHandler;
import io.netty.bootstrap.ServerBootstrap;
import io.netty.channel.*;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.nio.NioServerSocketChannel;
/** * RPC server *@author binbin.hou
* @since0.0.1 * /
public class RpcServer extends Thread {
private static final Log log = LogFactory.getLog(RpcServer.class);
/** * port number */
private final int port;
public RpcServer(a) {
this.port = RpcServerConst.DEFAULT_PORT;
}
public RpcServer(int port) {
this.port = port;
}
@Override
public void run(a) {
// Start the server
log.info("RPC service starts server");
EventLoopGroup bossGroup = new NioEventLoopGroup();
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
ServerBootstrap serverBootstrap = new ServerBootstrap();
serverBootstrap.group(workerGroup, bossGroup)
.channel(NioServerSocketChannel.class)
.childHandler(new ChannelInitializer<Channel>() {
@Override
protected void initChannel(Channel ch) throws Exception {
ch.pipeline().addLast(newRpcServerHandler()); }})// This parameter affects connections that have not yet been accepted
.option(ChannelOption.SO_BACKLOG, 128)
// The server will send an ACK packet to determine if the client is still alive after a period of time when the client does not respond.
.childOption(ChannelOption.SO_KEEPALIVE, true);
// Bind the port to start receiving incoming links
ChannelFuture channelFuture = serverBootstrap.bind(port).syncUninterruptibly();
log.info("RPC server startup completed, listening [" + port + "] port");
channelFuture.channel().closeFuture().syncUninterruptibly();
log.info("RPC server shutdown complete");
} catch (Exception e) {
log.error("RPC service exception", e);
} finally{ workerGroup.shutdownGracefully(); bossGroup.shutdownGracefully(); }}}Copy the code
For simplicity, the server startup port number is fixed, and the RpcServerConst constant class contains the following contents:
public final class RpcServerConst {
private RpcServerConst(a){}
/** * Default port *@since0.0.1 * /
public static final int DEFAULT_PORT = 9627;
}
Copy the code
RpcServerHandler
Of course, one of the more core classes is RpcServerHandler
public class RpcServerHandler extends SimpleChannelInboundHandler {
@Override
protected void channelRead0(ChannelHandlerContext ctx, Object msg) throws Exception {
// do nothing now}}Copy the code
Currently, the implementation is empty. You can add the corresponding log output and logical processing later.
test
The code to start the test is very simple:
/** * Service start code test *@paramThe args parameter * /
public static void main(String[] args) {
new RpcServer().start();
}
Copy the code
instructions
Above we have implemented the server side implementation, this section to take a look at the client client code implementation.
Code implementation
RpcClient
/* * Copyright (c) 2019. houbinbin Inc. * rpc All rights reserved. */
package com.github.houbb.rpc.client.core;
import com.github.houbb.log.integration.core.Log;
import com.github.houbb.log.integration.core.LogFactory;
import com.github.houbb.rpc.client.handler.RpcClientHandler;
import io.netty.bootstrap.Bootstrap;
import io.netty.channel.Channel;
import io.netty.channel.ChannelFuture;
import io.netty.channel.ChannelInitializer;
import io.netty.channel.ChannelOption;
import io.netty.channel.EventLoopGroup;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.nio.NioSocketChannel;
import io.netty.handler.logging.LogLevel;
import io.netty.handler.logging.LoggingHandler;
/** * <p> RPC client </p> ** <pre> Created: 2019/10/16 11:21 PM </pre> * <pre> Project: RPC </pre> **@author houbinbin
* @sinceHundreds * /
public class RpcClient extends Thread {
private static final Log log = LogFactory.getLog(RpcClient.class);
/** * listen port number */
private final int port;
public RpcClient(int port) {
this.port = port;
}
public RpcClient(a) {
this(9527);
}
@Override
public void run(a) {
// Start the server
log.info("RPC service starts client");
EventLoopGroup workerGroup = new NioEventLoopGroup();
try {
Bootstrap bootstrap = new Bootstrap();
ChannelFuture channelFuture = bootstrap.group(workerGroup)
.channel(NioSocketChannel.class)
.option(ChannelOption.SO_KEEPALIVE, true)
.handler(new ChannelInitializer<Channel>(){
@Override
protected void initChannel(Channel ch) throws Exception {
ch.pipeline()
.addLast(new LoggingHandler(LogLevel.INFO))
.addLast(new RpcClientHandler());
}
})
.connect("localhost", port)
.syncUninterruptibly();
log.info("RPC service startup client completed, listening port:" + port);
channelFuture.channel().closeFuture().syncUninterruptibly();
log.info("RPC service started client shut down");
} catch (Exception e) {
log.error("RPC client encountered an exception", e);
} finally{ workerGroup.shutdownGracefully(); }}}Copy the code
.connect(“localhost”, port) Specifies the server to which the client needs to connect. The port must be the same as that on the server.
RpcClientHandler
Client-side processing classes are also simpler and left blank for now.
/* * Copyright (c) 2019. houbinbin Inc. * rpc All rights reserved. */
package com.github.houbb.rpc.client.handler;
import io.netty.channel.ChannelHandlerContext;
import io.netty.channel.SimpleChannelInboundHandler;
/** * <p> Client processing class </p> ** <pre> Created: 2019/10/1611:30pm </pre> * <pre> Project: RPC </pre> **@author houbinbin
* @sinceHundreds * /
public class RpcClientHandler extends SimpleChannelInboundHandler {
@Override
protected void channelRead0(ChannelHandlerContext ctx, Object msg) throws Exception {
// do nothing.}}Copy the code
Start the test
The service side
Start the server first.
The client
Then start the client to connect to the server, as follows:
/** * Service start code test *@paramThe args parameter * /
public static void main(String[] args) {
new RpcClient().start();
}
Copy the code
summary
In order to facilitate learning, the above source code has been open source:
github.com/houbb/rpc
I am old ma, looking forward to meeting with you next time.