Introduction of Netty
Netty is a high-performance, highly scalable, asynchronous event-driven network application framework that greatly simplifies TCP and UDP client and server network development. It is a NIO framework that encapsulates Java NIO well. As an asynchronous NIO framework, all I/O operations of Netty are asynchronous and non-blocking. Through the Future-Listener mechanism, users can obtain the RESULTS of I/O operations proactively or through notification mechanism.
The characteristics of Netty
- Unified API for different protocols
- Based on a flexible, extensible event-driven model
- Highly customizable threading model
- Better throughput, lower latency
- Save more resources and minimize unnecessary memory copy
- Full SSL/TLS and STARTTLS support
- Works well in applets and Android’s restricted environments
- Outofmemoryerrors are no longer caused by fast, slow, or overloaded connections
- There are no longer inconsistent NIO read/write frequencies on high-speed networks
Netty core content
The core content of Netty mainly includes the following four aspects:
- Reactor Thread Model: A high performance multithreaded programming approach
- Channel concept: enhanced version of Channel concept in Netty
- ChannelPipeline responsibility chain design pattern: event handling mechanism
- Memory management: Enhanced ByteBuf buffers
Netty overall structure diagram
Netty core components
EventLoop: EventLoop maintains a thread and task queue and supports asynchronous submission of tasks. EventLoop implements the Executor interface itself. When the Executor method is called to submit a task, it determines whether to start it or not. If it is not started, it calls the built-in Executor to create a new thread to trigger the execution of the run method. The general process reference SingleThreadEventExecutor Netty source code is as follows:
EventLoopGroup: An EventLoopGroup mainly manages the life cycle of eventLoop. It can be regarded as a thread pool, which maintains a group of Eventloops. Each eventLoop processes multiple channels, and a Channel can only correspond to one eventLoop
Bootstrap: Bootstrap is used for the client to connect to a remote host and has one EventLoopGroup. When Bootstrap calls bind() and connect(), it creates a single Channel with no parent Channel to implement all network exchanges.
ServerBootstrap: ServerBootstrap is the bootstrap class of the server. The main user server is bound to a local port and has two EventLoopGroups. ServerBootstarp when the bind() method is called, a ServerChannel is created to accept connections from the client, and the ServerChannel manages multiple subchannels for communication with the client.
Channel: A Channel in Netty is an abstract concept, which can be understood as an enhancement and extension of Java NIO Channel, adding many new properties and methods, such as bing methods, etc.
ChannelFuture: ChannelFuture can register one or more instances of ChannelFutureListener to be notified when an operation completes, whether it succeeds or fails. ChannelFuture stores the results of subsequent operations and cannot predict when they will be executed. Operations submitted to the Channel are executed in the order they are awakened.
ChannelHandler: ChannelHandler is used to handle business logic, with both inbound and outbound implementations.
ChannelPipeline: The ChannelPipeline provides a container for the ChannelHandler chain and defines an API for propagating a stream of inbound and outbound events on that chain.
Netty threading model
Netty’s threading model is based on the Reactor schema. About the Reactor model, you can refer to [Reactor Model](2.1.3 Reactor model.MD). Netty can support single-thread Reactor model, multi-thread Reactor model and principal/slave multiple Reactor model according to the user’s configuration. The general flow in Netty is as follows:
Netty starting code example
Examples of server-side code:
import io.netty.bootstrap.ServerBootstrap; import io.netty.buffer.ByteBuf; import io.netty.buffer.Unpooled; import io.netty.channel.*; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.SocketChannel; import io.netty.channel.socket.nio.NioServerSocketChannel; import io.netty.handler.logging.LogLevel; import io.netty.handler.logging.LoggingHandler; import java.nio.charset.Charset; Public class EchoServer {public static void main(String[] args) {// Accept thread group, BossGroup = new NioEventLoopGroup(1); EventLoopGroup workerGroup = new NioEventLoopGroup(1); ServerBootstrap b = new ServerBootstrap(); b.group(bossGroup, WorkerGroup) / / bind two thread group. The channel (NioServerSocketChannel. Class) / / specified channel types. The option (ChannelOption SO_BACKLOG, Handler(new LoggingHandler(loglevel.info)) // set the LogLevel. ChildHandler (new) ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel socketChannel) throws Exception { ChannelPipeline pipeline = socketChannel.pipeline(); // Get the processor chain pipeline.addLast(new EchoServerHandler()); // Add a new component handler}}); / / bybindChannelFuture f = b.bind(8080).sync(); CloseFuture ().sync(); // Block the main thread until the network service is shut down. } catch (Exception e) { e.printStackTrace(); } finally { workerGroup.shutdownGracefully(); bossGroup.shutdownGracefully(); }}} class EchoServerHandler extends ChannelInboundHandlerAdapter {/ / every time new data received from the client, @override public void channelRead(ChannelHandlerContext CTX, Object msg) throws Exception { System.out.println("Received data:" + ((ByteBuf) msg).toString(Charset.defaultCharset()));
ctx.write(Unpooled.wrappedBuffer("Server message".getBytes())); ctx.fireChannelRead(msg); @Override public void channelReadComplete(ChannelHandlerContext CTX) throws Exception {ctx.Flush (); } @override public void exceptionCaught(ChannelHandlerContext CTX, ChannelHandlerContext CTX, Throwable cause) throws Exception { cause.printStackTrace(); ctx.close(); }}Copy the code
Examples of client code:
import io.netty.bootstrap.Bootstrap;
import io.netty.buffer.ByteBuf;
import io.netty.buffer.Unpooled;
import io.netty.channel.*;
import io.netty.channel.nio.NioEventLoopGroup;
import io.netty.channel.socket.SocketChannel;
import io.netty.channel.socket.nio.NioSocketChannel;
import java.nio.charset.Charset;
public class EchoClient {
public static void main(String[] args) {
EventLoopGroup group = new NioEventLoopGroup();
try {
Bootstrap b = new Bootstrap();
b.group(group)
.channel(NioSocketChannel.class)
.option(ChannelOption.TCP_NODELAY, true) .handler( new ChannelInitializer<SocketChannel>() { @Override public void initChannel(SocketChannel ch) throws Exception { ChannelPipeline p = ch.pipeline(); p.addLast(new EchoClientHandler()); }}); ChannelFuture f = b.connect("127.0.0.1", 8080).sync();
f.channel().closeFuture().sync();
} catch (Exception e) {
e.printStackTrace();
} finally {
group.shutdownGracefully();
}
}
}
class EchoClientHandler extends ChannelInboundHandlerAdapter {
private final ByteBuf firstMessage;
public EchoClientHandler() {
firstMessage = Unpooled.buffer(256);
for (int i = 0; i < firstMessage.capacity(); i++) {
firstMessage.writeByte((byte) i);
}
}
@Override
public void channelActive(ChannelHandlerContext ctx) {
ctx.writeAndFlush(firstMessage);
}
@Override
public void channelRead(ChannelHandlerContext ctx, Object msg) {
System.out.println("Received data:" + ((ByteBuf) msg).toString(Charset.defaultCharset()));
ctx.write(Unpooled.wrappedBuffer("Client message".getBytes())); } @Override public void channelReadComplete(ChannelHandlerContext ctx) { ctx.flush(); } @Override public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) { cause.printStackTrace(); ctx.close(); }}Copy the code