美文网首页测量移动开发小水吧
关于Netty的ByteBuff内存泄漏问题

关于Netty的ByteBuff内存泄漏问题

作者: 贾亦真亦贾 | 来源:发表于2017-03-17 09:29 被阅读3609次

之前做的东华车管数据采集平台总是发生数据丢失的情况,虽然不频繁但是还是要关注一下原因,于是今天提高了Netty的Log级别,打算查找一下问题出在哪了,提高级别代码:

ServerBootstrap b =new ServerBootstrap();
b.group(bossGroup,workerGroup).channel(NioServerSocketChannel.class).option(ChannelOption.SO_BACKLOG, 2048).handler(new LoggingHandler(LogLevel.DEBUG)).childHandler(new ChildChannelHandler());

将Loglevel设置成DEBUG模式就OK了。
于是开始安心的观察日志:

2017-01-19 10:04:46  [ nioEventLoopGroup-1-0:1625429 ] - [ INFO ]  消息主体:60160308049620860021010707190117020453395443491162627407087d081f00002e37008801008c00f9
2017-01-19 10:04:49  [ nioEventLoopGroup-1-0:1628830 ] - [ ERROR ]  LEAK: ByteBuf.release() was not called before it's garbage-collected. Enable advanced leak reporting to find out where the leak occurred. To enable advanced leak reporting, specify the JVM option '-Dio.netty.leakDetectionLevel=advanced' or call ResourceLeakDetector.setLevel() See http://netty.io/wiki/reference-counted-objects.html for more information.
2017-01-19 10:04:49  [ nioEventLoopGroup-1-0:1628845 ] - [ INFO ]  入缓存队列操作结果:9
2017-01-19 10:04:49  [ nioEventLoopGroup-1-0:1628845 ] - [ INFO ]  消息主体:601603080496208600210107071901170204573954434611626262170f88091f00002e37008801008c00fa
2017-01-19 10:04:53  [ nioEventLoopGroup-1-0:1632839 ] - [ INFO ]  入缓存队列操作结果:9
2017-01-19 10:04:53  [ nioEventLoopGroup-1-0:1632839 ] - [ INFO ]  消息主体:60160308049620860021010707190117020501395443581162624817108a091f00002e37008801008c00fb
2017-01-19 10:04:55  [ nioEventLoopGroup-1-0:1634196 ] - [ INFO ]  入缓存队列操作结果:9
2017-01-19 10:04:55  [ nioEventLoopGroup-1-0:1634196 ] - [ INFO ]  消息主体:601603080496208600210107071901170205023954436011626244571288091f00002e37008801008c00fc
2017-01-19 10:04:56  [ nioEventLoopGroup-1-0:1635288 ] - [ INFO ]  入缓存队列操作结果:9
2017-01-19 10:04:56  [ nioEventLoopGroup-1-0:1635288 ] - [ INFO ]  消息主体:60160308049620860021010707190117020503395443651162624107118a091f00002e37008801008c00fd
2017-01-19 10:04:57  [ nioEventLoopGroup-1-0:1636443 ] - [ INFO ]  入缓存队列操作结果:9
2017-01-19 10:04:57  [ nioEventLoopGroup-1-0:1636443 ] - [ INFO ]  消息主体:601603080496208600210107071901170205053954437111626234671088091f00002e37008801008c00fe

注意这句话:

LEAK: ByteBuf.release() was not called before it's garbage-collected. Enable advanced leak reporting to find out where the leak occurred. To enable advanced leak reporting, specify the JVM option '-Dio.netty.leakDetectionLevel=advanced' or call ResourceLeakDetector.setLevel() See http://netty.io/wiki/reference-counted-objects.html for more information.

通过这句话我们可以得知,只要加入

ResourceLeakDetector.setLevel(ResourceLeakDetector.Level.ADVANCED);

将警告级别设置成Advaced即可查到更详细的泄漏信息,之后再度查看日志:

2017-01-19 10:35:59  [ nioEventLoopGroup-1-0:665092 ] - [ ERROR ]  LEAK: ByteBuf.release() was not called before it's garbage-collected. See http://netty.io/wiki/reference-counted-objects.html for more information.
Recent access records: 5
#5:
    io.netty.buffer.AdvancedLeakAwareByteBuf.readBytes(AdvancedLeakAwareByteBuf.java:435)
    com.dhcc.ObdServer.ObdServerHandler.channelRead(ObdServerHandler.java:31)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:243)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#4:
    Hint: 'ObdServerHandler#0' will handle the message from this point.
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:387)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:243)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#3:
    io.netty.buffer.AdvancedLeakAwareByteBuf.release(AdvancedLeakAwareByteBuf.java:721)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:237)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#2:
    io.netty.buffer.AdvancedLeakAwareByteBuf.retain(AdvancedLeakAwareByteBuf.java:693)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:277)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:216)
    io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:316)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:230)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
#1:
    io.netty.buffer.AdvancedLeakAwareByteBuf.skipBytes(AdvancedLeakAwareByteBuf.java:465)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:272)
    io.netty.handler.codec.DelimiterBasedFrameDecoder.decode(DelimiterBasedFrameDecoder.java:216)
    io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:316)
    io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:230)
    io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:84)
    io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:153)
    io.netty.channel.PausableChannelEventExecutor.invokeChannelRead(PausableChannelEventExecutor.java:86)
    io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:389)
    io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:956)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)
Created at:
    io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:250)
    io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:155)
    io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:146)
    io.netty.buffer.AbstractByteBufAllocator.ioBuffer(AbstractByteBufAllocator.java:107)
    io.netty.channel.AdaptiveRecvByteBufAllocator$HandleImpl.allocate(AdaptiveRecvByteBufAllocator.java:104)
    io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:113)
    io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:514)
    io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:471)
    io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:385)
    io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:351)
    io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116)
    io.netty.util.internal.chmv8.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1412)
    io.netty.util.internal.chmv8.ForkJoinTask.doExec(ForkJoinTask.java:280)
    io.netty.util.internal.chmv8.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:877)
    io.netty.util.internal.chmv8.ForkJoinPool.scan(ForkJoinPool.java:1706)
    io.netty.util.internal.chmv8.ForkJoinPool.runWorker(ForkJoinPool.java:1661)
    io.netty.util.internal.chmv8.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:126)

定位到我的代码中为:

ByteBuf buff=(ByteBuf) msg;
byte[] req=new byte[buff.readableBytes()];

于是可以确定是ByteBuff内存泄漏导致的问题,于是从这方面着手调查,发现netty5默认的分配bytebuff的方式是PooledByteBufAllocator,所以要手动回收,要不然会造成内存泄漏。
于是释放ByteBuff即可

ReferenceCountUtil.release(buff);

这里引入一个网友对于这行代码的说明:

ReferenceCountUtil.release()其实是ByteBuf.release()方法(从ReferenceCounted接口继承而来)的包装。netty4中的ByteBuf使用了引用计数(netty4实现了一个可选的ByteBuf池),每一个新分配的ByteBuf>>的引用计数值为1,每对这个ByteBuf对象增加一个引用,需要调用ByteBuf.retain()方法,而每减少一个引用,需要调用ByteBuf.release()方法。当这个ByteBuf对象的引用计数值为0时,表示此对象可回收。我这只是用ByteBuf说明,还有其他对象实现了ReferenceCounted接口,此时同理。

在检查问题的过程中,我还怀疑是不是我的Netty使用了UDP协议导致的数据丢失,于是这里附上Netty使用的是TCP还是UDP的判断方法:

关于TCP和UDP
socket可以基于TCP,也可以基于UDP。区别在于UDP的不保证数据包都正确收到,所以性能更好,但容错不高。TCP保证不错,所以性能没那么好。
UDP基本只适合做在线视频传输之类,我们的需求应该会是TCP。

那这2种方式在写法上有什么不同?网上搜到这样的说法:

在ChannelFactory 的选择上,UDP的通信选择 NioDatagramChannelFactory,TCP的通信我们选择的是NioServerSocketChannelFactory;
在Bootstrap的选择上,UDP选择的是ConnectionlessBootstrap,而TCP选择的是ServerBootstrap。

对于编解码器decoder和Encoder,以及ChannelPipelineFactory,UDP开发与TCP并没有什么区别,在此不做详细介绍。

对于ChannelHandler,是UDP与TCP区别的核心所在。大家都知道UDP是无连接的,也就是说你通过 MessageEvent 参数对象的 getChannel() 方法获取当前会话连接,但是其 isConnected() 永远都返回 false。
UDP 开发中在消息获取事件回调方法中,获取了当前会话连接 channel 对象后可直接通过 channel 的 write 方法发送数据给对端 channel.write(message, remoteAddress),第一个参数仍然是要发送的消息对象,
第二个参数则是要发送的对端 SocketAddress 地址对象。
这里最需要注意的一点是SocketAddress,在TCP通信中我们可以通过channel.getRemoteAddress()获得,但在UDP通信中,我们必须从MessageEvent中通过调用getRemoteAddress()方法获得对端的SocketAddress 地址。

相关文章

  • 关于Netty的ByteBuff内存泄漏问题

    之前做的东华车管数据采集平台总是发生数据丢失的情况,虽然不频繁但是还是要关注一下原因,于是今天提高了Netty的L...

  • 每周阅读(7/25)

    追踪 Netty 异常占用堆外内存的经验分享LeanCloud团队关于Netty堆外内存泄漏的调查 MongoDB...

  • Netty的内存泄漏问题

    在测试netty时发现这个问题 两种方式改变日志级别 ResourceLeakDetector.setLevel(...

  • Netty堆外内存泄漏排查,这一篇全讲清楚了

    上篇文章介绍了Netty内存模型原理,由于Netty在使用不当会导致堆外内存泄漏,网上关于这方面的资料比较少,所以...

  • Netty内存泄漏问题排查

    1. 问题描述 手机收到告警 “主机内存不足, 使用率高达 90.31%, 使用了 3.49G”。线上机器查看日...

  • Controller销毁NSTimer释放的细节

    关于NSTimer释放和内存泄漏的问题。 @(NSTimer)[内存管理,NSTimer释放,循环引用] 首先需要...

  • Xcode调试工具

    一.静态内存分析工具 编译阶段查找内存泄漏等问题 1.常见内存泄漏问题 常见的内存泄漏除了循环引用,CoreFou...

  • 每日一问:说说你对 LeakCanary 的了解

    昨天的问题说到了关于 内存泄漏需要注意的点,在文章最后有说到 LeakCanary 检测内存泄漏。实际上,我相信绝...

  • JVM

    直接内存 使用场景:Unsafe类、NIO零拷贝、Netty的零拷贝、JNI 优点:性能更高 缺点:内存泄漏难排查...

  • 网络编程Netty之ByteBuf详解

    Netty中的ByteBuf优势 NIO使用的ByteBuffer有哪些缺点 1: 无法动态扩容,ByteBuff...

网友评论

  • 948a51ed9ccd:“于是可以确定是ByteBuff内存泄漏导致的问题,于是从这方面着手调查,发现netty5默认的分配bytebuff的方式是PooledByteBufAllocator,所以要手动回收,要不然会造成内存泄漏。”
    为什么用PooledByteBufAllocator需要手动回收? PooledByteBufAllocator不是堆内的内存吗,堆内是被JVM管理的。
    陈森来:1、Netty对实现ReferenceCounted接口的ByteBuf引用计数,new出来时计数为1,retain时加1,release时减1,且调用release时计数为1(除了自己没有其他人引用ByteBuf)则deallocate。
    2、大部分情况下无须手动调用release。例如处理Inbound channelRead事件,如果用了ByteToMessageDecoder解码器Netty会在解码后调用release, 处理outbound write事件,HeadContext最后write的时候调用unsafe中会调用release。
    3、PooledByteBufAllocator分配的ByteBuf需要调用release-deallocate把分配内存块还给内存池,否则内存池一直标记着该内存块是分配出去的。DirectByteBuf也是需要release-deallocate释放堆外内存。
    4、为了避免内存泄漏,Netty为Direct和Pool分配的ByteBuf包了一层AdvancedLeakAwareByteBuf实现内存泄漏检测。
    以面是背景知识。

    楼主博客中提到的内存泄漏,目测是在处理解码时不是通过继承ByteToMessageDecoder实现且自己实现包解码器时没有调用release。

    为什么用PooledByteBufAllocator需要手动回收?
    -- 是需要回收,但是不一定需要手动编码释放,Netty在ChannelHandle中处理ByteBuf可能已帮我们释放,具体要看channelPiple中数据处理流动。
    PooledByteBufAllocator不是堆内的内存吗,堆内是被JVM管理的。
    -- 不一定是堆内,也可以是堆外。Pool分配内存回收的意义在于可重新分配出去。
    贾亦真亦贾:https://zhuanlan.zhihu.com/p/21741364 你看看这篇文章 是否对你有帮助。
    贾亦真亦贾:PooledByteBufAllocator是基于线程上下文实现的,所以它不能跨线程。跨线程之后实际操作的就不是同一块内存区域,这会导致很多严重的问题,内存泄露便是其中之一。
    源码:
    final ThreadLocal<PoolThreadCache> threadCache = new ThreadLocal<PoolThreadCache>() {
    private final AtomicInteger index = new AtomicInteger();
    @Override
    protected PoolThreadCache initialValue() {
    final int idx = index.getAndIncrement();
    final PoolArena<byte[]> heapArena;
    final PoolArena<ByteBuffer> directArena;
    if (heapArenas != null) {
    heapArena = heapArenas[Math.abs(idx % heapArenas.length)];
    } else {
    heapArena = null;
    }
    if (directArenas != null) {
    directArena = directArenas[Math.abs(idx % directArenas.length)];
    } else {
    directArena = null;
    }
    return new PoolThreadCache(heapArena, directArena);
    }
    当然 我这里分析的也可能不对 因为后来发现问题也不是这里出现的。

本文标题:关于Netty的ByteBuff内存泄漏问题

本文链接:https://www.haomeiwen.com/subject/faasnttx.html