Skip to content

Upgrade Netty dependency and fix Thread shutdown #7

@tgpfeiffer

Description

@tgpfeiffer

I am working on a Scala project that uses both Apache Spark (http://spark.apache.org/) and msgpack-rpc. Spark (transitively) depends on io.netty:netty:3.6.6.Final while msgpack-rpc depends on org.jboss.netty:netty:3.2.1.Final.

If I just include the dependencies to Spark and msgpack-rpc in my build.sbt file, then Spark will refuse starting with

java.lang.VerifyError: (class: org/jboss/netty/channel/socket/nio/NioWorkerPool, method: createWorker signature: (Ljava/util/concurrent/Executor;)Lorg/jboss/netty/channel/socket/nio/AbstractNioWorker;) Wrong return type in function
at akka.remote.transport.netty.NettyTransport.(NettyTransport.scala:282)
at akka.remote.transport.netty.NettyTransport.(NettyTransport.scala:239)
...

So I edited my sbt file and excluded the transitive dependency on org.jboss.netty:netty from the msgpack-rpc dependency; that is, the (newer) netty version 3.6.6.Final is used. msgpack-rpc will still run fine, but has problems with shutdown: client.getEventLoop.shutdown() will not stop the Netty threads, a lot of "New I/O worker" and one "New I/O boss" thread will stay alive.

This is a problem, for instance, when using sbt run, because this will wait for all non-daemon threads to exit before exiting/returning to the sbt shell. This is never the case with the newer Netty version.

So I was wondering if it's possible to update the msgpack-rpc code to also work with a newer Netty version when it comes to thread shutdown. Or is it in fact a Netty bug?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions