Apache Spark job fails with InvalidClassException, class version mismatch, in Azure HDInsight

This article describes troubleshooting steps and possible resolutions for issues when using Apache Spark components in Azure HDInsight clusters.

Issue

You try to create an Apache Spark job in a Spark 2.x cluster. It fails with an error similar to the following:

18/09/18 09:32:26 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 1, wn7-dev-co.2zyfbddadfih0xdq0cdja4g.ax.internal.chinacloudapp.cn, executor 4): java.io.InvalidClassException:
org.apache.commons.lang3.time.FastDateFormat; local class incompatible: stream classdesc serialVersionUID = 2, local class serialVersionUID = 1
        at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:699)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)

Cause

This error can be caused by adding an additional jar to the spark.yarn.jars config, specifically a shaded jar that includes a different version of the commons-lang3 package and introduces a class mismatch. By default, Spark 2.1/2/3 uses version 3.5 of commons-lang3.

Tip

To shade a library is to put its contents into your own jar, changing its package. This differs from packaging the library, which is putting the library into your own jar without repackaging.

Resolution

Either remove the jar, or recompile the customized jar (AzureLogAppender) and use maven-shade-plugin to relocate classes.

Next steps

If you didn't see your problem or are unable to solve your issue, visit one of the following channels for more support:

  • If you need more help, you can submit a support request from the Azure portal. Select Support from the menu bar or open the Help + support hub.