Error while configuring/setting spark.plugins to com.nvidia.spark.SQLPlugin

Hello,

While running the spark example, I was trying to configure spark.plugins as specified in the instructions. Below is how I configured it in the spark configuration:

import time
import os
from pyspark.sql import SparkSession
from pyspark.sql.window import Window
from pyspark.conf import SparkConf
conf = SparkConf()

conf.setAppName("MortgageETL+XGBoost")

conf.set("spark.rapids.sql.enabled", "true")

conf.set("spark.plugins", "com.nvidia.spark.SQLPlugin")

# Create spark session

spark = SparkSession.builder.config(conf=conf).getOrCreate()

I get below error:

Error Message:

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/11/02 14:40:25 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: Could not find Spark Shim Loader for 3.0.1
	at com.nvidia.spark.rapids.ShimLoader$.$anonfun$detectShimProvider$18(ShimLoader.scala:290)
	at scala.Option.getOrElse(Option.scala:189)
	at com.nvidia.spark.rapids.ShimLoader$.detectShimProvider(ShimLoader.scala:290)
	at com.nvidia.spark.rapids.ShimLoader$.findShimProvider(ShimLoader.scala:301)
	at com.nvidia.spark.rapids.ShimLoader$.initShimProviderIfNeeded(ShimLoader.scala:103)
	at com.nvidia.spark.rapids.ShimLoader$.getShimClassLoader(ShimLoader.scala:196)
	at com.nvidia.spark.rapids.ShimLoader$.loadClass(ShimLoader.scala:329)
	at com.nvidia.spark.rapids.ShimLoader$.newInstanceOf(ShimLoader.scala:335)
	at com.nvidia.spark.rapids.ShimLoader$.newDriverPlugin(ShimLoader.scala:368)
	at com.nvidia.spark.SQLPlugin.driverPlugin(SQLPlugin.scala:29)
	at org.apache.spark.internal.plugin.DriverPluginContainer.$anonfun$driverPlugins$1(PluginContainer.scala:44)
	at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:245)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at scala.collection.TraversableLike.flatMap(TraversableLike.scala:245)
	at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:242)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
	at org.apache.spark.internal.plugin.DriverPluginContainer.<init>(PluginContainer.scala:43)
	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:163)
	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:146)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:530)
	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:238)
	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:750)
22/11/02 14:40:25 WARN MetricsSystem: Stopping a MetricsSystem that is not running

Here are the details of Packages I am using:
pyspark == 3.0.1
Jar files :
com.nvidia:rapids-4-spark_2.12:22.10.0,
ai.rapids:cudf:22.10.0,
ml.dmlc:xgboost4j-spark-gpu_2.12:1.6.2,
ml.dmlc:xgboost4j-gpu_2.12:1.6.2

Do let me know if any extra information is needed.

#inception #inceptionprogram

The reason why I and my team faced this issue was the pyspark version which was 3.0.1. We then used version 3.1.1 which helped us to solve the problem

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.