Link to home
Start Free TrialLog in
Avatar of Joseph Jean pierre
Joseph Jean pierreFlag for India

asked on

Getting sbt error not able to compile gives me the slf4j error

I'm getiing this sbt error and i'm new to sbt and i'm trying to compile the build.scala for my existing project. When searched for the below issue in google it says to add sl4j jars in your dependencies or classpath and it will work. But i get the same error again and again. Tried adding in build.scala file but still issue. Not sure if i have put in the right version etc.,
sbt
Detected sbt version 0.13.16
Starting sbt: invoke with -help for other options
Using /home/rsa-key-20190625/.sbt/0.13.16 as sbt dir, -sbt-dir to override.
[info] Loading global plugins from /home/rsa-key-20180725/.sbt/0.13.16/plugins
[info] Loading project definition from /devops/iot/installs/trendalyze-spark-job-server/project
[info] Set current project to Trendalyze spark job server (in build file:/devops/iot/installs/trendalyze-spark-job-server/)
> compile
[info] Compiling Templates in Template Directory: /devops/iot/installs/trendalyze-spark-job-server/src/main/webapp/WEB-INF/templates
Failed to instantiate SLF4J LoggerFactory
Reported exception:
java.lang.NoClassDefFoundError: org.slf4j.impl.StaticLoggerBinder
	at org.slf4j.LoggerFactory.bind(LoggerFactory.java:121)
	at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:111)
	at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:268)
	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:241)
	at org.fusesource.scalate.util.Log$$anon$1.log$lzycompute(Logging.scala:31)
	at org.fusesource.scalate.util.Log$$anon$1.log(Logging.scala:31)
	at org.fusesource.scalate.util.Log$class.debug(Logging.scala:142)
	at org.fusesource.scalate.util.Log$$anon$1.debug(Logging.scala:30)
	at org.fusesource.scalate.util.ClassFinder$.discoverCommandClasses(ClassFinder.scala:69)
	at org.fusesource.scalate.util.ClassFinder$$anonfun$discoverCommands$1$$anonfun$apply$1.apply(ClassFinder.scala:36)
	at org.fusesource.scalate.util.ClassFinder$$anonfun$discoverCommands$1$$anonfun$apply$1.apply(ClassFinder.scala:36)
	at org.fusesource.scalate.util.ClassLoaders$.withContextClassLoader(ClassLoaders.scala:86)
	at org.fusesource.scalate.util.ClassFinder$$anonfun$discoverCommands$1.apply(ClassFinder.scala:35)
	at org.fusesource.scalate.util.ClassFinder$$anonfun$discoverCommands$1.apply(ClassFinder.scala:34)
	at scala.collection.immutable.List.flatMap(List.scala:327)
	at org.fusesource.scalate.util.ClassFinder$.discoverCommands(ClassFinder.scala:34)
	at org.fusesource.scalate.TemplateEngine.<init>(TemplateEngine.scala:261)
	at com.mojolly.scalate.Generator.engine$lzycompute(Generator.scala:21)
	at com.mojolly.scalate.Generator.engine(Generator.scala:20)
	at com.mojolly.scalate.Generator.execute(Generator.scala:51)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:90)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
	at java.lang.reflect.Method.invoke(Method.java:508)
	at com.mojolly.scalate.ScalatePlugin$$anonfun$generateScalateSource$1$$anonfun$apply$1.apply(ScalatePlugin.scala:101)
	at com.mojolly.scalate.ScalatePlugin$$anonfun$generateScalateSource$1$$anonfun$apply$1.apply(ScalatePlugin.scala:73)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
	at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
	at scala.collection.immutable.List.foreach(List.scala:318)
	at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
	at com.mojolly.scalate.ScalatePlugin$$anonfun$generateScalateSource$1.apply(ScalatePlugin.scala:73)
	at com.mojolly.scalate.ScalatePlugin$$anonfun$generateScalateSource$1.apply(ScalatePlugin.scala:72)
	at com.mojolly.scalate.ScalatePlugin$.withScalateClassLoader(ScalatePlugin.scala:129)
	at com.mojolly.scalate.ScalatePlugin$.generateScalateSource(ScalatePlugin.scala:72)
	at com.mojolly.scalate.ScalatePlugin$$anonfun$scalateSourceGeneratorTask$1.apply(ScalatePlugin.scala:52)
	at com.mojolly.scalate.ScalatePlugin$$anonfun$scalateSourceGeneratorTask$1.apply(ScalatePlugin.scala:52)
	at scala.Function6$$anonfun$tupled$1.apply(Function6.scala:35)
	at scala.Function6$$anonfun$tupled$1.apply(Function6.scala:34)
	at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
	at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
	at sbt.std.Transform$$anon$4.work(System.scala:63)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
	at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
	at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
	at sbt.Execute.work(Execute.scala:237)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
	at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
	at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
	at java.util.concurrent.FutureTask.run(FutureTask.java:277)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:522)
	at java.util.concurrent.FutureTask.run(FutureTask.java:277)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1160)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.lang.Thread.run(Thread.java:812)
Caused by: java.lang.ClassNotFoundException: org.slf4j.impl.StaticLoggerBinder
	at java.net.URLClassLoader.findClass(URLClassLoader.java:610)
	at java.lang.ClassLoader.loadClassHelper(ClassLoader.java:937)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:882)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:865)
	... 56 more
[trace] Stack trace suppressed: run last trendalyze-spark-job-server/compile:managedSources for the full output.
[error] (trendalyze-spark-job-server/compile:managedSources) java.lang.NoClassDefFoundError: org.slf4j.impl.StaticLoggerBinder
[error] Total time: 1 s, completed Jun 12, 2019 10:16:04 AM

Open in new window



Build.scala:
import sbt._
import Keys._
import org.scalatra.sbt._
import org.scalatra.sbt.PluginKeys._
import com.earldouglas.xwp.JettyPlugin
import com.mojolly.scalate.ScalatePlugin._
import scoverage.ScoverageKeys._
import sbtassembly.AssemblyKeys._
import sbtassembly.{PathList,MergeStrategy}
import ScalateKeys._
//import org.apache.spark.streaming.kafka010._

object TrendalyzeSparkJobServerBuild extends Build {
  val Organization = "com.trendalyze"
  val Name = "Trendalyze spark job server"
  val Version = "1.5.18-sprint-17-11-17"
  val ScalaVersion = "2.11.8"
  val ScalatraVersion = "2.4.1"
  val SparkVersion = "2.3.1"

  //Joseph added on 7th June 2019
//  lazy val excludeJpountz = ExclusionRule(organization = "net.jpountz.lz4", name = "lz4")
 // lazy val kafkaClients = "org.apache.kafka" % "kafka-clients" % "0.10.0.0" excludeAll(excludeJpountz) // add more exclusions here

  lazy val project = Project (
    "trendalyze-spark-job-server",
    file("."),
    settings = ScalatraPlugin.scalatraSettings ++ scalateSettings ++ Seq(
      organization := Organization,
      name := Name,
      version := Version,
      scalaVersion := ScalaVersion,
      coverageEnabled := false,
      coverageMinimum := 1,
      coverageFailOnMinimum := true,
      resolvers += Classpaths.typesafeReleases,
      resolvers += "Scalaz Bintray Repo" at "http://dl.bintray.com/scalaz/releases",
      resolvers += Resolver.mavenLocal,
      libraryDependencies ++= Seq(
        "junit" % "junit" % "4.12" % "test",
        //"com.github.eirslett" % "sbt-slf4j_2.10" % "0.1",
        //"org.slf4j" %  "slf4j-log4j12" % "1.5.6" % "provided",
        "org.slf4j" % "slf4j-log4j12" % "1.6.6" % "provided",
	"org.scalatra" %% "scalatra" % ScalatraVersion,
        "org.scalatra" %% "scalatra-scalate" % ScalatraVersion,
        "org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "test",
        "ch.qos.logback" % "logback-classic" % "1.1.5" % "runtime",
        "org.eclipse.jetty" % "jetty-webapp" % "9.2.15.v20160210" % "container",
        "javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
        "org.apache.spark" %% "spark-core" % SparkVersion,
        "org.apache.spark" %% "spark-hive" % SparkVersion,
        "org.apache.spark" %% "spark-mllib" % SparkVersion,
        "org.apache.spark" %% "spark-yarn" % SparkVersion,
        "com.sun.jersey" % "jersey-bundle" % "1.19.2",
        "org.scalatra" %% "scalatra-json" % "2.4.0-RC2-2",
        "org.json4s"   %% "json4s-jackson" % "3.3.0.RC2",
        "org.scala-lang" % "scala-library" % ScalaVersion,
        "org.scala-lang" % "scala-reflect" % ScalaVersion,
        "org.scala-lang" % "scala-compiler" % ScalaVersion,
        "com.typesafe" % "config" % "1.3.0",
        "mysql" % "mysql-connector-java" % "5.1.38",
        "com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8",
        "net.liftweb" %% "lift-json" % "2.6",
        "org.mongodb" %% "casbah" % "3.1.1",
        "org.mongodb.spark" %% "mongo-spark-connector" % "2.0.0",
        //"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0" % "provided",
        //"org.apache.spark" % "spark-core_2.11" % "2.4.0" % "provided",
        "org.apache.kafka" %% "kafka" % "2.2.1" % "provided",
        //"kafka" % "kafka_2.8.2" % "0.8.0-beta1" % "provided",
        "org.apache.spark" % "spark-streaming_2.11" % "2.4.0" % "provided",
        "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.4.0" % "provided",
        //"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.0.1",
        "org.apache.commons" % "commons-csv" %"1.3",
        //"iflux.jdbc.driver" % "IfluxDriver" % "1.2",
        "org.scalamock" %% "scalamock-scalatest-support" % "3.2.2" % "test",
        "com.databricks" %% "spark-xml" % "0.4.1",
        "org.apache.hadoop" % "hadoop-aws" % "2.6.0",
        "org.apache.hadoop" % "hadoop-azure" % "2.7.3",
        "com.microsoft.azure" % "azure-storage" % "2.1.0"
        //excludeAll(ExclusionRule(organization = "net.jpountz.lz4")
        //excludeAll(
        //ExclusionRule(organization = "net.jpountz")
       // )
        ),

      //excludeDependencies ++= Seq(ExclusionRule(organization = "net.jpountz")),
      testOptions in Test += Tests.Argument("-oDF"),

      scalateTemplateConfig in Compile <<= (sourceDirectory in Compile){ base =>
        Seq(
          TemplateConfig(
            base / "webapp" / "WEB-INF" / "templates",
            Seq.empty,  /* default imports should be added here */
            Seq(
              Binding("context", "_root_.org.scalatra.scalate.ScalatraRenderContext", importMembers = true, isImplicit = true)
               ),  /* add extra bindings here */
            Some("templates")
          )
        )
      }
    )
  ).enablePlugins(JettyPlugin)

  //excludeDependencies ++= Seq(ExclusionRule(organization = "net.jpountz"))

  //http://www.java2s.com/Code/JarDownload/kafka/kafka_2.8.2-0.8.0-beta1.jar.zip

 
  lazy val sparkDistJar = Project(
    "spark-dist-jar",
    file("spark-dist-jar"),
    settings = Seq(
      organization := Organization,
      version := Version,
      scalaVersion := ScalaVersion,
      scalaSource in Compile := baseDirectory.value / "src",
      libraryDependencies ++= Seq(
        "junit" % "junit" % "4.12" % "provided",
        //"com.github.eirslett" % "sbt-slf4j_2.10" % "0.1",
	//"org.slf4j" %  "slf4j-log4j12" % "1.5.6" % "provided",
	//"org.slf4j" % "slf4j-log4j12" % "1.6.6" % "provided",
        "org.scalatra" %% "scalatra" % ScalatraVersion % "provided",
        "org.scalatra" %% "scalatra-scalate" % ScalatraVersion % "provided",
        "org.scalatra" %% "scalatra-specs2" % ScalatraVersion % "provided",
        "javax.servlet" % "javax.servlet-api" % "3.1.0" % "provided",
        "org.apache.spark" %% "spark-core" % SparkVersion % "provided",
        "org.apache.spark" %% "spark-hive" % SparkVersion % "provided",
        "org.apache.spark" %% "spark-mllib" % SparkVersion % "provided",
        "com.sun.jersey" % "jersey-bundle" % "1.19.2" % "provided",
        "org.scalatra" %% "scalatra-json" % "2.4.0-RC2-2" % "provided",
        "org.json4s"   %% "json4s-jackson" % "3.3.0.RC2" % "provided",
        "org.scala-lang" % "scala-library" % ScalaVersion % "provided",
        "org.scala-lang" % "scala-reflect" % ScalaVersion % "provided",
        "org.scala-lang" % "scala-compiler" % ScalaVersion % "provided",
        "com.typesafe" % "config" % "1.3.0" % "provided",
        "net.liftweb" %% "lift-json" % "2.6" % "provided",
        "org.mongodb" %% "casbah" % "3.1.1",
        "org.apache.commons" % "commons-csv" %"1.3" % "provided",
        "org.mongodb.spark" %% "mongo-spark-connector" % "1.0.0" % "provided",
        //"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.0.1" % "provided",
        //"org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.3.0" % "provided",
        //"org.apache.spark" % "spark-core_2.11" % "2.4.0" % "provided",
        "org.apache.kafka" %% "kafka" % "2.2.1" % "provided",
        //"kafka" % "kafka_2.8.2" % "0.8.0-beta1" % "provided",
        "org.apache.spark" % "spark-streaming_2.11" % "2.4.0" % "provided",
        "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.4.0" % "provided",
        "org.scalamock" %% "scalamock-scalatest-support" % "3.2.2" % "provided"
      ),
      //excludeDependencies ++= Seq(ExclusionRule(organization = "net.jpountz")),
      test in assembly := {},
      assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false),
      assemblyMergeStrategy in assembly := {
        case PathList("ScalatraBootstrap.class") => MergeStrategy.discard
        case x =>
          val oldStrategy = (assemblyMergeStrategy in assembly).value
          oldStrategy(x)
      }
    )
  )
}

Open in new window

Avatar of krishna kishore mellacheruvu venkata
krishna kishore mellacheruvu venkata
Flag of India image

Can you check in  sl4j jars whether this class is there?

org.slf4j.impl.StaticLoggerBinder

Also, please correctly check whether it is available in classpath that jar file.
One way to double check is
Go to Dos Prompt and go to the folder from where you are executing and check the classpath.
ASKER CERTIFIED SOLUTION
Avatar of Joseph Jean pierre
Joseph Jean pierre
Flag of India image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial