Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sbt.compiler.EvalException: Type error in expression on dockerAliases #1353

Closed
dzlab opened this issue Jul 18, 2020 · 4 comments
Closed

sbt.compiler.EvalException: Type error in expression on dockerAliases #1353

dzlab opened this issue Jul 18, 2020 · 4 comments
Labels

Comments

@dzlab
Copy link

dzlab commented Jul 18, 2020

Expected behaviour

sbt docker:publish or sbt docker:publishLocal generate a docker image and publish it on a private local registery

Actual behaviour

sbt.compiler.EvalException: Type error in expression on dockerAliases

$ sbt docker:publish
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
/Users/dzlab/Projects/spark-k8s/build.sbt:39: error: not found: value dockerAliases
    dockerAliases ++= Seq(
    ^
sbt.compiler.EvalException: Type error in expression
[error] sbt.compiler.EvalException: Type error in expression
[error] Use 'last' for the full log.
Project loading failed: (r)etry, (q)uit, (l)ast, or (i)gnore? q

Information

Here is my build.sbt

val sparkVersion = "2.4.5"

scalaVersion in ThisBuild := "2.12.0"


val sparkLibs = Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion
)

// JAR build settings
lazy val commonSettings = Seq(
  organization := "dzlab",
  version := "0.1",
  scalaSource in Compile := baseDirectory.value / "src",
  scalaSource in Test := baseDirectory.value / "test",
  resourceDirectory in Test := baseDirectory.value / "test" / "resources",
  javacOptions ++= Seq(),
  scalacOptions ++= Seq(
    "-deprecation",
    "-feature",
    "-language:implicitConversions",
    "-language:postfixOps"
  ),
  libraryDependencies ++= sparkLibs
)

// Docker Image build settings
dockerBaseImage := "gcr.io/spark-operator/spark:v" + sparkVersion

lazy val root = (project in file("."))
  .enablePlugins(
    DockerPlugin,
    JavaAppPackaging
  )
  .settings(
    name := "spark-k8s",
    commonSettings,
    dockerAliases ++= Seq(
      dockerAlias.value.withRegistryHost(Some("localhost:5000"))
    ),
    mainClass in (Compile, run) := Some("dzlab.SparkJob")
  )
  • What sbt-native-packager are you using addSbtPlugin("com.typesafe.sbt" % "sbt-native-packager" % "1.0.0")
  • What sbt version 0.13.18
  • What is your build system (e.g. Ubuntu, MacOS, Windows, Debian ) MacOS
  • What package are you building (e.g. docker, rpm, ...) docker
  • What version has your build tool (find out with e.g. rpm --version)
  • What is your target system (e.g. Ubuntu 16.04, CentOS 7)
@dzlab
Copy link
Author

dzlab commented Jul 19, 2020

I was using an old version of the packager where dockerAliases didn;t existed. I updated to 1.7.2 and the error disappeared.
Now I can build the image but it does not looks like the image is available on the registry! (I'm using a minikube registry).

@muuki88 muuki88 added the docker label Jul 20, 2020
@muuki88
Copy link
Contributor

muuki88 commented Jul 20, 2020

Thanks for your question 😃 You configuration looks solid.

Did you try docker:publishLocal ? This should push the image to your local docker registry.
Even though you already setting the alias to localhost:5000, which I find weird, but I have no
really production experience with docker. So 🤷‍♂️ 😂

Also the output of the sbt run would be interesting.

@dzlab
Copy link
Author

dzlab commented Jul 20, 2020

@muuki88 thanks for confirming that the build.sbt looks good, I think the problem is with my registry configuration the image seem to be built properly (I can run it) but it is not published to the registry.

sbt docker:publishLocal
[info] Loading global plugins from /Users/dzlab/.sbt/0.13/plugins
[info] Loading project definition from /Users/dzlab/Projects/spark-k8s/project
[info] Set current project to spark-k8s (in build file:/Users/dzlab/Projects/spark-k8s/)
[info] Wrote /Users/dzlab/Projects/spark-k8s/target/scala-2.12/spark-k8s_2.12-0.1.pom
[warn] [1] There are no exposed ports for your docker image
[warn]  Configure the `dockerExposedPorts` or `dockerExposedUdpPorts` setting. E.g.
[warn] 
[warn]  // standard tcp ports
[warn]  dockerExposedPorts ++= Seq(9000, 9001)
[warn] 
[warn]  // for udp ports
[warn]  dockerExposedUdpPorts += 4444
[warn]           
[success] All package validations passed
[info] Sending build context to Docker daemon  103.9MB
[info] Step 1/19 : FROM gcr.io/spark-operator/spark:v2.4.5 as stage0
[info]  ---> 775e46820946
[info] Step 2/19 : LABEL snp-multi-stage="intermediate"
[info]  ---> Running in bbc5b5b8e7d1
[info] Removing intermediate container bbc5b5b8e7d1
[info]  ---> 39620874f5ad
[info] Step 3/19 : LABEL snp-multi-stage-id="28f7a183-ff2d-49a9-a7f4-8bf0922cc14a"
[info]  ---> Running in 52724730b27f
[info] Removing intermediate container 52724730b27f
[info]  ---> f09c6c35fe2e
[info] Step 4/19 : WORKDIR /opt/docker
[info]  ---> Running in 57917f47244e
[info] Removing intermediate container 57917f47244e
[info]  ---> ddd5bf51b6f8
[info] Step 5/19 : COPY 1/opt /1/opt
[info]  ---> 89f9087f61e1
[info] Step 6/19 : COPY 2/opt /2/opt
[info]  ---> 2ebbff0a92f1
[info] Step 7/19 : USER root
[info]  ---> Running in c978e3160b49
[info] Removing intermediate container c978e3160b49
[info]  ---> 16a0a7467ee6
[info] Step 8/19 : RUN ["chmod", "-R", "u=rX,g=rX", "/1/opt/docker"]
[info]  ---> Running in eb2c169a5984
[info] Removing intermediate container eb2c169a5984
[info]  ---> fb2828e0e3e5
[info] Step 9/19 : RUN ["chmod", "-R", "u=rX,g=rX", "/2/opt/docker"]
[info]  ---> Running in 7f135d6c8b47
[info] Removing intermediate container 7f135d6c8b47
[info]  ---> 1954043d6805
[info] Step 10/19 : RUN ["chmod", "u+x,g+x", "/1/opt/docker/bin/spark-k8s"]
[info]  ---> Running in 1973d6e1ce5c
[info] Removing intermediate container 1973d6e1ce5c
[info]  ---> 222cfb1efd3d
[info] Step 11/19 : FROM gcr.io/spark-operator/spark:v2.4.5 as mainstage
[info]  ---> 775e46820946
[info] Step 12/19 : USER root
[info]  ---> Running in a40600f3498c
[info] Removing intermediate container a40600f3498c
[info]  ---> a86cc4e1fed5
[info] Step 13/19 : RUN id -u demiourgos728 1>/dev/null 2>&1 || (( getent group 0 1>/dev/null 2>&1 || ( type groupadd 1>/dev/null 2>&1 && groupadd -g 0 root || addgroup -g 0 -S root )) && ( type useradd 1>/dev/null 2>&1 && useradd --system --create-home --uid 1001 --gid 0 demiourgos728 || adduser -S -u 1001 -G root demiourgos728 ))
[info]  ---> Running in 678ff1b1a486
[info] Removing intermediate container 678ff1b1a486
[info]  ---> 410b180d0973
[info] Step 14/19 : WORKDIR /opt/docker
[info]  ---> Running in 88bc2ff2a37d
[info] Removing intermediate container 88bc2ff2a37d
[info]  ---> 6eab9db7dc8a
[info] Step 15/19 : COPY --from=stage0 --chown=demiourgos728:root /1/opt/docker /opt/docker
[info]  ---> 2322c55c1b43
[info] Step 16/19 : COPY --from=stage0 --chown=demiourgos728:root /2/opt/docker /opt/docker
[info]  ---> 34f27fe0790d
[info] Step 17/19 : USER 1001:0
[info]  ---> Running in 5003593f3696
[info] Removing intermediate container 5003593f3696
[info]  ---> acaa9a126038
[info] Step 18/19 : ENTRYPOINT ["/opt/docker/bin/spark-k8s"]
[info]  ---> Running in 9c78002cfa36
[info] Removing intermediate container 9c78002cfa36
[info]  ---> eb79e083769b
[info] Step 19/19 : CMD []
[info]  ---> Running in e1c068a7d142
[info] Removing intermediate container e1c068a7d142
[info]  ---> 684766b9e996
[info] Successfully built 684766b9e996
[info] Successfully tagged spark-k8s:0.1
[info] Successfully tagged localhost:5000/spark-k8s:0.1
[info] Removing intermediate image(s) (labeled "snp-multi-stage-id=28f7a183-ff2d-49a9-a7f4-8bf0922cc14a") 
[info] Deleted Images:
[info] deleted: sha256:222cfb1efd3d9d4b7f51c3367b0622f43ef88f30150cc62b95e5770926240a53
[info] deleted: sha256:612a0fa67059f9ab6cb34bc0b9cf59025b7d6470dd326b1747599610dfb2dbe5
[info] deleted: sha256:1954043d68051ff59b260890e0fa3f42914ea06aaa0fda99715fbca8d5936b8c
[info] deleted: sha256:7aa225573b8863caf132d98142d4209f562e6c77a9b84a42a009647aeee64f15
[info] deleted: sha256:fb2828e0e3e57879651a71151d2965aed43c8e0f43afd4c4e3646812450fd7f8
[info] deleted: sha256:ae7e0b06ff423029c1940557aed93bba09547df3553fb80534de09aa3efe199a
[info] deleted: sha256:16a0a7467ee6f039de96ab4da6f2261a17d55b08989fe607224207392447d75b
[info] deleted: sha256:2ebbff0a92f14fedd9baf02ae157a391e4d78e124519c855d342d441390b93ef
[info] deleted: sha256:0d62c91bf58ed079dc4f4752d1657691c040b36b81db2f661e849fbb20fa8e7a
[info] deleted: sha256:89f9087f61e1968fb4d0521de0a23d7b55deefe81ce69d8e660b9f0d834d8e37
[info] deleted: sha256:aea6cf1ed953e8fcddf08be76cf637aabdc86663f0328596bcf6c3f0f70087a2
[info] deleted: sha256:ddd5bf51b6f8a941fccb81c49bb497f027fffa5bc53a6c3d452f010dc29f8bc6
[info] deleted: sha256:d60fb71c699434417efefef2e731a80f266316ee4292e291b46be3b426b18d6d
[info] deleted: sha256:f09c6c35fe2e5d98d489d58f1fecae5a9b4c2308e701c7049540bd6a5beb6ee0
[info] deleted: sha256:39620874f5adefe3ffdfd73a7572a043fa0b247ab496d7fd18b079f88a7e3a54
[info] Total reclaimed space: 207.6MB
[info] Built image spark-k8s with tags [0.1, 0.1]
[success] Total time: 38 s, completed Jul 20, 2020 12:04:32 PM
$ docker images
REPOSITORY                                     TAG                   IMAGE ID            CREATED             SIZE
spark-k8s                                      0.1                   684766b9e996        2 minutes ago       658MB

I will investigate into registry configuration and let you know

@dzlab
Copy link
Author

dzlab commented Jul 26, 2020

I got this working, it was an issue with the registry setup.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants