Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue at compilation time when required with dep #417

Closed
seblegall opened this issue May 23, 2018 · 28 comments
Closed

Issue at compilation time when required with dep #417

seblegall opened this issue May 23, 2018 · 28 comments
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed.

Comments

@seblegall
Copy link

Hi,

It seems that there is a compilation error when the client-go is required from dep.

The dep ensure command goes well. But then... go build output this error :

# github.com/seblegall/test-k8s-go-client/vendor/k8s.io/client-go/discovery
vendor/k8s.io/client-go/discovery/restmapper.go:42:75: undefined: meta.VersionInterfacesFunc
vendor/k8s.io/client-go/discovery/restmapper.go:176:19: undefined: meta.VersionInterfacesFunc

Steps to reproduce this error :

I originally had this trouble when trying to do a dep update on project that use this dependency : https://github.com/Meetic/blackbeard

I'm using : go version go1.9.2 darwin/amd64

dep version output (I'm using dep build from source and I'm up to date with master) :

dep:
 version     : devel
 build date  :
 git hash    :
 go version  : go1.9.2
 go compiler : gc
 platform    : darwin/amd64
 features    : ImportDuringSolve=false
@seblegall seblegall changed the title Issue at compilation when required with dep Issue at compilation time when required with dep May 23, 2018
@liggitt
Copy link
Member

liggitt commented May 23, 2018

what branches are you using?

@seblegall
Copy link
Author

My Gopkg.toml file contains :

[[constraint]]
  name = "k8s.io/client-go"
  version = "7.0.0"

The Gopkg.lock file contains :

[[projects]]
  name = "k8s.io/client-go"
  packages = [
    "discovery",
    "kubernetes",
    "kubernetes/scheme",
    "kubernetes/typed/admissionregistration/v1alpha1",
    "kubernetes/typed/admissionregistration/v1beta1",
    "kubernetes/typed/apps/v1",
    "kubernetes/typed/apps/v1beta1",
    "kubernetes/typed/apps/v1beta2",
    "kubernetes/typed/authentication/v1",
    "kubernetes/typed/authentication/v1beta1",
    "kubernetes/typed/authorization/v1",
    "kubernetes/typed/authorization/v1beta1",
    "kubernetes/typed/autoscaling/v1",
    "kubernetes/typed/autoscaling/v2beta1",
    "kubernetes/typed/batch/v1",
    "kubernetes/typed/batch/v1beta1",
    "kubernetes/typed/batch/v2alpha1",
    "kubernetes/typed/certificates/v1beta1",
    "kubernetes/typed/core/v1",
    "kubernetes/typed/events/v1beta1",
    "kubernetes/typed/extensions/v1beta1",
    "kubernetes/typed/networking/v1",
    "kubernetes/typed/policy/v1beta1",
    "kubernetes/typed/rbac/v1",
    "kubernetes/typed/rbac/v1alpha1",
    "kubernetes/typed/rbac/v1beta1",
    "kubernetes/typed/scheduling/v1alpha1",
    "kubernetes/typed/settings/v1alpha1",
    "kubernetes/typed/storage/v1",
    "kubernetes/typed/storage/v1alpha1",
    "kubernetes/typed/storage/v1beta1",
    "pkg/apis/clientauthentication",
    "pkg/apis/clientauthentication/v1alpha1",
    "pkg/version",
    "plugin/pkg/client/auth/exec",
    "rest",
    "rest/watch",
    "tools/auth",
    "tools/clientcmd",
    "tools/clientcmd/api",
    "tools/clientcmd/api/latest",
    "tools/clientcmd/api/v1",
    "tools/metrics",
    "tools/reference",
    "transport",
    "util/cert",
    "util/flowcontrol",
    "util/homedir",
    "util/integer"
  ]
  revision = "23781f4d6632d88e869066eaebb743857aa1ef9b"
  version = "v7.0.0"

So, I'm using the v7.0.0 tag.

@liggitt
Copy link
Member

liggitt commented May 23, 2018

client-go v7.0.0 depends on k8s.io/apimachinery 31dade610c053669d8054bfd847da657251e8c1a:

https://github.com/kubernetes/client-go/blob/v7.0.0/Godeps/Godeps.json#L373-L376

the referenced type exists at that revision https://github.com/kubernetes/apimachinery/blob/31dade610c053669d8054bfd847da657251e8c1a/pkg/api/meta/restmapper.go#L90-L92

@sttts
Copy link
Contributor

sttts commented May 23, 2018

With dep you have to pin all dependencies (especially those under the k8s.io org) to the right versions. For client-go 7.0.0 use the kubernetes-1.10.0 tags on the other k8s.io repos.

@seblegall
Copy link
Author

@sttts It works when specifying the version instead of branch in the Gopkg.toml.

However... isn't wired that the default use case with dep doesn't work?

I'm not an expert of dep but... couldn't we find a solution to make client-go explicitly require version
kubernetes-1.10.0 of apimachinery ?

@sttts
Copy link
Contributor

sttts commented May 29, 2018

@seblegall compare kubernetes/publishing-bot#55. Looks like dep's SAT algorithm falls over. Actually though, nobody has found out whether we have a mistake in our (experimentally) shipped Gopkg.tomls or whether dep just explodes.

@seblegall
Copy link
Author

@sttts Well, did you also try to migrate to vgo? 🤣

@sttts
Copy link
Contributor

sttts commented May 29, 2018

Well, did you also try to migrate to vgo?

I think nobody dared to try that yet for the whole kube. Maybe just for client-go and friends it's feasible.

@seblegall
Copy link
Author

I may take some time in the next few days to test vgo on my own project (that require the go-client), then, I may test vgo also on the go-client. Keep you updated.

@ilackarms
Copy link

also running into this issue with dep

@aalubin
Copy link

aalubin commented Jun 3, 2018

+1

@cheynewallace
Copy link

Also having this issue.. Can anyone suggest a fix? Im blocked right now

@seblegall
Copy link
Author

You have to put this explicitly in your Gopkg.toml file :

[[constraint]]
  name = "k8s.io/api"
  version = "kubernetes-1.9.0"

[[constraint]]
  name = "k8s.io/apimachinery"
  version = "kubernetes-1.9.0"

[[constraint]]
  name = "k8s.io/client-go"
  version = "6.0.0"

In this example I have set the version to kubernetes-1.9.0 because I'm using the client-go version 6.0.0. But you should put the right version of k8s.io/* packages depending on the client-go you are using.

@aalubin
Copy link

aalubin commented Jun 5, 2018

@cheynewallace setting the client-go to reference master solves the problem as well:

[[constraint]]
  name = "k8s.io/client-go"
  branch = "master"

@sttts
Copy link
Contributor

sttts commented Jun 5, 2018

setting the client-go to reference master solves the problem as well:

not a good idea. We have no guarantees whatsoever about compatibility with former or future Kube versions.

@aalubin
Copy link

aalubin commented Jun 5, 2018

@sttts, you are absolutely right. I should have added a warning for this workaround.

@cheynewallace
Copy link

Thanks @seblegall , that one solved it for me..

@sigma
Copy link
Contributor

sigma commented Jun 8, 2018

I think nobody dared to try that yet for the whole kube. Maybe just for client-go and friends it's feasible.

actually I did try, and indeed I ended up in a situation where all staging repositories were behaving, but no solution could be found for the main repo.
As far as I can tell, the combination of excessive v0.0.0-* fake versions (really v2+) and dependency cycles in the implicit vgo modules was deadly

@sttts
Copy link
Contributor

sttts commented Jun 11, 2018

@sigma am curious, can you upload your experiments to a branch?

@nikhita
Copy link
Member

nikhita commented Jul 5, 2018

vgo and kubernetes: kubernetes/kubernetes#65683

@idealhack
Copy link
Member

About dep, I found that it works well to use the same tag for client-go, api and apimachinery. Could this be the recommended way when pinning dependencies?

ref: kubernetes/api#5 (comment)

@sttts
Copy link
Contributor

sttts commented Jul 10, 2018

Could this be the recommended way when pinning dependencies?

Yes, that's recommended.

@fejta-bot
Copy link

Issues go stale after 90d of inactivity.
Mark the issue as fresh with /remove-lifecycle stale.
Stale issues rot after an additional 30d of inactivity and eventually close.

If this issue is safe to close now please do so with /close.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta.
/lifecycle stale

@k8s-ci-robot k8s-ci-robot added the lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. label Oct 8, 2018
@808codist
Copy link

@seblegall wrote:

you should put the right version of k8s.io/* packages depending on the client-go you are using

How does one go about finding the "right version of k8s.io/*", given a certain client-go version, e.g. v9.0.0?

@ash2k
Copy link
Member

ash2k commented Oct 31, 2018

@808codist Give this tool a try https://github.com/ash2k/kubegodep2dep

@fejta-bot
Copy link

Stale issues rot after 30d of inactivity.
Mark the issue as fresh with /remove-lifecycle rotten.
Rotten issues close after an additional 30d of inactivity.

If this issue is safe to close now please do so with /close.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta.
/lifecycle rotten

@k8s-ci-robot k8s-ci-robot added lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. and removed lifecycle/stale Denotes an issue or PR has remained open with no activity and has become stale. labels Nov 30, 2018
@fejta-bot
Copy link

Rotten issues close after 30d of inactivity.
Reopen the issue with /reopen.
Mark the issue as fresh with /remove-lifecycle rotten.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta.
/close

@k8s-ci-robot
Copy link
Contributor

@fejta-bot: Closing this issue.

In response to this:

Rotten issues close after 30d of inactivity.
Reopen the issue with /reopen.
Mark the issue as fresh with /remove-lifecycle rotten.

Send feedback to sig-testing, kubernetes/test-infra and/or fejta.
/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed.
Projects
None yet
Development

No branches or pull requests