Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable Xray an target image #693

Open
gnought opened this issue Jun 13, 2024 · 13 comments
Open

Unable Xray an target image #693

gnought opened this issue Jun 13, 2024 · 13 comments

Comments

@gnought
Copy link

gnought commented Jun 13, 2024

Expected Behavior

run slim x nginx:latest --debug --verbose
should return xray reports


Actual Behavior

It gives an API error(500)

md=xray state=started
cmd=xray info=params target='nginx:latest' add-image-manifest='false' add-image-config='false' rm-file-artifacts='false'
cmd=xray state=image.api.inspection.start
cmd=xray info=image size.human='68 MB' architecture='arm64' id='sha256:0f04e4f646a3f14bf31d8bc8d885b6c951fdcf42589d06845f64d18aec6a3c4d' size.bytes='67668418'
cmd=xray info=image.stack index='0' name='nginx:latest' id='sha256:0f04e4f646a3f14bf31d8bc8d885b6c951fdcf42589d06845f64d18aec6a3c4d' instructions='17' message='see report file for details'
cmd=xray info=image.exposed_ports list='80/tcp'
cmd=xray state=image.api.inspection.done
cmd=xray state=image.data.inspection.start
cmd=xray info=image.data.inspection.save.image.start
time="2024-06-13T21:03:44+08:00" level=error msg="dockerutil.SaveImage: dclient.ExportImage() error = API error (500): invalid repository name (0f04e4f646a3f14bf31d8bc8d885b6c951fdcf42589d06845f64d18aec6a3c4d), cannot specify 64-byte hexadecimal strings"
time="2024-06-13T21:03:44+08:00" level=fatal msg="slim: failure" error="API error (500): invalid repository name (0f04e4f646a3f14bf31d8bc8d885b6c951fdcf42589d06845f64d18aec6a3c4d), cannot specify 64-byte hexadecimal strings" stack="goroutine 1 [running]:\nruntime/debug.Stack()\n\truntime/debug/stack.go:24 +0x64\ngithub.com/slimtoolkit/slim/pkg/util/errutil.FailOn({0x102b5ff98, 0x140005220a8})\n\tgithub.com/slimtoolkit/slim/pkg/util/errutil/errutil.go:32 +0x38\ngithub.com/slimtoolkit/slim/pkg/app/master/command/xray.OnCommand(0x140004c0d68, 0x140004c8480, 0x140004c0d88, {0x16f252d37, 0xc}, 0x1, {0x0, 0x0}, {0x0, 0x0}, ...)\n\tgithub.com/slimtoolkit/slim/pkg/app/master/command/xray/handler.go:374 +0x21e0\ngithub.com/slimtoolkit/slim/pkg/app/master/command/xray.glob..func1(0x140000bfa80)\n\tgithub.com/slimtoolkit/slim/pkg/app/master/command/xray/cli.go:337 +0x1710\ngithub.com/urfave/cli/v2.(*Command).Run(0x103c731c0, 0x140000beac0)\n\tgithub.com/urfave/cli/v2@v2.3.0/command.go:163 +0x4a8\ngithub.com/urfave/cli/v2.(*App).RunContext(0x140002c51e0, {0x102b7e098?, 0x103cf1e60}, {0x14000116050, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.3.0/app.go:313 +0x808\ngithub.com/urfave/cli/v2.(*App).Run(...)\n\tgithub.com/urfave/cli/v2@v2.3.0/app.go:224\ngithub.com/slimtoolkit/slim/pkg/app/master.Run()\n\tgithub.com/slimtoolkit/slim/pkg/app/master/app.go:15 +0x4c\nmain.main()\n\tgithub.com/slimtoolkit/slim/cmd/slim/main.go:15 +0x194\n" version="darwin/arm64|Transformer|1.40.11|latest|latest"

Specifications

  • Version: slim version darwin/arm64|Transformer|1.40.11|latest|latest
  • Platform: macOS 14.5
  • Docker info
$ docker info
Client:
Version:    26.1.3
Context:    orbstack
Debug Mode: false
Plugins:
buildx: Docker Buildx (Docker Inc.)
  Version:  v0.14.1
  Path:     /Users/devlocal/.docker/cli-plugins/docker-buildx
compose: Docker Compose (Docker Inc.)
  Version:  v2.27.1
  Path:     /Users/devlocal/.docker/cli-plugins/docker-compose

Server:
Containers: 4
Running: 1
Paused: 0
Stopped: 3
Images: 51
Server Version: 26.1.4
Storage Driver: overlayfs
driver-type: io.containerd.snapshotter.v1
Logging Driver: json-file
Cgroup Driver: cgroupfs
Cgroup Version: 2
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local splunk syslog
Swarm: inactive
Runtimes: io.containerd.runc.v2 runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 3a4de459a68952ffb703bbe7f2290861a75b6b67
runc version: 51d5e94601ceffbbd85688df1c928ecccbfa4685
init version: de40ad0
Security Options:
seccomp
 Profile: builtin
cgroupns
Kernel Version: 6.9.3-orbstack-00146-g1a8d02c90788
Operating System: OrbStack
OSType: linux
Architecture: aarch64
CPUs: 12
Total Memory: 15.59GiB
Name: orbstack
ID: ed3c0974-9b58-465e-906c-ea982b257a80
Docker Root Dir: /var/lib/docker
Debug Mode: false
Experimental: true
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false
Product License: Community Engine
Default Address Pools:
 Base: 192.168.215.0/24, Size: 24
 Base: 192.168.228.0/24, Size: 24
 Base: 192.168.247.0/24, Size: 24
 Base: 192.168.207.0/24, Size: 24
 Base: 192.168.167.0/24, Size: 24
 Base: 192.168.107.0/24, Size: 24
 Base: 192.168.237.0/24, Size: 24
 Base: 192.168.148.0/24, Size: 24
 Base: 192.168.214.0/24, Size: 24
 Base: 192.168.165.0/24, Size: 24
 Base: 192.168.227.0/24, Size: 24
 Base: 192.168.181.0/24, Size: 24
 Base: 192.168.158.0/24, Size: 24
 Base: 192.168.117.0/24, Size: 24
 Base: 192.168.155.0/24, Size: 24
 Base: 192.168.147.0/24, Size: 24
 Base: 192.168.229.0/24, Size: 24
 Base: 192.168.183.0/24, Size: 24
 Base: 192.168.156.0/24, Size: 24
 Base: 192.168.97.0/24, Size: 24
 Base: 192.168.171.0/24, Size: 24
 Base: 192.168.186.0/24, Size: 24
 Base: 192.168.216.0/24, Size: 24
 Base: 192.168.242.0/24, Size: 24
 Base: 192.168.166.0/24, Size: 24
 Base: 192.168.239.0/24, Size: 24
 Base: 192.168.223.0/24, Size: 24
 Base: 192.168.164.0/24, Size: 24
 Base: 192.168.163.0/24, Size: 24
 Base: 192.168.172.0/24, Size: 24
 Base: 172.17.0.0/16, Size: 16
 Base: 172.18.0.0/16, Size: 16
 Base: 172.19.0.0/16, Size: 16
 Base: 172.20.0.0/14, Size: 16
 Base: 172.24.0.0/14, Size: 16
 Base: 172.28.0.0/14, Size: 16
@kcq
Copy link
Member

kcq commented Jun 19, 2024

@gnought are you using Orbstack instead of Docker Desktop?

@gnought
Copy link
Author

gnought commented Jun 20, 2024

@kcq yes, you are right.

@kcq
Copy link
Member

kcq commented Jun 21, 2024

@gnought Still need to verify this... but it appears it should be straightforward to fix. Looks like Orbstack doesn't like missing sha256 prefix in image IDs, at least, in some of the Docker API calls. Will need your help verifying the enhancement.

@gnought
Copy link
Author

gnought commented Jun 22, 2024

@kcq sure, tag me and let me know how to test the fix. :)

@kcq
Copy link
Member

kcq commented Jun 24, 2024

@gnought try this new release and see if you get different results with it

@gnought
Copy link
Author

gnought commented Jun 25, 2024

Hi @kcq, I got a similar error for v1.41.4

gnought:~/work/bin/dist_mac_m1 ?1 % ./docker-slim --version
mint version darwin/arm64|Aurora|1.41.4|b59472b8cf16b26575df9960d8ede8f6aa4fb6a9|2024-06-23_08:28:41AM
gnought:~/work/bin/dist_mac_m1 √ % ./docker-slim xray nginx
cmd=xray state=started
cmd=xray info=cmd.input.params add-image-manifest='false' add-image-config='false' rm-file-artifacts='false' target='nginx'
cmd=xray state=image.api.inspection.start
cmd=xray info=image size.human='68 MB' architecture='arm64' id='sha256:9c367186df9a6b18c6735357b8eb7f407347e84aea09beb184961cb83543d46e' size.bytes='67669989'
cmd=xray info=image.stack message='see report file for details' index='0' name='nginx:latest' id='sha256:9c367186df9a6b18c6735357b8eb7f407347e84aea09beb184961cb83543d46e' instructions='17'
cmd=xray info=image.exposed_ports list='80/tcp'
cmd=xray state=image.api.inspection.done
cmd=xray state=image.data.inspection.start
cmd=xray info=image.data.inspection.save.image.start
time="2024-06-25T11:43:29+08:00" level=error msg="dockerutil.SaveImage: dclient.ExportImage() error = API error (500): invalid repository name (9c367186df9a6b18c6735357b8eb7f407347e84aea09beb184961cb83543d46e), cannot specify 64-byte hexadecimal strings"
time="2024-06-25T11:43:29+08:00" level=fatal msg="slim: failure" error="API error (500): invalid repository name (9c367186df9a6b18c6735357b8eb7f407347e84aea09beb184961cb83543d46e), cannot specify 64-byte hexadecimal strings" stack="goroutine 1 [running]:\nruntime/debug.Stack()\n\truntime/debug/stack.go:24 +0x64\ngithub.com/mintoolkit/mint/pkg/util/errutil.FailOn({0x1042f2718, 0x140005d5fc8})\n\tgithub.com/mintoolkit/mint/pkg/util/errutil/errutil.go:32 +0x38\ngithub.com/mintoolkit/mint/pkg/app/master/command/xray.OnCommand(0x140005beaa8, 0x1400037a000, 0x140005beac8, {0x16dd5ad1b, 0x5}, 0x1, {0x0, 0x0}, {0x0, 0x0}, ...)\n\tgithub.com/mintoolkit/mint/pkg/app/master/command/xray/handler.go:375 +0x220c\ngithub.com/mintoolkit/mint/pkg/app/master/command/xray.glob..func1(0x140004d8740)\n\tgithub.com/mintoolkit/mint/pkg/app/master/command/xray/cli.go:340 +0x1754\ngithub.com/urfave/cli/v2.(*Command).Run(0x105574480, 0x140004d8740, {0x14000400960, 0x2, 0x2})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:279 +0x754\ngithub.com/urfave/cli/v2.(*Command).Run(0x140004fc160, 0x140004d8140, {0x140001161b0, 0x3, 0x3})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:272 +0x964\ngithub.com/urfave/cli/v2.(*App).RunContext(0x140004e4200, {0x104314b48?, 0x1055f6240}, {0x140001161b0, 0x3, 0x3})\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:337 +0x534\ngithub.com/urfave/cli/v2.(*App).Run(...)\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:311\ngithub.com/mintoolkit/mint/pkg/app/master.Run()\n\tgithub.com/mintoolkit/mint/pkg/app/master/app.go:15 +0x4c\nmain.main()\n\tgithub.com/mintoolkit/mint/cmd/mint/main.go:15 +0x194\n" version="darwin/arm64|Aurora|1.41.4|b59472b8cf16b26575df9960d8ede8f6aa4fb6a9|2024-06-23_08:28:41AM"
gnought:~/work/bin/dist_mac_m1 ?1 %

@gnought
Copy link
Author

gnought commented Jun 25, 2024

The error comes from
https://github.com/mintoolkit/mint/blob/b59472b8cf16b26575df9960d8ede8f6aa4fb6a9/pkg/docker/dockerutil/dockerutil.go#L313
It looks using github.com/fsouza/go-dockerclient client to contact docker api, and the error comes from upstream.
Is it a time to switch to official docker client, https://pkg.go.dev/github.com/docker/docker/client#readme-go-client-for-the-docker-engine-api ?

@kcq
Copy link
Member

kcq commented Jun 25, 2024

thanks for checking @gnought ! that's odd that the fix got lost... need to restore it

@kcq
Copy link
Member

kcq commented Jul 1, 2024

@gnought try this new version

@gnought
Copy link
Author

gnought commented Jul 2, 2024

hi @kcq
no more 500 API error, but a new error occurs "invalid tar header"

gnought:~/tmp/dist_mac_m1 √ % ./slim --version
mint version darwin/arm64|Aurora|1.41.5|4cc2b185c9cdcd6d0586246e21d8aecc5d847feb|2024-07-01_04:36:47AM
gnought:~/tmp/dist_mac_m1 √ % ./slim x nginx:latest --debug --verbose
cmd=xray state=started
cmd=xray info=cmd.input.params add-image-config='false' rm-file-artifacts='false' target='nginx:latest' add-image-manifest='false'
cmd=xray state=image.api.inspection.start
cmd=xray info=image id='sha256:1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b' size.bytes='67669989' size.human='68 MB' architecture='arm64'
cmd=xray info=image.stack name='nginx:latest' id='sha256:1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b' instructions='17' message='see report file for details' index='0'
cmd=xray info=image.exposed_ports list='80/tcp'
cmd=xray state=image.api.inspection.done
cmd=xray state=image.data.inspection.start
cmd=xray info=image.data.inspection.save.image.start
cmd=xray info=image.data.inspection.save.image.end
cmd=xray info=image.data.inspection.process.image.start
time="2024-07-03T01:08:22+08:00" level=error msg="layerFromStream: error reading layer(3239ea84a00d153b5b8f81b47548275701d325537ee211a15719a09589e90ec0) - archive/tar: invalid tar header"
time="2024-07-03T01:08:22+08:00" level=error msg="dockerimage.LoadPackage: error reading oci layer from archive(/Users/gnought/tmp/dist_mac_m1/.mint-state/images/1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b/image/1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b.tar/blobs/sha256/3239ea84a00d153b5b8f81b47548275701d325537ee211a15719a09589e90ec0) - archive/tar: invalid tar header"
time="2024-07-03T01:08:22+08:00" level=fatal msg="slim: failure" error="archive/tar: invalid tar header" stack="goroutine 1 [running]:\nruntime/debug.Stack()\n\truntime/debug/stack.go:24 +0x64\ngithub.com/mintoolkit/mint/pkg/util/errutil.FailOn({0x1023e9df8, 0x103650430})\n\tgithub.com/mintoolkit/mint/pkg/util/errutil/errutil.go:32 +0x38\ngithub.com/mintoolkit/mint/pkg/app/master/command/xray.OnCommand(0x140004d0aa8, 0x140001a3860, 0x140004d0ac8, {0x16fc62c39, 0xc}, 0x1, {0x0, 0x0}, {0x0, 0x0}, ...)\n\tgithub.com/mintoolkit/mint/pkg/app/master/command/xray/handler.go:412 +0x2398\ngithub.com/mintoolkit/mint/pkg/app/master/command/xray.glob..func1(0x14000614780)\n\tgithub.com/mintoolkit/mint/pkg/app/master/command/xray/cli.go:340 +0x1754\ngithub.com/urfave/cli/v2.(*Command).Run(0x1036704a0, 0x14000614780, {0x140006147c0, 0x4, 0x4})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:279 +0x754\ngithub.com/urfave/cli/v2.(*Command).Run(0x14000626160, 0x14000614140, {0x140001d2000, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:272 +0x964\ngithub.com/urfave/cli/v2.(*App).RunContext(0x1400061c200, {0x10240cbe8?, 0x1036f2280}, {0x140001d2000, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:337 +0x534\ngithub.com/urfave/cli/v2.(*App).Run(...)\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:311\ngithub.com/mintoolkit/mint/pkg/app/master.Run()\n\tgithub.com/mintoolkit/mint/pkg/app/master/app.go:15 +0x4c\nmain.main()\n\tgithub.com/mintoolkit/mint/cmd/mint/main.go:15 +0x194\n" version="darwin/arm64|Aurora|1.41.5|4cc2b185c9cdcd6d0586246e21d8aecc5d847feb|2024-07-01_04:36:47AM"
gnought:~/tmp/dist_mac_m1 ?1 % ./mint x nginx:latest --debug --verbose
cmd=xray state=started
cmd=xray info=cmd.input.params target='nginx:latest' add-image-manifest='false' add-image-config='false' rm-file-artifacts='false'
cmd=xray state=image.api.inspection.start
cmd=xray info=image id='sha256:1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b' size.bytes='67669989' size.human='68 MB' architecture='arm64'
cmd=xray info=image.stack index='0' name='nginx:latest' id='sha256:1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b' instructions='17' message='see report file for details'
cmd=xray info=image.exposed_ports list='80/tcp'
cmd=xray state=image.api.inspection.done
cmd=xray state=image.data.inspection.start
cmd=xray info=image.data.inspection.process.image.start
time="2024-07-03T01:08:34+08:00" level=error msg="layerFromStream: error reading layer(3239ea84a00d153b5b8f81b47548275701d325537ee211a15719a09589e90ec0) - archive/tar: invalid tar header"
time="2024-07-03T01:08:34+08:00" level=error msg="dockerimage.LoadPackage: error reading oci layer from archive(/Users/gnought/tmp/dist_mac_m1/.mint-state/images/1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b/image/1aaa8180df68200fd41f9066cf62155e3b71183c04b2895a7388d5fd84ef3c8b.tar/blobs/sha256/3239ea84a00d153b5b8f81b47548275701d325537ee211a15719a09589e90ec0) - archive/tar: invalid tar header"
time="2024-07-03T01:08:34+08:00" level=fatal msg="slim: failure" error="archive/tar: invalid tar header" stack="goroutine 1 [running]:\nruntime/debug.Stack()\n\truntime/debug/stack.go:24 +0x64\ngithub.com/mintoolkit/mint/pkg/util/errutil.FailOn({0x103109df8, 0x104370430})\n\tgithub.com/mintoolkit/mint/pkg/util/errutil/errutil.go:32 +0x38\ngithub.com/mintoolkit/mint/pkg/app/master/command/xray.OnCommand(0x14000698aa8, 0x14000720460, 0x14000698ac8, {0x16ef42c39, 0xc}, 0x1, {0x0, 0x0}, {0x0, 0x0}, ...)\n\tgithub.com/mintoolkit/mint/pkg/app/master/command/xray/handler.go:412 +0x2398\ngithub.com/mintoolkit/mint/pkg/app/master/command/xray.glob..func1(0x140000a25c0)\n\tgithub.com/mintoolkit/mint/pkg/app/master/command/xray/cli.go:340 +0x1754\ngithub.com/urfave/cli/v2.(*Command).Run(0x1043904a0, 0x140000a25c0, {0x140000a2600, 0x4, 0x4})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:279 +0x754\ngithub.com/urfave/cli/v2.(*Command).Run(0x140002adce0, 0x1400014fa80, {0x14000152000, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:272 +0x964\ngithub.com/urfave/cli/v2.(*App).RunContext(0x14000522200, {0x10312cbe8?, 0x104412280}, {0x14000152000, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:337 +0x534\ngithub.com/urfave/cli/v2.(*App).Run(...)\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:311\ngithub.com/mintoolkit/mint/pkg/app/master.Run()\n\tgithub.com/mintoolkit/mint/pkg/app/master/app.go:15 +0x4c\nmain.main()\n\tgithub.com/mintoolkit/mint/cmd/mint/main.go:15 +0x194\n" version="darwin/arm64|Aurora|1.41.5|4cc2b185c9cdcd6d0586246e21d8aecc5d847feb|2024-07-01_04:36:47AM"
gnought:~/tmp/dist_mac_m1 ?1 %

@kcq
Copy link
Member

kcq commented Aug 24, 2024

it's possible that Orbstack's docker save API call is different... need to repro the steps manually to confirm

@rfilgueiras
Copy link

I can reproduce it when trying to x-ray:

❯ slim --version
slim version linux/amd64|Transformer|1.40.11|1b271555882eacdfb4e6598d6d0552e9b9b1449b|2024-02-02_01:36:22PM
❯ slim xray nvcr.io/nvidia/clara/bionemo-framework:1.8 --debug --verbose
cmd=xray state=started
cmd=xray info=params target='nvcr.io/nvidia/clara/bionemo-framework:1.8' add-image-manifest='false' add-image-config='false' rm-file-artifacts='false' 
cmd=xray state=image.api.inspection.start
cmd=xray info=image size.human='13 GB' architecture='amd64' id='sha256:57f09db9857da64e3054622608ff03284eafae68b46031d4c080d1c662c3fe99' size.bytes='13386288121' 
cmd=xray info=image.stack index='0' name='nvcr.io/nvidia/clara/bionemo-framework:1.8' id='sha256:57f09db9857da64e3054622608ff03284eafae68b46031d4c080d1c662c3fe99' instructions='198' message='see report file for details' 
cmd=xray info=image.exposed_ports list='8888/tcp,6006/tcp' 
cmd=xray state=image.api.inspection.done
cmd=xray state=image.data.inspection.start
cmd=xray info=image.data.inspection.process.image.start
time="2024-09-09T10:19:27+02:00" level=error msg="layerFromStream: error reading layer(0dd472f69204c68b47e676a9c50febca0a431a40420fc94e1a01a7af84e5f770) - archive/tar: invalid tar header"
time="2024-09-09T10:19:27+02:00" level=error msg="dockerimage.LoadPackage: error reading oci layer from archive(/tmp/slim-state/.slim-state/images/57f09db9857da64e3054622608ff03284eafae68b46031d4c080d1c662c3fe99/image/57f09db9857da64e3054622608ff03284eafae68b46031d4c080d1c662c3fe99.tar/blobs/sha256/0dd472f69204c68b47e676a9c50febca0a431a40420fc94e1a01a7af84e5f770) - archive/tar: invalid tar header"
time="2024-09-09T10:19:27+02:00" level=fatal msg="slim: failure" error="archive/tar: invalid tar header" stack="goroutine 1 [running]:\nruntime/debug.Stack()\n\truntime/debug/stack.go:24 +0x5e\ngithub.com/slimtoolkit/slim/pkg/util/errutil.FailOn({0x2329020, 0x338eb30})\n\tgithub.com/slimtoolkit/slim/pkg/util/errutil/errutil.go:32 +0x4b\ngithub.com/slimtoolkit/slim/pkg/app/master/command/xray.OnCommand(0xc000476a98, 0xc00055c090, 0xc000476ab8, {_, _}, _, {_, _}, {0x0, 0x0}, ...)\n\tgithub.com/slimtoolkit/slim/pkg/app/master/command/xray/handler.go:411 +0x2dbf\ngithub.com/slimtoolkit/slim/pkg/app/master/command/xray.glob..func1(0xc000518700)\n\tgithub.com/slimtoolkit/slim/pkg/app/master/command/xray/cli.go:337 +0x1d98\ngithub.com/urfave/cli/v2.(*Command).Run(0x33a8f00, 0xc000518700, {0xc000518740, 0x4, 0x4})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:279 +0x9dd\ngithub.com/urfave/cli/v2.(*Command).Run(0xc00055a160, 0xc000518140, {0xc000138000, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.27.1/command.go:272 +0xc2e\ngithub.com/urfave/cli/v2.(*App).RunContext(0xc00052c200, {0x23470c8?, 0x341e0a0}, {0xc000138000, 0x5, 0x5})\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:337 +0x5db\ngithub.com/urfave/cli/v2.(*App).Run(...)\n\tgithub.com/urfave/cli/v2@v2.27.1/app.go:311\ngithub.com/slimtoolkit/slim/pkg/app/master.Run()\n\tgithub.com/slimtoolkit/slim/pkg/app/master/app.go:15 +0x45\nmain.main()\n\tgithub.com/slimtoolkit/slim/cmd/slim/main.go:15 +0x187\n" version="linux/amd64|Transformer|1.40.11|1b271555882eacdfb4e6598d6d0552e9b9b1449b|2024-02-02_01:36:22PM"

@kcq
Copy link
Member

kcq commented Sep 9, 2024

@rfilgueiras thanks for sharing your report! I'll use nvcr.io/nvidia/clara/bionemo-framework:1.8 to repro. Couldn't repro the original condition with OrbStack, so this ticket got stuck and hopefully now with the accessible repro image it'll be unstuck :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants