Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not allowing to upload a single file, only dir #6

Open
justinask7 opened this issue Jun 18, 2020 · 7 comments
Open

Not allowing to upload a single file, only dir #6

justinask7 opened this issue Jun 18, 2020 · 7 comments

Comments

@justinask7
Copy link

When I try to upload only a single file through source_dir variable - I get the following error Error: ENOTDIR: not a directory, while in README.md it's described as (Required) The local directory (or file) you wish to upload to S3.

@corestackdev
Copy link

corestackdev commented Jun 22, 2020

I do confirm - file upload not working.
But as a workaround, this is what I do in my workflow, a direct approach until this action gets a fix.

- name: copy file to  s3
  shell: bash
  env:
    aws_key_id: ${{ secrets.S3_ACCESS }}
    aws_secret_access_key: ${{ secrets.S3_SECRET }}
    aws_s3_bucket: ${{ secrets.S3_BUCKET }}
  run: |
    sudo apt-get update && sudo apt-get -y install awscli
    aws configure set aws_access_key_id $aws_key_id
    aws configure set aws_secret_access_key $aws_secret_access_key 
    aws configure set default.region us-east-1
    aws s3 <fille_to_upload> s3://$aws_s3_bucket/

@shallwefootball
Copy link
Owner

shallwefootball commented Jul 18, 2020

could I see your workflow file (like .github/workflows/upload-s3.yml)? @justinask7

kittaakos pushed a commit to arduino/arduino-language-server that referenced this issue Aug 24, 2020
Ref: shallwefootball/upload-s3-action#6
Signed-off-by: Akos Kitta <kittaakos@typefox.io>
@RDeluxe
Copy link

RDeluxe commented Aug 27, 2020

Same issue here.

 - uses: shallwefootball/s3-upload-action@master
        env:
          REGION: fr-par
          HOST: https://s3.fr-par.scw.cloud
        with:
          aws_key_id: ${{ secrets.S3_ACCESS_KEY }}
          aws_secret_access_key: ${{ secrets.S3_SECRET_KEY }}
          aws_bucket: ${{ secrets.S3_BUILDS_BUCKET }}
          source_dir: ./Doc.zip

@shallwefootball
Copy link
Owner

yep

Can you make some directory and put Doc.zip ?

like ./upload/Doc.zip

@RDeluxe

@jonatino
Copy link

jonatino commented Nov 30, 2020

Error: ENOTDIR: not a directory, scandir 'D:\a\pvphero-launcher\pvphero-launcher\build\Release\PvpHero.exe'
    at Object.readdirSync (fs.js:854:3)
    at klawSync (D:\a\_actions\shallwefootball\s3-upload-action\master\dist\index.js:8458:25)
    at Object.104 (D:\a\_actions\shallwefootball\s3-upload-action\master\dist\index.js:2451:15)
    at __webpack_require__ (D:\a\_actions\shallwefootball\s3-upload-action\master\dist\index.js:22:30)
    at startup (D:\a\_actions\shallwefootball\s3-upload-action\master\dist\index.js:37:19)
    at D:\a\_actions\shallwefootball\s3-upload-action\master\dist\index.js:41:18
    at Object.<anonymous> (D:\a\_actions\shallwefootball\s3-upload-action\master\dist\index.js:44:10)
    at Module._compile (internal/modules/cjs/loader.js:959:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:995:10)
    at Module.load (internal/modules/cjs/loader.js:815:32) {
  errno: -4052,
  syscall: 'scandir',
  code: 'ENOTDIR',
  path: 'D:\\a\\pvphero-launcher\\pvphero-launcher\\build\\Release\\PvpHero.exe'
}
      - name: Upload windows to s3
        uses: shallwefootball/s3-upload-action@master
        with:
          aws_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws_secret_access_key: ${{ secrets.AWS_ACCESS_KEY_SECRET}}
          aws_bucket: 'pvphero'
          destination_dir: 'launcher/windows/'
          source_dir: 'build/Release/PvpHero.exe'

@shallwefootball

@el1f
Copy link

el1f commented Feb 8, 2021

Same problem here.
Wouldn't be much of an issue if the docs didn't state that the action actually works with a single file.

(Required) The local directory (or file) you wish to upload to S3. The directory will replace to key generated by shortid in S3

If file uploads are out of scope for the action do you think you can update the readme?

@loganfarr
Copy link

I ran into this same problem but got around it with just creating a directory and selecting that directory. Oddly enough, it will upload things within the directory but not the directory itself.

So the following code

      - name: Zip folder
        run: |
          cd ../
          tar -czf ${{ env.GITHUB_REF_SLUG }}.tar.gz store/
          mkdir artifacts/
          mv ${{ env.GITHUB_REF_SLUG }}.tar.gz artifacts/${{ env.GITHUB_REF_SLUG }}.tar.gz

      - name: Upload
        uses: shallwefootball/s3-upload-action@master
        with:
          aws_key_id: ${{ secrets.AWS_KEY_ID }}
          aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY}}
          aws_bucket: ${{ secrets.AWS_BUCKET }}
          source_dir: '../artifacts'
          destination_dir: 'store'

Will actually create s3://[bucket-name]/store/main.tar.gz instead of s3://[bucket-name]/store/artifacts/main.tar.gz

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants