Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support an alternative to $-imports #3062

Open
lefou opened this issue Mar 3, 2024 · 10 comments
Open

Support an alternative to $-imports #3062

lefou opened this issue Mar 3, 2024 · 10 comments

Comments

@lefou
Copy link
Member

lefou commented Mar 3, 2024

We inherited the various $-imports like import $ivy or import $file from Ammonite. It turns out, the Scala compiler isn't very fond of us using the $ symbol in object and class names and newer versions starting from 3.4.0 enforce this more strictly.

See also:

I think a nice alternative to these magic imports could be something similar as the using directives as known from Scala CLI.

From a compiler perspective, they are just comments, so we don't need to mangle scripts just for the sake of replacing directives with something regular. Something we needed to do before with the $-imports.

My preferred usage would look like this:

//> mill version 0.11.7
//> mill dep g:a:v

But since my proposal to make them tool/context specific wasn't accepted, the "standard" way to support tool specific key-value pair would be to encode the tool/context in the key. It would look like this:

//> using mill.version 0.11.7
//> using mill.dep g:a:v

The pros for following this standard would be, that IDEs, formatter and syntax highlighters might handle them better. (Which is already the case at the time of writing this. GitHub renders my first example just grey, but the second nicely colored.)

@lefou
Copy link
Member Author

lefou commented Mar 3, 2024

FTR, here is my mentioned proposal.

@lefou wrote on Sept 8, 2003:

I've been playing around with using directive for a while, mostly in the context of Mill build scripts, and also had many discussions including those on the first Tooling Summit this year.

I suggest, to split the "using directives" (as known in it's current form) into three parts: the comment marker, the context marker and the directive, and discus each part, it's purpose and it's potential for standardization separately.

  1. The comment marker //>. This is undoubtedly agreed, that is nice. It's a comment, so it's transparent to the compiler. It's easy to recognize and survives most code formatting attempts. This is a nice candidate for a "standard".

  2. The context marker, currently the "using" keyword. There was lots of discussion about the "using" keyword, it's name and also the fact whether it's needed or not. I'd like to associate the context or the intepreting tools with it. Although a more generice "scala-cli" whould be more clear, "using" is now associated with Scala CLI and will do it. For other tools, there should be a different marker like "mill" or "ammonite"/"amm". That way, tool-specific configuration can co-exist and a script can be compatible to multiple tools/runners.

  3. The context-specific directive or configuration, typically some key-value like tool/context specific configuration. Due to it's tools specific nature, a more broader standard makes not much sense, and will make the evolution of the various tools harder.

Why do we need context specific configuration?

Different tools have different features and properties. They also have different requirements, different defaults and different usability concepts. It's best to leave the control over the configuration of these details to the respective tools. Instead, let's make sure that tools can co-exists and scripts can be used in various environments simultaneously. With my proposal to introduce a context marker (or handle the "using" as marker for Scala-CLI only), it is possible to apply different configurations for different tools to the same script.

Showcase

Take the build.sc for example. It is an Scala script file and many different tools (Ammonite, Scala-CLI, Mill, xxx worksheet) can edit, format, execute it, but all will intepret it slightly different. Lets say I want to use the os-lib dependency, in Ammonite or Mill I can just use it, as it is already provided, whereas in Scala CLI I need to import it first (with //> using dep). If said script requires a minimal version of Ammonite or Scala-CLI to function, I could easily configure that due to the distinct context selectors (//> using version vs //> ammonite version).

Example: A script with configuration for Scala-CLI and Ammonite

//> using dep "com.lihaoy::os-lib:0.9.1"
//> ammonite version 3.0.0

println("Current path: ${os.pwd}")

At a first glance, when opening this file, I can see it is supposed to run with Scala-CLI and/or Ammonite. It uses the os-lib dependency and requires Ammonite in version 3.0.0.

See also my other post where I presented this idea: com-lihaoyi/Ammonite#1372 (comment)

@lefou
Copy link
Member Author

lefou commented Jun 25, 2024

I think support for a minimal required version would be nice. Mill should fail if it does not match.

//> using mill.minVersion 0.11.8

@gamlerhart
Copy link
Contributor

+1 for moving away from the $ivy imports. I tripped over this a few times with IntelliJ, where it decided to 'clean up' imports and messed up the $ivy imports.

@lefou
Copy link
Member Author

lefou commented Jun 28, 2024

+1 for moving away from the $ivy imports. I tripped over this a few times with IntelliJ, where it decided to 'clean up' imports and messed up the $ivy imports.

A tip: Recently I have discovered, if you place the $-import after the other regular imports, IntelliJ seems to not remove them when organizing the imports. At least, in my case, it was constantly removing the import $meta._ from a build.sc whenever a new import was about to be added, and I could prevent it by placing the import $meta._ after all other imports.

@bishabosha
Copy link
Contributor

I am also in favour of this - it would mean no custom parser needed for the scala 3 port (unless we would keep the old way specifically for deprecation messages)

@lihaoyi
Copy link
Member

lihaoyi commented Aug 13, 2024

I'd like to preserve magic imports for backwards compatibility, at least for the first scala 3 release. It's fine to break stuff in a breaking release, but leaving the ivy imports in place for at least one or two releases should help smoothen the migration considerably

Long term I think we should consider alternatives to magic imports, but let's decouple it from the Scala 3 upgrade

@lefou
Copy link
Member Author

lefou commented Aug 14, 2024

We could introduce using directives and deprecate imports in 0.12, so that we could start clean in Mill 0.13 with Scala 3.

@lihaoyi
Copy link
Member

lihaoyi commented Aug 14, 2024

Possible, but I would like to spend a bit more time exploring the design space for this area.

directives work great to replace ivy imports, are kind of awkward to replace file imports. It's plausible we could find some solution, e.g. by making file imports go away by auto discovering files, but that may require other changes e.g. changing the file extension from .sc to .mill to avoid picking up random scripts. And I think there's some interesting stuff we could do with a more structured JSON/YAML header format, which could precisely match our JSON serialization format and generalize to arbitrary configs in a way that directives dont

That's not to say I think we should never do dirextives, but from a timeline perspective I think we should not couple it to the Scala 3 upgrade, as this area will take more time to work out the details

@lefou
Copy link
Member Author

lefou commented Aug 14, 2024

@lihaoyi Could you elaborate on the difficulties you see with $file imports? Since we already preprocess them and replace with some dummy import statement, it should be even easier as we don't need to change the script any longer. Their compiled binary representation is on the classpath and the actual import importing the package or class should work without any change.

@lefou
Copy link
Member Author

lefou commented Aug 14, 2024

Regarding the more complex encoding like JSON/YAML. I think this was discussed for using directives many times, finally it was dropped because of complexity. There is still no good editor support for such file headers. And as a result it's harder to properly edit, format and get the header right. In contrast, the current using directives are simple key value pairs and that's all. Everything else is parsing the value, which is analog to e.g. how CLI args work or sysprops or env values.

I also envision to use a simple

//> using mill.version 0.12.0

and want to parse this from millw, so that there is no longer a .mill-version file needed. As long as the format is that trivial, parsing in bash, cmd or PowerShell is easy.

Also, it works for scala-cli, which doesn't want to be a build tool yet its rich building capabilities are completely controlled via this simple format. We pick users up with something they're already familiar with or which is easy to learn.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants