-
Notifications
You must be signed in to change notification settings - Fork 12.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add auto-import for the package.json
imports
field
#55015
Add auto-import for the package.json
imports
field
#55015
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for adding this!
src/compiler/moduleSpecifiers.ts
Outdated
if (typeof cachedPackageJson !== "object" && cachedPackageJson !== undefined) { | ||
return undefined; | ||
} | ||
const packageJsonContent = cachedPackageJson?.contents.packageJsonContent || readJson(packageJsonPath, host as { readFile(fileName: string): string | undefined }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use JSON.parse
for package.json files. readJson
uses the TypeScript parser to construct an AST and then converts that to an object 😵 I’ve been meaning to audit existing uses of readJson
and then rename it; I know some can be replaced.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I saw that 😅. I was aiming to consistent with the implementation of auto-import for the Sorry, I was thinking of some other code, I've updated it to exports
field. I was thinking maybe that should be left for a seperate PR to avoid more ad-hoc handling of package.json reading?JSON.parse
.
Use JSON.parse for package.json files. readJson uses the TypeScript parser to construct an AST and then converts that to an object 😵 I’ve been meaning to audit existing uses of readJson and then rename it; I know some can be replaced.
I've actually checked btw and all the usages of readJson
are for reading package.json
s and the readJsonOrUndefined
function that's used by readJson
is only used for build info so all of them can really be replaced with JSON.parse
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure that replacing everything with JSON.parse
is generally correct, because it'll throw if the file has malformed JSON. The services project has a helper, tryParseJson
, which catches this, and that's used by its createPackageJsonInfo
(which then makes its package.json cache).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, JSON.parse
should be in a try/catch, particularly when reading a local package.json as opposed to one in node_modules. It’s virtually impossible for a node_modules package.json to be malformed so I don’t care as much there. My point was only that we shouldn’t be creating TypeScript AST nodes and issuing parse diagnostics just to get an object representation of a JSON file.
diff --git a/src/compiler/moduleSpecifiers.ts b/src/compiler/moduleSpecifiers.ts index 4dcf24ea4b..d6022afe4b 100644 --- a/src/compiler/moduleSpecifiers.ts +++ b/src/compiler/moduleSpecifiers.ts @@ -90,7 +90,6 @@ import { pathIsBareSpecifier, pathIsRelative, PropertyAccessExpression, - readJson, removeExtension, removeFileExtension, removeSuffix, @@ -894,11 +893,11 @@ function tryGetModuleNameFromPackageJsonImports(moduleFileName: string, sourceDi } const packageJsonPath = combinePaths(ancestorDirectoryWithPackageJson, "package.json"); const cachedPackageJson = host.getPackageJsonInfoCache?.()?.getPackageJsonInfo(packageJsonPath); - if (typeof cachedPackageJson !== "object" && cachedPackageJson !== undefined) { + if (typeof cachedPackageJson !== "object" && cachedPackageJson !== undefined || !host.fileExists(packageJsonPath)) { return undefined; } - const packageJsonContent = cachedPackageJson?.contents.packageJsonContent || readJson(packageJsonPath, host as { readFile(fileName: string): string | undefined }); - const imports = (packageJsonContent as any).imports; + const packageJsonContent = cachedPackageJson?.contents.packageJsonContent || JSON.parse(host.readFile(packageJsonPath)!); + const imports = packageJsonContent?.imports; if (!imports) { return undefined; }
src/compiler/moduleSpecifiers.ts
Outdated
if (typeof cachedPackageJson !== "object" && cachedPackageJson !== undefined || !host.fileExists(packageJsonPath)) { | ||
return undefined; | ||
} | ||
const packageJsonContent = cachedPackageJson?.contents.packageJsonContent || JSON.parse(host.readFile(packageJsonPath)!); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this is a new problem as this file already has code that looks exactly like this, but I'm somewhat certain that we should be using readJson
for this; if the JSON is malformed, we'll throw here and then I don't know what will happen. readJson
is what's used to get stuff into the cache, and generally speaking other raw JSON.parse
calls all appear to at least try/catch
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that there's another thread about this at #55015 (comment); if we want to leave this JSON.parse
in here given we already do that, that's fine, but overall if we're avoiding readJson
because it calls the wrong JSON parser, but the alternatiive is to call JSON.parse
unguarded... that feels spooky to me.
src/compiler/moduleSpecifiers.ts
Outdated
if (typeof cachedPackageJson !== "object" && cachedPackageJson !== undefined) { | ||
return undefined; | ||
} | ||
const packageJsonContent = cachedPackageJson?.contents.packageJsonContent || readJson(packageJsonPath, host as { readFile(fileName: string): string | undefined }); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure that replacing everything with JSON.parse
is generally correct, because it'll throw if the file has malformed JSON. The services project has a helper, tryParseJson
, which catches this, and that's used by its createPackageJsonInfo
(which then makes its package.json cache).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reading the tests, I'm a little confused about the import conditions which map to .ts
files...
src/compiler/moduleSpecifiers.ts
Outdated
if (typeof cachedPackageJson !== "object" && cachedPackageJson !== undefined || !host.fileExists(packageJsonPath)) { | ||
return undefined; | ||
} | ||
const packageJsonContent = cachedPackageJson?.contents.packageJsonContent || JSON.parse(host.readFile(packageJsonPath)!); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see that there's another thread about this at #55015 (comment); if we want to leave this JSON.parse
in here given we already do that, that's fine, but overall if we're avoiding readJson
because it calls the wrong JSON parser, but the alternatiive is to call JSON.parse
unguarded... that feels spooky to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
None of the new tests test conditional imports - those probably need to be checked. Complex examples of nested conditions like
{
"imports": {
"#thing": {
"types": { "import": "./types-esm/thing.d.mts", "require": "./types/thing.d.ts" },
"default": { "import": "./esm/thing.mjs", "require": "./dist/thing.js" }
}
}
}
are actually more common in the wild than you'd think for exports
, and likely have just as much utility for imports
.
Just wanted to say, I'm excited about this 😍 |
@emmatown Almost there! |
I would love to get this into 5.3. @emmatown, are you interested in continuing? If not, someone on the team can probably address the remaining feedback and get it ready to merge. |
See reasons at https://github.com/epicweb-dev/epic-stack/blob/main/docs/decisions/031-imports.md and vuejs/core#9919 Pending vuejs/core#9919 to be fixed Pending microsoft/TypeScript#55015 to be released
This shows a failure mode around a package which (oddly!) publishes an `index.ts` file in its root but also specifies both the `types` and `exports.types` keys in `package.json` as pointing to an emitted `.d.ts` file in a `dist` directory. If the `index.ts` is removed or renamed, `tsc` correctly uses `types` to resolve the location. This appears to have been introduced by microsoft/TypeScript#55015.
Inspired by and in anticipation of this feature being released, I wrote https://www.webpro.nl/articles/using-subpath-imports-and-path-aliases (please let me know if there's mistakes). Thanks @emmatown and @Andarist for the hard work! |
// @Filename: /a.ts | ||
//// something/**/ | ||
|
||
verify.importFixModuleSpecifiers("", ["#something.js"]); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@andrewbranch should this autocomplete #something.js
? I'm investigating some stuff and I concluded the conclusion that perhaps this should actually be just #something
. @module: nodenext
doesn't prescribe the specifier ending - and since this package.json
doesn't have a type it's an implied CJS file so the ending is optional.
note for myself: fixing this would likely depend on using getModuleSpecifierEndingPreference
appropriately
Hmm, I still observe some inconsistencies elsewhere but now I concluded that this one is correct - imports
/exports
always require extensions (regardless of the module format).
Fixes #49492
This treats
package.json#imports
as roughly equivalent tocompilerOptions#paths
in terms of deciding which specifier to use, it also preferspackage.json#imports
overcompilerOptions#paths
While doing this, I found some broken behaviour around extension replacement which was already observable with the
package.json#exports
field auto-import but is more visible withpackage.json#imports
so this fixes that as well.autoImportPackageJsonExportsSpecifierEndsInTs
shows an example of what's fixed forpackage.json#exports
. That fix and test is in a seperate commit if that's easier to review