Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project compiles OK but hangs when --watch is added #25023

Open
benjaminjackman opened this issue Jun 16, 2018 · 33 comments
Open

Project compiles OK but hangs when --watch is added #25023

benjaminjackman opened this issue Jun 16, 2018 · 33 comments
Labels
Bug A bug in TypeScript Domain: Declaration Emit The issue relates to the emission of d.ts files
Milestone

Comments

@benjaminjackman
Copy link

TypeScript Version: 3.0.0-dev.20180616 and 2.9.2

Search Terms: typescript watch --watch hangs stalls slow

Hi I have minimized one of my projects to an example source (which I have attached [1]) that:

  1. Compile fine when not using watch mode (takes about 5 seconds to compile on my machine)
  2. Hangs the compiler when using watch mode (doesn't complete it's compile).

To compile the example:

  1. download the zip and unzip it and enter that directory
  2. run yarn install to add the project's deps.
    3a. run yarn tsc-ok for the example where it does compile
    3b. run yarn tsc-bad for the example where it doesn't

BTW: Is there some way for me to tell the compiler to dump out the source files it is processing as it processes them? It would be extremely helpful for tracking this down to the specific issue.

1: Source Code: tsc-watch-hangs.zip

@devuo
Copy link

devuo commented Jun 18, 2018

This is potentially related to #25018

@sheetalkamat
Copy link
Member

In the attached repro, the declaration emit takes forever (which is used in --w mode to determine set of files to emit). This seems to repro as well on doing tsc --d
@weswigham can you please take a look and see why declaration is taking forever.

@benjaminjackman
Copy link
Author

@sheetalkamat If it's not too much trouble to explain (or if there is somewhere you could point me I'd really appreciate it), how were you able to see that? Are you running the compiler in a debugger / are there flags I can enable that dump out compiler phases / the files being processed.

@mhegazy mhegazy added the Bug A bug in TypeScript label Jun 19, 2018
@mhegazy mhegazy added this to the TypeScript 3.0 milestone Jun 19, 2018
@sheetalkamat
Copy link
Member

@benjaminjackman i just ran your code with node tsc.js --inspect-brk --d as i suspected that is what is causing the issue. I paused debugger and it was stuck in declaration emit.

@sheetalkamat sheetalkamat added the Domain: Declaration Emit The issue relates to the emission of d.ts files label Jun 20, 2018
@sheetalkamat sheetalkamat removed their assignment Jun 20, 2018
@weswigham
Copy link
Member

I'm not sure it's so much as hanging as taking (close to) forever because of the number of types it needs to print structurally with import types (mixin pattern + inferred conditional types = nonlocal aliases for all the things). I cut the sample down in size and it actually completes (after an inordinate amount of time). A telltale sign is the fact that the memory usage doesn't really go up (or it does, but incredibly slowly) - meaning neither stack space nor heap are in heavy overuse (and breaking randomly, you're usually never more than ~5 types into an anonymous type stack, at most). I'm guessing that getModuleSpecifiers results need to be cached when so many inferred types are going to be printed using (the same) specifiers, since to do its calculation it needs to traverse the program's files/the filesystem, which is rather expensive to be repeating.

@weswigham
Copy link
Member

weswigham commented Jun 20, 2018

While caching is probably still a good idea, given how often I was breaking into that code, on further inspection that isn't the root problem, sadly - just an aggravating factor. T.T

@weswigham
Copy link
Member

If you wait for them (which is actually faster with specifier caching, though still a few minutes at least), the types for that project look like, for example:
image

If I may be so bold: Thems some large types. 1300 lines for one type argument.

@weswigham
Copy link
Member

Most of it is repeitions of the same structural type, unfortunately:

{
        good: import("h-mst/src/BetterMobxStateTreeTs").ModelReferenceFactory<string | number, import("h-mst/src/BetterMobxStateTreeTs").ModelInstanceForAllIn<{
            id: import("h-mst/src/BetterMobxStateTreeTs").IndentifierFactory<string>;
            color: string;
            tags: import("h-mst/src/BetterMobxStateTreeTs").OptionalFactory<string[] | import("mobx/lib/types/observablearray").IObservableArray<string>, string[], import("mobx/lib/types/observablearray").IObservableArray<string> & import("h-mst/src/BetterMobxStateTreeTs").IToJson<string> & import("h-mst/src/BetterMobxStateTreeTs").IStateTreeNode>;
            transient: import("h-mst/src/BetterMobxStateTreeTs").OptionalFactory<boolean, boolean, boolean>;
        }> & import("h-mst/src/BetterMobxStateTreeTs").IToJson<import("h-mst/src/BetterMobxStateTreeTs").ModelSnapshotForAllIn<{
            id: import("h-mst/src/BetterMobxStateTreeTs").IndentifierFactory<string>;
            color: string;
            tags: import("h-mst/src/BetterMobxStateTreeTs").OptionalFactory<string[] | import("mobx/lib/types/observablearray").IObservableArray<string>, string[], import("mobx/lib/types/observablearray").IObservableArray<string> & import("h-mst/src/BetterMobxStateTreeTs").IToJson<string> & import("h-mst/src/BetterMobxStateTreeTs").IStateTreeNode>;
            transient: import("h-mst/src/BetterMobxStateTreeTs").OptionalFactory<boolean, boolean, boolean>;
        }>> & import("h-mst/src/BetterMobxStateTreeTs").IStateTreeNode & {
            readonly abbrev: string;
            readonly isFood: boolean;
        }>;
        amount: import("h-mst/src/BetterMobxStateTreeTs").NumberFactory;
    }

@benjaminjackman is there some kind of alias we should be striving to use instead of these verbose structural decompositions?

@weswigham
Copy link
Member

If anyone is curious, it seems like engine/store/protos/tech.ts is the first file that takes a long time to build declarations for. We produce 150000 lines (approx 20MB) of declarations for it (but we do emit its declarations!). I think engine/store/protos/structure.ts takes even longer (and will apparently run OOM depending on the exact node version you're using, once the specifier caching is applied to speed it up).

@weswigham
Copy link
Member

On a slightly different tack:
@benjaminjackman do you intend for your project to have declaration on at all? I know @sheetalkamat was able to repro the issue outside of watch using the declaration flag, but since you don't actually set it, I imagine you'd be hoping for it not to come into play. @sheetalkamat do you know why declaration diagnostics are being generated under watch even when declaration isn't set?

@benjaminjackman
Copy link
Author

@weswigham So basically I am trying to structurally type mobx-state-tree using a .d.ts. file, which is as you are seeing fairly complex. Basically mobx-state-tree wraps objects into enhanced versions (called models) that add a lot of features (they can be transformed to and from json / plain old javascript object literals, they may be referred to by other objects by an string/numeric identifier, and can emit patches when things are changed, and on and on).

So for each model (what would be a class in normal typescript land) I need to calculare 3 separate types, which are each generated by walking the properties of the model: one for what type is legal to use in their construction. E.g. some fields have default values, so they do not need to be specified in this CreationType, however those fields will be emitted in the SnapshotType (of the value produced when toJson is called) finally there is the instantiated type Type that is what is created on a call of .create(init: CreationType) => Type. In that type fields that are references can be assigned either by a numeric/string id or by an actual instance to the object held by the reference (however calls to the get accessor will only ever produce the Type of the reference see #2521 which blocks typing this correctly).

Additionally, methods can be added onto the model with .views(self => {...}) and .actions(self => {...}) enhancers (that basically & their own return value with the calling type to produce an enhanced type).

When a model references or nests another model a seemingly big branch factor happens because how a model is created for example depends on the creation type of all it's children (which themselves can be models.)

I could keep going but .. phew! That's enough I am sure you get the point...

I'll go over some hacks I have used to speed things up but I don't have a good enough handle on how the typescript compiler is caching stuff to know if it's actually a valid approach or I am just making random noise against a complex space.

@benjaminjackman
Copy link
Author

benjaminjackman commented Jun 21, 2018

@weswigham

do you intend for your project to have declaration on at all?

Not at all.

I imagine you'd be hoping for it not to come into play. @sheetalkamat do you know why declaration diagnostics are being generated under watch even when declaration isn't set?

Maybe it gets used as a cache by watch to prevent recompiling? If it's not needed for that then I'd rather turn it off. I am able to hack together decent compile times by just running tsc without watch (using something like nodemon -x to trigger compiles on file changes).

However that obviously seems silly.

@weswigham
Copy link
Member

Ah, I see, right, because the hash of the declaration output is used for checking up-to-dateness, so you get them built even if you yourself aren't using them.

@benjaminjackman
Copy link
Author

If anyone is curious, it seems like engine/store/protos/tech.ts is the first file that takes a long time to build declarations for. We produce 150000 lines (approx 20MB) of declarations for it (but we do emit its declarations!). I think engine/store/protos/structure.ts takes even longer (and will apparently run OOM depending on the exact node version you're using, once the specifier caching is applied to speed it up).
--

I hope this doesn't come across as super annoying but having a couple of flags that would cause the compiler to just dump out what it is doing as it is doing it and what file it's processing (optionally with timestamps) would be extremely helpful for folks that don't know the compiler internals at all to atleast get a handle on which of there files is causing problems. I toyed around with --traceResolution (which atleast sort of lets me know if the compiler is making progress) because whittling my codebase down to this example took a bit of time of just erasing stuff and waiting iteratively until I reduced it to what I uploaded here.

Getting something that even just said what phase the compiler was in would have been very useful (even if it wasn't at a file by file level) for that process.

There is --diagnostics and --extendedDiagnostics but when it's hanging (or just stalling out for a really long time) those never get printed.

Also it would be a nice feature to have for attempting to work around the issue, for example now that I know those files are causing problems (which I have suspected) I can atleast try to do more 'caching' there or cut them down if possible.

BTW I used to have the mobx-state-tree version of a union type ... which is basically unusable due to the compile time explosion it induces.

@benjaminjackman
Copy link
Author

@benjaminjackman is there some kind of alias we should be striving to use instead of these verbose structural decompositions?

@weswigham

So I have a couple hacks that makes thing a bit better:

  1. As somone found out in mobx-state-tree land, you can set a type = to a typeof and then extend that type in an interface (Which seems to improve performance)
    That looks like this
export const Player = types.model({
  id : types.identifier(types.string),
  inventory: Inventory,
  progress : ProgressModel,
  popPool : PopPool,
  laborOrders : LaborOrders,
})
type PlayerTypeOf = typeof Player.Type
//Use this type elsewhere
export interface Player extends PlayerTypeOf {}

It seems to improve performance somewhat.

Also I have noticed that using a class instead of type literal in calls to model seems to improve things as well, for example writing Player like this

export const Player = types.model(new class Player {
  id = types.identifier(types.string)
  inventory= Inventory
  progress = ProgressModel
  popPool = PopPool
  laborOrders = LaborOrders
})

when I do that perfomance seems to improve when the models start to have a decent amount of nesting.

@weswigham
Copy link
Member

Hm, I can make Tech.ts compile relatively fast with some slight refactorings to make the name of the internal class type visible so it's not printed structurally; but Structure.ts will not yield in such a way. Likely because while I can adjust it to use its names for itself at the top-level, it proceeds to print Tech structurally within the mapped bits, and Tech is 100000 lines of type (apparently) when it can't self-reference. The long and the short of it is that we need to somehow make names for these anonymous things available in our declaration emit and reuse those names, rather than printing similar structures over and over again.

@benjaminjackman
Copy link
Author

@weswigham Sounds like we are working down similar paths.

Can you by chance show me what it means to make the name of the internal class type visible?

I was just about to post that I am able to get watch to work on a really simplified example by basically throwing the kitchen sink at things for the compiler to cache on then making sure those are used externally.

I made a snippet that basically does the type XTypeOf = typeof type XModel; interface X extends XTypeOf trick but on all the types for the model (it's static type, +the creation, snapshot and instance types). It's really verbose but it's actually able to emit .d.ts files now and do watches (atleast on the simplified example I was using).

Here is what I am talking about:

const TimesGoodAmountsModel = augment(
  xtypes
    .clsModel(
      class TimesGoodAmounts {
        times = types.number
        goodAmounts = GoodAmounts
      },
    )
    .views(self => {
      return {
        withTimes(times: number) {
          return {
            ...self.toJSON!(),
            times,
          }
        },
        get totalGoodAmounts(): GoodAmounts {
          if (self.times == 1) {
            return GAS(self.goodAmounts)
          } else {
            return TIMES_GAS(self.times, self.goodAmounts)
          }
        },
      }
    }),
  self => {
    return {
      empty() {
        return {
          times: 0,
          goodAmounts: [],
        }
      },
    }
  },
)

type TimesGoodAmountsModelTypeOf = typeof TimesGoodAmountsModel
type TimesGoodAmountsTypeTypeOf = typeof TimesGoodAmountsModel.Type
export interface TimesGoodAmountsType extends TimesGoodAmountsTypeTypeOf {}
type TimesGoodAmountsSnapshotTypeTypeOf = typeof TimesGoodAmountsModel.SnapshotType
export interface TimesGoodAmountsSnapshotType extends TimesGoodAmountsSnapshotTypeTypeOf {}
type TimesGoodAmountsCreationTypeTypeOf = typeof TimesGoodAmountsModel.CreationType
export interface TimesGoodAmountsCreationType extends TimesGoodAmountsCreationTypeTypeOf {}
export interface TimesGoodAmounts extends TimesGoodAmountsModelTypeOf {
  Type: TimesGoodAmountsType
  CreationType: TimesGoodAmountsCreationType
  SnapshotType: TimesGoodAmountsSnapshotType
}
export const TimesGoodAmounts: TimesGoodAmounts = TimesGoodAmountsModel

Thanks for pointing me in the right direction in terms of the .d.ts size explosions. I have feeling it's a good thing to check for other times I have seen slowness from the compiler when I have been doing a lot of branching with nested types.

One suggestion that would cut this approach down in half, maybe typescript should allow interfaces to extend typeof X since it appears to work ok otherwise.

I will try just exporting types = typeof next without using the interface extends trick since it will cut down a lot of on the boilerplate, though i think the interface hack had other side benefits in terms of what the compiler was able to cache.

I have heard that it can cache an interface but not a type declaration (hence the desire to have an interface extend the type before exporting).

@weswigham
Copy link
Member

And it is definitely ultimately the structural decomposition of classes that ends up as the issue here (aggravated by the combinatoric expansion masked via the conditionals) - I experimentally disabled that feature in our declaration emitter (it is just one of the flags set at the top), and while it introduced some errors (as some types became unprintable), it also let it build pretty much instantly with declaration on. 😦

@weswigham
Copy link
Member

I have heard that it can cache an interface but not a type declaration (hence the desire to have an interface extend the type before exporting).

Aliases are cached in a best effort kinda way for intersections and unions; so type something = string | number will work, but a second type something2 = string | number won't override the alias provided by the first, and type Something = Another won't allow for caching anything (since it's not a union or intersection). It's a bit of a known limitation in how things are named in emit today.

@weswigham
Copy link
Member

And it is definitely ultimately the structural decomposition of classes that ends up as the issue here

Here is what I am talking about:

And writing out the type annotation like that allows the compiler to name the type without decomposing it, conveniently working around the expansion issue. I'm hoping there's some way to generalize it though, so the compiler can apply this kind of optimization itself when generating declarations; the issue I see though, is that even if I'm free to introduce new local names, if it ends up used in another class in another file, that local name won't be available (since it's in a different scope), and it'll be back to square one; meaning I'd have to export all the generated names, but that'd affect the file's API, which is undesirable (though it is what you've done in the example you wrote above). 😡

@benjaminjackman
Copy link
Author

benjaminjackman commented Jun 21, 2018

@weswigham ok that makes sense for what it's worth this is basically the simplest snippet-table boilerplate version I have come up with

//Able to reduce .d.ts emits considerably
//-------------
export type TimesGoodAmountsType = typeof TimesGoodAmountsModel.Type
export type TimesGoodAmountsSnapshotType = typeof TimesGoodAmountsModel.SnapshotType
export type TimesGoodAmountsCreationType = typeof TimesGoodAmountsModel.CreationType
export type TimesGoodAmounts = typeof TimesGoodAmountsModel & {
  Type: TimesGoodAmountsType
  SnapshotType: TimesGoodAmountsSnapshotType
  CreationType: TimesGoodAmountsCreationType
}
export const TimesGoodAmounts: TimesGoodAmounts = TimesGoodAmountsModel

this did not work and still is causing the massive emits

//Does NOT reduce .d.ts emits significantly enough
//-------------
export type TimesGoodAmountsType = typeof TimesGoodAmountsModel.Type
export type TimesGoodAmountsSnapshotType = typeof TimesGoodAmountsModel.SnapshotType
export type TimesGoodAmountsCreationType = typeof TimesGoodAmountsModel.CreationType
export type TimesGoodAmounts = typeof TimesGoodAmountsModel
export const TimesGoodAmounts: TimesGoodAmounts = TimesGoodAmountsModel

because those internal types need to be cacheable by the compiler as well I am guessing.

edit I realize in example 2 it's kind of silly to expect those types above would help, but typically they are used by other mobx-state-tree classes as aliases to avoid having to type out typeof type ... (being able to nest types within types would be a nice addition here)

e.g.

const X = {
  type S : string
}
type S = X.S

instead of having to do

const X = {
  //Dummy value don't use!
  S : string = null as never
}
type S = typeof X.S

@mhegazy mhegazy modified the milestones: TypeScript 3.0, TypeScript 3.1 Jul 2, 2018
@weswigham
Copy link
Member

weswigham commented Aug 1, 2018

At present - not really. (Other than annotate all your complex types which would otherwise cause the combinatoric explosion in type output or don't use the watch or declaration flags)

This is really complex, because fixing it without disabling some declaration emit features which are important for making, eg, mixins work likely requires new syntax in the language itself. So I'm looking into it myself, but it'll probably be a bit, unfortunately.

@pelotom
Copy link

pelotom commented Aug 2, 2018

@weswigham I'm sure I don't understand the subtleties of the problem, but this seems like a pretty bad defect. In my project a build from scratch is taking 15 seconds, whereas an incremental build is taking ~4 minutes. Until this is fixed I'm probably going to have to avoid --watch and use a hand-rolled watch process which just reruns tsc whenever a typescript file changes 😕

@benjaminjackman
Copy link
Author

@weswigham I had a couple of ideas for mitigating this in the meantime:

  1. Once a certain file hits a certain size limit we can at least warn that file is too large and give up rather than effectively hanging, Perhaps similarly to how at a certain depth the structural typing just stops, atleast then watch mode would work. Also narrowing down the offending .d.ts would be a lot easier as well

  2. Perhaps a polling watch mode that always starts fresh could be added as an option. It will be a bit slower. I am going to experiment with rigging something together to do this using chokidar. Basically when a file changes this mode will call tsc with --noEmit (since I am just using this for compiler errors)

The way I am combating this in practice is to write code, observe that watch stops showing errors (which can be a pretty frustrating process), stopping what I am doing and emitting .d.ts files / trial and error guessing where the explosions are coming from.

It just stinks because if this was a smooth, then the mobx-state-tree library would flow a lot faster, though I understand it's not an easy thing to fix.

@benjaminjackman
Copy link
Author

I made a gist that describes how to use chokidar to call tsc --noEmit whenever a file changes in case anyone stumbles into this ticket:

@weswigham
Copy link
Member

Self-note: Like @benjaminjackman proposed, the best way to provide a short term fix for this is to provide a flag which tells watch mode to skip checking declaration emit.

@sheetalkamat
Copy link
Member

@weswigham You would need to handle the builder correctly in that scenario because builder works on declaration files being generated to decide which files to emit.

@Bnaya
Copy link

Bnaya commented Jan 11, 2019

Hey @benjaminjackman @pelotom
I'm struggling with similar issues

Something that really helped with spinning up watch is to mask MST models behind interfaces,
especially for composited models. (is a similar manner to your examples )

i didn't check how its effect decelerations emits,
just tsc -w startup times (from forever to 10 secs) and im still playing with it.

A quick and minimal example:

export type ForDirectExtend<T> = T;

const MyAModelInferred = types.model("MyAModel", {
  items: types.array(types.string)
});

interface MyAModelType extends ForDirectExtend<typeof MyAModelInferred> {}

// now MyAModel is fast!
const MyAModel: MyAModelType = MyAModelInferred;


// Now even MyBModelInferred is not very slow, becouse its referencing the "fast" MyAModel and not the inferred thing
const MyBModelInferred = types.model("MyBModel", {
  somethingsomething: types.map(MyAModel)
});

interface MyBModelType extends ForDirectExtend<typeof MyBModelInferred> {}

const MyBModel: MyBModelType = MyBModelInferred;

And if we could hint typescript to create intermediate type names instead all of it, that would be great!

@liquidprogrammer
Copy link

I'm also experiencing this issue with typescript 3.7.2 and typescript@next.
Compilation works fine. But --watch hangs until dies with OOM. In my case typescript cannot infer the return type of the method (there are very complex types written there, i know). If i specify the return type explicitly, then watch mode works.
BTW: with typescript 3.0.1 everything works

Steps to reproduce:

  1. Unpack demo.zip and jump to folder
  2. run npm install
  3. run npm run watch -> hangs until dies with OOM
  4. break watch
  5. go to file APISomeClass
  6. comment the method with NOTE: ts watch dies
  7. uncomment the method with NOTE: works (they are the same, one with return type specified)
  8. run npm run watch -> works now

I'm on MacOS Mojave 10.14.5 (18F203), npm version is 6.13.0, node version is 12.7.0 (not sure if that matters)

@joost00719
Copy link

Issue still persists in version 4.1.2.
Had it compile during lunch and came back with no result (Still compiling).
Without watch its compiling in no more than 4 seconds.

@Bnaya
Copy link

Bnaya commented Mar 5, 2021

I believe "smarter type alias preservation" feature
https://devblogs.microsoft.com/typescript/announcing-typescript-4-2/#smarter-type-alias-preservation
should have positive effect here

@bwinklesky
Copy link

bumping issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug A bug in TypeScript Domain: Declaration Emit The issue relates to the emission of d.ts files
Projects
None yet
Development

No branches or pull requests