Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create DSL to Ease Creation of End User DSLs #47

Open
3 tasks
erikareads opened this issue Jun 21, 2023 · 24 comments
Open
3 tasks

Create DSL to Ease Creation of End User DSLs #47

erikareads opened this issue Jun 21, 2023 · 24 comments
Labels
enhancement New feature or request

Comments

@erikareads
Copy link
Contributor

erikareads commented Jun 21, 2023

Is your feature request related to a problem? Please describe.
Now, to create a DSL with Spark, you need to manually manipulate Entity and Section structs.

Describe the solution you'd like
I propose creating a DSL in Spark to ease the creation of Entities and Sections.
This is:

  • In the spirit of Spark
  • Demonstrates Spark in Spark
  • Allows for compile time validation of types and helpful error messages for DSL creators.

Express the feature either with a change to resource syntax, or with a change to the resource interface

The highest-level syntax I propose would look like this:

defmodule MyCustomDsl do
  defmodule Dsl do
    use Spark
    entity do
      ...
    end
    
    section do
      ...
    end
  end
  sections = Spark.load(Dsl)
  use Spark.Dsl.Extension, sections: sections
end

Which replaces/augments the current syntax:

defmodule MyCustomDsl do
  @entity %Spark.Dsl.Entity{
     ...
   }

  @section %Spark.Dsl.Section{
    ...
  }
  use Spark.Dsl.Extension, sections: [@section]
end

Additional context
Thanks to @jimsynz for the idea!

I will comment on this issue with ideas for the DSL for specific fields so that we can discuss specifics as they arise.

Work remaining

  • Add options fields to the Field entity
    • Determine types for each field in the Field entity
    • Determine required status for each field in the Field entity
@erikareads
Copy link
Contributor Author

In the WIP I have this API for Entities so far:

entity do
  name :my_entity
  target MyEntity
  schema do
    field :field_name do
      type :any
      required true
    end
    field :other_field do
      type :any
      required true
    end
  end
end

I use sub-entities for Schema and Field to enable this,
and then I use transform to turn them back into a a keyword list of keyword lists that t:schema/0 expects.

Since they are entities in this implementation, it's easy to have default value that are defined in the defstruct.
These defaults make it nice to do something like:

schema do
  field :my_field
  field :my_other_field do
    required true
  end
end

Which would work by setting every option field in the Field Entity to required: false.
A field :my_atom invocation without a type would be typed as :any and required: false(?).
The question is if we want to allow this kind of definition, or force a user to set a type and required state for each field.

Working from nimble_schema:

field :my_field do
  type :my_type # required true?
  required true # required true?
  default :any # required false
  keys nil # required false (what does keys do)
  deprecated "a string" # required false (is this a version number?)
  doc "my cool field" # required false (does this map to the `describe` field in an Entity? if so, do we want to rename it to `describe` in the DSL?)
  subsection "" # required false (What is a subsection in a nimble_schema?)
  type_doc false # required false (What does this do?)
  rename_to :another_field_name # required false (Is this relevant when we're defining the field name already?)
end

Do we want to make any of these fields private?
Do we want to require or not require any of them?

@erikareads
Copy link
Contributor Author

The type of type would be an enumeration of all the types here, is there an effective way to encode all the Option types into an Option type?

@erikareads
Copy link
Contributor Author

erikareads commented Jun 21, 2023

Consideration: the current syntax requires (lightweight) dependency resolution.

In the current syntax, I allow entities and sections to be declared at the top level, and then referred to by their atom name:

entity do
  name :my_entity
  ...
  entities [entity_field: :my_other_entity]
end

entity do
  name :my_other_entity
  ...
end

section do
  name :my_section
  ...
  entities [:my_entity]
end

section do
  name :my_other_section
  sections [:my_section]
  ...
end

This has the benefit of allowing re-use of entities and sections, but at the cost of needing to resolve all the atom names into their respective entities and sections.
This isn't a hard problem, since a transformer has access to all of the entities and sections at once, and therefore and simply find and replace atoms with their respective values, raising an error if an entity or section is missing.

But an alternative syntax might look like:

section do
  name :my_section
  sections do
    section do
      name :my_other_section
      entities do
        entity do
          ...
        end
      end
     end
  end
end

This would avoid the resolution process, since the dependencies are explicit, at the cost of needing to define sections and entities recursively in the implementation. Further this syntax leads to significant nesting, damaging legibility.

@erikareads
Copy link
Contributor Author

My understanding of the resolution process:

  1. Collect all entities and sections
  2. Globally Replace all entity name atoms in Entities with their respective Entities
  3. Globally Replace all entity name atoms in Sections with their respective Entities
  4. Globablly Replace all sections name atoms in Sections with their respective Sections

The difficulty comes the "Global" part of the replace.

For example:

entity do
  name :entity1
  entities [:entity2]
end

entity do
  name :entity2
  entities [:entity3]
end

entity do
  name :entity3
end

Would generate as:

[
  %Entity{name: :entity1, entities: [:entity2]}, 
  %Entity{name: :entity2, entities: [:entity3]}, 
  %Entity{name: :entity3}, 
]

Which would need to resolve to:

%Entity{
  name: :entity1
  entities: [
    %Entity{
      name: :entity2, 
      entities: [%Entity{name: :entity3}]
    }
}

@erikareads
Copy link
Contributor Author

When using @module_attributes, as we've used previously, Elixir has done this substitution for us.
Here, we would need to write a variable replacement system to handle nested cases.

@erikareads
Copy link
Contributor Author

So I wrote one. #48 in the most recent commit has support for nested entities and nested sections.

I added Spark.load/2 which takes a list of section names (atoms) as its second argument, and returns only the Section structs that match. Otherwise load/1 will pull both the outer and inner section, creating ambiguity when trying to instantiate the inner section.

@zachdaniel
Copy link
Contributor

I think the atom resolution works, but that there may be some interesting challenges there. For example, if two entities have a nested entity w/ the same name but a different definition. That sounds solvable (like maybe we actually use “id” as a reference instead of the name? Or allow an id to be set and referred to optionally?)

@erikareads
Copy link
Contributor Author

I think the atom resolution works, but that there may be some interesting challenges there. For example, if two entities have a nested entity w/ the same name but a different definition. That sounds solvable (like maybe we actually use “id” as a reference instead of the name? Or allow an id to be set and referred to optionally?)

I see what you mean. When using module attributes, we avoid this by the ability to name the module attribute anything, not just what the name of the entity or section ends up being.

id would be tricky as field name, since we already have identifier for adding an unique identifier in each instantiation of an entity.

Perhaps alias?
We could simplify the atom resolution by using a transform on the entity Entity that sets alias to name if alias is nil.
Then we always do atom resolution based on the alias.

Thoughts?

@erikareads
Copy link
Contributor Author

Just to get a sense of what that look like:

entity do
  name :my_entity
  alias :entity1
end

entity do
  name :my_entity
  alias :entity2
end

entity do
  name :other_entity
  entities [sub_entity: :entity1]
end

section do
  name :my_section
  entities [:entity2, :other_entity]
end

Okay alias won't work because of reserved words. You can tell from highlighting.
We would also want validation logic that two entities with the same name aren't used in the same scope.

@erikareads
Copy link
Contributor Author

Perhaps defalias?

@zachdaniel
Copy link
Contributor

hmm.... dsl_alias?

@erikareads
Copy link
Contributor Author

I like that better than defalias, but we're defining an alias/alternative for an entity or section, not the whole DSL.
Here are some alternatives that might work:

  • tag
  • label
  • handle

@erikareads
Copy link
Contributor Author

erikareads commented Jun 22, 2023

alias feels closest to what we want, which puts dsl_alias ahead of those alternatives for me.

Alternatively: entity_alias and section_alias.
dsl_alias would have the benefit for being the same field for both sections and entities though.

@zachdaniel
Copy link
Contributor

We could also use reference like referenced_as :name which defaults to its :name. I'm open to any of the options you just listed as well as referenced_as

@erikareads
Copy link
Contributor Author

Let's do referenced_as, I dismissed reference for not being specific enough, but referenced_as is exactly what we're doing with the field.

I'll move forward with that, but feel free to change your mind before we merge the feature. :)

@erikareads
Copy link
Contributor Author

To make my plans concrete:

  1. Add a referenced_as field to the Entity and Section structs.
  2. Add it as a field to the DSL with type atom
  3. Add a transform to both Entity and Section that at least sets referenced_as to the value of name if missing.
  4. Redo the logic in the dependency resolver to use referenced_as instead of name.

@erikareads
Copy link
Contributor Author

With my latest commit #48 now supports:

  • referenced_as on both Entities and Sections
  • All public Entity fields with no type validation
  • All public Section fields with no type validation

That's assuming that docs is the only internal use only field on both structs.
Theoretically, as the PR stands, it should be possible to implement any Spark DSL with the new DSL.
However, that hasn't been tested.

There isn't really validation yet, but that's not significantly different than manually constructing the Entity and Section structs.

@erikareads
Copy link
Contributor Author

I'm working on re-implementing the tests for Spark using the meta-DSL, and I came across this:

@step %Spark.Dsl.Entity{
      name: :step,
      target: Spark.Test.Step,
      args: [:name],
      recursive_as: :steps,
      entities: [
        steps: []
      ],
      schema: [
        name: [
          type: :atom,
          required: true
        ],
        number: [
          type: :integer
        ]
      ]
    }

I want to specifically draw attention to the entities: [steps: []] construction.
Is that a valid construction?
It seems to be necessary for the recursive_as to work, but in my implementation of the meta-DSL for entity, I assumed that entities needed to be named, specific entities.
Do I need to add the "missing" case here?

@jimsynz
Copy link
Contributor

jimsynz commented Aug 7, 2023

yeah, it is a valid construction. you're right that it's used for recusive_as. To be clear, the format of the keyword list of the entities key is {target_struct_field_to_store_them_in: [child_entity, another_child_entity]}

@erikareads
Copy link
Contributor Author

I see, that's a good clarification.
It disagrees with the typespec of Entity, which needs to be updated to Keyword.t([Entity.t()]).
With that in mind, I will change entities to expect a one to many relationship, instead of the one-to-one relationship I had assumed.

@erikareads
Copy link
Contributor Author

Question: rather than:

sections = Spark.load(Impl)

use Spark.Dsl.Extension,
  sections: sections

Is it possible to do something like this instead?

use Spark.Dsl.Extension do
  entity ...
end

_Originally posted by @jimsynz in #48 (comment)

Question: rather than:

sections = Spark.load(Impl)

use Spark.Dsl.Extension,
  sections: sections

Is it possible to do something like this instead?

use Spark.Dsl.Extension do
  entity ...
end

We probably can't do use Spark.Dsl.Extension that way for backward compatibility reasons,
however, since we're introducing use Spark for the meta-DSL anyway,
there's nothing stopping us from going:

defmodule MyNewDSL do
  use Spark
  entity ...
  
   section ...

   transformer ...

   patch ...
end

Since use Spark is new, there shouldn't be any backwards compatibility concerns.

I would need to investigate what would be required to provide that API.

Originally posted by @erikareads in #48 (comment)

@erikareads
Copy link
Contributor Author

I think it's worth asking what a radically improved API for Spark can look like.

Right now:

defmodule MyThing.Dsl do
  @entity %Spark.Dsl.Entity{...}
 
  @section %Spark.Dsl.Section{...}

  @moduledoc """
   #{Spark.Dsl.Extension.doc_index([@section])}
  #{Spark.Dsl.Extension.doc([@section])}
  """

  use Spark.Dsl.Extension,
    sections: @sections,
    transformers: @transformers,
    verifiers: @verifiers
end

defmodule MyThing do
  use Spark.Dsl,
    single_extension_kinds: [:data_layer],
    many_extension_kinds: [
      :authorizers,
      :notifiers
    ],
    default_extensions: [
      extensions: [MyThing.Dsl]
    ],
    opt_schema: [
      validate_api_inclusion?: [
        type: :boolean,
        default: true
      ]
    ]
end
defmodule AnotherApp do
  use MyThing
  ...
end

with the current meta-DSL in #48:

defmodule MyThing.Dsl do
  defmodule Dsl do
    use Spark
    entity ...

   section ...
  end

  sections = Spark.load(Dsl)
  use Spark.Dsl.Extension, sections: sections
end

defmodule MyThing do
  use Spark.Dsl, default_extensions: [extensions: MyThing.Dsl]
end
defmodule AnotherApp do
  use MyThing
  ...
end

This version of the API works with the constraints of how the macros currently work, but we could have:

defmodule MyThing do
  use Spark

  entity ...

  section ...

  sections [:my_section]
  transformers [...]
  option_schema do
    ...
  end
end
defmodule AnotherApp do
  use MyThing
  ...
end

This seems like a better use of the use Spark namespace,
but would require a significant amount of additional work to handle the simultaneity of compiling the declared DSL, reading that declaration, and providing a simple use macro for end users. As well as expansion of the DSL to handle transformers, patchers, and other features of Spark.

Thus, I propose moving my current work to use Spark.SectionDSL:

defmodule MyThing.Dsl do
  defmodule Dsl do
    use Spark.SectionDsl
  end
  sections = Spark.SectionDsl.load(Dsl)
  use Spark.Dsl.Extension, sections: sections
end

@joshprice
Copy link
Contributor

What's it going to take to get this merged?

This meme seemed appropriate

8f3c95

@zachdaniel
Copy link
Contributor

Really its just a matter of it being championed, completed and tested. I’d love for it to happen but I myself likely won’t be able to do it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants