Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conan 2.0 requirements effect in package_id #36

Closed
wants to merge 2 commits into from

Conversation

memsharded
Copy link
Member

Conan 2.0 proposes a new package_id that takes into account the different scenarios that can happen when building and linking C and C++ native binary artifacts like executables, static libraries, shared libraries, etc.

This will be done differentiating 2 major scenarios:

  • "embed" cases, when the dependency artifact will be embedded in the consumer artifact, like when an executable links a static library. This case will default to use a full_mode policy (old recipe_revision_mode), which guarantees that every change in the dependency requires a re-build in the consumer.
  • "non embed" cases, where the dependency artifact is in general not embedded/inlined in the consumer artifact, like when a static library depends on another static library or a shared library depends on another shared library. This case will default to use a minor_mode policy, which fully allows the developer to define what needs to be done (not bumping the version or bumping the patch, means no need to rebuild the consumers, bumping the minor means consumers should be rebuilt and bumping the major means that there are breaking changes requiring the consumers to change their code to adapt).

  • Upvote 👍 or downvote 👎 to show acceptance or not to the proposal (other reactions will be ignored)
    • Please, use 👀 to acknowledge you've read it, but it doesn't affect your workflow
  • Comment and reviews to suggest changes to all (or part) of the proposal.

@FnGyula
Copy link

FnGyula commented Jun 17, 2022

As long as both cases are default configurable, I'm 👍

@memsharded
Copy link
Member Author

As long as both cases are default configurable, I'm

Sure, if you check the full text, there are 5 different core.package_id configurations, including the embed, non-embed, unknown, build and python cases, that can be configured in global.conf

@DoDoENT
Copy link

DoDoENT commented Jun 17, 2022

I'm strongly in favor of this proposal, but I also think that the chosen "use-case" should also affect the visibility of the public headers, as described in this old issue.

Notably, let's have 3 static library packages Aaa, Bbb and Ccc with dependency graph Ccc -> Bbb -> Aaa.

Let's also say that public headers of Bbb include and use public headers of Aaa. This means that if some package Ddd, that depends on Ccc, overrides the version of Aaa to a new, breaking, version, Conan needs to rebuild both Bbb and Ccc, as implementation files (.cpp files) of Ccc may have inlined code from public headers of Aaa, even though they never directly included them - the include was transitive via public headers of Bbb.

My suggestion in the mentioned issue was to also let Conan generators be aware of that, and prevent Ccc from even seeing headers from Aaa unless Bbb declares that it uses Aaa as in "embed use-case". Even though the "embedded use-case" is meant mostly for applications and dynamic/shared libraries, I would argue that it should also be used for head-only and static libraries whose public headers are used/"embedded" by the public headers of the consuming library.

So, by using syntax from the proposal, in the example above, the recipe of Bbb would state something like this:

class BBBConan(Conanfile):

    def requirements(self):
        self.requires('Aaa/1.0.0@user/channel', package_id_mode='minor_mode')

WIthout stating that, Ccc (the user of Bbb) would not get the header search path for Aaa, thus making a compilation for Ccc fail if Bbb mistakenly uses public headers of Aaa in its public headers.

@a4z
Copy link

a4z commented Jun 17, 2022

is / will there be a default mode if you set / select nothing ?

(I hope the answer is yes, full_mode is the default :-)

@memsharded
Copy link
Member Author

@DoDoENT

WIthout stating that, Ccc (the user of Bbb) would not get the header search path for Aaa, thus making a compilation for Ccc fail if Bbb mistakenly uses public headers of Aaa in its public headers.

I am not sure if I follow. This was already approved in https://github.com/conan-io/tribe/blob/main/design/026-requirements_traits.md, and it has been implemented. Now in Conan 2.0 (released alpha), ccc package does not have visibility over aaa headers, unless bbb explicitly defines it with the appropriate trait self.requires("aaa/1.0", transitive_headers=True).

@memsharded
Copy link
Member Author

@a4z

Yes, it is in the text of the proposal:

  • core.package_id:default_non_embed_mode, default=minor_mode
  • core.package_id:default_embed_mode, default=full_mode

The default for the non-embed mode is select to be minor_mode, for several reasons:

  • It allows user control, based on versioning for the non-embed case
  • It allows ConanCenter to be operational, otherwise, every single change to any recipe in ConanCenter would render all the dependant recipes binaries obsolete, requiring to rebuild everything all the time, and that is not viable. ConanCenter running in 1.X with "semver_direct_mode" for all cases has proved that is not that strictly necessary to rebuild absolutely always.
  • Users that want to change it, just need 1 line in their global.conf

Values that these configurations can take (given the full version is ``1.2.3``):
- ``full_mode``: including ``dep/1.2.3@user/channel#recipe_revision:package_id``
- ``major_mode``: include ``dep/1.Y.Z@user/channel``
- ``minor_mode``: include ``dep/1.2.Z@user/channel``
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think minor_mode should not be affected by the user/channel combination. This is now the case, but I think it's confusing. Notably, if you switch opencv/4.6.0 with opencv/4.6.0@company/stable, your packages should not get rebuilt - it's still the same version of OpenCV, just binary from a different provider (possibly built with different flags, but the API and ABI should remain the same).

I still think we need to have a mode that "locks" to the user/channel combination or maybe user only, but in general that should be a more specific case for specific company rules.

I'm aware that something like this can be done at the individual package level or even shared via python requirements, but, at least from the documentation, I'm not sure that this is as flexible as builtin package modes (i.e. how would you define a custom "semantic version-like" mode that behaves just like the built-in, but ignores the user/package combination?).

Maybe I'm just misunderstanding something?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We explicitly added this recently after discussing this. We have been considering a few things:

  • Replacement (override), of an upstream dependency based on user/channel is something that shouldn't be common, but an exceptional thing. If an opencv/4.6.0@company/stable dependency is to be used, the recommended approach is to make it explicit in recipes with requires = "opencv/4.6.0@company/stable.
  • The idea, once that packages are considered "immutable" and the conan copy command has been removed in 2.0, is that something under a different user/channel can be a completely different thing, not something to be transparently swapped. Conan 2.0 even enable the pkg/version@user, to drop only the channel.
  • Once that we consider packages under different user/channel like a completely different thing, we thought that the least that can happen if that changes is to force by default a re-build of the consumers.

Adding new modes that discard user/channel part is definitely possible, lets see what others think about this default.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding new modes that discard user/channel part is definitely possible, lets see what others think about this default.

in our company, we use snapshot/unstable/stable as channel depending on the maturity of the package. Having a mode that only considers pkg/version@user would be quite useful for us to avoid unnecessary rebuilds.

@DoDoENT
Copy link

DoDoENT commented Jun 17, 2022

I am not sure if I follow. This was already approved in https://github.com/conan-io/tribe/blob/main/design/026-requirements_traits.md, and it has been implemented. Now in Conan 2.0 (released alpha), ccc package does not have visibility over aaa headers, unless bbb explicitly defines it with the appropriate trait self.requires("aaa/1.0", transitive_headers=True).

@memsharded , I'm aware of that. I'm just saying that that proposal should come hand-in-hand with this proposal here.

Notably, if you write self.requires('aaa/1.0', transitive_headers=True), it's package_id_mode should default to minor_mode (or whatever is chosen for the "embedded use-case"). Of course, it would still be possible to write self.requires('aaa/1.0', transitive_headers=True, package_id_mode='semver_mode') when you know what you are doing, but at least it would not be the default.

My guess is that, by default, when you opt-in for transitive_headers, you automatically get into the "embed use-case" scenario.

I could be wrong, though,

@a4z
Copy link

a4z commented Jun 17, 2022

but how is the selection between embed and non embed mode done ?


Conan 2.0 proposes a new default ``package_id`` that will be dynamic, based on two main cases:

- When an application is linking a static library, or a shared library is linking a static library, or any library or application is linking a header-only library as an implementation detail, the resulting artifacts will be fully embedding/inlining the dependency artifacts, and they require in the general case to be re-built from sources when something in the dependency changes. This will be the “embed” case, and by default will use a “full_mode” policy, which means that the full dependency reference ``pkg/version@user/channnel#recipe_revision:package_id`` will be used and factored in the consumer ``package_id``. Recall that ``package_revision`` can no longer be part of the ``package_id`` as removed in proposal https://github.com/conan-io/tribe/pull/30
Copy link

@puetzk puetzk Jun 17, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- When an application is linking a static library, or a shared library is linking a static library, or any library or application is linking a header-only library as an implementation detail, the resulting artifacts will be fully embedding/inlining the dependency artifacts, and they require in the general case to be re-built from sources when something in the dependency changes. This will be the “embed” case, and by default will use a “full_mode” policy, which means that the full dependency reference ``pkg/version@user/channnel#recipe_revision:package_id`` will be used and factored in the consumer ``package_id``. Recall that ``package_revision`` can no longer be part of the ``package_id`` as removed in proposal https://github.com/conan-io/tribe/pull/30
- When an application is linking a static library, or a shared library is linking a static library, or any compiled library or application is linking a header-only library as an implementation detail, the resulting artifacts will be fully embedding/inlining the dependency artifacts, and they require in the general case to be re-built from sources when something in the dependency changes. This will be the “embed” case, and by default will use a “full_mode” policy, which means that the full dependency reference ``pkg/version@user/channnel#recipe_revision:package_id`` will be used and factored in the consumer ``package_id``. Recall that ``package_revision`` can no longer be part of the ``package_id`` as removed in proposal https://github.com/conan-io/tribe/pull/30

One header-only library should be allowed to depend on another header-only library without triggering full_mode.

The next paragraph acknowledges that with "or a header-only library is depending on another library" but as written the two rules conflict.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You are right, better clarify that this is for compiled libraries.

@memsharded
Copy link
Member Author

@DoDoENT

My guess is that, by default, when you opt-in for transitive_headers, you automatically get into the "embed use-case" scenario.

Not really. If you have both bbb and aaa that are static libraries and bbb requires aaa with transitive_headers=True, that doesn't imply that bbb will be inlining/embedded the aaa artifacts, in fact it could mean the opposite, that it is less of an implementation detail of bbb and be just transitive over the public headers, but not used in the .cpp at all.

And for the consumer ccc of bbb, that depends on the type of consumer. If it is an application, it will certainly use the embed case both over bbb and aaa.

Also, the most important, the non-embed doesn't imply that nothing will be ever embedded. It is implying that it is not always embedding the changes. But the user has the control to specify it, bumping the patch (or not bumping), means that only .cpp files were modified, it doesn't matter that headers would be embedded in consumers. If a change is done in a public header, then users can do a bump to the minor version to indicate it, and that will trigger the signal that something that might be embedded by the consumers has changed, and as such, it is necessary to rebuild consumers.

@puetzk
Copy link

puetzk commented Jun 17, 2022

My other concern also relates to the combinations of embed vs transitive_headers requirements, plus compatible_packages. I find it quite common that shared library has a lot of implementation details which are both embedded inside of it (via static or header libraries) which are not exposed in its public API. Today that's not much a of problem; such requirements just capture the major version and that rarely changes. This of course means the package_id is quite inadequate to achieve reproducibility, which is presumably what this proposal is aimed at - it will make package_ids much more specific than currently are.

I'd like to view that as a good thing, because in terms of reproducibility it definitely is. But as-is exposing all these implementation details is either going to cause a lot of proliferation of package binaries, or (more likely) it's just going to result in recipes routinely defeating the new mechanism by deleting hidden+embedded requirements from their package_id to restore the status quo.

This feels like the use case compatible_packages was intended for. But it will be very difficult for a recipe to list every possible permutation of full_mode package_ids for all of its requirements, and outright impossible for it to cover future versions of those requirements produced after the recipe was uploaded.

What I think would work well is if the package_id had all the details for reproducibility, but the recipe could use self.compatible_packages.append to offer additional package_ids where some of these requirements (that are private, not subject to one-definition-rule, and thus interchangeable) were deleted. The problem is that currently compatible_packages only operates in the consumer, and so no "reduced" package_id like this would ever match anything in the cache or server. What could really be neat is if compatible_packages offered by the recipe while producing a package were also recorded, allowing the consumer<->producer roles (of the same recipe) to try and meet in the middle at some reduced package_id that they agree constitutes a compatible substitution. This would likely require the binary cache to have some kind of alias package concept (like recipes already do).

Another important detail is that some requirements might frequently not even be resolvable in the consumer's profile - e.g. a mingw-gcc or C# executable which requires a shared library built in msvc, which in turn requires some other static/header-only library that doesn't support gcc (or vice-versa). But if the interface is suitably tight, the compatible_packages might meet in the middle at "os=Windows, arch=...". This is the same kind of question that comes up all the time for tool_requires, where today's solution is generally to produce a very minimized package_id (that is thus not at all sufficient for reproducibility of that tool package).

But I think compatible_packages can already handle that pretty well - it's OK for the list to have hypothetical options in the compatible_packages list that came out to "INVALID", you just skip to the next option and see if it works. And presumably a requirement whose validate() failed just makes that requirement's package_id "INVALID", and full_mode propagates this, but nothing fails the whole graph. So if any of the compatible_packages options delete that requirement (or reduce it to minor_mode, or whatever) that should allow the overall the package_id for this alternative to be valid again.

@KerstinKeller
Copy link

In general, I fully support this proposal. It gives better control over package ID computation of dependencies.

However, it still doesn't address a usecase which I find more important (or did I overread something?):
Why can't a package define it's own mode?

At the moment there is a global definition, and consumers can overwrite the global defaults. This proposal keeps it the same, just the granularity is more fine grained.
Even with this approach, I will still have to modify each and every package which uses library X, if package X might make changes that requires a full_version_mode.

Why can't a package require "minimum" compatibility, for itself?
E.g. Each library knows best what versioning scheme it uses, e.g. semver ,...
So each package should say: I don't follow any versioning, please always use full version mode. Or: I use semver. Mayor mode is fine (though, can Conan model the case only to use newer artifacts on runtime).

The integrator can then globally specify if they use the packages default, or something "stricter".

Then, if package could specify it's minimum requirement type, there is not so much of a need for their consumers to specify it at all.


Conan will be using the ``package_type`` information to define which of the above scenarios apply.

Conan 2.0 will allow ``tool_requires`` (or any other ``requires`` with trait ``build=True``) to affect the ``package_id`` of their consumers. By default it will be effect will be ``None``, but users will be able to define it, globally via ``global.conf`` configuration, or per recipe in the ``requires(...., build=True, package_id_mode=xxxx)`` trait (being ``xxx`` any valid mode like ``minor_mode``, ``patch_mode``, ``full_mode``, etc)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am still confused , does it mean this conan-io/conan#7876 will be solved with moving the NDK dependency from a build_requires to a tools_requires section ?

and I add that for example in the profile, and this ensures all packages I use come from the same NDK build?

and does the global setting win over the recipe setting or vice versa?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tool_requires is mostly an alias for build_requires to better illustrate that it is something to be used for "tools", and not for regular libraries (typical abuse in Conan 1.X)

Yes, conan-io/conan#7876 will be solved by this, by different means:

  • Specifying directly in the self.tool_requires("ndk/version", package_id_mode="minor_mode")
  • Specifying the general policy for tool_requires in global_conf: core.package_id:default_build_mode=minor_mode. This approach affects all the packages that are built with a tool_requires, even if it comes defined in a profile.

At the moment the global.conf settings is explicitly called a "default", it will have less priority than the recipe ones. It seems that when a recipe decides to explicit the modes of its dependencies, it is very aware of what is doing, and that should be respected in general. If the case for a global configuration that also overrides the recipe defined ones, that might be doable.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for your patience in explaining me the details!


This proposal is about the effect that the dependencies have over the current package ``package_id``, but not about other factors, such as the package settings, options, configuration or other type of binary compatibility resulting from other factors other than dependencies.

Conan 2.0 proposes a new default ``package_id`` that will be dynamic, based on two main cases:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm thinking about an alternative wording to emphasize on how dependency affects binary, something like:

... based on whether dependency affects binaries in the package or merely a dependency which needs to be carried to further stage.

The examples of the former are:

  1. Header-only libraries with inline or template code used in the package.
  2. Static library linked into a shared library or an application.

In these cases dependency directly affects resulting binary and potentially any change in the dependency may produce a different binary. Therefore in this case "full_mode" is a reasonable default.

However if dependency is just carried to the next stage, for example:

  1. An application or a shared library linked to another shared library. Then binary is affected only after dynamic linker loaded everything in runtime.
  2. A static library depending on another static library. Then binary is affected only when linker produces a shared library or an application.
  3. A header-only library depending on another library. Then binary is affected only when compiler instantiates the code.

In this case it is possible that binary from upstream package may not affect binary of direct dependent downstream package. And by default Conan will use a "minor_mode" policy, which means requiring a rebuild if the major or minor of the dependency changes, but not requiring it if the patch of the dependency changes.


Conan 2.0 will allow ``tool_requires`` (or any other ``requires`` with trait ``build=True``) to affect the ``package_id`` of their consumers. By default it will be effect will be ``None``, but users will be able to define it, globally via ``global.conf`` configuration, or per recipe in the ``requires(...., build=True, package_id_mode=xxxx)`` trait (being ``xxx`` any valid mode like ``minor_mode``, ``patch_mode``, ``full_mode``, etc)

Conan 2.0 ``package_id`` will not depend by default on the dependencies’ ``options``. Their influence should be represented in most cases by the scenarios above, and the ``full_mode`` taking into account the dependency ``package_id`` into the consumer ``package_id``.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think about one case (which can be presented as an example) - when option affects compile-time definitions which result in different ABI. For example (I may be wrong, but still), in ASIO one can chose between io_uring implementation or epoll_reator which in the end will provide different type aliases and therefore making consumer incompatible with one built with different options (even if consumer is a static library).

I think another case, when options affect dependencies like with_xxx, is covered indirectly by always requiring whole graph, so flipping such flag will produce another package_id, but because the graph has changed.

(Now I started wondering about whole boost or openssl...)

Otherwise I think not requiring options by default is a good change - I think I filed 2 or 3 issues myself when options resulted in some issues.

Conan 2.0 will implement a new ``package_id`` default strategy which is dynamic, and can take different values for the versions of the dependencies, depending on the context and usage. The three main cases are:

- "embed" case: this mode will be used whenever some binary is "inlining" or "embedding" native binary code or resources of its dependency in its own artifacts. This happens when a shared library links a static library, when an executable links a static library, or when a translation unit of a library includes a header file from another library and inlines it. This mode typically means that when the dependency changes, the consumer has to be rebuilt to embed the changed dependency in its binary.
- "non embed" case: this mode happens when the resources and artifacts from the dependency are not inlined nor embedded in the current artifacts. An executable using a shared library or a static library "linking" (it actually doesn’t link, but this terminology can be used in build systems, to express a link time dependency) another static library. This mode means that in many situations when the interfaces of the dependency don't change, the binaries of the consumer don't need to be rebuilt when the binaries of the dependency change. If the interfaces of the dependency change, then bumping its version patch, minor, major, can define to its consumers if they need to rebuild (bumping the minor) or not (bumping the patch), or if they should expect breaking changes (bumping the major).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

static library "linking"

I tried to express this in my first comment, but was afraid of using word "linking". Maybe it's a good idea to give some vocabulary / definition in the beginning. What is "linking" and different kind of dependency nesting: we talk at least about 3 levels now - the package in question (for which package_id is computed) and it's up- and down-stream dependencies. Maybe it's a good idea to define "package" (for example as binary files produced by building the Conan package) or just "binaries".

@memsharded
Copy link
Member Author

@KerstinKeller

We already have a PR with a preliminary PoC for what you are commenting: conan-io/conan#11441. It seems doable, but we think it is important to first outline the basic behaviors, defining in which situations we are in embed, and not-embed mode, etc. Because that implies in conan-io/conan#11441, that there should be probably 2 or more definitions in the recipes for the consumers, at least the "embed" and the "not-embed" one. Because it doesn't matter if a package doesn't follow semver, the case when it is being compiled as a static library and consumed by another static library might be totally different to the one that such static library is being compiled by an application. If you want the two cases to be the same, it is easy, just assign both the same policy, but in general it seems that defining the policies for the different scenarios should be part of this feature.

@mmatrosov
Copy link

Sorry for the late reply.

The configuration for defaults is global, and managed in global.conf. The idea is that changing these defaults should be a team or company wide policy, and define it for most projects and teams.

The conans global config has always been frustrating to me. What if I am a part of several "teams or companies"? My team from my job, other teams from my job, several open source projects, my own setup, educational setups?

The idea of any global state influencing build result smells a lot. Is it totally necessary? Can we get rid of it?

@memsharded
Copy link
Member Author

Hi @mmatrosov

The conans global config has always been frustrating to me. What if I am a part of several "teams or companies"? My team from my job, other teams from my job, several open source projects, my own setup, educational setups?

The idea of any global state influencing build result smells a lot. Is it totally necessary? Can we get rid of it?

How we can get rid of it, this is the reality we see, and you just described it. For example companies will use custom settings, and they will use custom settings in their recipes to affect their builds. They will have very different profiles, and most likely very different hooks. Surely completely different remote repositories. The packages they build for those recipes with those profiles will not work without that. And as you have suggested, same happens for other setups. It is not just the configuration, everything for a setup is connected.

The recommended way to deal with that is to use a different Conan home for each different setup. That home will contain the settings, the profiles, the remotes, the hooks, for that specific setup, and of course the configuration for that specific setup.

@a4z
Copy link

a4z commented Jun 29, 2022

The conans global config has always been frustrating to me. What if I am a part of several "teams or companies"? My team from my job, other teams from my job, several open source projects, my own setup, educational setups?

The idea of any global state influencing build result smells a lot. Is it totally necessary? Can we get rid of it?

Sorry for being OT, but it might be helpful in this context

I simply use different CONAN_USER_HOME for different team/project setups
https://docs.conan.io/en/latest/reference/env_vars.html?highlight=conan_user_home#conan-user-home
so there is no global state for me, since I never use ~/.conan

often automatically set via direnv , so by just changing into a directory I have the team/project environment, and only the download cache, which is also set that way, is shared

@mmatrosov
Copy link

mmatrosov commented Jun 30, 2022

Thanks for letting me know about conan home. I completely forgot about this feature. I have lived with ~/.conan my whole life and did not have much problems with it. But I have a feeling that situation might change with core.package_id:default_* modes.

Let us consider two global settings mentioned above: settings and remotes. We indeed have custom settings.yaml file, but we append to it. We do not remove stuff. Hence, every recipe from CCI behaves the save way as in CCI CI. We also indeed have our own remotes. But again, we do not remove conan-center, we simply add our artifactory host first. Furthermore, if you assume all remotes contain properly built binaries, it does not matter which remote to choose for the build from the result's perspective. Again, every recipe from CCI behaves the save way as in CCI CI.

But what about core.package_id:default_* modes? It seems like if you build the same dependency graph on two hosts with different default modes, then update some dependencies and built again, you can end up in completely different binaries. Is this correct or am I missing something?

@foundry-markf
Copy link

Late to the party on this, but an observation.

As much as I appreciate the correctness in identifying downstream packages as out-of-date when making a change to 'my package', to me this suggests that making small changes in common packages could cause a cascade of many package rebuilds in CI systems?

Of course, configuring this to suit your level of 'correctness' and 'comfort' is probably the solution to this.

I'm just wondering if any testing in CI on this proposal has revealed a significant cost overhead in the most correct mode? If I make a minor change, how much effort does it take to get an entire dependency tree back into sync by CI systems?

For example, the library we often speculate about here is zlib, and what needs changing if we need to patch it, as it's often at the very bottom of the pile of dependencies. Certainly, I see all of the downstream packages that link static-zlib into a shared library would be marked out-of-date, and then where that cascades from there.

Is this a reasonable concern?

@memsharded
Copy link
Member Author

The feature conan-io/conan#11441, that allows a dependency to define the package_id_embed_mode, package_id_non_embed_mode and package_id_unknown_mode for their dependencies has been merged and it will be available in Conan 2.0-beta.2

This definition will have priority over conf definition of default_package_id_mode, that as the name implies, it is only a "default", and only effective when no one else is defining that value.

@memsharded
Copy link
Member Author

Regarding the definition of CONAN_USER_HOME (CONAN_HOME in 2.0), a new feature has been merged in Conan 2.0, the definition of a .conanrc file that defines such a variable, to be able to define the location of the Conan home, differently and auomatically for different projects.

@a4z
Copy link

a4z commented Jul 26, 2022

Regarding the definition of CONAN_USER_HOME (CONAN_HOME in 2.0), a new feature has been merged in Conan 2.0, the definition of a .conanrc file that defines such a variable, to be able to define the location of the Conan home, differently and auomatically for different projects.

awesome!

@memsharded
Copy link
Member Author

For example, the library we often speculate about here is zlib, and what needs changing if we need to patch it, as it's often at the very bottom of the pile of dependencies. Certainly, I see all of the downstream packages that link static-zlib into a shared library would be marked out-of-date, and then where that cascades from there.

@foundry-markf
In our experience, many companies and teams have been opting-in for the most strict modes, including package_revision_mode, requiring tons of re-builds (sometimes probably unnecessary). So apparently, there is a larger concern about correctness than about resources or speed.

The approach in this proposal is accordingly more on the correctness side, but not extremely, leaving space for user control about what/when needs to be rebuilt using versioning schemas. It is a matter of probabilities, if a shared library is linking a static library, the likelihood of embedding it is very high, recipe_revision_mode will be used, meaning that even a cosmetic change to a recipe like a comment or docstring will trigger the rebuild of the consumer shared libraries. This is up to the user, if they have the discipline and tooling to do the correct version bumps when something in code changes, then it can be relaxed to patch_mode for exampl, but as the proposed default, we chose something that is closer to what we see in practice by users.

For static libraries, we also need to find some balance that works as default for ConanCenter. With now +1300 packages, built for +100 configurations, it is not possible to rebuild everything everytime one library recipe changes one comment. So we went for something stricter than the current default, and went to minor_mode. This can work quite well in practice for environments like ConanCenter, in which third-party packages are being packages, as the source code of the package actually doesn't change when you create new revisions. If you apply this default for your own libraries, it can be done if the developers have enough discipline to bump the patch/minor versions correctly when necessary, but otherwise you might need to go stricter too.

@memsharded
Copy link
Member Author

Closing as outdated, still this was super useful thanks again for all feedback

@memsharded memsharded closed this Jul 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants