This section gathers the most common questions from the community related to packages and usability of this repository.
- What is the policy on recipe name collisions?
- What is the policy on creating packages from pre-compiled binaries?
- Should reference names use
-
or_
? - Why are CMake find/config files and pkg-config files not packaged?
- Should recipes export a recipe's license?
- Why recipes that use build tools (like CMake) that have packages in Conan Center do not use it as a build require by default?
- How are rare build systems without generators packaged?
- Are python requires allowed in the
conan-center-index
? - What version should packages use for libraries without official releases?
- Is the Jenkins orchestration library publicly available?
- Why not x86 binaries?
- Do static libraries tend to be compiled as PIC by default?
- Why PDB files are not allowed?
- Can I remove an option from a recipe?
- Can I split a project into an installer and library package?
- What license should I use for Public Domain?
- What license should I use for a custom project specific license?
- How do I flag a problem to a recipe consumer?
- Why is a
build.check_min_cppstd
call not enough? - What is the policy for adding older versions of a package?
- What is the policy for removing older versions of a package?
- Can I install packages from the system package manager?
- Why ConanCenter does not build and execute tests in recipes
- Why not add an option to build unit tests
- What is the policy for supported python versions?
- How to package libraries that depend on proprietary closed-source libraries?
- How to protect my project from breaking changes in recipes?
- What's the policy on version ranges?
- How to consume a graph of shared libraries?
- How to watch only specific recipes?
- Is it possible to disable Pylint?
- How long can I be inactive before being removed from the authorized users list?
- Can we add package which are parts of bigger projects like Boost?
- Can I add options that do not affect
package_id
or the package contents - Can I use full_package_mode for a requirement in my recipe?
Packages generated by the build service and uploaded to Conan Center follow the structure of <name>/<version>
for the reference. Although the ecosystem of C/C++ open-source libraries is not as big as in other languages there is still a risk of having a name collision for the names of the package.
This repository will try to follow the most well-known names for each of the recipes contributed, paying attention to all the contributions and checking any collision with other popular libraries beforehand. In the case of having to face disambiguation (due to different libraries with the same name), we would look at other sources and look for a consensus.
However, if it is not possible and there is the case of a new recipe producing a name collision, the first recipe contributed will have precedence over it. Generally, recipes contributed to the repo won't change its name in order to not break users.
For example, GSL
is the name of Guidelines Support Library
from Microsoft and GNU Scientific Library
from GNU. Both libraries are commonly known as gsl
, however, to disambiguate (if there is already a gsl
package in this repo) we could use ms-gsl
in the first case or gnu-gsl
in the second.
The policy is that in the general case recipes should build packages from sources, because of reproducibility and security concerns. The implication is that the sources must be publicly available, and in a format that can be consumed programmatically.
See Picking Sources for more information.
Recipes should stick to the original name of a library as much as possible. For example libjpeg-turbo
, expected-lite
and optional-lite
have a -
in their original names.
In the case of spaces in the name, the most common approach is to use _
as done in xz_utils
.
For libraries with a too generic name, like variant
, the name of the organization can be used as prefix separated by a -
, like mpark-variant
, tl-expected
or taocpp-tuple
.
We know that using find_package()
and relying on the CMake behavior to find the dependencies is something that should be avoided in favor of the information provided by the package manager.
Conan has an abstraction over the packages build system and description by using generators. Those generators translate the information of the dependency graph and create a suitable file that can be consumed by your build system.
In the past, we have found that the logic of some of the CMake's find/config or pkg-config files can lead to broken scenarios due to issues with:
- Transitive dependencies: The find logic of CMake can lead to link libraries with system libraries instead of the ones specified in the conanfile.
- Different build type configurations: Usually those files are not prepared to handle multi-configuration development while switching between release/debug build types for example.
- Absolute paths: Usually, those files include absolute paths that would make the package broken when shared and consumed.
- Hardcoded versions of dependencies as well as build options that make overriding dependencies from the consumer not possible.
We believe that the package manager should be the one responsible to handle this information in order to achieve a deterministic and controlled behavior. Regarding the integration with CMake, Conan already provides ways to consume those packages in the same way by using generators like cmake_find_package* or cmake_find_package_multi and features like components to define internal libraries of a package and generate proper CMake targets or build_modules to package build system utilities like CMake macros.
Defining the package information in the recipe is also useful in order to consume those packages from a different build system, for example using pkg-config with the pkg_config generator.
Finally, by not allowing these files we make packages agnostic to the consumer as the logic of those files is not in the package but in the way the consumer wants the information.
If you really think this is an issue and there is something missing to cover the use case of a library you want to contribute to ConanCenter, please do not hesitate to open an issue and we will be happy to hear your feedback.
* Take a look at the integrations section to learn more: https://docs.conan.io/1/integrations/build_system/cmake/cmake_find_package_generator.html
No, recipes do not need to export a recipe license. Recipes and all files contributed to this repository are licensed under the license in the root of the repository. Using any recipe from this repository or directly from conan-center implies the same licensing.
Why recipes that use build tools (like CMake) that have packages in Conan Center do not use it as a build require by default?
We generally consider tools like CMake as a standard tool to have installed in your system. Having the cmake
package as a build require in all the recipes that use it will be an overkill, as every build requirement is installed like a requirement and takes time to download. However, cmake
could still be useful to use in your profile:
[tool_requires]
cmake/3.17.2
Other packages using more unusual build tools should refer to the Dependencies - Adding Build Requirements section for more information.
The C++ ecosystem has a lot of rare, unique and obscure build systems. Some of these are available in ConanCenter but they do not have built-in generators from the main Conan client.
The recipe is expected to encode the specifics of the build system, mapping the settings
, options
for the binary configuration, and also mapping self.dependencies
so the build system can locate the dependencies libraries as required.
For these cases, contributors are asked to help reviewers as much as possible as it's likely we will not have expertise.
TODO: Add a link to docs.conan.io which explains how to write a custom generator in the 2.0 sense
For quality assurance the build service is expected to be green and the hooks will ensure the package contents match what is expected given the options. These recipes are more likely to have inconsistency with other recipes but make for excellent contributions.
Unless they are a general and extended utility in recipes (in which case, we should study its inclusion in the Conan tools module), python requires are not allowed in conan-center-index
repository.
This happens for a number of reasons, some projects have a "live on main" others are less maintained but still merge pull requests. Read about the ConanCenter specific version format for more information.
Currently, the Jenkins orchestration library for this build service is not available. We believe this solution is too specific for this purpose, as we are massively building binaries for many configurations and the main purpose of a CI system with Conan in an organization should be to rebuild only the need packages. However, we know this could be interesting for organizations in order to learn new approaches for CI flows. We will release this information and CI flow recommendations as soon as possible.
As described in the Supported platforms and configurations, only the x86_64 architecture is available for download, the rest must be built from sources. The reasons behind this decision are:
- Few users need different pre-built packages that are not x86_64 packages, this number is less than 10% of total users (data obtained through the download counter from Bintray), and tends to decrease over the years;
- Some OS are putting the x86 as obsolete, examples macOS and Ubuntu 20.04;
- For security reasons, most companies build their own packages from sources, even if they already have a pre-built version available, which further reduces the need for extra configurations;
- Each recipe results in around 130 packages, and this is only for x86_64, but not all packages are used, some settings remain with zero downloads throughout their life. So, imagine adding more settings that will rarely be used, but that will consume more resources as time and storage, this leaves us in an impractical situation.
As stated earlier, any increase in the number of configurations will result in an impractical scenario. In addition, more validations require more review time for a recipe, which would increase the time for all PRs, delaying the release of a new package. For these reasons, x86 is not validated by the CCI.
We often receive new fixes and improvements to the recipes already available for x86_64, including help for other architectures like x86 and ARM. In addition, we also receive new cases of bugs, for recipes that do not work on a certain platform, but that are necessary for use, which is important to understand where we should put more effort. So we believe that the best way to maintain and add support for other architectures is through the community.
Yes! You can learn more about default options in Packaging Policy.
The project initially decided not to support the PDB files primarily due to the size of the final package, which could add an exaggerated size and not even used by users. In addition, PDB files need the source code to perform the debugging and even follow the path in which it was created and not the one used by the user, which makes it difficult to use when compared to the regular development flow with the IDE.
However, there are ways to get around this, one of them is through the /Z7 compilation flag, which can be passed through environment variables. You can use your profile to customize your compiler command line.
Adding one more common option, it seems the most simple and obvious solution, but it contains a side effect already seen with fPIC. It is necessary to manage the entire recipe, it has become a Boilerplate. So, adding PDB would be one more point to be reviewed for each recipe. In addition, in the future new options could arise, such as sanity or benchmark, further inflating the recipes. For this reason, a new option will not be added. However, the inclusion of the PDB files is discussed in issue #1982 and there are some ideas for making this possible through a new feature. If you want to comment on the subject, please visit issue.
No. The PDBs are only needed to debug dependency code. By providing the libraries you are able to link and build your application and debug your own code. This is by far the more common scenario which we want to enable.
It's preferable to leave all options (ie. not removing them) because it may break other packages which require those deleted options. Prefer the deprecation path with a mapping from old options to new ones:
- Add "deprecated" as option value
- Set "deprecated" as default option
- Check the option value, if the value is different from "deprecated", raise a warning
- Remove the option from Package ID
options = {"foobar": [True, False, "deprecated"]}
default_options = {"foobar": "deprecated"}
def configure(self):
if self.options.foobar != "deprecated":
self.output.warning("foobar option is deprecated, do not use anymore.")
def package_id(self):
del self.info.options.foobar
This is the safest way, users will be warned of deprecation and their projects will not risk breaking. As additional examples, take a look on follow recipes: dcmtk, gtsam and libcurl.
However, if logic is too complex (this is subjective and depends on the Conan review team) then just remove the option. After one month, we will welcome a PR removing the option that was deprecated.
No. Some projects provide more than a simple library, but also applications. For those projects, both libraries and executables should be kept together under the same Conan package. In the past, we tried to separate popular projects, like Protobuf, and it proved to be a complex and hard task to be maintained, requiring custom patches to disable parts of the building. Also, with the context feature, we can use the same package as build requirement, for the same build platform, and as a regular requirement, for the host platform, when cross-building. It's recommended using 2 profiles in that case, one for build platform (where the compilation tools are being executed) and one for host platform (where the generated binaries will run).
See License Attribute for details.
See License Attribute for details.
Regardless of why, if the recipe detects a problem where binaries might not be generated correctly, an exception must be raised. This to prevent the publishing
incorrect packages which do not work as intended. Use ConanInvalidConfiguration
which is specially support in ConanCenter.
raise ConanInvalidConfiguration(f"The project {self.ref} requires liba.enable_feature=True.")
You should not be using the self.output.warn
and it is not enough to alter consumers or stop the build service.
Very often C++ projects require a minimum standard version, such as 14 or 17, in order to compile. Conan offers tools which enable checking the relevant setting is enabled and above this support for a certain version is present. Otherwise, it uses the compiler's default.
def configure(self):
build.check_min_cppstd(self, 14) π Wrong!
This fails to cover the vast number of use cases for the following reasons:
cppstd
is not configured in the--detect
ed profiles generated by Conan, the majority of users simply do not have this setting.- A shocking number of projects override this setting within their respective build scripts, this setting does not get applied in those cases.
- Conan-Center-Index does not manage the
cppstd
setting for the compilers it supports to generate binaries.
def validate(self):
# π Correct
if self.settings.compiler.cppstd:
build.check_min_cppstd(self, 14)
As a result, all calls to build.check_min_cppstd
must be guarded by a check for the setting and the only way to ensure the C++ standard is to check the compiler's version to know if it offers sufficient support. An example of this can be found here.
See Adding older versions for details.
See Removing older versions for details.
It depends. You can not mix both regular projects with system packages, but you can provide package wrappers for system packages. However, Conan can not track system packages, like their version and options, which creates a fragile situation where affects libraries and binaries built in your package but can not be totally reproduced. Also, system package managers require administrator permission to install packages, which is not always possible and may break limited users. Moreover, more than one Conan package may require the same system package and there is no way to track their mutual usage.
The hook KB-H032 does not allow system_requirement
nor SystemPackageTool
in recipes, to avoid mixing both regular projects with
system packages at same recipe.
There are exceptions where some projects are closer to system drivers or hardware and packaging as a regular library could result in an incompatible Conan package. To deal with those cases, you are allowed to provide an exclusive Conan package which only installs system packages, see the How-to for more.
There are different motivations:
- time and resources: adding the build time required by the test suite plus execution time can increase our building times significantly across the 100+ configurations.
- ConanCenter is a service that builds binaries for the community for existing library versions, this is not an integration system to test the libraries.
- Adding a testing option will change the package ID, but will not provide different packaged binaries
- Use the configuration skip_test to define the testing behavior.
Python 2.7
and earlier is not supported by the ConanCenter, as it's already EOL.
Python 3.6
and earlier is also not supported by the ConanCenter, as it's already EOL.
Versions Python 3.7+
onwards are currently supported by the infrastructure and the recipes.
Our docker images use Python 3.7.13+
ATM.
Windows agents currently use Python 3.7.9+
. macOS agents use version Python 3.7.12+
.
The version run by our agents and docker images is a subject to change, as security updates to the Python are released, or they enter EOL.
Additional concerns about supported versions within conan ecosystem (not just ConanCenter, but client itself and other tools) are documented in tribe.
For ConanCenter, besides security, there are various concerns about critical features provided by the Python interpreter, include its syntax and the standard library, e.g.:
- LZMA compression support
- Unicode awareness
- long-path awareness
Right now, only the CPython flavor of the interpreter is supported (e.g. we never tested recipes work with IronPython, JPython, Cython, etc.).
In addition, we support only 64-bit builds of the interpreter (amd64/x86_64 architecture) - 32-bit builds (x86) are not supported and not installed on the agents.
There are no guarantees that recipes will work correctly in future Python versions having breaking changes to the interpreter, as we don't test all the possible combinations (and probably will never be). Patches are welcomed if problems are found.
There are several popular software libraries provided by Intel:
- Intel Math Kernel Library (MKL)
- Intel Integrated Performance Primitives (IPP)
- Intel Deep Neural Networking Library (DNN)
these Intel libraries are widely used by various well-known open-source projects (e.g. OpenCV or TensorFlow).
Unfortunately, these Intel libraries cannot be accepted into ConanCenter due to several important reasons:
- they are closed-source and commercial products, ConanCenter cannot redistribute their binaries due to the license restrictions
- registration on the Intel portal is required in order to download the libraries, there are no permanent public direct download links
- they use graphical installers which are hard to automate within conan recipe
instead, the libraries that depend on MKL, IPP or DNN should use the following references:
intel-mkl/<version>
, e.g.intel-mkl/2021
intel-ipp/<version>
, e.g.intel-ipp/2021
intel-dnn/<version>
, e.g.intel-dnn/2021
Note: These references are not available in ConanCenter and will likely never be! it's the consumer's responsibility to provide the recipes for these libraries.
Since these references will be never available in ConanCenter, they will be deactivated in the consuming recipes by default:
options = {
"shared": [True, False],
"fPIC": [True, False],
"with_intel_mkl": [True, False]}
default_options = {
"shared": False,
"fPIC": True,
"with_intel_mkl": False}
def requirements(self):
if self.options.with_intel_mkl:
self.requires("intel-mkl/2021")
If consumers activate the option explicitly (with_intel_mkl=True
), Conan will fail because of the unknown reference.
Consumers may use an override facility in order to use their own private references for Intel MKL, IPP or DNN libraries.
For instance, if you have a private reference intel-mkl/2021@mycompany/stable
, then you may use the following override in your conanfile.txt
:
[requires]
intel-mkl/2021@mycompany/stable
This repository and the CI building recipes is continuously pushing to new Conan versions, sometimes adopting new features as soon as they are released (Conan client changelog).
You should expect that latest revision of recipes can introduce breaking changes and new features that will be broken unless you also upgrade Conan client (and sometimes you will need to modify your project if the recipe changes the binaries, flags,... it provides).
To isolate from these changes there are different strategies you can follow. Keep reading in the consuming recipes section.
Version ranges are currently allowed on a handful of dependencies, but not for general use. See Dependencies Version Ranges for additional details.
When the CI builds packages with shared=True
, it applies the option only to the package being created, but not to
the requirements. As the default value for the shared
option is usually False
, you can expect that the dynamic
library that has just being generated has linked all its requirements as static libraries.
It is important to remark the default package id mode
used by Conan (which is the same default used by ConanCenter): semver_direct_mode
. With this default only the major
version of the requirements is encoded in the package ID.
The two previous behaviors together can lead to unexpected results for a user that want to consume a graph of
dependencies as shared libraries from ConanCenter. They might think that using *:shared=True
in their profile is
enough, and indeed Conan will retrieve from ConanCenter all the dynamic libraries for all the graph of dependencies, but
all of them will contain the logic of their respective requirements embedded in the dynamic library, and this
logic is embedded at the time of building, so it might not match the version of the requirements that was resolved
by Conan, and for sure, the other dynamic libraries won't be used, only the ones linked directly by the consumer
project. See a more detailed example here.
In order to consume all those libraries as shared ones, building from sources is needed. This can be
easily achievable using *:shared=True
in the host profile and --build
in the install command. With these inputs,
Conan will build from sources all the packages and use the shared libraries when linking.
Note: If you are hosting your own recipes, the proper solution for recipes would be to use something like
shared_library_package_id
, that will encode this information in the package ID and ensure that any change in the static libraries that are embedded into a shared one is taken into account when computing the package ID.In this repository we are not using it, because it will lead to many missing packages, making it impossible for the CI to actually build consumers in PRs.
The Code Owners feature requires
write permission for any listed user in the file .github/CODEOWNERS
, which makes it impossible to be accepted by Conan. However, that file is still important as it can be re-used in
a future Github Action to parse and communicate users. Meanwhile, there is the project https://app.github-file-watcher.com/,
which is able to notify users, but only after merging to the master branch. Feel free to contribute to a new Github Action that
implements a file watcher feature.
No. The pylint has an important role of keeping any recipe prepared for Conan v2 migration. In case you are having difficult to understand linter errors, please comment on your pull request about the problem to receive help from the community.
Please, read Inactivity and user removal section.
Sadly no. There have been many efforts in the past and we feel it's not sustainable given the number of combinations of libraries and version. See #14660 for recent discussions. There is one main "boost" recipe with many versions maintained. Adding boost libraries with no dependencies just opens the door to graph resolution problems and once available allows for dependent libraries to be added.
In order to avoid this the sole permutation which is permissible is when the project does not package any headers under the boost/
folder, does not use the boost namespace
and does not install libraries with the boost prefix.
Yes, but make sure it does not have Boost in the name. Use the author-name
convention so there are no conflicts. In addition to follow the rules outlined above.
Generally no, these sorts of options can most likely be set from a profile or downstream recipes. However if the project supports this option from its build script and would otherwise dynamically embed this into the CMake config files or generated pkg-config files then it should be allowed.
Doing so requires deleting the option from the package_id
.
For some irregular projects, they may need to be aligned when being used as a requirement, using the very same version, options, and settings and maybe not mixing shared with static linkage. Those projects usually break between patch versions and are very sensitive, so we can not use different versions through Conan graph dependencies, otherwise, it may result in unexpected behavior or even runtime errors.
A very known project is GLib, which requires the very same configuration to prevent multiple instances when using static linkage. As a solution, we could consume GLib on full package id mode, like:
def package_id(self):
self.info.requires["glib"].full_package_mode()
Perfect solution on the consumer side, but there is a side-effect: CCI will not re-generate all involved packages for any change in the dependencies graph with which glib is associated, which means, users will start to see MISSING_PACKAGES error during their pull requests. As a trade-off, it would be necessary to update all recipes involved, by opening new PRs, then it should generate new packages, but it takes many days and still is a process that is not supported by CCI internally.
To have more context about it, please, visit issues #11684 and #11022
In summary, we do not recommend full_package_mode
or any other custom package id mode for requirements on CCI, it will break other PRs soon or later.
Instead, prefer using shared=True
by default, when needed.
Also, when having a similar situation, do not hesitate in opening an issue explaining your case, and ask for support from the community.