Amethyst Linux
Static and dynamic linking
To understand linking, one must first understand how executables and libraries work. On most modern operating systems, you will find both executable binaries and libraries, both of which contain code that are used to run programs. Libraries contain code which is meant to be included by executables and other libraries. Executables are the actual programs that the user of an operating system actually run, which (almost) always depend on libraries in order to run. There are two options for connecting the executables to the libraries they depend on: Static linking, and dynamic linking.
Static linking means that the necessary parts of the library are copied into the final executable file, just as if you included the original code itself into the program. This means that the resulting executable file is completely self-contained, and will run exactly the same way on every system, no matter what libraries are installed. (That is, as long as the kernel & programs it interacts with behave as expected...) This is the way code used to be compiled for a long time, and some operating systems still use this when dynamic linking would be impossible or less convenient.
Dynamic linking means that the symbols are copied from the library instead of the code. When the executable is run, the symbols are used to find the proper code to run in the library. This may have pros and cons depending on the context the library is used in.
- Pro: Size. Just including the symbols means that the resulting executable file will be smaller than if it included the code itself.
- Con: Fragilitiy. If the library file is moved, renamed, deleted, or corrupted, the executable won't be able to find the library anymore, and will fail to run. Depending on how widely used the library is, this could break your whole operating system. (In reality, this isn't usually a big deal, especially if you )
- Pro: Updates. If the library is updated and the API is completely the same, the new version of the library can be installed as a drop in replacement for the old version, letting the same executables use the newer version without needing to recompile them all. (While it's not that big of a deal when you have just a few packages, it's a really big deal when you have hundreds, or even thousands of packages all using the same library, such as libc.)
- Con: Updates. On the other hand, if the library is updated and has a new API with different symbols or different arguments to pass to functions, then the executable also won't be able to run, and may even catastrophically crash when it tries to use a modified function. (In particular, sometimes you end up with issues where one package needs a newer version of the library, but another package still needs an older version, and you can't just install both versions without conflicts. That's what some like to refer to as 'dependency hell'.)
- Pro: Plugins. One could intentionally exploit the way dynamically linked libraries work to create plugins, using library symbols to link in different functions with different behaviors depending on what the plugin is meant to do.
- Con: Malicious plugins. On the other hand, the fact that functions can be arbitrarily replaced when using a different library could be a security vulnerability that can be exploited by malicious libraries and plugins. (The xz backdoor is a good example of this that happened relatively recently)
Mainstream linux distributions generally only provide dynamically linked executables and libraries, with the rare exception of offering statically linked executables for the sake of recovering the operating system in the event that dynamically linked executables aren't working properly. There are a few niche distros that offer statically linked packages as the default, but they usually forego dynamic linking entirely. My personal opinion is that both can be useful depending on the circumstances, so I think it makes sense to offer both as an option.