In Linux, is it better to use the official package system of your distro or to build from source? Does building from source show in the official package manager or does the user have to track things like that? Why have both?
In Linux, is it better to use the official package system of your distro or to build from source? Does building from source show in the official package manager or does the user have to track things like that? Why have both?
Last edited by Aparavoid; 11-13-2009 at 09:23 PM.
Build from source can be useful on a few situations:In Linux, is it better to use the official package system of your distro or to build from source?
- You want to make changes to the code
- The application has configuration settings that can only be set at compile time (a common occurrence).
- The package provided is not good or bloated (packages implementing unnecessary dependencies is a common(?) occurrence)
- You need to stay up to date with the application development cycle and can't wait for someone to build a package (a common situation if you plan to use nightly-builds)
- The distro repositories are slow to update on applications new versions or only provide older tested-and-tried versions.
- You have a customized version of GCC (or use another compiler) and want applications to be built from it.
Packages can be useful on other situations:
- None of the above matters
- You don't want to install library dependencies for the build from source option (like installing the whole KDE library source shebang on a predominantly Gnome machine)
- You are tight on disk space
- Packages are provided by the application authors (usually the best type of packages since they are not built by a John Doe)
Personally I prefer to build from source since I prefer the control and allows me to make changes to the code. However, I do not build from source KDE or 3D applications. The first because I'm on Gnome and don't feel like having KDE source libraries on my system taking space for only 2 or 3 KDE applications I use. The second because I don't understand a thing about 3D programming so no much point in building from source.
It doesn't. You have to keep track of those applications development cycle you build from source.Does building from source show in the official package manager or does the user have to track things like that?
Think you got the gist by now.Why have both?
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
I just had this debate somewhere else; some of the linux clergy frown on source building because it leads to confusion*. That will be especially true on a system with multiple users.
However, on your box at home source building is fine and it is often my preference, because compile options are totally opaque with packages and they are sometimes significant. The packages are usually designed to fit into a some greater wholistic vision, which if you have not been following the vision, this amounts to squat. Conversely, they are sometimes an attempt to satisfy everybody, meaning your software can contain way more "features" than you may have wanted. Unnecessary complications lead to unnecessary problems.
No, package management systems DO NOT account for source builds (hence the "confusion"). Sometimes, if you have source built libraries that are dependencies for a package you then want to install, this means you will have to use a "force" or "ignore dependancies" option to get the package in. On distros like fedora and debian, the install tools have branched over the years into the newer, preferred, "higher level" tools (yum,apt) vs. the old low level tools (rpm,dpkg). The newer tools are better at automatically resolving dependencies, searching web repositories etc. which means if you want to force a package WITHOUT installing it's dependancies because you have built them yourself, you may be better off using the lower level tools.
A couple of tips:
- always use the default /usr/local directory to install into. This makes it easy to tell what binaries/libraries were built, and which were installed from a package. The distro always puts stuff into /usr. The /usr/local tree is identical to the /usr tree (bin/, sbin/, lib/, inc/). All source packages I have ever seen by default install into /usr/local, so your /usr/local directory will be all source built stuff.
- save the tarball, eg, into /usr/local/src. You can erase the build directory, but most source packs can be uninstalled (you do a configure, make, then "make uninstall") but you will need the package if you erased the build directory. In other words, you do not need the "original" build directory as it existed after the build to uninstall, but you do need a build directory as it is created when you decompress the tarball. Saving them is also important because version numbers can change frequently and the old one may become unavailable.
- Also check your options first with "./configure --help | less"
All three of those tips are really important and I strongly recommend you follow them!
Finally, the reason for both is it is better than no choice, and that it is a necessity. Anyone can make a binary package of their product for a specific distro, but those that are not made and distributed by the distro are not official packages and may have problems because of that. Note there are a lot of distros, and not all software developers want to make a package for them. With popular stuff, the distros do that. With less popular stuff, a source package is available that ideally, can be built on any hardware using any linux distro.** AFAIK, Linux covers more hardware than any other OS, so there are a lot of variables here -- notice the distros have different packages for different architectures, etc. So vim 7.2 probably has like 100+ different packaged versions you can find somewhere on line (and 7.1 had a hundred others), but there is only ONE source tarball for vim 7.2 (sometimes these are also split into architectures, but usually not). Unlike Apple or MS, the linux OS is from a large group of independent distributors without any central authority. Like Apple and MS, much of the software is by independent developers, but they have a much more bewildering array of possibilities to deal with, whereas if you make one binary for win 7, one for vista, and one for XP you've covered your bases.
* some of them even present it as being too difficult, which is a scare tactic IMO
** eg, I have a project distributed by debian, but I did not make the package. Debian took my source tarball and made a bunch of packages, one for each of their target platforms. So my software has been like, ported to everything
Last edited by MK27; 11-14-2009 at 10:27 AM.
C programming resources:
GNU C Function and Macro Index -- glibc reference manual
The C Book -- nice online learner guide
Current ISO draft standard
CCAN -- new CPAN like open source library repository
3 (different) GNU debugger tutorials: #1 -- #2 -- #3
cpwiki -- our wiki on sourceforge