Distributing software for multiple platforms

This is a discussion on Distributing software for multiple platforms within the C Programming forums, part of the General Programming Boards category; Hi all, I've written a nice piece of open source C/C++ software, which runs great on my Mac. It has ...

  1. #1
    Registered User
    Join Date
    Jul 2009
    Posts
    4

    Distributing software for multiple platforms

    Hi all,

    I've written a nice piece of open source C/C++ software, which runs great on my Mac. It has some dependencies (OpenGL, LibTiff, Zlib, libXML, and libXML++) and it also requires Python. Although I used to build with AutoTools, I'm currently moving to CMake.

    The question is how to distribute my code so (1) it's as easy as possible for my potential users, and (2) the distribution process is relatively painless for me. I want my code to run on as many possible varieties of Mac, Windows, and Linux platforms. To make installation easy for users, I'd like to distribute pre-compiled binaries. However, then is there an easy way for me to compile for a wide variety of platforms? And how do I handle dependencies? Do I statically link as many of them as possible, using the appropriate platform version of each one, and just hope that users have the other dependencies and that my software finds them?

    How do commercial software developers handle these problems?

    Thanks,
    -Steve

  2. #2
    Registered User ledow's Avatar
    Join Date
    Dec 2011
    Posts
    435
    With a lot of hard work.

    Pre-compiled binaries is your only really sensible solution (unless you WANT your users to do all that same hard work). Source drops are pretty much "you're on your own" territory and can put a lot of people off, and invariably someone will provide a pre-compiled binary that everyone else just uses for convenience anyway. The rest? You just have to knuckle down and do it.

    Compiling statically versus dynamically is an argument in itself. Most Windows programs, you'll notice, are statically linked or they have a bunch of DLL's supplied with them of known versions (little reliance on system libraries except for things like MSVC Runtimes etc. - and if you've ever run a game through Steam, you'll see that almost every game updates every DLL it intends to use on first install - DirectX, MSVC Runtime, etc. - and that some still throw dozens of DLL's of known version into their program folders to make sure they get what they expect). You just can't rely on a Windows system to have those DLL's installed without making up a full installer (e.g. using NSIS or similar) and distributing them yourself (check the licences!). It also matters what licence you use for the program. If you include SDL 1.2, for example, you need to either ship a separate SDL DLL or license your whole program under the terms of GPL/LGPL. SDL 1.3 supposedly fixes this problem with a new license, but it's a little demo that just because you want to ship statically doesn't necessarily mean you can.

    Cross-compiling can be a bit of a nightmare too, especially when you're going to have to test each program on each platform anyway. You'd probably be better off with some virtualisation or testing machines but it is *possible* to cross-compile to any platform you like. I wouldn't recommend doing it blindly, though, because it's easy to put out binaries that just plain don't work. I use Windows for development and cross-compile to other platforms but let's just say that it's not as simple as you might think, even if your code is perfectly cross-platform (which is rare). It can take you longer to set up a proper cross-compiling environment and get your code running on 3 platforms than it would do to just install those three operating systems in a virtual (or real) machine and compile directly on each platform - at least for the first time you do it.

    I'm in the process of writing a cross-platform game at the moment, bearing in mind that my last project was a joint x86 Windows / Linux / ARM-based handheld console program. The code loads entirely dynamically with the exclusion of whatever MinGW handles on Windows (which I think only requires the system MSVC Runtime). For the Windows version, I ship a number of DLL's for various purposes - licensing, encapsulation, ease of upgrading versions, etc. For Linux, I ship only a handful of those because I don't want to stomp over system libraries or ship insecure old versions. On ARM, I actually statically compile most of it except for the things I know are provided by the platform and would cause me licensing problems. I have a 2000-line C file that checks all the dependencies are in place, are the right version and loads them direct from the shared libraries on any platform as the first line of main(). I use NSIS scripts to create installers for Windows / Linux. I tried the static library approach on those platforms and you end up with a huge block of code that you have to change for every upgrade of a library and which can be a licensing nightmare (and you can't really NOT upgrade when you get serious security problems in one of your DLL dependencies). It might solve theoretical problems of DLL-hell to statically compile, but to be honest I've yet to witness any DLL conflict on any system I've tested. And the code can be HUGE. 50Mb executables or larger, because you've just lumped everything in.

    Also, literally every time I switch to my secondary platforms and test, I have to rewrite sections of code due to unforeseen differences. I have to add in DLL's that I've added or changed. I have to update Makefiles (and writing a cross-platform single Makefile is nigh-on impossible). I have to add new prototypes for new functions I call, etc. I have to recreate the installer. I haven't even considered any x86 / x86-64 differences yet, because I have enough fun trying to get it to work on Windows XP, Windows 7 32-bit and Linux x86. And, in the end, compiling on the different platforms gives me much more information than trying to cross-compile (even when that's possible). It's amazing how many libraries you miss, or how many libraries change name (OpenSSL, for one, often has different library names on Windows and Linux), or how many things that Windows compilers fuss about that Linux ones don't and vice-versa. Every ten or so SVN commits, I compile on a different platform with a different version of GCC and end up having to add another commit for the differences noticed (even silly things like filename case sensitivity, different warnings, improperly-isolated Windows headers, etc.).

    I actually use a Linux VPS for various things. I have one for my game, hosted in a datacenter, that I intend to use to let people download it, view the website, etc. At the moment it runs the private SVN repository for the code, backups of my development environment and also functions as a compiler for Linux. I literally just commit my code from my Windows setup to that server, then log into it, and compile that same code on that platform. I even have a hacked SDL-VNC backend so that that server can show me graphical output despite being headless (yes, I can quite literally play my game here from a server that's in a datacenter somewhere else, OnLive-style!). Having that server compile picks up LOTS of things that I would have normally missed: 1) What happens when SDL can't set a video mode AT ALL or various modes aren't available, 2) Any x86-64 peculiarities, 3) warnings and errors from a more-modern version of GCC than MinGW ships with, 4) Linux-based libraries on a clean machine on a different distro to that which I normally use, 5) Valgrind (sadly missing on Windows!). Yes, I valgrind a running binary of my code over the Internet! I could do all that with a local machine but given that it's no more difficult to transfer the code, log into and compile it, it's like having a virtual machine on my laptop (and means I can code anywhere in the world even without my main development machine). I can even remote-gdb the code if there's a big problem that only pops up on that particular platform.

    As soon as you get into multiple-architectures, you really need to build yourself a nice compilation system. Either a cross-compile, or a set of Makefiles that work on virtualised or native platforms. They can be a lot of hard work to make work and to work consistently and take all the quirks of a system into account (e.g. even VisualC vs MinGW compilation can mean big differences in how you build them) but you have to do it properly if you want it to "just work" in future. Sure, the first time I did things, I just had a batch file and two scripts that manually compiled everything with the right compiler for the platform, but as you build the program it gets more and more complex and you have a lot more differences to take account of (depending on your code, you could find yourself with a lot of #ifdef's scattered around just to do silly things like create a directory).

    My advice: Get yourself virtual machines somehow, and forget about cross-compiling unless you have a large team of knowledgeable people (i.e. could use gdb if necessary) willing to test on all platforms. If that means buying physical machines and installing those OS, or just virtualising / emulating them on your development machine, or just renting a remote server of the right OS that you can install a compiler on, that's what I'd do. I don't have Mac builds (I've never owned a Mac) but if I was asked to provide them, it would be either to push out that work to someone else who *does* have a Mac, or to find one that I can rent somehow. I wouldn't risk just cross-compiling without having machines to test on, and if you have them to test on, you might as well compile on them too and avoid the hassle of cross-compiling.

    Some projects (OpenTTD springs to mind, from my experience) do have large collections of automated systems that will compile the code for a number of platforms, produce binaries, test them, etc. automatically. Generally, though, that involves having all of those machine-types just sitting around, having people knowledgeable on all those platforms to check and commit fixes for those problems, and having enough people to run that sort of compile-farm setup for you. Absent that, I'd virtualise rather than touch a cross-compiler

    - Compiler warnings are like "Bridge Out Ahead" warnings. DON'T just ignore them.
    - A compiler error is something SO stupid that the compiler genuinely can't carry on with its job. A compiler warning is the compiler saying "Well, that's bloody stupid but if you WANT to ignore me..." and carrying on.
    - The best debugging tool in the world is a bunch of printf()'s for everything important around the bits you think might be wrong.

  3. #3
    Registered User
    Join Date
    Mar 2011
    Posts
    546
    ledow is right. do it with VMs. with the right VM setup you can automate the entire build process for all platforms from a single startup.

  4. #4
    Registered User
    Join Date
    Jul 2009
    Posts
    4
    Thank you! I was hoping for some easy solution, but had a sense that that wasn't realistic. I'll start pestering my friends, to see if I can borrow their Windows, Ubuntu, Red Hat, Debian, and other systems. By the way, I've had good luck with cross-compiling for Windows from Mac using MinGW. Despite that, I still test on Windows and/or the Parallels VM.

    -Steve

  5. #5
    Registered User ledow's Avatar
    Join Date
    Dec 2011
    Posts
    435
    Sorry to bring up an old thread, but I have this note in a file I keep concerning licensing for MinGW:

    "Profiled code, which is any code (even your own) that you compile and link with the -pg option for runtime profiling, also falls under the GNU General Public License."

    So, don't distribute binaries that have been compiled with -pg unless you want that code to end up under the GPL.

    - Compiler warnings are like "Bridge Out Ahead" warnings. DON'T just ignore them.
    - A compiler error is something SO stupid that the compiler genuinely can't carry on with its job. A compiler warning is the compiler saying "Well, that's bloody stupid but if you WANT to ignore me..." and carrying on.
    - The best debugging tool in the world is a bunch of printf()'s for everything important around the bits you think might be wrong.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Distributing C++ Applications
    By Sayan-Chaliha in forum C++ Programming
    Replies: 0
    Last Post: 07-19-2010, 11:31 PM
  2. distributing toolbars?
    By Laura Jennings in forum Tech Board
    Replies: 2
    Last Post: 08-22-2007, 12:31 AM
  3. Replies: 0
    Last Post: 07-10-2004, 11:02 AM
  4. distributing dll files
    By lambs4 in forum Game Programming
    Replies: 1
    Last Post: 06-08-2004, 07:16 PM
  5. Distributing MFC Applications
    By WiKiDJeFF in forum Windows Programming
    Replies: 4
    Last Post: 05-01-2003, 05:39 PM

Tags for this Thread


1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21