Welcome to Software Development on Codidact!
Will you help us build our independent community of developers helping developers? We're small and trying to grow. We welcome questions about all aspects of software development, from design to code to QA and more. Got questions? Got answers? Got code you'd like someone to review? Please join us.
Post History
I see 3 common ways to distribute such software: Statically linked binary Dynamically linked binary Source 1 means you compile everything into a standalone binary file. This bloats the file...
Answer
#5: Post edited
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too. Keep in mind that not all OSes have the same versions of software. For example, Debian stable might have a very old version of your library, while Arch Linux will have the very latest, you might have to deal with that issue depending on how volatile your library is.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too. Keep in mind that not all OSes have the same versions of software. For example, Debian stable might have a very old version of your library, while Arch Linux will have the very latest, you might have to deal with that issue depending on how volatile your library is.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well. It also has the most potential for errors, because the build process will need to install a lot of build dependencies on the user's computer, and there's more things that can go wrong.
- I'm not very familiar with distributing C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
#4: Post edited
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
Keep in mind that not all OSes have the same versions of software. For example, Debian stable might have a very old version of your library, while Arch Linux will have the very latest, you might have to deal with that issue depending on how volatile your library is.2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too.- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.
- I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too. Keep in mind that not all OSes have the same versions of software. For example, Debian stable might have a very old version of your library, while Arch Linux will have the very latest, you might have to deal with that issue depending on how volatile your library is.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.
- I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
#3: Post edited
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.
- I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
- Keep in mind that not all OSes have the same versions of software. For example, Debian stable might have a very old version of your library, while Arch Linux will have the very latest, you might have to deal with that issue depending on how volatile your library is.
- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.
- I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
#2: Post edited
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
2 means you compile it such that the program will expect to find the library pre-installed on the users computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.
- I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
- I see 3 common ways to distribute such software:
- 1. Statically linked binary
- 2. Dynamically linked binary
- 3. Source
- 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries.
- 2 means you compile it such that the program will expect to find the library pre-installed on the user's computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew; Windows has Chocolatey but most don't use it) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program.
- 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too.
- For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed).
- 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well.
- I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version).
- I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here).
- As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages
#1: Initial revision
I see 3 common ways to distribute such software: 1. Statically linked binary 2. Dynamically linked binary 3. Source 1 means you compile everything into a standalone binary file. This bloats the file size, but you don't have to trust the user to have the dependencies (the libraries) installed correctly. This would be the easiest to create a package for, because it's "self contained". It's also popular for Windows, because Windows is bad at dealing with dependencies and libraries. 2 means you compile it such that the program will expect to find the library pre-installed on the users computer and ready to go. This seems like a stretch, but it's actually common. Modern package managers (Linux, Mac Homebrew) are pretty good at dealing with dependency chains and their repos are usually pretty good about having most libraries already packaged. Of course, your package would have to correctly indicate that it requires some other library, so that it gets automatically installed when people try to install your program. 2 will address your concern with the library update, whereas 1 requires you to release a new version just for the library update. However, library updates are not always drop-in, so you might have to be a bit more careful with specifying what version of the library you require. And yes, if the library does a breaking change, you'll have to release a new version of your program too. For 3, your build pipeline should be able to obtain a suitable version of the library from an appropriate source. Usually people have a Makefile that provides a `build` command, which will also ensure build dependencies are set up (not just libraries, but having the compiler installed). 3 is the easiest for developers. Some package repos make it even more convenient, for example Arch Linux has the AUR. Creating an AUR package is fairly straightforward, and you can easily indicate other packages (whether binary Arch repos or other AUR stuff) as dependencies. I think in Homebrew it's pretty easy as well. I'm not very familiar with distributing the C programs, but getting the header with wget seems like an odd approach. Usually people will git-clone the upstream lib to a temp dir during the build, and checkout a specific commit or tag (to pin the version). I would recommend that you start with step 0: Write a proper Makefile that follows the usual conventions like `build` and `configure` (you can look at other software for examples, or the manual of GNU Make, or distro packaging docs). Make sure it actually builds on a clean system (you can try a Docker container). Then distribute it as source if your target OS makes it easy like Arch or Homebrew. If you really want to distribute binaries, the easiest is probably to statically link everything and tell people to download the binary from Github Releases (or the equivalent in your host). If you want to create proper binary packages with dynamic linking, you will have to go to each OS'es packaging instructions, and maintain a separate package for each one (having a good Makefile will help here). As an example, here's a page about how to do source distributions for Arch (AUR): https://wiki.archlinux.org/title/Creating_packages