How to export/package a group of files from Bazel - makefile

This feels too obvious to be unanswered, but if the answer is out there, I haven't found it. For context, I'm incorporating someone else's existing code into a Bazel build, so I'm really not looking for "just don't do it that way"-type answers.
The code produces man dozen related files: Libraries, compiled binaries (from C and C++, if that matters), python and shell scripts, etc. Those files expect to find each other in specific locations (e.g. shell scripts reference binaries by relative or absolute path), and I need to package up and install the whole lot.
Is there a way to do that in Bazel? To pick out a bunch of bazel-generated files (and, in this case, a bunch of input files that we pass through unmodified) and put them in a tarball, or a standard package format (e.g. .deb) or even just place them in the local file system in known locations?
The closest ideas I've seen involve basically doing it by hand (e.g. writing a shell script to go into Bazel's output directory and copy out the files of interest) but that seems easy to get wrong. There has to be a way to use the intelligence of the build system to bundle up a bunch of targets and data files, right?

Naturally, I find what's probably the answer shortly after posting the question: https://docs.bazel.build/versions/master/be/pkg.html. If anyone has further insight, though, I'm definitely happy to hear it!

Related

Combine a set of shell scripts with internal dependencies into one?

I'm developing a set of shell scripts and for ease of development, the functions are often split across various files.
So the final binary scripts which I expect the end user to use require them to have all the relevant "library" scripts installed in the right location.
I am trying to find a way that allows me to develop the scripts with the same logical split in files, but then I can merge them all into a single binary script.
In the naive case, it would recursively go through all the sourced files and include them in the same file (similar to the pre-processing step in C compilers). The more involved version would also identify which functions are unused and trim them out.
Does anything like this exist? If not, I might consider writing it, but would be happy to hear about potential pitfalls that I should account for
I have seen this before, in Arch Linux' devtools repo. They use m4 to process .in files.
It's just templating though. And you might not need anything more.

How to code a script to create a makefile

i'm new year and I need some answer. I searched on the web to some answer but i didn't found anything usefull. What am i searching is for a shell programms that when you execute it, create a Makefile with the binary name in arguments like :
./automakefile.sh hello .
Will build you a Makefile with a binary name called hello.
I hope you guys will help me, i'm counting on you <3
There is, unfortunately, no such magic command. If there was, we wouldn't need Makefiles to start with because the magic would most likely have been incorporated in the compiler.
There are several reasons why there isn't a command like that.
Given a random binary file, you can't generally say what programming language it was written in.
You also can't tell what source file were used to compile the binary file from, or where in the file hierarchy they are located (not just where they were located when the binary file was compiled last time, maybe on another system).
You don't know the dependencies between the source code files. Makefiles are primarily useful for keeping track of these (and compiler flags etc.), so that changing one single source file in a big project does not trigger a recompilation of everything.
You don't know what compiler to use, or what flags to pass to it. This is another thing a Makefile contains.
There are build tools available for making the creation of Makefiles easier, and for making them portable between systems on different architectures (the Makefiles that is, not necessarily the programs, that's down to the programmer). One such set of tool is GNU's autotools, another is CMake, and I'm sure there are others as well, but those are the ones I use.
Now you're facing another but similar problem, and that is that you still need to learn the syntax of, and writ,e your Makefile.am and configure.ac files (for the GNU tools), or your CMakeLists.txt files (for CMake).

Effective way of distributing go executable

I have a go app which relies heavily on static resources like images and jars. I want to install that go executable in different platforms like linux, mac and windows.
I first thought of bundling the resources using https://github.com/jteeuwen/go-bindata, but since the files(~100) have size ~ 20MB or so, it takes a really long time to build the executable. I thought having a single executable is an easy way for people to download the executable and run it. But seems like that is not an effective way.
I then thought of writing a installation package for each of the platform like creating a .rpm or .deb packages? So these packages contain all the resources and puts it into some platform specific pre defined locations and the go executable can reference them. But the only thing is that I have to handle that in the go code. I have to see if it is windows then load the files from say c:\go-installs or if it is linux then load the files from say /usr/local/share/go-installs. I want the go code to be as platform agnostic as it can be.
Or is there some other strategy for this?
Thanks
Possibly does not qualify as real answer but still…
As to your point №2, one way to handle this is to exploit Go's way to do conditional compilation: you might create a set of files like res_linux.go, res_windows.go etc and put a set of the same variables in each, pointing to different locations, like
var InstallsPath = `C:\go-installs`
in res_windows.go and
var InstallsPath = `/usr/share/myapp`
in res_linux.go and so on. Then in the rest of the program just reference the res.InstallsPath variable and use the path/filepath package to construct full pathnames of actual resources.
Of course, another way to go is to do a runtime switch on runtime.GOOS variable—possibly in an init() function in one of the source files.
Pack everything in a zip archive and read your resource files from it using archive/zip. This way you'll have to distribute just two files—almost "xcopy deployment".
Note that while on Windows you could just have your executable extract the directory from the pathname of itself (os.Args[0]) and assume the resource file is located in the same directory, on POSIX platforms (GNU/Linux and *BSD etc) the resource file should still be located under /usr/share/myapp or a similar place dictated by FHS (or particular distro's rules), so some logic to locate that file will still be required.
All in all, if this is supposed to be a piece of FOSS, I'd go with the first variant to let the downstream packagers tweak the pathnames. If this is a proprietary (or just niche) software the second idea appears to be rather OK as you'll play the role of downstream packagers yourself.

How can I completely compile a bash project to be distributed?

I am trying to compile a bash project into a distributable binary. I tried shc, and it worked, except all my source statements were broken. I have numerous source statements to keep the code base cleaner, but they are broken when compiled with shc. How can I compile down my bash project so that instead of having a bunch of .sh files, the end user can just have one single file?
Shc is an obfuscator, not a compiler. At the end of the day, it still invokes /bin/sh or whatever, and feeds it your original script. It has not a slightest idea what your script actually does. If it needs an additional file to source, you have to supply it at an appropriate location.
You may want to investigate things like SHAR. Build anarchive, then compile it with shc if you want.
It sounds like all you're missing is a facility to expand all your source statements. That should be fairly easy to write if your codebase is fairly consistent in its use of those statements: just write a script to expand them inline and away you go.
Alternatively, just put all your scripts into a single Zip file or tarball and tell the user to extract the contents of that one file, or if even that is too much I'm sure you can imagine a way to encode the zipped contents of all the non-main files into a giant comment at the bottom of the main file, and have it extract what it needs before proceeding.
Or, you know, use the appropriate installer for your system. Build an RPM for RHEL or a Debian package or a Windows MSI or whatever....

How To Export exe files in Visual Studio With All Used Files

So I have been working on a few projects using audio and images from files in Visual Studio C++. As of now they are just test projects, but I am going to be moving now towards making 2D games for fun using SFML and a few different audio libraries. The problem is this, I want to give out my games to others so they can play and test them, and I may try to develop some sort of Multiplayer for some, thus increasing my desire to give it out to others, however I do not know how I can give them the games with all the files included. I used to just be able to grab the exe files out of the debug or release folder, but these projects have files they rely on.
So here is my question, is it possible to export an exe file that contains all the other files (wav's, jpg's etc.)? If this question sounds overwhelmingly stupid then tell me, because I have very little idea of what an exe is, and whether it can hold those files (I am used to java, where u can simply export something into a runnable jar and because it is an archive, with all of the resources prepackaged in there, I don't know if an exe shares these traits). If this is not possible, or there are better alternatives, what are they? I have seen things and know how to load sounds from arrays of data, would that be a better solution? Or are there other options? On top of that, in the debug and release folder there are several DLL files which I need to run the project, is there a way to compress these into the exe or will those have to be in the same folder as the exe no matter what?
The real question here is what is the best way to export an exe file of my project so that I can utilize all of my sound and image resources as well as the dll's into an easy to distribute copy? Thank you in advance to any advice.
It's not possible to export an exe that contains your exe and multiple other files. You can use an installer (such as InnoSetup, which is free), or bundle the extra files into a resource and load them from resource at runtime. (The first has the benefit of being able to ask the user where to install, create shortcuts and folders, Start Menu items, etc.).
There's two easy ways to make a file that you can easily give to people to test and/or play your game.
The first option is using an installer, as mentioned in Ken White's answer. It's a good method for "final" releases, but it adds an extra step if you just want to send a copy of your game to someone to test it.
The second method is put all your files into a single .zip file (or .rar, or .tar.bz). Basically, this is a lot like Java's .jar file, with all the dlls, image files, and sound files into a single file. Recent versions of Windows have the ability to create zip files built in, so the best way to do it is just zip up the Debug or Release version with all the files, and unzip to an empty folder somewhere, and test the game. Doing that will let you make sure you got all the files you need. This way, you can easily send your game to someone, they can simply unzip it to a folder somewhere, and play, no messing about with installers.
The bonus third option is sticking the files into a resource and loading them at runtime, or similar things (it's possible to get really fancy and combine all the files into a single EXE, but it's not exactly easy, and not really advisable).

Resources