How can I automate a (usually) interactive build script with Github Actions? - continuous-integration

I'm trying to add CI to a project that uses a set of build scripts written in bash. The scripts prompt for input a few times for configuration information (setting flags, setting parameters, etc.) Does Github Actions have its own commands for dealing with this, or is there a way to set up an expect script (or something similar)?

There is currently no feature that allows prompting for manual input during workflow runs. See this response on the community forums where a similar question was asked.
Here are some options you can explore:
Redesign the scripts to read from configuration files and check them into git to trigger the build.
Use the workflow_dispatch event to create a workflow that you can manually trigger from the Actions UI and supply input parameters. See this documentation for more detail.
Use slash-command-dispatch to trigger the build using a slash command with arguments for the build's input parameters.

Related

When to use Groovy versus a shell script in Jenkins?

I was watching this video about Jenkins:
https://www.youtube.com/watch?v=6BIry0cepz4
He mentions that shell scripts have many advantages over using Groovy, to do custom work, in a Jenkins build process.
Apparently the sandbox that Jenkins uses to run the Groovy has some sharp limits?
Where can I find more information about this? When do I give up on Groovy and switch to a shell script?
As the comment from Szymon says this a broad question to answer. And there is no one stop shop that list all the pros and cons of each. It will rather build up based on the use case and the experience that you encounter.
Apparently the sandbox that Jenkins uses to run the Groovy has some
sharp limits?
This is due to the fact that Jenkins enforces certain security measure,so as to not call any method that can perform malicious or unhealthy stuff inside your infrastructure. If you really need to use certain listed libraries, you/jenkins admin need to white-list the class by approving it. Check out the link below:
https://jenkins.io/doc/book/managing/script-approval/
Now, shell script does comes very handy but its not always hunky-dory, on everything.
When do I give up on Groovy and switch to a shell script?
To me, it depends on what i am trying to achieve. Select it based on the which one makes it more easier.

TeamCity Conditional SMB Upload path

Using TeamCity version 2017.2.3 (build 51047).
I have a SMB Upload build step and would like to upload the builds from the default branch to a different location than all other builds.
I seen the following variable that will tell me if its a deafult build %teamcity.build.branch.is_default% however im not too sure how or even if its possible to specify conditional Target URL for the SMB Upload step.
Either with some form on IF block, or ternary statement inline.
Non of this is done using PowerShell. All through the UI, i would prefer to keep it that way if possible. Our old TeamCity install saws essentially just a glorified PowerShell script runner and grew into this un maintainable monolith, besides PowerShell is a rather terrible language.
Essentially what i would like would be builds on any branch going to
//DataStore/builds/my-api-%build.number%.zip
Whilst builds on the default branch go to
//DataStore/builds/default/my-api-%build.number%.zip
Any help would be appreciated thanks.
In general, this is not possible. The SMB Upload runner doesn't let you specify a condition anywhere in it.
If conditional steps were possible, you could create two steps: Upload from default and Upload from non-default, each with a different Target URL. It turns out that conditional build steps are the most voted-for feature in TeamCity, see this ticket, yet JetBrains are quite opposed to the idea. You may want to vote for the ticket, or at least monitor it.
There is one thing that you can do, other than Powershell. The Target URL field expands variables. (You can tell this by typing a percent sign in the text field: TeamCity immediately starts suggesting variable names. Compare this with the Step name text field above: that has no variable expansion.) Thus, you could enter a Target URL in this form:
//DataStore/builds/%teamcity.build.branch.is_default%/my-api-%build.number%.zip
That way, you'll end up with files being uploaded as
//DataStore/builds/true/my-api-1234.zip
//DataStore/builds/false/my-api-1235.zip
Now that's kind-of ugly. You can improve it in two ways:
1) create symlinks or junctions on your file server (on the directory/filesystem level), so that the above are accessible to the clients as
//DataStore/builds/default/my-api-1234.zip
//DataStore/builds/my-api-1235.zip
2) even better, you can set up a variable that will either contain the value "/default" or "". Then you can change your Target URL to //DataStore/builds%myCleverVariable%/my-api-%build.number%.zip. To do that, you'll need an extra step before this one, a Powershell runner, that will test the value of %teamcity.build.branch.is_default% and set %myCleverVariable% accordingly, using TeamCity service messages.
The conditional build step feature has been implemented in TeamCity 2020.1

Post build event depending on configuration name in new ASP.NET 5 project

I'm writing a unified project for 3 smart TVs. I have also 3 configurations created in Visual Studio. Now I want to execute some CLI scripts depending on selected configuration.
The problem is in new ASP.NET 5 project I don't have an editor for post build events.
I know I have to do this in project.json. What I found is:
"scripts": {
"postbuild": ""
}
But using this one I can't create different CLI scripts for different configurations.
I found also:
"configurations": {
},
And I guess this is probably what I want, but... How to use it? Intellisense has no power here and also I wasn't lucky searching the Web...
[edit]
Maybe I should try with .xproj?
You'll need to build a master script which uses the available context and environment variables to switch and run the other scripts of your choice.
In addition to the list of variables Here for compile, you also get these for publish related scripts and then these are available everywhere, as are environment variables returned by Environment.GetEnvironmentVariable, which can be seen here.
The image below shows the intellisense from the VS2015 Update 3 RTM, but it's misleading, since you get others depending on the script block you're using:
So, your full list of context variables that you can use to control flow in your scripts is:
Every script block:
%project:Directory%
%project:Name%
%project:Version%
Compile specific:
%compile:TargetFramework%
%compile:FullTargetFramework%
%compile:Configuration%
%compile:OutputFile%
%compile:OutputDir%
%compile:ResponseFile%
%compile:RuntimeOutputDir% (only available if there is runtime output)
%compile:RuntimeIdentifier% (only availabe if there is runtime output)
%comiple:CompilerExitCode% (only available in the postcompile script block)
Publish specific:
%publish:ProjectPath%
%publish:Configuration%
%publish:OutputPath%
%publish:TargetFramework%
%publish:FullTargetFramework%
%publish:Runtime%
I investigated on this a bit but did not really get to any good result.
There are some project variables that are exposed in scripts. Unfortunately, those are very limited:
%project:Name% gives you the project name
%project:Directory% gives you the project directory
%project:Version% gives you the project version
So there is no way to access the build configuration or the environment here.
The configurations option in the project.json is also limited to build configurations and only allows declaring compilation options there, so that also doesn’t work.
Unfortunately, there also doesn’t seem to be another way to solve this. At least not right now. I would consider myself sending a pull request to DNX to add some additional project variables which one could use but at the moment, it doesn’t really make any sense to invest time into DNX: After all it’s being replaced by the dotnet CLI. We’ll see if that one will come with functionality to access the environment—and if not, I might end up submitting a pull request to add this functionality. But until we get there, I’m afraid there is no solution for this.

Can I automatically generate a change script using a .scmp file?

We're using database projects here at work and for our deployment to the production server, our current process is to manually run a compare using a saved .scmp file that compares the database project to our production database (using a read-only login), then generate a SQL script that we give to our I.T. support guy to run on production. We also do a build to generate our post-deployment script, and we give that one to our guy to run as well.
I'm trying to automate as much of this process as possible (to reduce the chances of mistakes and make it more efficient). I'd like to know if there's a way to automatically generate the sql change script using the predefined options in our .scmp file.
Additionally, is there an easy way to automate the appending of the post-deployment script to the end of the schema change script, so he just has one sql file to run?
Perhaps there's a nice way to do the whole thing with powershell or something.
Ok what you should do is use sqlpackage.exe to create your script from the dacpac that is produced by building the ssdt project.
Create a batch script to call it or make it a part of your CI process.
To filter the output there are some new options like exclude certain types in the latest (March 2015) release of ssdt or use a deployment filter like:
http://agilesqlclub.codeplex.com if you need more flexibility.
Using this you can filter the deployment like the compare and also the pre/post deploy scripts are pre/appended so you kill two birds with one stone!
Ed

Jenkins Timeout because of long script execution

I have some Issues regarding Jenkins and running a Powershell Script within. Long Story short: the Script takes 8x longe execution time then running it manually (takes just a few minutes) on the Server(Slave).
Im wondering why?
In the script are functions which which invoke commands like & msbuild.exe or & svn commit. I found out that the script hangs up in those Lines where before metioned commands are executed. The result is, that Jenkins time out because the Script take that long. I could alter the Timeout threshold in the Jenkins Job Configuration but i dont think this is the solution for the problem
There are no error ouputs or any information why it takes that long and i do not have any further Idea for the reason. Maybe one of you could tell me, how Jenkins invokes internaly those commands.
This is what Jenkins does (Windows batch plugin):
powershell -File %WORKSPACE%\ScriptHead\DeployOrRelease.ps1
I've created my own Powershell CI Service before I found that Jenkins supports it's own such plugin. But in my implementation and in my current jobs configs we follow sample segregation principle rule: more is better better. I found that my CI Service works better when is separated in different steps (also in case of error it's a lot easy for a root cause analyse). The Single responsibility principle is also helpful here. So as in Jenkins we have pre- & post-, build and email steps as separate script. About
msbuild.exe
As far as I remember in my case there were issues related with the operations in FileSystem paths. So when script was divided/separated in different functions we had better performance (additional checks of params).
Use "divide and conquer" technique. You have two choices: modify your script so that will display what is doing and how much it takes for every step. Second option is to make smaller scripts to perform actions like:
get the code source,
compile/build the application,
run the test,
create a package,
send the package,
archive the logs
send notification.
The most problematic is usually the first step: To get the source code from GIT or SVN or Mercurial or whatever you have as version control system. Make sure this step is not embeded into your script.
During the job run, Jenkins capture the output and use AJAX to display the result in your browser. In the script make sure you flush standard output for every step or several steps. Some languages cache standard output so you can see the results only at the end.
Also you can create log files that can be helpful to archive and verify activity status for older runs. From my experience using Jenkins with more then 10 steps requires you to create a specialized application that can run multiple steps like "robot framework".

Resources