How to run jscodeshift transform on all files? - jscodeshift

I am creating a transform that will replace all instances of:
templateUrl: 'some/url/to/some.html'
with
template: require('some/url/to/some.html')
I am doing this because I am changing the way that our AngularJS code brings in their templates. I am going to make them all use the webpack string loader.
I have gotten the transform to work. But now I have no idea how to run it against all of the files in my project. I can't see in the documentation how to run this against all of the .js files in my project. To run my transform, I currently type:
jscodeshift ./folder/myComponent.js -t ./tools/codemodes/template.js -d -p
This command will run my transform against the myComponent.js file, but I want it to be run against all of the .js files in my project. What do I need to change with my current command to make it select all .js files, and run the transform against all of them?

If you want to run a transform against all files in a directory, you just enter the directory path, instead of a file path. In the example above, do the following:
jscodeshift ./folder -t ./tools/codemod/templates.js -d -p
That will run the transform against all of your nested js files.

The answer provided by frosty will run on all files in the directory, not just JS files. If you want to just run the transform on all nested JS files, use the ** glob syntax:
jscodeshift ./path/to/folder/**/*.js -t ./tools/codemod/templates.js -d -p
Note that certain shells don't recognize ** and you will have to first manually turn it on with:
$ shopt -s globstar

Related

zip created using -jr flags unzipped differently on macOS when double clicking vs running unzip

I am zipping the .xctest file from Plugins folder inside the .app target generated by building my app. I have a build phase script that runs last in my test target to copy this file over. I use the following script to do the zipping:
XCTEST_FILE=${TARGET_BUILD_DIR}/${TARGET_NAME}.xctest
XCTEST_ZIP=${TARGET_BUILD_DIR}/../../${TARGET_NAME}.xctest.zip
zip -jr ${XCTEST_ZIP} ${XCTEST_FILE}
This gives me TestTarget.xctest.zip file. But it unzips differently based on these 2 methods,
unzip TestTarget.xctest.zip
-TestTarget
-CodeResources
-Info.plist
Double clicking TestTarget.xctest.zip in finder
-TestTarget.xctest
--TestTarget
--CodeResources
--Info.plist
Why is unzip going to the innermost node and extracting all the files? I want the unzip command to give me the .xctest directory. I tried renaming the zip file to TestTarget.zip and it still behaves similarly.
I was initially zipping using zip -r ${XCTEST_ZIP} ${XCTEST_FILE}, but the problem with this was it would retain the entire folder structure from root (\) when I double clicked to unzip the file. A post recommend using the -j flag instead of -r. But just -j led to no zip file being generated. Another comment recommend -jr which created a zip that generated output I expected when double clicking it. But I guess the unzip command does stuff differently.
Similar Question: MacOs zip file - different result when double click and running unzip command
The cause for error here was very different,
The problem was when the file was created. It was not related to MacOs issue but with certain path length known issue in windows
Based on How to zip folder without full path, I had to update my script to first cd into the TARGET_BUILD_DIR before generating the zip. I also had to remove the -j flag so the local folder structure was retained on running unzip.
cd ${TARGET_BUILD_DIR}
XCTEST_FILE=./${TARGET_NAME}.xctest
XCTEST_ZIP=../../${TARGET_NAME}.xctest.zip
zip -r ${XCTEST_ZIP} ${XCTEST_FILE}

append a parameter to a command in the file and run the appended command

I have a the following command in a file called $stat_val_result_command.
I want to add -Xms1g parameter at the end of the file so that is should look like this:
<my command in the file> -Xms1g
However, I want to run this command after append. I am running this in a workflow system called "nextflow". I tied many things, including following, but it does not working. check the script section which runs in Bash by default:
process statisticalValidation {
input:
file stat_val_result_command from validation_results_command.flatten()
output:
file "*_${params.ticket}_statistical_validation.txt" into validation_results
script:
"""
echo " -Xms1g" >> $stat_val_result_command && ```cat $stat_val_result_command```
"""
}
Best to avoid appending to or manipulating input files localized in the workdir as these can be, and are by default, symbolic links to the original files.
In your case, consider instead exporting the JAVA_TOOL_OPTIONS environment variable. This might or might not work for you, but might give you some ideas if you have control over how the scripts are being generated:
export JAVA_TOOL_OPTIONS="-Xms1g"
bash "${stat_val_result_command}"
Also, it's generally better to avoid localizing and running scripts like this. It might be unavoidable, but usually there are better options. For example, third-party scripts, like your Bash script could be handled more simply:
Grant the execute permission to these files and copy them into a
folder named bin/ in the root directory of your project repository.
Nextflow will automatically add this folder to the PATH environment
variable, and the scripts will automatically be accessible in your
pipeline without the need to specify an absolute path to invoke them.
This of course assumes you can control and parameterize the process that creates your Bash scripts.

openBinaryFile: does not exist when executing pandoc in Gitlab CI bash script

I am getting this error:
pandoc: sh: openBinaryFile: does not exist (No such file or directory)
when trying to build some assets with Pandoc in a Gitlab CI bash script.
I have a repo, Finnito/Science, that is serving a Gitlab Pages site using Hugo. I am trying to set up a Gitlab CI pipeline to build my HTML slides and PDFs docs from my Markdown source when I commit to the repo so that I don't have to build them locally.
I have been trying out different Docker images of pandoc but decided pandoc/latex is my best bet because it's "official" and built on Alpine which is nice and lightweight. But I can't seem to make heads or tails of this error.
I have tried various different incantations for pandoc but they don't seem to work.
My Gitlab CI job looks like this:
assets:
image: pandoc/latex
script:
- chmod +x ci-build.sh
- sh ci-build.sh
and my ci-build.sh script looks like this:
#!/bin/sh
modulesToBuild=(
"/builds/Finnito/science/content/10sci/5-fire-and-fuels"
"/builds/Finnito/science/content/10scie/6-geology"
"/builds/Finnito/science/content/11sci/4-mechanics"
"/builds/Finnito/science/content/11sci/5-genetics"
"/builds/Finnito/science/content/12phy/2-mechanics"
"/builds/Finnito/science/content/12phy/3-electricity"
)
for i in "${modulesToBuild[#]}"; do
# Navigate to the directory.
cd $i
# Build the HTML slides and
# PDFs for all markdown docs.
for filename in markdown/*.md; do
file=${filename##*/}
name=${file%%.*}
pandoc/latex pandoc -s --mathjax -i -t revealjs "markdown/$name.md" -o "$name.html"
pandoc/latex pandoc "markdown/$name.md" -o "$name.pdf" --pdf-engine=pdflatex
done
done
Honestly, I'm just pretty lost with how to successfully call pandoc within the Docker container. I am very new to this and it all makes very little sense!
Any help would be most appreciated!
The image has /usr/bin/pandoc set as an entry point. This means that one doesn't has to specify the pandoc command when running the container; providing a command it will cause pandoc to try to read an input file with the name of the given command, which causes the error you are seeing.
For a while, images used a custom entrypoint script which tried to detect if a different binary should be executed. But this was reverted as it proved unreliable and confusing.

bash: ngc: command not found

I'm using #angular/compiler-cli to build my ng2 app in aot mode. When I input 'ngc -p tsconfig-aot.json' in my bash window, I get 'bash: ngc: command not found'. However, when I use 'node_modules/.bin/ngc -p tsconfig-aot.json' instead, it works. I googled for serval times but didn't get any usfull information. Can any give me a hand? Thx!
Seems like you need to put ngc in your path:
echo $PATH
Do you see ngc in binary in your path?
If not:
PATH=$PATH:/path/to/ngc
To make it permanent add to .bash_profile
export PATH=$PATH:/path/to/ngc
I've tried to change the slash to 'backslash' on windows and it worked for me:
node_modules\\.bin\ngc
If you don't want to set it globally, you can specify an absolut path in your angular-project, just make sure that you delete this part of the path when you don't use it anymore.
ngc is in node_modules/.bin, so depending on where you want to use ngc you can export the path like this:
PATH=$PATH:../../../node_modules/.bin
To run commands located into the node_modules folder of your project, without installing them globally (operation that will make the ngc command work in any system folder), you can use this command:
ngx ncc <options>
Basically ngx is a shortcut that executes any command located in node_modules bin folder.

minify phonegap javascript in xcode

How can I run a minification script on the javascript used in a phonegap project after it has been copied into the build by the "copy bundle resources" build phase?
I'm sure it should be a case of adding a script like:
for i in $DSTROOT/www/*.js
do
uglifyjs --overwrite $i
done
But $DSTROOT/www doesn't seem to be the folder it copies things too. What is the correct environment variable to use?
$PROJECT_DIR will get your output project directory. I combine all my js files in my www output directory first:
perl -pe 1 `find "$PROJECT_DIR/www" -name '*.js'` > "$PROJECT_DIR/www/all.js"
You can then minify your combined js in a single command with your chosen minifier.

Resources