I try to update a specific directory on AppFog for my Node.JS application, but I don't know how does it works..
I tried to execute the following command : "af update appname --path public/admin".
But it doesn't work : "No such file or directory".
And when I update all the application, all my uploaded photos are deleted. By "uploaded photos" I mean the photos uploaded by an upload form in HTML from my application in the "public/images/photos" directory!
How can I update only the "public/admin" directory?
Anthony
Create a .afignore file in the root directory. There you can specify the files or directories you want to ignore while running update command.
Example:
# ignore run.py
run.py
# ignore dot files
.*
# ignore assets directory
assets/
# ignore static directory
static/
Once you are done specifying the files, run af update appname
You can find the documentation here : https://docs.appfog.com/getting-started/af-cli/afignore
Related
Is a way to change thecypress.config.ts file's location instead of having it in the root of the project along with the package.json, tsconfig.json?
I changed the file (cypress.config.ts) location but it doesn't work I got an error message :
Could not find a Cypress configuration file in this folder:
/Users/Mycomputer/cypressDemoPomExample
I figure out how to fix it
You can simply run the command :
npx cypress open --config-file theNewCypressConfigPath
I have writen a Ruby script to lookup for documents with a given date and upload them to Google Drive by using the google-drive-ruby gem. I have a folder inside of the gdrive root path where I want to place the files, and I access it using collection_by_title and then uploading the file by using the .add method.
The problem is that the files are being uploaded each one two times, one to the folder I want and another one to the root path of my GDrive. Any thoughts?
This is the method where the file gets uploaded:
def upload_document(file, folder_code)
folder = #session.collection_by_title("#{folder_code}")
path = "#{#basedir}/#{folder_code}/#{file}"
folder.add(#session.upload_from_file(path, file, convert: false))
end
EDIT: Methods and variables translated to english.
Each time the method upload_document is triggered, one copy of the file gets uploaded to the folder and another copy gets uploaded to the root path of gdrive.
Example: Method upload_document gets called providing the file (454327.pdf) and the code of the folder where it has to be uploaded in gdrive ("1"). I build the folder object by using collection_by_title, I build the path where the file is located in my local network, and finally the file gets uploaded using upload_from_file. At this point, two copies of the file had been uploaded, one to the root path of gdrive (which I don't want) and another one to the right folder in gdrive.
I received an answer from the gem creator explaining what is happening and my script is finally working as I expected.
https://github.com/gimite/google-drive-ruby/issues/260
The thing is that the file is firstly uploaded to the root by default and then .add just moves the file to the selected collection, so the file needs to be removed from root after the move operation is completed.
#session.root_collection.remove(file)
I am new to rpm creation. I need to find a way to install/uninstall/upgrade a plugin in jenkins using rpm.I am able to install plugin using rpm , but on un-installation how can I delete new file/directory which was not part of the package.
Suppose my package only deploys xyz.jpi file on server which on server restart creates xyz.jpi and xyz folder . On uninstallation I want to remove both created folders .
you can use the %ghost directive in the %files section; which means "this file/folder does not exist yet, but when it appears it will be mine."
%files
%ghost %dir /path/to/unexisting/xyz
If there will be files present in that directory; I am not sure rpm will remove them. In that case it might be necessary to add another line (to be tested!)
%ghost /path/to/unexisting/xyz/*
more information in the documentation
How about in %postun section check if this is the last instance of the package who owns this folder.
E.g:
In our case all products (a,b,c) co-own /opt/xyz then the last of a, b, c on its uninstall remove the /opt/xyz folder (if it is not installed by rpm).
we are checking this by rpm -qa | egrep 'b|c' ....
if nothing present then we do rm -rf /opt/xyz
I've setup a custom configuration file for Pylint (name, conveniently, config). There has to be a way that I don't have to include --rcfile=config on every run. How can I set the config file permanently?
When you do not specify the --rcfile option, Pylint searches for a configuration file in the following order and uses the first one it finds:
pylintrc in the current working directory
If the current working directory is in a Python module, Pylint
searches up the hierarchy of Python modules until it finds a
pylintrc file. This allows you to specify coding standards on a
module-by-module basis. Of course, a directory is judged to be a
Python module if it contains an __init__.py file.
The file named by environment variable PYLINTRC
.pylintrc in your home directory, unless you have no home directory
or your home directory is /root
.pylintrc in the current working directory
/etc/pylintrc
Thus depending on the method you choose, Pylint can use a different configuration file based on the location of the code, the user or the machine.
Note that the configuration file only applies to Python files that are in modules. Thus, Pylint still uses its default rules when analyzing Python files in a directory with no __init__.py file.
For example, I have a bin/ directory containing command line applications. Ordinarily, this directory needs no __init__.py file because it is never imported. I had to add a bin/__init__.py file to get Pylint to analyze these Python files using my pylintrc file.
set the path to that file in the PYLINTRC environment variable, or rename the file $HOME/.pylintrc or /etc/pylintrc (the latter is probably only supported on *nix)
It can be done using .pre-commit-config.yaml. This snippet below need to be added to .pre-commit-config.yaml:
repos:
- repo: local
hooks:
- id: pylint
name: pylint
entry: pylint
language: system
types: [python]
args: [
"-rn", # Only display messages
"-sn", # Don't display the score
"--rcfile=.pylintrc", # Link to your config file
"--load-plugins=pylint.extensions.docparams", # Load an extension
]
I'm just installing MediaWiki (loving it). I'm lookin at this for adding images. I can se the logic of
[[File:MediaWiki:Image sample|50px]]
but where so I set the filepath for "File" (nothing obvious in LocalSettings.php) ... or is there some other logic at work?
I'd appreciate any help
Thanks
File location is determined by $wgLocalFileRepo which by default depends on $wgUploadDirectory and $wgHashedUploadDirectory. The upload directory defaults to [MediaWiki base dir]/images (Adrian must be using an older version). If hashing is enabled, /x/xy will be appended to the path, where xy are the first two letters of the md5 hash of the filename.
The defaults from DefaultSettings.php are:
$wgUploadPath = "$wgScriptPath/uploads";
$wgUploadDirectory = "$IP/uploads";
If you want to change this, you should copy and paste this into LocalSettings.php
And make sure that $wgEnableUploads = true; is in LocalSettings.php too.
Your "Image sample" is the name of image, not the name of a file. By config file you can just set the root folder for image uploads.
Just for future reference in case someone else runs into this issue:
I installed MediaWiki on my Mac OS Sierra and when I attempted to upload an image I got the following message:
Failed:
Could not open lock file for "mwstore://local-backend/local-public/d/d9/babypicture.png".
I changed the permissions on the mediawiki_root/images folder to be owned by _www user and group.
chown -R _www:_www wiki/images
I was able to upload the image afterward.