I am working on native script 8 and using ns plugin add #triniwiz/nativescript-downloader plugin to download the zip file from the server.
I get the download response
"status": "completed",
"path": "file:///data/user/0/com.myapp.example/cache/1bbf6484-9c77-4357-9759-1c1a55011a21"
but when the plugin tries to unzip the same downloaded file it gives me this
File does not exist, invalid archive path: file:///data/user/0/com.myapp.example/cache/1bbf6484-9c77-4357-9759-1c1a55011a21
I am using #nativescript/zip for unzipping the downloaded file.
unZipFile(path, unzipPath) {
let destination = fs.path.join(this.document.path,"/assets/",unzipPath);
return Zip.unzip({
archive: path,
directory: destination,
onProgress: this.onUnZipProgress
}).then((res) => {
console.log(res);
return destination;
}).catch((err) => {
return 'failed-----------------:'+err;
});
}
not sure if there is something wrong with the code or the plugin, can someone please help?
Check the download directory you're using. You likely should be using only the temp or documents known folders. See the [NativeScript File System][1] docs for details.
I've seen a problem similar to this where it looks like the file downloaded successfully but in fact failed due to security restrictions. This is especially true on iOS.
[1]: https://v7.docs.nativescript.org/ns-framework-modules/file-system
I have a Laravel project and as you know when you deploy your app everything in your public directory should be copied over to your htdocs or public_html directory to hide your application's code.
I am using webpack to build my react code and everything else and each time I change my javascript webpack does what I want, it sees I make a change and then it builds it.
However I want to add one additional command after it builds and that is to copy everything from the public directory into the correct directory in htdocs/public_html.
So far I read up on this question here Run command after webpack build
It works and I can get the echo to work but I'm not sure why cp isn't working. Echo works but how do I know what shell commands I can use?
I tried 'cp' and even 'copy-item' which is powershell, but none are working.
This is my plugin so far, I figured I needed to change the directory to be safe
before copying anything over but again, nothing is working.
mix.webpackConfig(webpack => {
return {
plugins: [
new WebpackShellPlugin({
onBuildStart: ['echo "Starting Build ..."'],
onBuildEnd: ["cd 'E:\\xammp\\apps\\FactorioCalculator'",
"cp '.\\public\\*' '..\\..\\htdocs\\FactorioCalculator\\' -f -r"]
})
]
};
});
You could always use the copyDirectory mix method. Just put something like the following at the bottom of your webpack.mix.js file:
mix.copyDirectory('public', '../../htdocs/FactorioCalculator/')
You might have to change your path to ..\\..\\htdocs\\FactorioCalculator\\ as per the path in your question (I only have my mac with me so I'm unable to test on my other machine).
To answer you original question, if you want to execute a command each time webpack finishes building you can use the mix.then() which takes a closure.
It's great that ParcelJS just handles sass out of the box but I'm running into a problem where it keeps throwing an exception when it encounters a url within in my scss file. I guess Parcel is trying to locate the resource and rewrite the url. I do not want Parcel to do this. Is there anyway to disable this? I just want it to compile the sass and leave any urls in there alone.
This question was posted when Parcel v1 was the latest version. For folks arriving here in the future, you can accomplish this in Parcel v2 with the parcel-resolver-ignore plugin. Here's how:
Install it (e.g. yarn add -D parcel-resolver-ignore)
Create or modify your .parcelrc file to add it to the parcel pipeline:
{
"extends": "#parcel/config-default",
"resolvers": ["parcel-resolver-ignore", "..."]
}
Add a "parcelIgnore" entry to package.json that contains regex patterns that define which resources to ignore, e.g.:
{
// An array of Regex patterns
"parcelIgnore": [
"images\/*.*",
]
}
The things you want to target your regexes to match are the urls referenced in the .scss files, not the .scss files themselves.
I am trying to upload a zip file containing my App into PhoneGap Build by using the API with Node.js but it doesn't work, it does if I upload the file manually from the website.
After a successfully authentication with this piece of code:
pgBuild.auth({ token: phonegapBuildToken }, authenticationResponse);
in my callback I do the following:
function authenticationResponse(e, api){
unlockAndroidKeyMethod(api);
unlockiOSKeyMethod(api);
var options = {
form: {
data: {
platforms: ['android', 'ios']
},
file: './www/xxx.zip'
}
};
api.post(phonegapEndpoint + '/build', options, function(ee, data) {
console.log('## BUILD IN PROGRESS...');
console.log(ee);
console.log(data);
//waitingForPendingBuild(api);
});
}
inside the option I am pointing to the file I want to load
file: './www/xxx.zip'
the problem is that whatever I put there it doesn't get picked up, what PhoneGap Build builds is the file always the file loaded through the website.
Can I get some help, please? :)
Thanks
PS: I get no error
I have managed to solved this problem - it was a problem on how I create the zip file apparently...PhoneGap Build API don't like zip files done with gulp-zip, using archiverjs (https://archiverjs.com/docs/) solves the issue :)
Thanks
I'm unable to upload my firefox extension using the form provided by mozilla. I'm constantly getting the error Your add-on failed validation with 2 errors.
No install.rdf or manifest.json foundAdd-on missing manifest, which is very misleading because my application has a manifest.json.
The manifest.json looks like this:
{
"manifest_version": 2,
"version": 1.0,
"name": "my-extension-name",
"description": "Lorem ipsum dolor sit amet",
"background": {
"scripts": ["js/background.js"]
},
"main": "popup.js",
"browser_action": {
"default_icon": "img/icon_grey.png",
"default_popup": "popup.html",
"default_title": "loremipsum"
},
"engines": {
"firefox": ">=38.0a1"
},
"permissions": [
"activeTab",
"tabs",
"background",
"http://*/*",
"https://*/*",
"notifications",
"alarms",
"storage",
"webRequest",
"webRequestBlocking",
"clipboardRead"
]
}
What is missing for this to work?
I was running into the same problem but all of these instructions didn't solve it.
What i always did was to pack the whole folder, hence the manifest.json was not on the first level, when unpacked.
SOLUTION FOR ME
Select all files, instead of the folder, and then pack them as one .zip file and it should work. At least it did for me.
Here is a link to the MDN Documentation.
The very simple answer to this is that its unable to find the manifest in your zip file. This is caused because when you take a file and zip it using the default compressor in windows it takes the file and throws it into a sub folder of the zip file you created...
before compressing
folderYouWantCompressed
-FileInFolder.html
-Manifest.json
after compressing it will look like this
nameOfZip.zip
-folderYouWantCompressed
-FileInFolder.html
-Manifest.json
but what you want is
nameOfZip.zip
-FileInFolder.html
-Manifest.json
the reason Oliver Sauter answer works is because when you select all the files within the "folderYouWantCompressed" it compresses without the sub folder meaning you dont run into this problem and it has no problem finding the manifest file.
for what I can tell the "correct answer" seems to be signing the add-on itself and is able to get the manifest file properly, so it does work but just seems like a 3rd party way of doing it (I did not look into it too deeply)
Note: that I originally had my issue solved by looking at Oliver Sauter post I just wanted to make it clear for future people looking at this post.
When you open your addon package zip file, the manifest.json file should be visible to you in order to upload it on AMO.
In your case, it looks like when you open your package zip, there is a folder and inside that folder manifest.json is located.
As I have found a solution to my problem and would like to share it for future reference I answer my own question:
The issue at hand was that I did not use the web-ext command line tool to create the .zip / .xpi package. I was able to solve the problem by installing web-ext and using web-ext build to build the extension. The result of this operation is a .xpi file that contains the project which I was then able to upload to the AMO service. Note that the manifest.json in the newly created package is identical to the manifest.json I originally provided. However, in addition to the manifest.json a directory META-INF was created which contains a mozilla.mf, mozilla.rsa and mozilla.sf file.
This however, did not entirely solve my problem. After uploading the extension to AMO, it could not be installed and was said to be damaged. Apparently, which is what I read somewhere in the interwebz (and forgot the source), Mozilla opens the .zip / .xpi package that is uploaded to test it and since my package was not signed, Mozilla could not ensure its integrity and marked it as insecure (i.e. damaged).
In order to solve the second problem I had to sign the extension. This can be done using the following command:
web-ext sign --api-secret YOUR_API_SECTER --api-key YOUR_API_KEY
After this, I was able to upload and install the extension.
Got the same problem, the problem was thats the file name is case sensitive:
Manifest.jason -> error, no manifest found
manifest.json -> susscessful
my solution (in mac os):
zip a directory using zip in terminal command zip -r example.zip example instead of right-clicking files and clicking "Compress" in mac os