NativeScript Downloader plugin: Can not open zip file after downloading the file - nativescript

I am working on native script 8 and using ns plugin add #triniwiz/nativescript-downloader plugin to download the zip file from the server.
I get the download response
"status": "completed",
"path": "file:///data/user/0/com.myapp.example/cache/1bbf6484-9c77-4357-9759-1c1a55011a21"
but when the plugin tries to unzip the same downloaded file it gives me this
File does not exist, invalid archive path: file:///data/user/0/com.myapp.example/cache/1bbf6484-9c77-4357-9759-1c1a55011a21
I am using #nativescript/zip for unzipping the downloaded file.
unZipFile(path, unzipPath) {
let destination = fs.path.join(this.document.path,"/assets/",unzipPath);
return Zip.unzip({
archive: path,
directory: destination,
onProgress: this.onUnZipProgress
}).then((res) => {
console.log(res);
return destination;
}).catch((err) => {
return 'failed-----------------:'+err;
});
}
not sure if there is something wrong with the code or the plugin, can someone please help?

Check the download directory you're using. You likely should be using only the temp or documents known folders. See the [NativeScript File System][1] docs for details.
I've seen a problem similar to this where it looks like the file downloaded successfully but in fact failed due to security restrictions. This is especially true on iOS.
[1]: https://v7.docs.nativescript.org/ns-framework-modules/file-system

Related

How to upload an PDF file with cypress.io

I would like to upload a PDF file using cypress.
The idea will be to go and find the pdf file into my directory.
I didn't find some way yet.
This is the element used to upload
2022 Update
Since Cypress v9.3.0 you can use
.selectFile('fileName.pdf')
you can find more details about it in Cypress api docs
Cypress doesn't support file upload out of the box.
cypress-file-upload is a npm package which provides a custom cypress command to upload files easily.
But, it requires the file to be present in the fixtures folder.
Sample code for uploading pdf file:
const fileName = 'myfile.pdf';
cy.fixture(fileName).then(fileContent => {
cy.get('#filesToUpload').upload({ fileContent, fileName, mimeType: 'application/pdf' }, { subjectType: 'input' });
});
You can also find more workarounds here: https://github.com/cypress-io/cypress/issues/170
As mentioned by #bushra you need to install the 'cypress-file-upload' package and then you can do this :)
Cypress.Commands.add("UploadFile", function () {
cy.fixture("someFile", "binary")
.then(Cypress.Blob.binaryStringToBlob)
.then((fileContent) => {
cy.get('someElement').attachFile({
fileContent,
filePath: "someFile.pdf",
fileName: "someFile.pdf",
});
});

Laravel-mix versioning when uploaded in S3 thinks in previous hash

With using webpack-s3-plugin npm package, I'm saving my laravel-mix compiled & versioned files into S3 (for cdn purposes).
Bare in mind, this was working until yesterday.
let webpackPlugins = [];
if (mix.inProduction() && process.env.UPLOAD_S3) {
webpackPlugins = [
new s3Plugin({
include: /.*\.(css|js)$/,
s3Options: {
accessKeyId: process.env.AWS_KEY,
secretAccessKey: process.env.AWS_SECRET,
region: process.env.AWS_REGION,
},
s3UploadOptions: {
Bucket: process.env.ASSETS_S3_BUCKET,
CacheControl: 'public, max-age=31536000'
},
basePath: 'assets/' + process.env.APP_ENV,
directory: 'public'
})
]
}
mix.scripts([ // I also tried '.combine'
'resources/js/vendor/vendor/jquery.slimscroll.js',
'resources/js/vendor/custom/theme-app.js',
], 'public/js/scripts.js')
// Other bundling stuff
.js([...].version()
mix.webpackConfig({
plugins: webpackPlugins
});
Now, S3's eTag doesn't match to mix-manifest.json hash. And, when I visit the page, it fetches 1 version behind, not the latest uploaded but exactly 1 previous version. However, when I check the 'updated date' on S3, it's correct. Nevertheless, it's exactly one version behind.
What I suspect is it is uploading to s3 before the bundling is completely done; however I am not sure. What am I missing here?
I used this guide if you want to know the laravel side in detail.
After diving around the S3 plugin source, I am fairly confident this is caused by the hook used to trigger the S3 upload. I don't know enough about webpack plugins to give a full description of this, but I have made an educated guess at what is causing the issue and my proposed fix seems to have sorted the issue.
The author of the plugin has accepted my pull request and the fix is currently awaiting release.
If you need a fix in the meantime, then you can do it like so (note, this is very dirty and should be treated as temporary):
Browse to your node_modules folder
Locate the folder named webpack-s3-plugin
Copy the file dist/s3_plugin.js
Paste somewhere in your project
Open the file and locate the line t.hooks.afterEmit.tapPromise
Replace with t.hooks.done.tapPromise
In your webpack.mix.js file, change the require('webpack-s3-plugin') to point to your javascript file
Just to reiterate, this is a temporary fix until the latest version of the plugin is released.

Cannot install search-guard - "ERROR: `elasticsearch` directory is missing in the plugin zip"

As topic states, I have a problem while trying to install search-guard plugin for my ELK stack:
[XXX#XXXX bin]$ ./elasticsearch-plugin install -b file:///home/xxxx/search-guard-6-6.2.1-21.0.zip
-> Downloading file:///home/xxxx/search-guard-6-6.2.1-21.0.zip
[=================================================] 100%  
ERROR: `elasticsearch` directory is missing in the plugin zip
I tried to do it from custom directory, then, following this answer from home, but it did not help. When I unzip the archive, I can see that there is a directory called "elasticsearch" there:
Does anyone have any suggestions how to proceed with that?
The error comes from InstallPluginCommand.class within the lib\plugin-cli-x.x.x.jar and is exactly what is says. Here's a clipped portion of the code as it's reading thru the entries in the zip file:
ZipInputStream zipInput = new ZipInputStream(Files.newInputStream(zip));
try {
ZipEntry entry;
while((entry = zipInput.getNextEntry()) != null) {
if (entry.getName().startsWith("elasticsearch/")) {
hasEsDir = true;
...
}
}
if (!hasEsDir) {
throw new UserException(2, "`elasticsearch` directory is missing in the plugin zip");
}
I realize that doesn't help you much, but as a last ditch effort, if you can't get to the root cause of the issue, 1 thing I did to get me over the hurdle was to just copy the files from the zip file into the es plugins directory (/usr/share/elasticsearch/plugins in our case). They go within /plugins, but under a directory, which is the name that Elasticsearch knows the plugin by.
The only 2 gotchas are:
You need to know the directory name to create under /plugins.
You need to know the replacement values for the plugin-descriptor.properties file.
If you can get that far, you can start ES and it should load everything fine.

PhoneGap Build API for Node.js - Unable to load a custom build

I am trying to upload a zip file containing my App into PhoneGap Build by using the API with Node.js but it doesn't work, it does if I upload the file manually from the website.
After a successfully authentication with this piece of code:
pgBuild.auth({ token: phonegapBuildToken }, authenticationResponse);
in my callback I do the following:
function authenticationResponse(e, api){
unlockAndroidKeyMethod(api);
unlockiOSKeyMethod(api);
var options = {
form: {
data: {
platforms: ['android', 'ios']
},
file: './www/xxx.zip'
}
};
api.post(phonegapEndpoint + '/build', options, function(ee, data) {
console.log('## BUILD IN PROGRESS...');
console.log(ee);
console.log(data);
//waitingForPendingBuild(api);
});
}
inside the option I am pointing to the file I want to load
file: './www/xxx.zip'
the problem is that whatever I put there it doesn't get picked up, what PhoneGap Build builds is the file always the file loaded through the website.
Can I get some help, please? :)
Thanks
PS: I get no error
I have managed to solved this problem - it was a problem on how I create the zip file apparently...PhoneGap Build API don't like zip files done with gulp-zip, using archiverjs (https://archiverjs.com/docs/) solves the issue :)
Thanks

Missing manifest.json when uploading Firefox Add-on to AMO

I'm unable to upload my firefox extension using the form provided by mozilla. I'm constantly getting the error Your add-on failed validation with 2 errors.
No install.rdf or manifest.json foundAdd-on missing manifest, which is very misleading because my application has a manifest.json.
The manifest.json looks like this:
{
"manifest_version": 2,
"version": 1.0,
"name": "my-extension-name",
"description": "Lorem ipsum dolor sit amet",
"background": {
"scripts": ["js/background.js"]
},
"main": "popup.js",
"browser_action": {
"default_icon": "img/icon_grey.png",
"default_popup": "popup.html",
"default_title": "loremipsum"
},
"engines": {
"firefox": ">=38.0a1"
},
"permissions": [
"activeTab",
"tabs",
"background",
"http://*/*",
"https://*/*",
"notifications",
"alarms",
"storage",
"webRequest",
"webRequestBlocking",
"clipboardRead"
]
}
What is missing for this to work?
I was running into the same problem but all of these instructions didn't solve it.
What i always did was to pack the whole folder, hence the manifest.json was not on the first level, when unpacked.
SOLUTION FOR ME
Select all files, instead of the folder, and then pack them as one .zip file and it should work. At least it did for me.
Here is a link to the MDN Documentation.
The very simple answer to this is that its unable to find the manifest in your zip file. This is caused because when you take a file and zip it using the default compressor in windows it takes the file and throws it into a sub folder of the zip file you created...
before compressing
folderYouWantCompressed
-FileInFolder.html
-Manifest.json
after compressing it will look like this
nameOfZip.zip
-folderYouWantCompressed
-FileInFolder.html
-Manifest.json
but what you want is
nameOfZip.zip
-FileInFolder.html
-Manifest.json
the reason Oliver Sauter answer works is because when you select all the files within the "folderYouWantCompressed" it compresses without the sub folder meaning you dont run into this problem and it has no problem finding the manifest file.
for what I can tell the "correct answer" seems to be signing the add-on itself and is able to get the manifest file properly, so it does work but just seems like a 3rd party way of doing it (I did not look into it too deeply)
Note: that I originally had my issue solved by looking at Oliver Sauter post I just wanted to make it clear for future people looking at this post.
When you open your addon package zip file, the manifest.json file should be visible to you in order to upload it on AMO.
In your case, it looks like when you open your package zip, there is a folder and inside that folder manifest.json is located.
As I have found a solution to my problem and would like to share it for future reference I answer my own question:
The issue at hand was that I did not use the web-ext command line tool to create the .zip / .xpi package. I was able to solve the problem by installing web-ext and using web-ext build to build the extension. The result of this operation is a .xpi file that contains the project which I was then able to upload to the AMO service. Note that the manifest.json in the newly created package is identical to the manifest.json I originally provided. However, in addition to the manifest.json a directory META-INF was created which contains a mozilla.mf, mozilla.rsa and mozilla.sf file.
This however, did not entirely solve my problem. After uploading the extension to AMO, it could not be installed and was said to be damaged. Apparently, which is what I read somewhere in the interwebz (and forgot the source), Mozilla opens the .zip / .xpi package that is uploaded to test it and since my package was not signed, Mozilla could not ensure its integrity and marked it as insecure (i.e. damaged).
In order to solve the second problem I had to sign the extension. This can be done using the following command:
web-ext sign --api-secret YOUR_API_SECTER --api-key YOUR_API_KEY
After this, I was able to upload and install the extension.
Got the same problem, the problem was thats the file name is case sensitive:
Manifest.jason -> error, no manifest found
manifest.json -> susscessful
my solution (in mac os):
zip a directory using zip in terminal command zip -r example.zip example instead of right-clicking files and clicking "Compress" in mac os

Resources