Since I am new to automation, I am facing some issues while writing a test script for downloading audio files in my cypress project folder, I am taking a song names from my fixtures folder in songs.json and want to download all the songs which are available in songs.json file one by one(File size may vary). I am using the cypress-downloadfile plugin to download the files, this script can download the first file but after showing an error cy.task('downloadFile') timed out after waiting 60000ms after running for the first iteration(Song), If anybody knows the different approach to achieve this logic, it would also be helpful.
/// <reference types="Cypress" />
/// <reference types="cypress-downloadfile"/>
describe('Youtube video',function(){
it('Download youutbe video',function(){
cy.visit('https://www.youtube.com/');
cy.fixture('songs').each((songNames)=>{
cy.log(songNames.Song_Name);
cy.get('#search-input > #search').type(songNames.Song_Name,{force: true});
cy.get('#search-icon-legacy').click();
cy.get('#dismissible .ytd-video-renderer').first().click();
cy.url().then(($link)=>{
cy.origin('https://getn.topsandtees.space',{ args: { $link } }, ({ $link })=>{
cy.visit('/dw4YQQnEr5');
cy.get('.form-group input[name="q"]').type($link);
cy.get('.form-group .btn').click();
});
cy.get('#dl_format').select('MP3 320 kbps').should('have.value','mp3');
cy.get('.search-item__download').click();
cy.get('.search-item__download').invoke('attr','data-href').then((videoLink)=>{
cy.downloadFile(videoLink,'mp3downloads',songNames.Song_Name+".mp3");
})
})
})
});
})
For more understanding attached Error image
song download location image
Related
I am working on native script 8 and using ns plugin add #triniwiz/nativescript-downloader plugin to download the zip file from the server.
I get the download response
"status": "completed",
"path": "file:///data/user/0/com.myapp.example/cache/1bbf6484-9c77-4357-9759-1c1a55011a21"
but when the plugin tries to unzip the same downloaded file it gives me this
File does not exist, invalid archive path: file:///data/user/0/com.myapp.example/cache/1bbf6484-9c77-4357-9759-1c1a55011a21
I am using #nativescript/zip for unzipping the downloaded file.
unZipFile(path, unzipPath) {
let destination = fs.path.join(this.document.path,"/assets/",unzipPath);
return Zip.unzip({
archive: path,
directory: destination,
onProgress: this.onUnZipProgress
}).then((res) => {
console.log(res);
return destination;
}).catch((err) => {
return 'failed-----------------:'+err;
});
}
not sure if there is something wrong with the code or the plugin, can someone please help?
Check the download directory you're using. You likely should be using only the temp or documents known folders. See the [NativeScript File System][1] docs for details.
I've seen a problem similar to this where it looks like the file downloaded successfully but in fact failed due to security restrictions. This is especially true on iOS.
[1]: https://v7.docs.nativescript.org/ns-framework-modules/file-system
I would like to upload a PDF file using cypress.
The idea will be to go and find the pdf file into my directory.
I didn't find some way yet.
This is the element used to upload
2022 Update
Since Cypress v9.3.0 you can use
.selectFile('fileName.pdf')
you can find more details about it in Cypress api docs
Cypress doesn't support file upload out of the box.
cypress-file-upload is a npm package which provides a custom cypress command to upload files easily.
But, it requires the file to be present in the fixtures folder.
Sample code for uploading pdf file:
const fileName = 'myfile.pdf';
cy.fixture(fileName).then(fileContent => {
cy.get('#filesToUpload').upload({ fileContent, fileName, mimeType: 'application/pdf' }, { subjectType: 'input' });
});
You can also find more workarounds here: https://github.com/cypress-io/cypress/issues/170
As mentioned by #bushra you need to install the 'cypress-file-upload' package and then you can do this :)
Cypress.Commands.add("UploadFile", function () {
cy.fixture("someFile", "binary")
.then(Cypress.Blob.binaryStringToBlob)
.then((fileContent) => {
cy.get('someElement').attachFile({
fileContent,
filePath: "someFile.pdf",
fileName: "someFile.pdf",
});
});
With using webpack-s3-plugin npm package, I'm saving my laravel-mix compiled & versioned files into S3 (for cdn purposes).
Bare in mind, this was working until yesterday.
let webpackPlugins = [];
if (mix.inProduction() && process.env.UPLOAD_S3) {
webpackPlugins = [
new s3Plugin({
include: /.*\.(css|js)$/,
s3Options: {
accessKeyId: process.env.AWS_KEY,
secretAccessKey: process.env.AWS_SECRET,
region: process.env.AWS_REGION,
},
s3UploadOptions: {
Bucket: process.env.ASSETS_S3_BUCKET,
CacheControl: 'public, max-age=31536000'
},
basePath: 'assets/' + process.env.APP_ENV,
directory: 'public'
})
]
}
mix.scripts([ // I also tried '.combine'
'resources/js/vendor/vendor/jquery.slimscroll.js',
'resources/js/vendor/custom/theme-app.js',
], 'public/js/scripts.js')
// Other bundling stuff
.js([...].version()
mix.webpackConfig({
plugins: webpackPlugins
});
Now, S3's eTag doesn't match to mix-manifest.json hash. And, when I visit the page, it fetches 1 version behind, not the latest uploaded but exactly 1 previous version. However, when I check the 'updated date' on S3, it's correct. Nevertheless, it's exactly one version behind.
What I suspect is it is uploading to s3 before the bundling is completely done; however I am not sure. What am I missing here?
I used this guide if you want to know the laravel side in detail.
After diving around the S3 plugin source, I am fairly confident this is caused by the hook used to trigger the S3 upload. I don't know enough about webpack plugins to give a full description of this, but I have made an educated guess at what is causing the issue and my proposed fix seems to have sorted the issue.
The author of the plugin has accepted my pull request and the fix is currently awaiting release.
If you need a fix in the meantime, then you can do it like so (note, this is very dirty and should be treated as temporary):
Browse to your node_modules folder
Locate the folder named webpack-s3-plugin
Copy the file dist/s3_plugin.js
Paste somewhere in your project
Open the file and locate the line t.hooks.afterEmit.tapPromise
Replace with t.hooks.done.tapPromise
In your webpack.mix.js file, change the require('webpack-s3-plugin') to point to your javascript file
Just to reiterate, this is a temporary fix until the latest version of the plugin is released.
I am trying to upload a zip file containing my App into PhoneGap Build by using the API with Node.js but it doesn't work, it does if I upload the file manually from the website.
After a successfully authentication with this piece of code:
pgBuild.auth({ token: phonegapBuildToken }, authenticationResponse);
in my callback I do the following:
function authenticationResponse(e, api){
unlockAndroidKeyMethod(api);
unlockiOSKeyMethod(api);
var options = {
form: {
data: {
platforms: ['android', 'ios']
},
file: './www/xxx.zip'
}
};
api.post(phonegapEndpoint + '/build', options, function(ee, data) {
console.log('## BUILD IN PROGRESS...');
console.log(ee);
console.log(data);
//waitingForPendingBuild(api);
});
}
inside the option I am pointing to the file I want to load
file: './www/xxx.zip'
the problem is that whatever I put there it doesn't get picked up, what PhoneGap Build builds is the file always the file loaded through the website.
Can I get some help, please? :)
Thanks
PS: I get no error
I have managed to solved this problem - it was a problem on how I create the zip file apparently...PhoneGap Build API don't like zip files done with gulp-zip, using archiverjs (https://archiverjs.com/docs/) solves the issue :)
Thanks
Problem - copied images end up with size 0 x 0 px.
In my gulpfile.js I copy images from the app directory to the dist directory, which works well. Here's a simplified version of the task:
gulp.task('html-deploy', function() {
return gulp.src([
'app/img/**/*',
'app/**/*.html'
],
{
base: 'app'
}
)
.pipe(gulp.dest('dist'))
});
My file structure looks like this:
/app
index.html
/img
image1.png
image2.png
Everything copies over well and nice, however the images don't display in the browser, even though filepaths are correct.
The weird thing is when I locate the images directly in Finder (osx) I can't view them there either, although the filesizes and read/write values are correct too.
This is probably because the copied images end up being 0x0 px in size. Why is this happening?
Was facing the same issue with ionic 2 application.
I included gulp task
gulp.task('images', function() {
return gulp.src('app/images/**/*')
.pipe(gulp.dest('www/build/images'));
});
called it from other build tasks are called (gulp task build)
and changed my image reference to refer from build folder:
<img src="build/images/mylogo.png" height="50px" width="50px">
Hope it helps someone!
This is kind of an answer to my own question - at least it fixed the problem.
Instead of directly copying the images I copy them via gulp-imagemin, and voilĂ !
var gulp = require('gulp');
var imagemin = require('gulp-imagemin')
var del = require('del')
// Minify and copy new images to dist
gulp.task('imagemin', ['clean'], function() {
return gulp.src('app/img/**/*')
.pipe(changed('dist/img'))
.pipe(imagemin())
.pipe(gulp.dest('dist/img'))
})