MATLAB code for multiple work done automatically - matlab-guide

I want to create a MATLAB code that will create a database of images and when a image is given as input it will search the database and match the given input.After that it will command a microcontroller. I want to know how I'll be able to do the whole thing automatically.The loop will run automatically.

Related

Efficient way to loop through large amount of files, convert them to webp and save the timestamp

I have a folder with about 750'000 images. Some images will change over time and new images will also be added every now and then. The folder-structure is about 4-5 levels deep with a maximum of 70'000 images per one single folder.
I now want to write a script that can do the following:
Loop through all the files
Check if the file is new (has not yet been converted) or changed since the last conversion
Convert the file from jpg or png to webp if above rules apply
My current solution is a python script that writes the conversion-times into a sqlite database. It works, but is really slow. I also thought about doing it in PowerShell due to better performance (I assume) but had no efficient way of storing the conversion-times.
What language would you recommend? Is there another way to convert jpg to webp without having to exernally call the command cwebp from within my script?

Parameterise Parm File name In Informatatica

I want to know how to (or can I) parameterize the parm file name in informatica?
little bit of background. I am building a standard map in informatica. Which business users can call directly after selecting the standard filters they want to apply in the map using a GUI.
The parm file name will be given by business users and all the filters that he/she selected will be in parm. The file will be dropped in the parm folder in informatica server.
This is a good case scenario, when only 1 users is using it at 1 point of time.
Also, I want to find out what should I do when multiple users are working on GUI and generating the parm files and invoking the informatica map. How do I get multiple instences of the same map running at the same time?
I hope I am making sense here....
Thanks!!!
You can achieve this by using concurrent execution of the workflow. Read about it and understand how can you implement it.
Once you know how to implement it, use a backend script/code by the gui to assign an instance name to each call through GUI. For each instance name, you can have an individual parameter file. (I believe that there would be a finite set of combination of variable values in your case). You can use below command to call individual instances, (either through you GUI or by any other backend code.
pmcmd %workflow_name% %informatica_folder_name%
-paramfile %paramfilepathandname% -rin %instance_name%
It might sound a bit confusing, but once you understand how concurrent workflows work, you can build on it based on the above input.
It'll be only possible if you call the Informatica from external tool, not the Client tools. One way is described by #Utsav, the other is when you use Informatica WSH to call a Workflow - you can indicate the parameterfile you want to be used with the workflow, as well as desired instance name.
I Think this guide to concurrent workflows May be what you are looking for:
https://kb.informatica.com/howto/6/Pages/17/301264.aspx

Loading Data into the application from GUI using Ruby

Problem:
Hi everyone, I am currently building an automation suite using Ruby-Selenium Webdriver-Cucumber to load data into the application using it's GUI. I've take input from mainframe .txt files. The scenarios are like to create a customer and then load multiple accounts for them as per the data provided in the inputs.
Current Approach
Execute the scenario using the rake task by passing line number as parameter and the script is executed for only one set of data.
To read the data for a particular line, I'm using below code:
File.readlines("#{file_path}")[line_number.to_i - 1]
My purpose of using line by line loading is to keep the execution running even if a line fails to load.
Shortcomings
Supposed I've to load 10 accounts to a single customer. So my current script will run 10 times to load each account. I want something that can load the accounts in a single go.
What I am looking for
To overcome the above shortcoming, I want to capture the entire data for a single customer from the file like accounts etc and load them into the application in a single execution.
Also, I've to keep track on the execution time and memory allocation as well.
Please provide your thoughts on this approach and any suggestions or improvements are welcomed. (Sorry for the long post)
The first thing I'd do is break this down into steps -- as you said in your comment, but more formally here:
Get the data to apply to all records. Put up a page with the
necessary information (or support command line specification if not
too much?).
For each line in the file, do the following (automated):
Get the web page for inputting its data;
Fill in the fields;
Submit the form
Given this, I'd say the 'for each line' instruction should definitely be reading a line at a time from the file using File.foreach or similar.
Is there anything beyond this that needs to be taken into account?

wkhtmltopdf runtime for many pdf-creations

I am using wkhtmltopdf on my ubuntu server to generate pdfs out of html-templates.
wkhtmltopdf is therefore started from a php-script with shell_exec.
My problem is, that I want to create up to 200 pdfs at (almost) the same time, which makes the runtime of wkhtmltopdf kind of stack for every pdf. One file needs 0.6 seconds, 15 files need 9 seconds.
My idea was to start wkhtmltopdf in a screen-session to decrease the runtime, but I can't make it work from php plus this might not make that much sense, because I want to additionally summarize all pdfs in one after creation, so I would have to check if every session is terminated?!
Do you have any ideas how I can decrease the runtime for this amount of pdfs or can you give me advice how to realize this correctly and smart with screen?
My script looks like the following:
loop up to 200times {
- get data for html-template from database
- fill template-string and write .html-file
- create pdf out of html-template via shell_exec("wkhtmltopdf....")
- delete template-file
}
merge all generated pdfs together to one and send it via mail
Thank you in advance and sorry for my bad english.
best wishes
Just create a single large HTML file and convert it in one pass instead of merging multiple PDFs afterwards.

QTP VB Script to update data excel placed in QC

I'm trying to automate some set of test cases which would pass inputs from one to another. For instance, if I have 5 test cases then 1st test case would pass input to 2nd - 2nd to 3rd - likewise it goes on.
And another point to be noted is that I won't perform batch execution and there will be a certain time gap between each test case.
So what I'm trying to do is like updating the outputs into some excel sheet and call them during succeeding execution. I have tried searching and tried some codes, but nothing has worked out.
So please share some idea to update excel sheet during run time which is placed in QC. Thanks!
What you're essentially saying is that you have test runs separated by some indeterminate amount of time, and you need to share data between runs. The answer is you need persisted storage of your data. You could use a database, flat file, Excel spreadsheet, or anything else that will let you programmatically write data in one run then read it in the next.
Excel spreadsheets are one such solution. You said you tried it and it did not work. That likely means that the method you used to write or read the data was incorrect, and not that there was a problem with the concept. If you provide some more specifics about exactly what you tried and where it failed, hopefully the community will be able to assist you.
I Believe you have Input Excel Sheet(s) in QC, What you can do is download the excel file from QC to local machine, store output from 1st test case to this excel sheet and upload back to QC. Which now you can use as input to next test case.

Resources