I have a few complex commands that I split up into "sub commands" (Example following). As my site is multi-domain, I need the capability of customizing the sequence and composition of the sub commands to perform (so far I had this in a config under L4). With the migration to L5, I'm refactoring this functionality using the Command bus.
Example of a command sequence:
Site A:
1) AuthorizeCharge
2) DoSomethingA
3) DoSomethingB
4) Charge
Site B
1) AuthorizeCharge
2) DoSomethingA
3) DoSomethingC
4) DoSomethingD
5) Charge
Each of these line items is a Command with it's Handler. This part is fairly clear to me (and works fine).
How can I dispatch this elegantly in my controller?
Here is what I already tried.
Fixed version (works)(in controller):
$return = $this->dispatch( new DoSomethingACommand($form));
$return[] = $this->dispatch( new DoSomethingBCommand($form));
Variable version (pseudo code):
someconfig.php
return [
'DoSomethingACommand',
'DoSomethingBCommand'
];
App\Namespace\Http\Controller\SomeController.php
...
//loop over all scenario commands
for ($j = 0; $j < count($commands); $j++)
{
//make the object which we need to act on
$object = \App::make( $this->_namespace . '\\'. $commands[ $j ] );
$return[] = $this->dispatchFrom( $object, $form ); //Doesnt work
}
I'm a bit stuck on how to solve this. Any advice on how I could implement this?
Why dont you just have a main command that then calls all the other sub-commands itself?
So have one major command called from your controller called HandlePaymentProcessCommand. Then inside that command you then call the sub-commands
1) AuthorizeCharge
2) DoSomethingA
3) DoSomethingC
4) DoSomethingD
5) Charge
This way the sub-commands are all handled in one location (the main command) - and is easily configurable to be changed.
Related
I'm currently working on some automation things in WHMCS. I've several custom products available in WHMCS:
Linux product 1
Linux product 2
Windows product 1
Windows product 2
I want to execute a bash script when an order accept and when the service terminate from the WHMCS. The script require an argument (IP address) which is a custom field for the above products. The script should fetch this custom field data containing IP address, and compare the products whether it is Linux or Windows and then pass it to the script as follows:
If the product is a Linux one, then the script pass would be like "autoaccept.sh linux [IP]"
If the product is Windows one, then the script call would be like "autoaccept.sh windows [IP]"
Similar way, when the package terminate on WHMCS we have to call the script again with "autoterminate.sh [IP]"
The WHMCS AcceptOrder and AfterModuleTerminate hook can be used I guess. But not sure how we can fetch these custom field data and compare the products within hook PHP code there. Can anyone shed some light on this or help our me to code this correctly.
Any responses would be much appreciated!
Created the Bash scripts already, and is working perfectly. I'm new to WHMCS hook and PHP things, so stucked here.
Use Product Modules: Product Module and implement the method _CreateAccount($params), You can find 'CustomFields' in $params Variable.
Here is example to executes python script:
<?php
function my_proc_execute($cmd)
{
$output='';
try
{
$descriptorspec = array(
0 => array("pipe", "r"), //STDIN
1 => array("pipe", "w"), //STDOUT
2 => array("pipe", "w"), //STDERR
);
$cwd = getcwd(); $env = null; $proc = proc_open($cmd, $descriptorspec, $pipes, $cwd, $env);
$buffer=array();
if(is_resource($proc))
{
fclose($pipes[0]);
stream_set_blocking($pipes[1], 1);
stream_set_blocking($pipes[2], 1);
stream_set_timeout($pipes[1],500);
stream_set_timeout($pipes[2],500);
while (!feof($pipes[1]))
{
$line = fread($pipes[1], 1024);
if(!strlen($line))continue;
$buffer[]=$line;
}
while(!feof($pipes[2]))
{
$line=fread($pipes[2], 1024);
if(!strlen($line))continue;
$buffer[]=$line;
}
$output=implode($buffer);
fclose($pipes[1]);fclose($pipes[2]);
$return_value = proc_close($proc);
}
else $output = "no resource; cannot open proc...";
}catch(Exception $e){$output = $e->getMessage();}
return $output;
}
function mymodule_CreateAccount($params)
{
$myData=$params['customfields']['the_name_of_field'];
$to_send_arr=array();
$to_send_arr['is_new']='0';
$to_send_arr['id']='xyz';
$to_send_arr['url']='my';
$to_send_arr['display_name']=$myData;
$exewin="c:/Python27/python.exe C:/xampp_my/htdocs/my/modules/servers/xxx/pyscripts/create_data.py";
$exelinux="/var/www/html/modules/servers/xxx/pyscripts/create_data.py";
$command=$exewin . ' ' . escapeshellcmd(base64_encode(json_encode($to_send_arr)));
$output=my_proc_execute($command);
$arr=json_decode($output,true);
}
?>
Python
import sys
import json
import time
import base64
if len(sys.argv) != 2:
print 'Error in the passed parameters for Python script.', '<br>'
sys.exit()
json_data = json.loads(base64.b64decode(sys.argv[1]))
id= json_data['id']
I need to process several image files from a directory (S3 directory), the process is to read the filename (id and type) that is stored in the filename (001_4856_0-P-0-A_.jpg), this file is stored in the moment is invoked the process (im using cron and schedule, it works great) the objetive of the process is to store the info into a database.
I have the process working, it works great but my problem is the number of files that is in the directory, because every second adds a lot more files to the directory, the time spent in the process is about 0.19 sec for file, but the amount of files is huge, about 15,000 per minute is added, so i think a multiple simultaneous process (about 10 - 40 times) of the same original process can do the job.
I need some advice or idea,
First to know how to launch multiple process at the same time of one original process.
Second how to get only the non selected filenames bcause the process takes the filenames with:
$recibidos = Storage::disk('s3recibidos');
if(count($recibidos) <= 0)
{
$lognofile = ['Archivos' => 'No hay archivos para procesar'];
$orderLog->info('ImagesLog', $lognofile);
}else{
$files = $recibidos->files();
if(Image::count() == 0)
{
$last_record = 1;
} else{
$last_record = Image::latest('id')->pluck('id')->first()+1;
}
$i=$last_record;
$fotos_sin_info = 0;
foreach($files as $file)
{
$datos = explode('_',$file);
$tipos = str_replace('-','',$datos[2]);
Image::create([
'client_id' => $datos[0],
'tipo' => $tipos,
]);
$recibidos->move($file,'/procesar/'.$i.'.jpg');
$i++;
}
but i dont figured out how to retrieve only the non selected.
Thanks for your comments.
Using multi-threaded programming in php is possible and has been discussed on so How can one use multi threading in PHP applications.
However this is generally not the most obvious choice for standard applications. A solution for your situation will depend on the exact use-case.
Did you consider a solution using queues?
https://laravel.com/docs/5.6/queues
Or the scheduler?
https://laravel.com/docs/5.6/scheduling
I want to run two matlab scripts in parallel for a project and communicate between them. The purpose of this is to have one script do image analysis and sending the results to the other which will use it for more calculations (time consuming, but not related to the task of finding stuff in the images). Since both tasks are time consuming, and should preferably be done in real time, I believe that parallelization is necessary.
To get a feel for how this should be done I created a test script to find out how to communicate between the two scripts.
The first script takes a user input using the built in function input, and then using labSend sends it to the other, which recieves it, and prints it.
function [blarg] = inputStuff(blarg)
mpiInit(); %added because of error message, but do not work...
for i=1:2
labBarrier; % added because of error message
inp = input('Enter a number to write');
labSend(inp);
if (inp == 0)
break;
else
i = 1;
end
end
end
function [ blarg ] = testWrite( blarg )
mpiInit(); % added because of error message, but does not help
par = 0;
if ( blarg == 0)
par = 1;
end
for i = 1:10
if (par == 1)
labBarrier
delta = labReceive();
i = 1;
else
delta = input('Enter number to write');
end
if (delta == 0)
break;
end
s = strcat('This lab no', num2str(labindex), '. Delta is = ')
delta
end
end
%%This is the file test_parfor.m
funlist = {#inputStuff, #testWrite};
matlabpool(2);
mpiInit(); % added because of error message, but does not help
parfor i=1:2
funlist{i}(0);
end
matlabpool close;
Then, when the code is run, the following error message appears:
Starting matlabpool using the 'local' profile ... connected to 2 labs.
Error using parallel_function (line 589)
The MPI implementation has not yet been loaded. Please
call mpiInit.
Error stack:
testWrite.m at 11
Error in test_parfor (line 8)
parfor i=1:2
Calling the method mpiInit does not help... (Called as shown in the code above.)
And nowhere in the examples that mathworks have in the documentation, or on their website, show this error or what to do with it.
Any help is appreciated!
You would typically use constructs such as labSend, labRecieve and labBarrier within an spmd block, rather than a parfor block.
parfor is intended for implementing embarrassingly parallel algorithms, in other words algorithms that consist of multiple independent tasks that can be run in parallel, and do not require communication between tasks.
I'm stretching my knowledge here (perhaps someone more expert can correct me), but as I understand things, it does not set up an MPI ring for communication between workers, which is probably the explanation for the (rather uninformative) error message you're getting.
An spmd block enables communication between workers using labSend, labRecieve and labBarrier. There are quite a few examples of using them all in the documentation.
Sam is right that the MPI functionality is not enabled during parfor, only during spmd. You need to do something more like this:
spmd
funlist{labindex}(0);
end
(Sam is also quite right that the error message you saw is pretty unhelpful)
I've got a couple of .t files in a folder. Each test script launches its own instance of Selenium and therefore opens its own browser. These then pass their instructions to page objects in separate modules. The page objects are where most of the test assertions occur, alas.
I run them in parallel using prove -j2 testfolder. When I do this I see two browsers open, responding to the Selenium calls, but the test results and browser action indicate that the second script only goes as far as just before the first script's first call to Test::More, then it hangs until the first script has finished.
The page object model is a red herring. I've tried just putting bare pass() calls at the top of each .t file and confirmed that the test case in the second script isn't tried until the entire first script is completed.
Each testX.t file ends up looking something like this:
use strict;
use warnings;
use Test::More tests => 40;
use Selenium::Remote::Driver;
use MyPage::Object; # test execution module
my $sel = new Driver( 'browser_name' => $browser,'
'remote_server_addr' => $host,
'port' => "80", );
pass("Debug test case - let's see when this passes");
my $user = new MyPage::Object( text => "test string", sel => $sel);
$user->verify_text;
.
.
Here's what Object.pm looks like:
use strict;
use warnings;
use Selenium::Remote::Driver;
use Selenium::Remote::WebElement qw(get_text);
package Object;
sub new {
my $class = shift;
my $self = bless { #_ }, $class;
return $self;
}
sub verify_text {
my ($self, $text_to_verify) = #_;
my $webElement = $self->{sel}->find_element("//*$xpath") or warn $!;
my $returnedtext = get_text($webElement) or warn $!;
Test::More::ok($returnedtext =~ /\Q$text_to_verify/, "text matches");
}
1;
Here's the output. While the first test is running I see this:
===( 4;12 4/40 0/? )===========================================
The first pair of numbers and the left number in the second pair go up as the first script's test cases are verified. After this, when the second script starts, the output changes to this:
testfolder\test2.t .. 4/35
With the left number increasing as test cases are executed.
Shouldn't running these in parallel cause the assertions in each of them to be run at the same time? Is this unusual or is this how parallel jobs are supposed to work in prove?
I'm running this from the command line in 64-bit Windows 7, ActiveState Perl v5.16.1. CPAN shows Prove is up to date (3.28).
I came up with a workaround, though this isn't really a solution to the root issue. I used a combo of this solutions at the Sauce Labs blog and Freddy Vega's solution here to come up with this:
#!usr/bin/perl
use strict;
use warnings;
use File::Find::Rule;
use Thread::Pool::Simple;
my $count = 0;
my #files = File::Find::Rule->file()
->name( '*.t' )
->in( 'C:\path\to\tests\' );
sub worker() {
my $file = shift;
system("prove $file");
print "Executing Test: " . $file . "\n";
}
my $pool = Thread::Pool::Simple->new(
min => 5,
max => 25,
do => [\&worker],
);
foreach my $test (#files) {
$pool->add($test);
print "Added $test to pool..\n";
$count++;
}
$pool->join();
exit(0);
I have a large CSV file containing Inventory data to update (more than 35,000 rows). I created a method which extends Mage_Catalog_Model_Convert_Adapter_Productimport to do the inventory update. Then I used an Advanced Profile to do the update which calls that method.
It's working very well when I run the profile manually. The problem is when I use an extension which handles the profile running in cronjob, the system takes too long to load and parse the CSV file. I set the cronjob to run everyday at 6:15am, but the first row of the file wouldn't be processed until 1:20pm the same day, it takes 7 hours to load the file.
That makes the process stop in the middle somehow, less than 1/3 records being processed. I've been frustrating trying to figure out why, trying to solve the problem, but no luck.
Any ideas would be appreciated.
Varien_File_Csv is the class that parses your csv file.
It takes too much memory.
Function to log memory amount used and peak memory usage,
public function log($msg, $level = null)
{
if (is_null($level)) $level = Zend_Log::INFO;
$units = array('b', 'Kb', 'Mb', 'Gb', 'Tb', 'Pb');
$m = memory_get_usage();
$mem = #round($m / pow(1024, ($i = floor(log($m, 1024)))), 2).' '.$units[$i];
$mp = memory_get_peak_usage();
$memp = #round($mp / pow(1024, ($ip = floor(log($mp, 1024)))), 2).' '.$units[$ip];
$msg = sprintf('(mem %4.2f %s, %4.2f %s) ', $mem, $units[$i], $memp, $units[$ip]).$msg;
Mage::log($msg, $level, 'my_log.log', 1);
}
$MyClass->log('With every message I log the memory is closer to the sky');
You could split your csv (use same filename) and call the job multiple times. You'll need to be sure a previous call won't run same time with a newer one.
Thanks