CI-Neo4jPHP, getting labels from a node - codeigniter

This script below writen in CI and Neo4jPHP environment.
$client = new Everyman\Neo4j\Client("localhost", 7474);
$client->getTransport()->setAuth("admin","password");
$all_labels = $client->getLabels();
foreach ($all_labels as $key=>$a_label)
{
echo $key.": ".$a_label->getName()."\n";
}
List of labels expected,
but i didn't get anything.

Related

Laravel 8 HTTP Put With Attach

I'm creating crud API with laravel 8 as a server and it works perfectly when tested by talend/postman (running on 127.0.0.1:8000).
then i'm creating crud apps with laravel 8 as a client. everything works fine but update data with attach file.
i've try with no attach file and works
$response = Http::put('http://127.0.0.1:8000/api/memo/'.$id_memo, $input);
but, its not works when using attach file
$input['id_user'] = $request->id_user;
$input['date_memo'] = $request->date_memo;
$input['time_memo'] = $request->time_memo;
if ($request->hasFile('lampiran_memo')) {
$lampiran_memo = $request->file('lampiran_memo');
$nama_lampiran = $lampiran_memo->getClientOriginalName();
$lampiran_memo->move("memo", $nama_lampiran);
$thefile = fopen("memo/".$nama_lampiran, 'r');
$response = Http::attach('lampiran_memo', $thefile)->put('http://127.0.0.1:8000/api/memo/14', $input);
}
after stuct, finally I use attach and post (not put) and works fine.
$input['id_user'] = $request->id_user;
$input['date_memo'] = $request->date_memo;
$input['time_memo'] = $request->time_memo;
if ($request->hasFile('lampiran_memo')) {
$lampiran_memo = $request->file('lampiran_memo');
$nama_lampiran = $lampiran_memo->getClientOriginalName();
$lampiran_memo->move("memo", $nama_lampiran);
$thefile = fopen("memo/".$nama_lampiran, 'r');
$response = Http::attach('lampiran_memo', $thefile)->post('http://127.0.0.1:8000/api/updatememo', $input);
}

Files not uploading due to unkown error using laravel

Laravel Version: 5.5
I have a problem when uploading a file. I have multiple files in the array, I am uploading these files two times in the same function for the first time I am checking pdf file version, and file dimensions this block of code works perfectly but in the second block of code I am again uploading these files for merging these files it gives me this error "The file "A4.pdf" was not uploaded due to an unknown error". When I remove the first block of code then the second block of code start working.I don't know where I did mistake, I have searched a lot but not found the answer.
This block of code checking pdf file version and dimensions.
$paper_size = array();
$del_files = array();
foreach ($files as $file) {
$filename = time().date('m-d-y').$file->getClientOriginalName();
$file->move(public_path().'/uploads/check_pdf_files/', $filename);
$version = $this->pdfVersion(public_path().'/uploads/check_pdf_files/'.$filename);
if($version > 1.5)
{
File::delete('public/uploads/check_pdf_files/'.$filename);
return Response::json(" Your PDF file version is greater than 1.4 which is not compatible with our system, Please make it lower version.", 400);
}
$get_paper_size = $this->get_pdf_dimensions('public/uploads/check_pdf_files/'.$filename);
$paper_size[] = $get_paper_size;
$del_files[] = $filename;
}
if(round($paper_size[0]['width']) != round($paper_size[1]['width']))
{
foreach ($del_files as $del)
{
File::delete('public/uploads/check_pdf_files/'.$del);
}
return Response::json(" Your Files dimensions is not matching please try with same dimensions.", 400);
}
This block of code using for merging the files.
$new_pdf_file = array();
foreach ($request->file as $merge_file)
{
$newFile_name = time().$merge_file->getClientOriginalName();
$merge_file->move('public/uploads/', $newFile_name);
$new_pdf_file[] = $newFile_name;
}
dd($new_pdf_file);
$pdf = new \LynX39\LaraPdfMerger\PdfManage;
foreach($new_pdf_file as $new)
{
$pdf->addPDF('public/uploads/dummy_uploads/'.$new, 'all');
}
$temp_name = time().$request->merge_name;
$pdf->merge('file',base_path(). '/public/uploads/' . Auth::user()->email . '/'.$temp_name.'.pdf', 'P');
foreach($new_pdf_file as $delete_new)
{
File::delete('public/uploads/dummy_uploads/'.$delete_new);
}
$user = DB::table('user_pdf_files')->insert([
'user_files' => $request->merge_name.'.pdf',
'filename' => $temp_name.'.pdf',
'type' => $request->type[0],
'user_id' => Auth::user()->id,
]);
Session::flash('success', 'Files Merged Successfully');
return Response::json('success', 200);`
You need the fully qualified path to where you want the files saved to. Anytime you are moving or copying a file, ensure that you are using public_path() with the relative path as a parameter. This function outputs the fully qualified path to the public folder. For example:
$merge_file->move(public_path('uploads'), $newFile_name);
This should be why the first code block is working vs. the second one. Not a very descriptive error, though!

Laravel Collective SSH results

I am performing SSH in Laravel whereby I connect to another server and download a file. I am using Laravel Collective https://laravelcollective.com/docs/5.4/ssh
So, the suggested way to do this is something like this
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
if($result) {
return $path;
} else {
return 401;
}
Now that successfully downloads the file and moves it to my local server. However, I am always returned 401 because $result seems to be Null.
I cant find much or getting the result back from the SSH. I have also tried
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path, function($line){
dd( $line.PHP_EOL);
});
But that never gets into the inner function.
Is there any way I can get the result back from the SSH? I just want to handle it properly if there is an error.
Thanks
Rather than rely on $result to give you true / false / error, you can check if the file was downloaded successfully in another way:
// download the file
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
// see if downloaded file exists
if ( file_exists($path) ) {
return $path;
} else {
return 401;
}
u need to pass file name also like this in get and put method:
$fileName = "example.txt";
$get = \SSH::into('scripts')->get('/remote/somelocation/'.$fileName, base_path($fileName));
in set method
$set = \SSH::into('scripts')->set(base_path($fileName),'/remote/location/'.$fileName);
in list
$command = SSH::into('scripts')->run(['ls -lsa'],function($output) {
dd($output);
});

Migrating Parse files from Gridstore to S3

We migrated the data(without files) to mLab and Heroku. So the old files are still on Parse.
Since then, any new file added goes into Gridstore, which is the default file storage for mLab.
Now I migrated old parse files from Parse to an S3 Bucket using sashido
The files are migrated and are accessible using S3Adapter in Heroku.
But the files on Gridstore are not accessible now. How can I migrate them to the same S3 bucket and change references in mLab?
Maybe you're interested in the solution I've tried. It's not a simple operation, but I migrated successfully 3 databases with my parse server configuration.
It's based in a PHP script (with the Parse PHP SDK) that runs through every object, it gets the file from Parse.com and sets it (with any of your adapter configuration) in your own server.
The script looks like:
<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
date_default_timezone_set('America/New_York');
$fileField = $argv[1];
$class = $argv[2];
require_once 'vendor/autoload.php';
use Parse\ParseObject;
use Parse\ParseQuery;
use Parse\ParseACL;
use Parse\ParsePush;
use Parse\ParseUser;
use Parse\ParseInstallation;
use Parse\ParseException;
use Parse\ParseAnalytics;
use Parse\ParseFile;
use Parse\ParseCloud;
use Parse\ParseClient;
$app_id = "******";
$rest_key = "******";
$master_key = "******";
ParseClient::initialize($app_id, $rest_key, $master_key);
ParseClient::setServerURL('http://localhost:1338/', 'parse');
$query = new ParseQuery($class);
$query->ascending("createdAt"); // it's just my preference
$query->exists($fileField);
$query->limit(1);
$count = $query->count();
for ($i = 0; $i < $count; $i = $i + 1) {
try {
$query->skip($i);
// get Entry
$entryWithFile = $query->first();
// get file
$parseFile = $entryWithFile->get($fileField);
// filename
$fileName = $parseFile->getName();
// if the file is hosted in Parse, do the job, otherwise continue with the next one
if (strpos($fileName, "tfss-") === false) {
echo "\nThis is already an internal file, skipping...";
continue;
}
$newFileName = str_replace("tfss-", "", $fileName);
$binaryFile = file_get_contents($parseFile->getURL());
$newFile = ParseFile::createFromData($binaryFile, $newFileName);
$entryWithFile->set($fileField, $newFile);
$entryWithFile->save(true);
echo "\nFile saved\n";
}
catch (Exception $e) {
// The conection with mongo or the server could be off for some second, let's retry it ;)
sleep(10);
continue;
}
}
echo "\n";
echo "END!";
?>
set your parse url correctly.
Imagine you want to migrate the file from class _User with field imageProfile, so be sure that you pass $fileField = "imageProfile"; $class = "_User".
Run that code for any field per class.
I did a dumb solution to work in parallel, which would be skipping steps in the for loop, for example:
<?php
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
date_default_timezone_set('America/New_York');
$index = $argv[1];
$of = $argv[2];
$fileField = $argv[3];
$class = $argv[4];
require_once 'vendor/autoload.php';
use Parse\ParseObject;
use Parse\ParseQuery;
use Parse\ParseACL;
use Parse\ParsePush;
use Parse\ParseUser;
use Parse\ParseInstallation;
use Parse\ParseException;
use Parse\ParseAnalytics;
use Parse\ParseFile;
use Parse\ParseCloud;
use Parse\ParseClient;
$app_id = "********";
$rest_key = "********";
$master_key = "********";
ParseClient::initialize($app_id, $rest_key, $master_key);
ParseClient::setServerURL('http://localhost:1338/', 'parse');
$query = new ParseQuery($class);
$query->ascending("createdAt");
$query->exists($fileField);
$query->limit(1);
$count = $query->count();
for ($i = $index; $i < $count; $i = $i + $of) {
try {
$query->skip($i);
// get Entry
$entryWithFile = $query->first();
// get file
$parseFile = $entryWithFile->get($fileField);
// filename
$fileName = $parseFile->getName();
// if the file is hosted in Parse, do the job, otherwise continue with the next one
if (strpos($fileName, "tfss-") === false) {
echo "\nThis is already an internal file, skipping...";
continue;
}
$newFileName = str_replace("tfss-", "", $fileName);
$binaryFile = file_get_contents($parseFile->getURL());
$newFile = ParseFile::createFromData($binaryFile, $newFileName);
$entryWithFile->set($fileField, $newFile);
$entryWithFile->save(true);
echo "\nFile saved\n";
}
catch (Exception $e) {
// The conection with mongo or the server could be off for some second, let's retry it ;)
sleep(10);
continue;
}
}
echo "\n";
echo "END!";
?>
so if you configure $fileField and $class as before, and you can open 3 threads and run:
php migrator.php 0 3 "imageProfile" "_User"
php migrator.php 1 3 "imageProfile" "_User"
php migrator.php 2 3 "imageProfile" "_User"
so you will have loops running like:
object 0, 3, 6
object 1, 4, 7
object 2, 5, 8
Good luck, and be quick! It's going to shut down in a few days.

TFS build duration report by agent

I'm trying to build a report to show the relative efficiency of my various build agents and having trouble getting the info I need out of the tool.
What I'd like to have is a simple grid with the following columns:
Build Number
Build Definition
Build Agent
Build Status
Build Start Time
Build Duration
Which would let me do something like chart the duration of successful builds of a given build definition on agent1 against the same build definition on agent2 through agentN.
How would I go about this?
My initial intention was to point you to TFS OLAP Cube & describe how you could retrieve what you were after. Then I realized that the cube does not provide with the info which Agent built what Build.Then I thought it would be simple to write a small TFS-console app that print the infos you 're after:
using System;
using Microsoft.TeamFoundation.Build.Client;
using Microsoft.TeamFoundation.Client;
namespace BuildDetails
{
class Program
{
static void Main()
{
TfsTeamProjectCollection teamProjectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri("http://TFS:8080/tfs/CoLLeCtIoNNaMe"));
var buildService = (IBuildServer)teamProjectCollection.GetService(typeof(IBuildServer));
IBuildDefinition buildDefinition = buildService.GetBuildDefinition("TeamProjectName", "BuildDefinitionName");
IBuildDetail[] buildDetails = buildService.QueryBuilds(buildDefinition);
foreach (var buildDetail in buildDetails)
{
Console.Write(buildDetail.BuildNumber+"\t");
Console.Write(buildDefinition.Name+"\t");
Console.Write(buildDetail.BuildAgent.Name+"\t");
Console.Write(buildDetail.Status+"\t");
Console.Write(buildDetail.StartTime+"\t");
Console.WriteLine((buildDetail.FinishTime - buildDetail.StartTime).Minutes);
}
}
}
}
This won't compile, since
Eventually I dove into the IBuildInformationNode[] and got the build agent as follows:
IBuildInformation buildInformation = buildDetail.Information;
IBuildInformationNode[] buildInformationNodes = buildInformation.Nodes;
string agentName;
try
{
agentName = buildInformationNodes[0].Children.Nodes[3].Fields["ReservedAgentName"];
}
catch
{
agentName = "Couldn't determine BuildAgent";
}
Console.Write(agentName + "\t");
The try-catch is necessary, so you can deal with builds that failed/stopped before agent-selection.If you use this latter part as a substitute to the failing Console.Write(buildDetail.BuildAgent.Name+"\t"); you should end up with a console app, whose output can be piped into a *.CSV file & then imported to Excel.
The following code should help in getting the Build Agent Name for the given build detail.
private string GetBuildAgentName(IBuildDetail build)
{
var buildInformationNodes = build.Information.GetNodesByType("AgentScopeActivityTracking", true);
if (buildInformationNodes != null)
{
var node = buildInformationNodes.Find(s => s.Fields.ContainsKey(InformationFields.ReservedAgentName));
return node != null ? node.Fields[InformationFields.ReservedAgentName] : string.Empty;
}
return string.Empty;
}
Make sure that you have refresh the build information in the build details object.You can do so by the either calling the following code on your build Details object before getting the build agents
string[] refreshAllDetails = {"*"};
build.Refresh(refreshAllDetails, QueryOptions.Agents);
Hope it helps :)
The build agent information isn't always in the same place.
I found it for a build I was looking at in buildInformationNodes[1].Children.Nodes[2].Fields["ReservedAgentName"]. The following seems to work for me (so far).
private static string GetAgentName(IBuildDetail buildDetail)
{
string agentName = "Unknown";
bool fAgentFound = false;
try
{
foreach (IBuildInformationNode node in buildDetail.Information.Nodes)
{
foreach (IBuildInformationNode childNode in node.Children.Nodes)
{
if (childNode.Fields.ContainsKey("ReservedAgentName"))
{
agentName = childNode.Fields["ReservedAgentName"];
break;
}
}
if (fAgentFound) break;
}
}
catch (Exception ex)
{
// change to your own routine as needed
DumpException(ex);
}
return agentName;
}

Resources