I am performing SSH in Laravel whereby I connect to another server and download a file. I am using Laravel Collective https://laravelcollective.com/docs/5.4/ssh
So, the suggested way to do this is something like this
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
if($result) {
return $path;
} else {
return 401;
}
Now that successfully downloads the file and moves it to my local server. However, I am always returned 401 because $result seems to be Null.
I cant find much or getting the result back from the SSH. I have also tried
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path, function($line){
dd( $line.PHP_EOL);
});
But that never gets into the inner function.
Is there any way I can get the result back from the SSH? I just want to handle it properly if there is an error.
Thanks
Rather than rely on $result to give you true / false / error, you can check if the file was downloaded successfully in another way:
// download the file
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
// see if downloaded file exists
if ( file_exists($path) ) {
return $path;
} else {
return 401;
}
u need to pass file name also like this in get and put method:
$fileName = "example.txt";
$get = \SSH::into('scripts')->get('/remote/somelocation/'.$fileName, base_path($fileName));
in set method
$set = \SSH::into('scripts')->set(base_path($fileName),'/remote/location/'.$fileName);
in list
$command = SSH::into('scripts')->run(['ls -lsa'],function($output) {
dd($output);
});
Related
I am trying to store heavy image through the laravel queue as follows because i dont want my user to wait until file gets stored:
This code is in controller
if($request->hasfile('featuringPhoto'))
{
$prefixFilename = date("Y-m-d-H-i-s");
$coverFilename = $prefixFilename.$request->featuringPhoto->getClientOriginalName();
ProceessFiles::dispatch($request->featuringPhoto->getRealPath(),$coverFilename);
}
else
{
$coverFilename4 = NULL;
}
Below code is in job
protected $files, $filename;
public function __construct($files,$filename)
{
$this->files= $files;
$this->filename = $filename;
}
public function handle()
{
if($this->files){
$coverFilename = $prefixFilename.$this->filename;
$img = file_get_contents($this->files);
$img->storeAs('images/photosUploadedByUser', $coverFilename, 'public');
}
}
It Gives me an error saying Call to a member function storeAs() on string
I tried this solution from stack but didnt work
Any suggestion will be appreciated.Thank you.
I think it is wrong to assume you are gonna save a lot time by executing the save operation in a queue, also because you are already fetching it from the web server. Queues will with scaling often be moved to worker servers and with this approach this will not work.
In the spirit of the question, stackoverflow and to explain to you what is not working. file_get_contents() returns the content of the file as a string. So to fix your problem, you should just store the results of that. You obviously can not call methods on strings.
$coverFilename = $prefixFilename.$this->filename;
$img = file_get_contents($this->files);
Storage::put('images/photosUploadedByUser/' . $coverFilename, $img);
I have been going round and round with this. I have uploads working with the follow:
public function store(Tool $tool)
{
If(Input::hasFile('file')){
$file = Input::file('file');
$name = $file->getClientOriginalName();
$path=Storage::put('public',$file); //Storage::disk('local')->put($name,$file,'public');
$file = new File;
$file->tool_id = $tool->id;
$file->file_name = $name;
$file->path_to_file = $path;
$file->name_on_disk = basename($path);
$file->user_name = \Auth::user()->name;
$file->save();
return back();
}
however when I try to download with:
public function show($filename)
{
$url = Storage::disk('public')->url($filename);
///$file = Storage::disk('public')->get($filename);
return response()->download($url);
}
I get the FileNotFound exception from laravel
However, if I use this instead:
$file = Storage::disk('public')->get($filename);
return response()->download($file);
I get
FileNotFoundException in File.php line 37: The file "use calib;
insert into
notes(tool_id,user_id,note,created_at,updated_at)
VALUES(1,1,'windows server 2008 sucks',now(),now());" does not exist
which is the actual content of the file...
It can obviously find the file. but why wont it download?
Try this:
return response()->download(storage_path("app/public/{$filename}"));
Replace:
$file = Storage::disk('public')->get($filename);
return response()->download($file);
With:
return response()->download(storage_path('app/public/' . $filename));
response()->download() takes a path to a file, not a file content. More information here: https://laravel.com/docs/5.4/responses#file-downloads
If any one still could not find their file even though the file clearly exists then try
return response()->file(storage_path('/app/' . $filename, $headers));
It could be due to a missing directory separator or it isn't stored inside the public folder.
I am trying to upload multiple files but I only get 1 file in return.Below is my code:
public function uploadQuoteItemImage(){
$file=Input::file('filename');
$file_count=count($file);
dd($file_count);
$uploadcount=0;
foreach($file as $f){
$random_name=str_random(8);
$destinationPath='images/';
$extension=$file->getClientOriginalExtension();
$filename=$random_name.'_quote_itm_image.'.$extension;
$byte=File::size($file); //get size of file
$uploadSuccess=Input::file('filename')->move($destinationPath,$filename);
$uploadcount ++;
}
if ($uploadcount == $file_count){
QuoteItemImage::create(array(
'quote_item_id'=>Input::get('quote_item_id'),
'filename'=>$filename,
'filesize'=>$byte
));
return Common::getJsonResponse(true, 'image created', 200);
}
}
Even though I sent 3 files its returning only 1 file. Please help.
so in the form-data of postman you are giving the key attribute as filename for files
in turn it should be filename[] since you are sending array of data
once you set it it works fine .
now you can check in the php code like below
$files = Input::file('filename');
foreach ($files as $one) {
$filename = $one->getClientOriginalName();
$listfilenames[] = $filename;
}
echo $listfilenames
I want to sync my localhost (Windows) with my remote server in real time and automatically. So when I modify, create or delete a file this tool should update remote server automatically. This aplication must to keep both servers synchronized in real time. Please I really need your help. I tried FTPbox, but it doesn't update always, I need some better. I'm working on windows, but if exists some on linux is better.
Thanks
WinScp has a synchronization feature that does what you want.
For linux users, you can have a look here.
Try Dropbox or Google Drive if you don't need to synchronize too much information.
I'm assuming that you want to syncronize the databases and files. This was my way out, I hope it will be of help to someone.
The first code is local, and the other one is remote.
//make sure you are connected to your local database
<?php
//function to check internet connection.
function is_connected() {
if($connected = fsockopen("www.example.com", 80)){
// website, port (try 80 or 443)
if ($connected){
$is_conn = true; //action when connected
fclose($connected);
}
return $is_conn;
} else {
$is_conn = false; //action in connection failure
}
}
//if connected to internet, do the following...
if(is_connected()== true){
echo "connected";
ini_set('max_execution_time', 3000);//increase this incase of slow internet
$table_name = TableName::find_all();
//whatever way you find an array
//of all your entries on this particular
//table that you want to sync with the remote table.
$file = 'to_upload_local.php'; //a local file to put table contents into
$current = serialize($table_name);//serialize the table contents (google).
file_put_contents($file, $current);//put the serialized contents to the file.
$remote_file = 'public_html/to_upload_remote.php';//this is the file that is on the remote server that you want to overwrite with the local file to upload.
$ftp_server = "ftp.yourwebsite.org";// your ftp address
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, "yourFTPUsername", "yourFTPPassword");
// turn passive mode on
ftp_pasv($conn_id, true);
// upload a file
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)){
echo "Upload Successful";
} else {
}
// close the connection
ftp_close($conn_id);
//this script called below is to update your remote database. Its in the next example
$call_script = file_get_contents('http://path_to_your_script');
} else {
//if not connected to internet,....
echo "offline";
}
?>
The online script that should do the work, (the one you called in the last line of the previous code) should look something like this:
//make sure you're connected to remote database
<?php
//this function should compare num_rows of your two
//databases values (local remote) and return the
//difference. It's used with array_udiff function.
function compare_objects($obj_a, $obj_b) {
return $obj_a->id - $obj_b->id;
}
//this function should compare contents of your two
//databases values (local remote) and return the
//difference. It's used with array_udiff function.
function comparison($obj_a, $obj_b){
if ($obj_a==$obj_b){
return 0;
}else{
return -1;
}
}
$file = '../to_upload_remote.php';//the uploaded file
$current = file_get_contents($file);//load the file
$array = unserialize($current);//unserialize to get the object array
$remote_table_name = remote_table_name::find_all();//get what you have in
//remote database
//if a new value is added, create a new entry to database with new vals
if($try_new = array_udiff($array, $remote_table_name, 'compare_objects')){
foreach($try_new as $entry){
$remote_table_name = new remote_table_name();
$remote_table_name->value = $entry->value;
//depending on the number of your columns,
//add values to remote table that were not there before.
//you can use any other suitable method to do this.
if($remote_table_name->save()){
echo "the remote_table_name was saved successfully";
}
}
} else {
echo "same number of rows";
}
//if some values are changed, update them with new vals
if($try_change = array_udiff($array, $remote_table_name, 'comparison')){
foreach($try_change as $entry){
$remote_table_name = remote_table_name::find_by_id($entry->id);
$remote_table_name->value = $entry->value;
//depending on the number of your columns,
//update values to remote table.
//you can use any other suitable method to do this.
if($remote_table_name->save()){
echo "the remote_table_name was saved successfully";
}
}
} else {
echo "All values match";
}
?>
So, any time the first code is executed, it reads the local table, takes all the values and puts them in the local file, uploads the local file and replaces one in the remote folder, calls a remote script to check the unserialized local table and compares it with the online table, then does the necessary.
EDIT
I tried debugging this with xdebug and netbeans. It's weird that the exports will work during the debug session if I put in some breakpoints. However, with no break points, a more realistic environment, the exports don't work.
I've tried adding sleeps into some parts of the code.
I think that maybe PHP is ending before the Redis commit is completed. Maybe the Redis connections are being done asynchronously, but I checked PRedis and the default is a synchronous connection.
I am working on a reporting tool.
Here is the basic issue.
We store a report into the session object but on later requests when we try to get to the report in the session object it's gone.
Here is a more detailed version.
I store a 'report' object into the session like so
$_SESSION['report_name_unixtimestamp'] = gzcompress( serialize( $reportObject ) );
The user sees the report in some table form and then if they want they can export it. The report could change so the idea behind storing it in the session like this is that when the user exports it to PDF, Excel, etc, they'll be getting a report identical to the one they are viewing.
The user clicks on an export button and on the PHP side it will go into the session, fetch the report via the key provided as a get parameter (uncompresses and unserializes it), create the export and send it to the user for download.
This has worked well up until the point that we tried to introduce the Redis caching server as a tool for better session management.
What happens now is the following:
The first time we run the report it will get stored into the cache and the export will work successfully.
We will run the report again, with the same user account in the same session. This changes the unixtimestamp and so there should be two entries in the $_SESSION. ( $_SESSION['report_name_oldertimetamp'] and $_SESSION['report_name_newertimestamp'] ). When we click on the export button again we get an error saying that the file doesn't exist ( because it hasn't been sent by the server ).
If we check the redis server for the newer version of the report it isn't there, but the old timestamp is still there.
Now, this worked with the file session management but not with Redis. we've tried the redis module for php as well as the pure php client Predis.
Does anyone have any ideas?
Here are a few more details :
Redis has NOT run out of memory. We've checked this many times.
We already know that to unserialize the report object in the session the report class has to be included already. ( remember, the first export works fine but anything after that fails )
If we check the php session object during the request that the report is running on, it WILL contain the newer report but it never makes it to Redis.
Below is the save handler that is being used with Predis.
The redis_session_init is the function I call right before session_start() so that it gets registered. I'm not sure how the redis_session_write function works though so maybe someone can help me with that.
<?php
namespace RedisSession
{
$redisTargetPrefix = "PHPREDIS_SESSION:";
$unpackItems = array( );
$redisServer = "tcp://cache.emcweb.com";
function redis_session_init( $unpack = null, $server = null, $prefix = null )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
if( $unpack !== null )
{
$unpackItems = $unpack;
}
if( $server !== null )
{
$redisServer = $server;
}
if( $prefix !== null )
{
$redisTargetPrefix = $prefix;
}
session_set_save_handler( 'RedisSession\redis_session_open', 'RedisSession\redis_session_close', 'RedisSession\redis_session_read', 'RedisSession\redis_session_write', 'RedisSession\redis_session_destroy', 'RedisSession\redis_session_gc' );
}
function redis_session_read( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
return base64_decode( $redisConnection->get( $redisTargetPrefix . $id ) );
}
function redis_session_write( $id, $data )
{
global $unpackItems, $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$ttl = ini_get( "session.gc_maxlifetime" );
$redisConnection->pipeline( function ($r) use (&$id, &$data, &$redisTargetPrefix, &$ttl, &$unpackItems)
{
$r->setex( $redisTargetPrefix . $id, $ttl, base64_encode( $data ) );
foreach( $unpackItems as $item )
{
$keyname = $redisTargetPrefix . $id . ":" . $item;
if( isset( $_SESSION[ $item ] ) )
{
$r->setex( $keyname, $ttl, $_SESSION[ $item ] );
}
else
{
$r->del( $keyname );
}
}
} );
}
function redis_session_destroy( $id )
{
global $redisServer, $redisTargetPrefix;
$redisConnection = new \Predis\Client( $redisServer );
$redisConnection->del( $redisTargetPrefix . $id );
$unpacked = $redisConnection->keys( $redisTargetPrefix . $id . ":*" );
foreach( $unpacked as $unp )
{
$redisConnection->del( $unp );
}
}
// These functions are all noops for various reasons... opening has no practical meaning in
// terms of non-shared Redis connections, the same for closing. Garbage collection is handled by
// Redis anyway.
function redis_session_open( $path, $name )
{
}
function redis_session_close()
{
}
function redis_session_gc( $age )
{
}
}
The issue was solved and it was much dumber than I thought.
The save handler doesn't implement locking in any way. On the report pages there are multiple requests being made to the server via ajax and the like. One of the ajax requests starts before the report gets saved to session space. Thus, it reads the session, then writes the session at the end.
Since the reports executes faster every time, the report would get cached to the session in Redis but would then be overwritten by the other script that had an older version of the sessien.
I had help from one of my co-workers. Ugh! This was a headache I'm glad to be over.