sync my localhost with my server by ftp - ftp

I want to sync my localhost (Windows) with my remote server in real time and automatically. So when I modify, create or delete a file this tool should update remote server automatically. This aplication must to keep both servers synchronized in real time. Please I really need your help. I tried FTPbox, but it doesn't update always, I need some better. I'm working on windows, but if exists some on linux is better.
Thanks

WinScp has a synchronization feature that does what you want.
For linux users, you can have a look here.

Try Dropbox or Google Drive if you don't need to synchronize too much information.

I'm assuming that you want to syncronize the databases and files. This was my way out, I hope it will be of help to someone.
The first code is local, and the other one is remote.
//make sure you are connected to your local database
<?php
//function to check internet connection.
function is_connected() {
if($connected = fsockopen("www.example.com", 80)){
// website, port (try 80 or 443)
if ($connected){
$is_conn = true; //action when connected
fclose($connected);
}
return $is_conn;
} else {
$is_conn = false; //action in connection failure
}
}
//if connected to internet, do the following...
if(is_connected()== true){
echo "connected";
ini_set('max_execution_time', 3000);//increase this incase of slow internet
$table_name = TableName::find_all();
//whatever way you find an array
//of all your entries on this particular
//table that you want to sync with the remote table.
$file = 'to_upload_local.php'; //a local file to put table contents into
$current = serialize($table_name);//serialize the table contents (google).
file_put_contents($file, $current);//put the serialized contents to the file.
$remote_file = 'public_html/to_upload_remote.php';//this is the file that is on the remote server that you want to overwrite with the local file to upload.
$ftp_server = "ftp.yourwebsite.org";// your ftp address
// set up basic connection
$conn_id = ftp_connect($ftp_server);
// login with username and password
$login_result = ftp_login($conn_id, "yourFTPUsername", "yourFTPPassword");
// turn passive mode on
ftp_pasv($conn_id, true);
// upload a file
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)){
echo "Upload Successful";
} else {
}
// close the connection
ftp_close($conn_id);
//this script called below is to update your remote database. Its in the next example
$call_script = file_get_contents('http://path_to_your_script');
} else {
//if not connected to internet,....
echo "offline";
}
?>
The online script that should do the work, (the one you called in the last line of the previous code) should look something like this:
//make sure you're connected to remote database
<?php
//this function should compare num_rows of your two
//databases values (local remote) and return the
//difference. It's used with array_udiff function.
function compare_objects($obj_a, $obj_b) {
return $obj_a->id - $obj_b->id;
}
//this function should compare contents of your two
//databases values (local remote) and return the
//difference. It's used with array_udiff function.
function comparison($obj_a, $obj_b){
if ($obj_a==$obj_b){
return 0;
}else{
return -1;
}
}
$file = '../to_upload_remote.php';//the uploaded file
$current = file_get_contents($file);//load the file
$array = unserialize($current);//unserialize to get the object array
$remote_table_name = remote_table_name::find_all();//get what you have in
//remote database
//if a new value is added, create a new entry to database with new vals
if($try_new = array_udiff($array, $remote_table_name, 'compare_objects')){
foreach($try_new as $entry){
$remote_table_name = new remote_table_name();
$remote_table_name->value = $entry->value;
//depending on the number of your columns,
//add values to remote table that were not there before.
//you can use any other suitable method to do this.
if($remote_table_name->save()){
echo "the remote_table_name was saved successfully";
}
}
} else {
echo "same number of rows";
}
//if some values are changed, update them with new vals
if($try_change = array_udiff($array, $remote_table_name, 'comparison')){
foreach($try_change as $entry){
$remote_table_name = remote_table_name::find_by_id($entry->id);
$remote_table_name->value = $entry->value;
//depending on the number of your columns,
//update values to remote table.
//you can use any other suitable method to do this.
if($remote_table_name->save()){
echo "the remote_table_name was saved successfully";
}
}
} else {
echo "All values match";
}
?>
So, any time the first code is executed, it reads the local table, takes all the values and puts them in the local file, uploads the local file and replaces one in the remote folder, calls a remote script to check the unserialized local table and compares it with the online table, then does the necessary.

Related

laravel update runs well on my local machine but on the live server returns error "Creating default object from empty value"

i have this code which run very well on my local machine but on the live server returns an error "Creating default object from empty value", i have checked if the value is null but not.
what the piece of code does is if the value is not found in the database, create new one else update, the create works very well but the update does not work on the live server but works on my local machine.
the problem is on the else part of the program where $id >=1 and is this line $ex->subject=$exams[$i]['value'];
public function saveExam(Request $request){
if($request->id==""){
return response()->json(['error'=>'Please select student']);
}
$exams=array_slice($request->exam,0,count($request->exam)-2);
$class_rec=array_slice($request->exam,count($request->exam)-2,1);
// return response()->json(['success'=>$class[0]['value']]);
$data;
$id=$request->id;
$sess=settings_session::find(1);
$session=$sess->session;
$term=$sess->term;
$id_num=0;
$table;
if($request->level=="primary"){
$table="App\\exam_report";
$id_num=exam_report::where('student_id',$id)->where('session',$session)->where('term',$term)->count();
}else if($request->level=="nursery"){
$table="App\\nursery_exam_report";
$id_num=nursery_exam_report::where('student_id',$id)->where('session',$session)->where('term',$term)->count();
}
//else if($request->level=="pre-nursery"){
// $table="App\\pnursery_exam_report";
//}
else if($request->secondary){
$table="App\\secondary_exam_report";
}
if($id_num <= 0){
for ($i=0; $i < count($exams) ; $i++) {
$ex=new $table;
$ex->subject=$exams[$i]['value'];
$i++;
$ex->first_test=$exams[$i]['value'];
$i++;
$ex->second_test=$exams[$i]['value'];
$i++;
$ex->exam=$exams[$i]['value'];
$i++;
$ex->total=$exams[$i]['value'];
$i++;
$ex->grade=$exams[$i]['value'];
$ex->student_id=$id;
$ex->class=$class_rec[0]['value'];
$ex->term=$term;
$ex->session=$session;
$ex->save();
}
}else if($id_num >=1){
$exa=$table::where('student_id',$id)->first();
$ids=$exa->id;
$id_it=$ids;
for ($i=0; $i < count($exams) ; $i++) {
$ex=$table::find($id_it);
$ex->subject=$exams[$i]['value'];
$i++;
$ex->first_test=$exams[$i]['value'];
$i++;
$ex->second_test=$exams[$i]['value'];
$i++;
$ex->exam=$exams[$i]['value'];
$i++;
$ex->total=$exams[$i]['value'];
$i++;
$ex->grade=$exams[$i]['value'];
$ex->student_id=$id;
$ex->class=$exams[$i]['value'];
$ex->term=$term;
$ex->session=$session;
$ids=$exa->id;
$ex->save();
$id_it++;
}
}
return response()->json(['success'=>'Success']);
}
it is expected to update the table but the error is on $ex->subject=$exams[$i]['value'] where $id >=1
Check all the data, collations, and names on the mysql remote server, check also the caps (in Windows the caps dont matter for tables, in Linux they do)
I found the answer, the problem is that on the database, the ids are not arranged serially, and each time i do this
$exa=$table::where('student_id',$id)->first();
it picks the highest id, and when i try to add the next id which did not exist, i get the error, but on my local server, my database ids are arranged well, i don't know why this behavior, the solution to my problem is this line i changed
$exa=$table::where('student_id',$id)->get();
$ids=$exa->min('id');
$id_it=$ids;
instead of pulling the first record, i get all record and find the minimum id

Laravel Collective SSH results

I am performing SSH in Laravel whereby I connect to another server and download a file. I am using Laravel Collective https://laravelcollective.com/docs/5.4/ssh
So, the suggested way to do this is something like this
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
if($result) {
return $path;
} else {
return 401;
}
Now that successfully downloads the file and moves it to my local server. However, I am always returned 401 because $result seems to be Null.
I cant find much or getting the result back from the SSH. I have also tried
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path, function($line){
dd( $line.PHP_EOL);
});
But that never gets into the inner function.
Is there any way I can get the result back from the SSH? I just want to handle it properly if there is an error.
Thanks
Rather than rely on $result to give you true / false / error, you can check if the file was downloaded successfully in another way:
// download the file
$result = \SSH::into('scripts')->get('/srv/somelocation/'.$fileName, $path);
// see if downloaded file exists
if ( file_exists($path) ) {
return $path;
} else {
return 401;
}
u need to pass file name also like this in get and put method:
$fileName = "example.txt";
$get = \SSH::into('scripts')->get('/remote/somelocation/'.$fileName, base_path($fileName));
in set method
$set = \SSH::into('scripts')->set(base_path($fileName),'/remote/location/'.$fileName);
in list
$command = SSH::into('scripts')->run(['ls -lsa'],function($output) {
dd($output);
});

Breeze Fetch Strategy always goes remote

Im having a problem where breeze always goes to the server even though I've specified FetchStrategy.FromLocalCache. I created a test script below. Th initial query goes remote as as expected. The second query also goes remote(FetchStrategy.FromLocalCache). The third query(ExecuteQueryLocally) goes to local cache. From developer tools I can see there are 2 network requests (not including metadata). What am I doing wrong?
getCategories = function (observable) {
var query = breeze.EntityQuery
.from("Categories")
.orderBy('Order');
manager.executeQuery(query) //goes remote
.then(fetchSucceeded)
.fail(queryFailed);
function fetchSucceeded(data) {
// observable(data.results);
getCategoriesLocal(observable);
}
},
getCategoriesLocal = function (observable) {
var query = breeze.EntityQuery
.from("Categories")
.orderBy('Order');
query.using(breeze.FetchStrategy.FromLocalCache);
manager.executeQuery(query) //also goes remote
.then(fetchSucceeded)
.fail(queryFailed);
function fetchSucceeded(data) {
d = manager.executeQueryLocally(query); //goes local
observable(d);
return;
}
},
Instead of
query.using(breeze.FetchStrategy.FromLocalCache);
you need to reassign it, i.e.
query = query.using(breeze.FetchStrategy.FromLocalCache);
In breeze all EntityQueries are immutable which means that any time you apply a change to an EntityQuery you get new one. This is by design, so that no query can get changed under you by a later modification.
Alternatively you can simply use
manager.executeQuery(query.using(breeze.FetchStrategy.FromLocalCache));

Saving Data Locally and Remotely (Syncing)

When data is entered, it ultimately needs to be saved remotely on a server. I do want the app to work if there is no data connection at the time also, so I need to save everything locally on the phone too. The app can then sync with the server when it gets a connection.
This brings up a little issue. I'm used to saving everything on the server and then getting the records back with id's generated from the server for them. If there is no connection, the app will save locally to the phone but not the server. When syncing with the server, I don't see a way for the phone to know when a record comes back which locally record it's associated with. There isn't enough unique data to figure this out.
What is the best way to handle this?
One way I've been thinking is to change the id of the records to a GUID and let the phone set the id. This way, all records will have an id locally, and when saving to the server, it should still be a unique id.
I'd like to know what other people have been doing, and what works and what doesn't from experience.
This is how we done with a first windows phone 7 app finished few days ago with my friend.
It might not be the best solution but 'till additional refactoring it works just fine.
It's an application for a web app like a mint.com called slamarica.
If we have feature like save transaction, we first check if we have connection to internet.
// Check if application is in online or in offline mode
if (NetworkDetector.IsOnline)
{
// Save through REST API
_transactionBl.AddTransaction(_currentTransaction);
}
else
{
// Save to phone database
SaveTransactionToPhone(_currentTransaction);
}
If transaction is successfully saved via REST, it responses with transaction object and than we save it to local database. If REST save failed we save data to local database.
private void OnTransactionSaveCompleted(bool isSuccessful, string message, Transaction savedTransaction)
{
MessageBox.Show(message);
if(isSuccessful)
{
// save new transaction to local database
DatabaseBl.Save(savedTransaction);
// save to observable collection Transactions in MainViewModel
App.ViewModel.Transactions.Add(App.ViewModel.TransactionToTransactionViewModel(savedTransaction));
App.ViewModel.SortTransactionList();
// Go back to Transaction List
NavigationService.GoBack();
}
else
{
// if REST is failed save unsent transaction to Phone database
SaveTransactionToPhone(_currentTransaction);
// save to observable collection Transactions in MainViewModel
App.ViewModel.Transactions.Add(App.ViewModel.TransactionToTransactionViewModel(_currentTransaction));
App.ViewModel.SortTransactionList();
}
}
Every Transaction object has IsInSync property. It is set to false by default until we got confirmation from REST API that it's saved successful on the server.
User has ability to refresh transactions. User can click on a button Refresh to fetch new data from the server. We do the syncing in the background like this:
private void RefreshTransactions(object sender, RoutedEventArgs e)
{
if (NetworkDetector.IsOnline)
{
var notSyncTransactions = DatabaseBl.GetData<Transaction>().Where(x => x.IsInSync == false).ToList();
if(notSyncTransactions.Count > 0)
{
// we must Sync all transactions
_isAllInSync = true;
_transactionSyncCount = notSyncTransactions.Count;
_transactionBl.AddTransactionCompleted += OnSyncTransactionCompleted;
if (_progress == null)
{
_progress = new ProgressIndicator();
}
foreach (var notSyncTransaction in notSyncTransactions)
{
_transactionBl.AddTransaction(notSyncTransaction);
}
_progress.Show();
}
else
{
// just refresh transactions
DoTransactionRefresh();
}
}
else
{
MessageBox.Show(ApplicationStrings.NETWORK_OFFLINE);
}
}
private void DoTransactionRefresh()
{
if (_progress == null)
{
_progress = new ProgressIndicator();
}
// after all data is sent do full reload
App.ViewModel.LoadMore = true;
App.ViewModel.ShowButton = false;
ApplicationBl<Transaction>.GetDataLoadingCompleted += OnTransactionsRefreshCompleted;
ApplicationBl<Transaction>.GetData(0, 10);
_progress.Show();
}
OnTransactionRefreshCompleted we delete all transaction data in local database and get the latest 10 transactions. We don't need all the data, and this way user have synced data. He can always load more data by taping load more at the end of transaction list. It's something similar like those twitter apps.
private void OnTransactionsRefreshCompleted(object entities)
{
if (entities is IList<Transaction>)
{
// save transactions
var transactions = (IList<Transaction>)entities;
DatabaseBl.TruncateTable<Transaction>();
DatabaseBl.Save(transactions);
((MainViewModel) DataContext).Transactions.Clear();
//reset offset
_offset = 1;
//update list with new transactions
App.ViewModel.LoadDataForTransactions(transactions);
App.ViewModel.LoadMore = false;
App.ViewModel.ShowButton = true;
}
if (entities == null)
{
App.ViewModel.ShowButton = false;
App.ViewModel.LoadMore = false;
}
// hide progress
_progress.Hide();
// remove event handler
ApplicationBl<Transaction>.GetDataLoadingCompleted -= OnTransactionsRefreshCompleted;
}
Caveat - I haven't tried this with windows phone development but use of GUID identities is something I usually do when faced with similar situations - eg creating records when I only have a one-way connection to the database - such as via a message bus or queue.
It works fine, albeit with a minor penalty in record sizes, and can also cause less performant indexes. I suggest you just give it a shot.

Anti flood : session or db stocking ips

right now I'm using an antiflood function in all my websites :
function flood($name,$time)
{
$name = 'tmptmptmp'.$name;
if(!isset($_SESSION[$name]))
{
$_SESSION[$name] = time();
return true;
}
else
{
if(time()-$time > $_SESSION[$name])
{
$_SESSION[$name] = time();
return true;
}
else
{
return false;
}
}
}
I use it this way :
if(flood('post',60)) do something;
else 'you're posting too fast';
Is this way safe ? Or do I need to replace it/complete it with a db table stocking ips and checking if they did a request earlier ?
It depends. How likely are your users going to clear their cookies to get past your anti-flood protection? I'll say that if they have to login again, 99% of the users won't even bother.
But sure, if you really want better method, store the ips in the DB. But even that can be defeated by getting a new IP.

Resources