Undo deletion of files caused by rsync - macos

I have a Cent OS 6 VM running on my Mac. I was developing remotely using a FTP directly on the VM. I had the vagrant-rsync-auto command running.
Not realizing that rsync-auto is running, I tried to copy the project I created onto my mac in the same directory where it is rsyncing. Apparently, I lost all the files I created which is one whole day's work.
Is it possible to have the files back?

This is a data recovery issue, not a programming issue, but I would venture a guess that it approaches close to 0% possible. Not only is the data gone/overwritten, but it was inside a VM so the recovery would have to be of the VM disk at an unknown point in time in the past which has since been overwritten by a newer machine state.
Perhaps this will be a good lesson in creating backups / cron jobs or using version control, like Git?

Related

Migrate FreeNAS Data to Windows (over SMB)

My FreeNAS server is slowly dying and before that happens i need to migrate all data in the NAS to a windows server.
The FreeNAS has ZFS Snapshots and i need to restore data from a few days ago to the Windows server.
I have done some research and i can't think of the best way to do this. (i am not linux/Zfs savvy)
So the things i need to do is,
Restore ZFS Snaptshot from a few days ago to a windows Server
I mounted a windows share to the Freenas using mount_smbfs //username:password#server.name/share_name share_name/
I can copy and create files on that share just fine. So I was wondering if it was possible to restore an entire data set from an snapshot to the windows share.
Any help, tips is much appreciated.
Note. I could easily copy all data on a freenas volume to the windows share, but what makes it complicated for me, is restoring data from a snapshot without overwriting the current data on the volume and moving that data to the windows share.
You have two sensible possibilities:
Access the ZFS dataset (shared over SMB) from your Windows Server, then right-click on it in Explorer and choose "Previous Versions". You will get (after a short time depending on the number of snapshots) a list of all snapshots with their dates. You can then either explore them and copy some files over, or you can choose to copy all to another location (e. g. your new share).
Mount the Windows share on FreeNAS like you did, then go to <pool>/<filesystem>/.zfs/snapshot/ (path completion on the shell might be turned off for the .zfs directory, so type it in manually). There you'll find all your snapshots (like you would have on Windows' Previous Versions) and you can copy some or all files over to the new directory.
I would suggest the first way, because you have the GUI and cannot do any harm to the FreeNAS system this way.
On the other hand, have you thought about the possibility of rescuing the system? You did not specify why it's dying, but things like hard drives or mainboards can be swapped quite easily without requiring setting up everything anew. Maybe this would help you more than moving the data off to another, unconfigured system?

Filesharing between OS X & Windows 10 on a htdocs directory?

I had a practical question for my own work at home. I want to use quad monitor for my coding and other work. I can do this with my macbook pro attached to external triple monitor. But it is not practical because of all the cable management and Macbook Pro is barely keeping up with the performance running it. So what I wanted to do was having my PC run triple monitor and my Macbook as forth screen. Code on my pc and share/update the files in the htdocs directory on my OS X. Like how FTP works.
I found this link: http://www.itworld.com/article/2844141/how-to-share-mac-os-x-yosemite-files-with-windows-10.html
But I'm not sure if I will face sudden obstacles in doing this with my htdocs directory or other directories where my work is stored and updated from time to time.(example:Symfony projects)
I hope I mentioned everything. Thanks in advance!
Well, you can use one of the free cloud based, file-sharig service, like Microsoft OneDrive, Google Drive or Dropbox.
But files will not be updated immediately, you need to wait few seconds (in the best case scenario). So it might get frustrating quickly.
Also, from my experience, OneDrive on Mac is not the best choice when it comes to a Symfony project - it stops working after a while, probably because a lot of cache files, so I need to restart it and it's not usable at all.
Another solution might be using a version control system (f.e. Git) - but you would be able to see the code changes only after a commit and push (and do it manually, of course).

What happens after OS X Internet Recovery without erasing disk

I want to reinstall OS X Yosemite in my MacBook Pro 2015 without erasing the disk. What will I need to reinstall after the process is done? What files are not removed from my hard drive? What files ARE removed from disk?
Thanks!
Run a backup to save all of your data via Time Machine.
Reinstall OSX.
Restore your data from Time Machine if its not available after reinstall.
Run Disk Utility and have the application search and fix any permissions issues that will almost certainly occur when doing a reinstall with applications pulled from a previous install.
DON'T attempt to do a reinstall without making a backup via Time Machine to an external drive.
From #bmike:
"Almost nothing is actually removed. What happens is the installer downloads a fresh and complete set of system files - basically the things that came with the Mac out of the factory minus some iLife apps like Garageband for instance.
The installer doesn't delete any user settings, user files and doesn't even delete apps you installed.
The process is designed to make the system work whether or not you have large or small changes to the system."
Link

Using WebStorm (JetBrains) with SSHFS mounted development server (Mavericks, OSXFUSE)? Constantly dismounts drive

UPDATE: I saw that someone was trying to use PyCharm with SSHFS and JetBrains said: "no". Perhaps this just won't work?
I'm trying to work with WebStorm on an SSHFS mounted disk at a client's office I'm working at — I've never used SSHFS before. I am using OSX 10.9.2, installed SSHFS thru home-brew and installed OSXFUSE.
The SSHFS mount dismounts periodically in any case, but since I started trying to use WebStorm with it it dismounts every time I start WebStorm and it starts scanning the files on the SSHFS disk — WebStorm gives the message "external file changes sync may be slow: Project files cannot be watched (are they under network mount?)" and if I try to open files it freezes. The SSHFS disc meanwhile has been dismounted. If I remount via terminal WebStorm isn't happy and either freezes or just sits there.
I set up the WebStorm project using "New project from existing files" — is there a way to set it up using SSHFS as a server? Beyond the login and password to the SSHFS disc I don't have any other server-specific info, but perhaps could get it.
Thanks for any help — 
This is how I operate, and maybe it can help you. If there's a config setting I seem to have glossed over, just ask and I'll fix this up. But all in all, this is wonderfully successful:
My build environment is tucked away on a Linux distro, but my development environment is co-located on a Mac Desktop (when I'm at work) and a Mac Air (when I'm at home). My projects are enormous, and contractually I can't move the code to any machine where it might be accessible if my laptop is stolen. So I pretty much have to use ssh (and sshfs) to get anything done.
When I am at home, and I sit down to work, I manually initiate the VPN -- since there are so many variations, I'll assume you know how to do this part.
I open a terminal and invoke:
caffeinate &
because I hate getting disconnected whenever the computer goes into screen saver. This may be why you get disconnected? I leave this terminal open whenever I'm developing. I also use tmux so that my terminal session can be shared between computers. Anyway...
I set up a mount point set up between the server and the client. I have a script that I run when the mount point goes down (customize for your own work):
umount -f /Volumes/$MOUNTDIR/
umount -f /Users/$HOMEUSER/$MOUNTDIR
mkdir /Users/$HOMEUSER/$MOUNTDIR
sshfs $HOMEUSER##SERVERADDR:/usr/$HOMEUSER/$MOUNTDIR /Users/$HOMEUSER/$MOUNTDIR
I then launch Webstorm, PyCharm, ADS, IntelliJ (I'm a Jetbrains fan).
At this point you can open the directory within $MOUNTDIR and start working. If you find that you need to run builds, here's a tip -- do not build locally. Instead use SSH to issue the build commands (or run scripts) on the server. The overhead of synching after the build has run is most likely far less than fetching and writing all of the steps of the build.
I only find I get disconnected if I lose the VPN. I used to get disconnected whenever the computer would sleep. Caffeinate fixed that.
For reasonable sized projects, this is probably all you need. So what follows is an optimization -- only do it if you are having headaches:
To speed up load times, what I do is create a local project that is not part of the mount. There is a .IDEA directory that gets created and written to a lot at the base of the first directory you open as a project. Inside of this directory are lots of files that get written to a lot, and depending on your network speed, it might cause grief. It does mean some settings have to be maintained everywhere you go, but in my case it's a small price to pay for big performance gains.
So because I do this, I'll have to manually add directories to my project (Under Preferences/Directories). But if you work with huge APIs, you might be doing this anyway. I am careful to mark directories I don't need to reference as 'excluded', to make life easier on the indexer. I work in a shared directory structure with thousands of other employees, and I make sure the streams don't cross.
Now I have many many thousands of files, and it is true that sync can be slow. But sync is only triggered when you leave the app and come back in. And honestly, it's not that terrible, so long as you have a reasonable internet connection.
I hope this helps. Once I started using this as my workflow, I never went back.

performance of xampp running on dropbox

i'm running xampp on my company notepad but would also like accessing it at home with my desktop pc which runs on a fast ssd.
i considered moving my xampp directory to dropbox.
can someone tell me about performance issues - will the dropbox background sync processes affect my xampp's apache/mysql performance - when there's lots of permanent coding work / changes in my scripts - or will it perform just as fast as it would when located in c:\xampp? thanks
It shouldn't, when syncing you are basically downloading it to your PC, so you're running it from your local box. Since you are using ssd, I doubt you'll be using all your SSD power to the maximum and also you won't be using the internet when coding locally.
But it should be causing a bit of cpu/network usage although you're using xampp on a local box(the sync needs to check the files if they changed/added/removed + transfer changes). Consider pausing the sync while working on xampp, and start it back when done working on it.

Resources