As part of a corporate workstation compliance effort, our IT department had to create a new user account for me on my Mac and left the old account until I transition all the artifacts and configs. So my old home directory was /Users/firstlast and the new one is /Users/flast. Of course, the burden of reconfiguring my environment fell on me.
Some of the apps that I use, such as ssh, Maven, Dropbox and DBeaver have their config conveniently and cleanly stored in the home directory (.ssh, .m2, .dropbox, .dbeaver, respectively) so migrating those was a cakewalk just copying the directory from the old home to the new using sudo and then chown. However, that is not the case with IntelliJ IDEA.
My IDEA was heavily customized to the extent that I would like to avoid doing it all over again if by all means possible. I understand that each project has its own config in the project directory but what I am after is to get the same list of projects to open when I launch IDEA under the new user as I did under the old user. All my projects were in ~/DEV (so ~/DEV/project1, ~/DEV/project2, etc and I can just copy the DEV directory from the old home to the new.
Where is this list of projects stored? I imagine it is somewhere in /private/etc or /private/var but permissioned to the old user so the new user does not see it.
This Jetbrains documentation lists where the important directories are:
http://devnet.jetbrains.com/docs/DOC-181
On Mac OS X IDEA uses the following directories:
Config: ~/Library/Preferences/IntelliJIdeaXX
System: ~/Library/Caches/IntelliJIdeaXX
Plugins: ~/Library/Application Support/IntelliJIdeaXX
Logs: ~/Library/Logs/IntelliJIdeaXX (starting from IntelliJ IDEA 9.0, older versions keep logs under System location)
While each project has its own config contained within the project directory, what you really want is when you first launch IDEA to get the same list of projects to choose from as before and just copying project directories will not do that.
Here are three easy steps to do it:
If your projects were in located in your old home directory, copy them into the new home directory and chown them to the new user. If they were located outside your old home directory, all you will need to do is chown them to the new user.
sudo cp -r /Users/${OLD_USER}/Library/Preferences/IdeaIC13 /Users/${NEW_USER}/Library/Preferences
sudo chown -R ${NEW_USER} /Users/${NEW_USER}/Library/Preferences/IdeaIC13
Ta-da-da. Open your IDE and it looks identical as before.
Related
Installed/running Ontotext GraphDB v10.1.0 (free desktop windows). All working fine, create repositories, run SPARQL, etc.
The server and UI are both loading/running/reporting repositories in the C:\Users<Username>\AppData\Roaming\Graph\data\repositories folder.
However, when running the ImportRdf.cmd utility, its "attaching to"/creating the repository in C:\Users<Username>\AppData\Local\Graph\data\repositories folder instead!?
Tried adding the correct path into C:\Users<user>\AppData\Local\GraphDB Desktop\app\GraphDB Desktop.cfg but makes no difference.
Anyone experienced this/got any fixes?
The data /repository/ directory can be set through the system or config property graphdb.home.data. The default value is the data subdirectory relative to the GraphDB home directory. For example, one way to configure it: Go in bin folder of graphdb distribution and start graphdb with the following command:
./graphdb -Dgraphdb.home="full path to where you want your repo directory".
After updating macOS to Mojave (10.14.4), my Mac was restarted and upon opening Jenkins (at localhost:8080) it appeared that I've lost all my jobs and the entire system configurations.
There was only 1 user (admin) defined in my installation and my usual password was deemed invalid, when I tried to log back in. So, I tried entering another password I normally used and it was accepted. I then found that all my jobs and configs have disappeared. It looked as if I've just started Jenkins for the first time.
Looking through here on StackOverFlow, there were suggestions to check the JENKINS_HOME variable to find out where the jobs are saved on the disk, but when I typed export $JENKINS_HOME I just get an empty response. So, it looks like I've never configured it during set up.
I then dig through the hard drive and found the folders matching the names of the jobs I created under ~/.jenkins/workspace. However, the contents of all the folders are empty. I was expecting to see the usual files, e.g. build.xml, config.xml, etc.
I then did a global search for build.xml and config.xml on Mac Finder it turned up nothing.
Any idea where my jobs went and what could have caused all the contents of the folders of the jobs to be empty?
You can find your Jenkins installation directory in "Manage Jenkins" -> "configure System" --> "Home directory". Find what was the Jenkins home before you restart MAC. It looks like your home directory is either deleted by you or you are pointing to new folder now. Set it to earlier folder.
If can help,
I'm having a similar problem.
The curious part is about the new directory after the service restart ".jenkins" directory inside :
'/var/root/'.
And now, the password that Jenkins request me is not from
'/Users/username/.jenkins/secrets/initialAdministratorPassword' but from the newst one with same path pattern.
Simon
I have renamed my home folder name (let's say from userA to userB) and Docker stopped working.
The error is:
Cannot create/resize "/Users/userA/Library/Containers/com.docker.docker/Data/com.dockser.driver.amd64-linux/Docker.qcow2":exit status 1"
Notice that the path it shows is to the old folder name, userA. I have uninstalled and reinstalled Docker CE from the docker store (https://store.docker.com/editions/community/docker-ce-desktop-mac) but still have the same error.
Why is it still using the old folder even after I uninstalled?
You need a clean setup.
WARNING: the below procedure will delete all of your containers and images❗️
Try to delete following folders: some of them may require sudo privileges.
~/.docker
~/Library/Containers/com.docker.docker
~/Library/Group\ Containers/group.com.docker
~/Library/Caches/com.docker.docker
~/Library/PrivilegedHelperTools/com.docker.vmnetd
~/Library/Preferences/com.docker.docker.plist
I know this is old and would prefer not to rez it, but this answer came up when searching Google and I found a better solution that doesn't require you do delete anything - which I felt would be useful for anyone who comes across this question.
Assuming the old name is UserA, and the new is UserB
Create a new UserA directory in /Users/, then all the directories in the "missing" file path. IE: Library/Containers/com.docker.docker/Data/com.dockser.driver.amd64-linux/
Move Docker.qcow2 from UserB/Library/Containers/com.docker.docker/Data/com.dockser.driver.amd64-linux/ to the directory created in step 1.
Start Docker
Open Docker preferences -> Go to the Disk menu
Select "Move Disk Image" and pick the new location to move that docker file into. IE: you'll want to move it to UserB/Library/Containers/com.docker.docker/Data/com.dockser.driver.amd64-linux/
I have a site that uses envoyer for deployment.
On my site users can save images/avatars/etc. These images are saved to the public path of laravel. /public/uploads/
The problem with this is that when I deploy an update, composer doesn't keep the new files in /public/uploads in the new release folder.
Is it to write a deployment script that'll copy everything from /public/uploads in the current release INTO the new release every time I push a new deployment?
Don't store them in public/uploads, store them in storage/uploads. Symlink storage/uploads to /public/uploads and move storage/uploads out of the project completely into it's own directory. Then symlink /storage/uploads to your own /storage/uploads, something like this:
ln -s /home/forge/storage/example.com/uploads /home/forge/example.com/storage/uploads
ln -s /home/forge/example.com/storage/uploads /home/forge/example.com/public/uploads
Make sure that the ownership of the new directory located at /home/forge/storage/example.com/uploads matches to your ownership of /home/forge/example.com.
This will allow persistence of storage on the device. Ensure that the symlinks are added to part of the deployment process, and you'll no longer lose your uploads each time you deploy.
I've installed Collabnet Subversion Edge, and would like to make sure I have it backed up properly. I would like NOT to use the CloudBackup service offered.
I've went to the administration interface for collabnet (localhost:3343) and went to Repositories > Backup Schedule. There, one can choose between 3 different 'Type of Job':
Cloud Services Backup
Full Dump Backup
Hotcopy Backup
Neither lets you choose where to copy the backup. I've tried looking up how this works, but documentation seems to be lacking a lot.
What is the best way to backup such a repository? Shall I just keep a copy of the entire collabnet folder (c:\csvn)?
The Subversion Edge admin UI lets you specify the folder for backups. It defaults to a folder inside the normal data folder, but you can specify a different value. So, for example, if you have a D:\ drive that you want the backups to go on you can just specify that folder in the settings and the backups will go to that folder.
It does need to be a physically accessible hard drive though.
See the Backup Directory configuration item in this screenshot:
https://ctf.open.collab.net/sf/projects/svnedge/screenshots/screens/config/config.png
You can use Windows Server Backup to backup Subversion repositories. It allows you to shedule backups
to a network share, dedicated backup volume, writeable media. For example, wbadmin command-line tool allows you to safely backup your repositories. This simple command performs one-time copy backup of C:\foo\bar to X:\ volume:
wbadmin start backup –backupTarget:x: -include:c\foo\bar -vsscopy
(To install Windows Server Backup, run ocsetup WindowsServerBackup in elevated command-prompt).
You can setup backup in different ways:
wbadmin command-line tool,
PowerShell cmdlets, good for automation and customization of backup actions,
Windows Server Backup wizard (control panel, actually) MMC snap-in.
It's not required to stop server's service when you run the backup because FSFS repository backend is always in consistent state.
Here are general tips about recovering Subversion repository from a backup:
Recover repository backup to an empty directory to make sure that restored repository files won't mix with files of the broken one. After repository if recovered, you can delete broken repository and then replace it with the recovered one.
Stop-start cycle your Subversion server after recovering repository from a backup.
If your clients get errors after repository recover, run svnadmin recover against it. The command finishes instantly and makes repository accessible again.
If you have access to the repository directories then you should be able to use hotcopy directly and specify where the backups go.
It's enough to take a periodical backup of just csvn/data directory where all your repositories and configuration files are stored.
Visit this link for backup (and upgrade) options. The contents in the link is added below. Hope it helps.
Manual Upgrade/Reinstallation Steps
Subversion Edge includes an integrated mechanism for installing updates. This is the preferred way to do an upgrade as it handles whatever steps are needed to perform the upgrade and can be done remotely from your web browser. However, there are scenarios where you might want or need to do an upgrade manually, for example your Subversion Edge server might not be able to access the Internet to pull down the updates or maybe one or more critical installation files have become corrupted and you need to reinstall using the same version. Here are the steps for performing a manual upgrade or reinstallation:
Windows
If your existing Subversion Edge installation was installed using the installer from Subversion Edge 2.0.0 or later, then all you need to do to upgrade is download the latest installer and run it. This will uninstall the current version and install the new version (which is how the Windows Installer (.msi) process works for upgrades).
If you are not sure what version you installed with, you can always safely use this approach:
Stop the existing services and uninstall the current version from the
Windows Control Panel. This will leave behind your C:\csvn folder and
any files in it that have been modified since the original install.
Delete everything in the C:\csvn folder EXCEPT the data folder. So
you should be left with just the C:\csvn\data folder.
Install the new version. The installer will pick up the existing data folder and when the services start it will basically just be an upgrade to the new
version.
WARNING: Take note of this reported bug and backup the svn_access_file first:
artf7081 - Using Windows installer for updates can overwrite the svn_access_file
Linux/Solaris
To upgrade a Linux/Solaris installation, this is the safest way to do it:
Stop the servers $ bin/csvn stop $ bin/csvn-httpd stop
Rename the csvn folder $ mv csvn csvn-old
Untar the new release as a non-root user
Move the data folder back into the new release
$ mv csvn-old/data csvn
Important! Copy "dist" configuration files to data folder
$ cp -f csvn/dist/*.dist csvn/data/conf
Start the servers
$ bin/csvn start
$ bin/csvn-httpd start