How do I load files/directories into MobaxTerm to start a Tractor (for genetics) operation? - genetics

Tractor is an operation that adds power to GWAS for an admixed individual. There are online tutorials, but I am struggling to even start running Tractor in MobaXTerm. Every time I try to run some of the code I get back "cant open file - whatever file - no such file or directory". I do not know how to implement the correct files needed to run the operation and have tried downloading files and opening in the MobaXTerm app. Do I need to create a whole separate directory? I am pretty new to using Moba and don't really know where to start with Tractor.

Related

MacOS X: Update software from dmg installer

I have a small software for MacOS with a simple dmg installer (Open, drag and drop to Application folder, you know). My problem is, that the software writes a small ini file inside the .app package and if I update the software, this file is lost, because the old package is removed before writing the new one.
My question is, if any of you know an elegant solution for this. The user should be able to save the file in any place e. g. desktop and the ini file should be moved into the new package. I don't want to save this file outside the .app package, because this would leave private data on the computer if the user just removes the package.
Thanks in advance!
Saving data into the aplication bundle is no good practice,
for future release, please implement another solution.
To solve the current problem, I can think of two solutions:
Add two files to the .dmg file
The new application.
Backuptool: An simple AppleScript to backup the file would do the job.
Make sure to notify the user to run the backup before replacing the application.
The user might however forget to run the backup, and loose data.
Create an installer
Another option would be to write an installer using PackageMaker.
PackageMaker provides options to run scripts before updating the Application.
Add an pre-installation script that backups the data.

How to randomly change GDM background image on Linuxmint [Ubuntu]

i was reading this thread ubuntu/linux bash: traverse directory and subdirectories to work with files and i thought maybe it can be twisted a little bit
Can this be set to:
be given a base folder
scan folder + subfolder
collect all files it finds (only images)
pick one randomly
write a symbolic link to /user/share/backgrounds directory (writing the image itself overwriting existing one may work as well)
what i intend is to execute the script upon system shutdown or at set interval so it will change the gdm background image..
this is based on a step to do it manually with this line
sudo ln -s /usr/share/applications/gnome-appearance-properties.desktop /usr/share/gdm/autostart/LoginWindow/
which prompts for the appearance dialog on startup, which writes the link.
Ideally, it would have a GUI to do it at will, and an option to "change it automagically upon restart" which will do the process i described above and add itself to system start, reboot or shutdown sequence.
Since theres no working utility atm for this, it might come handy for some people =)
thanks for your help.
Use Wallpapoz. It can change wallpapers randomly across workspaces and over time.

How do I schedule for all the files of one folder to be moved to another server?

I did a quick search and couldn't see anything that was relevant which I found strange as this seems like it'd be a common question. Maybe I'm just going the wrong way about it or being thick? Who knows.
Anyway, I am trying to set up a scheduled task that moves all the files in a folder from Server A to a folder in Server B. If this was a simple matter of copying them it would be fine as I'd already got that working using Core FTP and a batch file but I'd like them to be removed from Server A after the copy has taken place.
I was looking at the windows ftp commands but although I managed to log onto Server A successfully from Server B whenever I tried to do a command it just took a very long time and then disconnected.
Any help in this would be appreciated, I need it to be a schedule-able file but it doesn't matter whether it is a .bat, .vbs or anything else that I haven't though of?
Thanks,
Harry
You could use www.Dropbox.com
Why? For stability. Any home-brew ftp script that moves files, is prone to an undetected error in transmission, resulting in deleted files.

How come the unix locate command still shows files/folders that aren't there any more?

I recently moved my whole local web development area over to using MacPorts stuff, rather than using MAMP on my Mac. I've been getting into Python/Django and didn't really need MAMP any more.
Thing is, I have uninstalled MAMP from the Applications folder, with the preferences file too, but how come when I run the 'locate MAMP' command in the Terminal it still shows all my /Applications/MAMP/ stuff as if it's all still there? And when I 'cd' into /Applications/MAMP/ it doesn't exist?
Something to do with locate being a kind of index searching system, hence things these old filepaths are cached? Please explain why, and how to sort it so they don't show anymore.
You've got the right idea: locate uses a database called 'locatedb'. It's normally updated by system cron jobs (not sure which on OS X); you can force an update with the updatedb command. See http://linux-sxs.org/utilities/updatedb.html among others.
Also, if you don't find files which you expect to, note this important caveat from the BUGS section of OSX' locate(1) man-page:
The locate database is typically built by user ''nobody'' and the
locate.updatedb(8) utility skips directories which are not readable
for user ''nobody'', group ''nobody'', or world. For example, if your
HOME directory is not world-readable, none of your files are in the database.
The other answers are correct about needing to update the locate database. I've got this alias to update my locate DB:
alias update_locate='sudo /usr/libexec/locate.updatedb'
I actually don't use locate all that much anymore now that I've found mdfind. It uses the spotlight file index which OSX is much better at keeping up to date compared to the locatedb. It also has quite a bit more power in what it can search from the command line.
Indeed the locate command searches through an index, that's why it's pretty fast.
The index is generated by the updatedb command, which is usually run as a nightly
or weekly job.
So to update it manually, just run updatedb.
According to the man page, its database is updated once a week:
NAME
locate.updatedb -- update locate database
SYNOPSIS
/usr/libexec/locate.updatedb
DESCRIPTION
The locate.updatedb utility updates the database used by locate(1). It is typically run once a week by
the /etc/periodic/weekly/310.locate script.
Take a look at the locate man page
http://unixhelp.ed.ac.uk/CGI/man-cgi?locate+1
You'll see that locate searches a database, not your actual filesystem.
You can update that database by using the updatedb command.
Also, since it's a database, unless you do update it regularly, locate wouln't find files that are in your filesystem that arn't in the database.

VSS Analyze - Access to file [filename] is denied

Our VSS database appears to be horribly out of shape. I've been trying to archive and run "analyze" and keep getting "Access to file [filename] is denied. The file may be read-only, may be in use, or you may not have permission to write to the file. Correct this problem and run analyze again." No one is logged into SourceSafe (including myself) and I'm running the analyze utility from the VS command prompt as follows:
analyze -v -f -bbackuppath databasepath
I get similar errors if I try and create project archives from the ssadmin tool.
The database is on a network share, and we're running VSS 2005 v8.0.50727.42. I'd love to be able to do this, as it would be a first step in a move away from VSS.
Thanks in advance.
More Info
Every time I run analyze, the file that spawns the access denied message changes. It's almost as if running analyze unlocks that file so that the next time I get through to the next one.
I had this issue with our VSS database as well when we tried to most recently analyze and repair.
We did a few things to get it working.
Turned off the network share, apparently we still had users accessing the share that we couldn't see, this helped most of the time.
Otherwise we copied the repository locally, then ran analyze on it from there.
Neither solution is ideal, but we were in a critical situation and it was the only way we got it to work.

Resources