Recovering a lost/dropped session in R? - session

Is it possible to recover an R session if R needed to close due to an error? Is there a temp directory where sessions/data sits before its saved to a file?
What I'd really need is the history of input commands.
(running R version 2.12.1 on WinXP)
Based on answer to my question below, I still have the following question:
Where is R saving .Rhistory to?
I am not able to find this file, despite restarting a new R session and still having access to commands from a previous session. R reference page on SAVEHISTORY did not resolve this for me. It seems that R is storing this data in some temp file/folder that I don't have access to. When I run the command tempdir(), I get a location of the temp directory, but not a single file is being stored there, whether during an active session, or after it closes.

Look at the working directory where you were working when R crashed and look for a file named ".Rhistory" I don't know if every command will be there. But I'm sure that at least the last 50 are there.
HTH

Related

Git error: unable to create temprary file & error building trees with no further info?

I'm trying to commit new changes to my repo, but it fails with the following errors:
error: unable to create temporary file: Invalid argument
error: unable to create temporary file: Invalid argument
error: Error building trees
When others faced similar issues, the error would point to a specific file/object which was causing the issue, however here there is no additional info to go on.
(Windows 10, tried through Terminal and cmd)
I found the issue - the project folder was being synced with OneDrive, which corrupted (The tag present in the reparse point buffer is invalid) one of the files in the objects folder inside .git. Figuring out where the issue lies could be done either visually (for me it was the only folder with blue OneDrive sync arrows instead of the green tick and the folder couldn't be opened). Another way is to clone the problematic branch into a new folder, make a small change (I created a test.txt with "test" written inside), push it to git, then go back to the problematic project folder and try to pull new changes from git. This operation failed and pointed to the corrupted object.
To solve this I looked into how to deal with files which OneDrive corrupted, and the most common suggestion was to run chkdsk c: /r /f in Command Prompt (as admin), which starts when the computer is rebooted. Beware it can take several hours to complete.
General Windows Solution (tested on Windows 10 and 11):
I'm getting the error because it appears that windows put the folder in read only mode
Deselect read only, then click apply and ok
Git really ought to print the path here, yes. The error seems to come from object-file.c:write_loose_object(), which is going to be creating a temporary file within .git/objects in the normal case.
The EINVAL error case should never occur, as the object path itself is necessarily valid (otherwise how did it come into existence?) and the generated temporary file name uses entirely "safe" characters. So the missing filename.buf in the error message should never comes out in the first place. In your case, the error does come out, and the missing %s for filename.buf means we can't see what path it is that your OS is objecting to.
You could build a debug or private version of Git (that adds the missing path) to find out, or use whatever system-call tracing facilities you have to observe the failing system call some other way. Find out why the OS is rejecting the attempt to create the loose object temporary file, and correct whatever the underlying problem may be. Meanwhile you might want to report this to the Git mailing list.

How to restore lost data in RStudio

I have serious problem with RStudio after crash. My R file I was currently working on has been deleted. I'd like to ask who has similar problem if there's any possibility to restore that file? Basically, RStudio saved that file as 0 bytes, while I was working on it several days. Unfortunatelly didn't made any backup.
My question is: is any possibility RStudio or Windows 7 have R files in cache?
If it was serious bug (happend first time) where should I report it?
EDIT
.Rhistory file has been overwritten as well as 0 byte file. The problem was probably caused by 0 bytes free-space on disk.
Please give any suggestions.
If you have been running your R file then it should have been recorded in the .Rhistory file.
Please have a look at the link below. Basically the file stores all the commands that you've been running.
https://support.rstudio.com/hc/en-us/articles/200526217-Command-History

Why does Isolated storage not delete the last two temporary files when exiting on a Windows Phone 7?

I have written an application that uses Isolated storage store data that I want to clear out on a periodic basis when it gets old. I have written a function that is called from Closing that checks the isolated storage for old data and deletes it.
This routine will delete everyfile that it is supose too except the last two files in the directory. When I debug the code I can see it execute the DeleteFile method on those files. I even when as far as checking right after the call to DeleteFile to see if the file still exists. According to the debugger it does not.
Yet when the appication starts up again the old data is for those last files is still in isolated storage. Thinking that it may be a race condition I put a Thread.sleep(1000) after the delete routeines.
The phone does not honor this delay and exits immediately after executing the delte code. I could not find a flush command that would be related to DeleteFile as I don't have a reference to a stream at that point.
Has anyone else found this or something similar? Is there a magic flush method I am missing or is this a defect in the phone IsolatedStorage implementation?
i agree with Matt and Matthieu.
though also wish to ask u have u tried truncating the file?
IsolatedStorageFileStream isfStream = new IsolatedStorageFileStream(strXMLFile, FileMode.Truncate, isf);

Where to store an application log file on Windows

Where would be the best "standard" place to put an application's debug log file in a Windows user environment?
In this particular case, it is an application that is run once and could go wrong. It will be run by system administrator types who may need to inspect the log after the application is run. Everytime the application is run, a new log file is created.
Options that have been floated so far include:
The program directory
The user's desktop
The user's local Application Data directory.
I have my favourite, but I wondered what the SO consensus was.
Note: this is similar to this question, but we're dealing with an application that's only likely to be run once by one user.
The Application Data directory would seem to be the perfect place, but it's an area that is nearly invisible. You need to give your users an easy way to get to it.
Have your installation script create a Log folder in the Application Data area for your program, and include a link to the folder in your Start menu.
In the organization I work for we use the (%TEMP% or %TMP%)\CompanyOrProductName\Logs directory
Using %APPDATA% may be problematic with roaming profiles if the logs are numerous or huge : it slows their login process ...
1.The program directory <- not good. Ideally you will only have RX permissions on this folder.
2.The user's desktop <- technically can be done, but I don't like this idea. Polluting desktop... I, as a user, don't like it.
3.The user's local Application Data directory. <- better
My preference is a subdirectory under the program directory (with a clear name like "DebugLog" or something similar). Permissions on that subdirectory should allow creating and writing files ("Change" will be fine)
The "standard" place for the log would be the AppData directory. However, really its up to you where you want to store them. As they are administrator (power users) then there should be no problems storing the logs in the same directory as the application being run. Even in the MyDocuments of the user would be a good shout.
If you EXPECT something to go wrong put it in the user's local Application Data directory.
If you don't and just want to log anyways I might think about really using the temp directory. The reasoning for this is simple. If the application is only run once you will leave trash in the Application Data directory otherwise that nobody will ever need again. In the temp you have at least the CHANCE that it's going to be cleaned up later.
BTW: IMHO the best would be not not create the log AS A FILE at all (log to memory) until something goes wrong. Then you can still offer a dialog where the user selects where to save the log.
Windows Temp Folder
Assuming you want to keep log files around a significant amount of time and they are intended to be used, read I would put the log file in a sub-folder of the user's local application data folder, accessible from windows explorer by typing %localappdata%.
If they are temporary log files, only to be used in the event of system diagnostics then you should put them in the temporary folder, accessible from windows explorer %temp%.

How come the unix locate command still shows files/folders that aren't there any more?

I recently moved my whole local web development area over to using MacPorts stuff, rather than using MAMP on my Mac. I've been getting into Python/Django and didn't really need MAMP any more.
Thing is, I have uninstalled MAMP from the Applications folder, with the preferences file too, but how come when I run the 'locate MAMP' command in the Terminal it still shows all my /Applications/MAMP/ stuff as if it's all still there? And when I 'cd' into /Applications/MAMP/ it doesn't exist?
Something to do with locate being a kind of index searching system, hence things these old filepaths are cached? Please explain why, and how to sort it so they don't show anymore.
You've got the right idea: locate uses a database called 'locatedb'. It's normally updated by system cron jobs (not sure which on OS X); you can force an update with the updatedb command. See http://linux-sxs.org/utilities/updatedb.html among others.
Also, if you don't find files which you expect to, note this important caveat from the BUGS section of OSX' locate(1) man-page:
The locate database is typically built by user ''nobody'' and the
locate.updatedb(8) utility skips directories which are not readable
for user ''nobody'', group ''nobody'', or world. For example, if your
HOME directory is not world-readable, none of your files are in the database.
The other answers are correct about needing to update the locate database. I've got this alias to update my locate DB:
alias update_locate='sudo /usr/libexec/locate.updatedb'
I actually don't use locate all that much anymore now that I've found mdfind. It uses the spotlight file index which OSX is much better at keeping up to date compared to the locatedb. It also has quite a bit more power in what it can search from the command line.
Indeed the locate command searches through an index, that's why it's pretty fast.
The index is generated by the updatedb command, which is usually run as a nightly
or weekly job.
So to update it manually, just run updatedb.
According to the man page, its database is updated once a week:
NAME
locate.updatedb -- update locate database
SYNOPSIS
/usr/libexec/locate.updatedb
DESCRIPTION
The locate.updatedb utility updates the database used by locate(1). It is typically run once a week by
the /etc/periodic/weekly/310.locate script.
Take a look at the locate man page
http://unixhelp.ed.ac.uk/CGI/man-cgi?locate+1
You'll see that locate searches a database, not your actual filesystem.
You can update that database by using the updatedb command.
Also, since it's a database, unless you do update it regularly, locate wouln't find files that are in your filesystem that arn't in the database.

Resources