Ruby (Errno::EACCES) on File.delete - ruby

I am trying to delete some XML files after I have finished using them and one of them is giving me this error:
'delete': Permission denied - monthly-builds.xml (Errno::EACCES)
Ruby is claiming that the file is write protected but I set the permissions before I try to delete it.
This is what I am trying to do:
#collect the xml files from the current directory
filenames = Dir.glob("*.xml")
#do stuff to the XML files
finalXML = process_xml_files( filenames )
#clean up directory
filenames.each do |filename|
File.chmod(777, filename) # Full permissions
File.delete(filename)
end
Any ideas?

This:
File.chmod(777, filename)
doesn't do what you think it does. From the fine manual:
Changes permission bits on the named file(s) to the bit pattern represented by mode_int.
Emphasis mine. File modes are generally specified in octal as that nicely separates the bits into the three Unix permission groups (owner, group, other):
File.chmod(0777, filename)
So you're not actually setting the file to full access, you're setting the permission bits to 01411 which comes out like this:
-r----x--t
rather than the
-rwxrwxrwx
that you're expecting. Notice that your (decimal) 777 permission bitmap has removed write permission.
Also, deleting a file requires write access to the directory that the file is in (on Unixish systems at least) so check the permissions on the directory.
And finally, you might want to check the return value from File.chmod:
[...] Returns the number of files processed.
Just because you call doesn't mean that it will succeed.

You may not have access to run chmod. You must own the file to change its permissions.
The file may also be locked by nature of being open in another application. If you're viewing the file in, say, a text editor, you might not be able to delete it.

In my case it was because the file I had been trying to delete--kindle .sdr record--was directory, not file. I need to use this instead:
FileUtils.rm_rf(dirname)

Related

Copying files and getting Errno::EACCES: Permission denied # io_fread

I am trying to copy some files and calculating hashes, but for some files I get Errno::EACCES: Permission denied # io_fread
source = <file>
dir = <target dir>
sha_t = Digest::SHA256.file(source).hexdigest # <- Error
FileUtils.cp(source, dir)
So I obviously have no right to read the file. So I thought I could just check if I can read with:
File.readable?(source)
but this returns true.
How else do I check if I can read a file? I don't want to begin ... rescue
UPDATE:
I am using Windows and Ruby 2.1.3p242
I do not want to chmod, but just skip the file if it cannot be read.
First, it would be really helpful if you described the OS and filesystem you were using. For now, I'll assume you're using *NIX with an ext* filesystem.
A good way of checking files is using the built-in Pathname class, which works on both directories and files.
require 'pathname'
p = Pathname.new('/mypath/myfile.rb')
p.writable?
p.readable?
I suspect that in your case the problem can be solved by setting the right permission on the destination before copying to it:
Pathname.new(dir).chmod 755
FileUtils.cp(source, dir)

sql loader without .dat extension

Oracle's sqlldr defaults to a .dat extension. That I want to override. I don't like to rename the file. When googled get to know few answers to use . like data='fileName.' which is not working. Share your ideas, please.
Error message is fileName.dat is not found.
Sqlloder has default extension for all input files data,log,control...
data= .dat
log= .log
control = .ctl
bad =.bad
PARFILE = .par
But you have to pass filename without apostrophe and dot
sqlloder pass/user#db control=control data=data
sqloader will add extension. control.ctl data.dat
Nevertheless i do not understand why you do not want to specify extension?
You can't, at least in Unix/Linux environments. In Windows you can use the trailing period trick, specifying either INFILE 'filename.' in the control file or DATA=filename. on the command line. WIndows file name handling allows that; you can for instance do DIR filename. at a command prompt and it will list the file with no extension (as will DIR filename). But you can't do that with *nix, from a shell prompt or anywhere else.
You said you don't want to copy or rename the file. Temporarily renaming it might be the simplest solution, but as you may have a reason not to do that even briefly you could instead create a hard or soft link to the file which does have an extension, and use that link as the target instead. You could wrap that in a shell script that takes the file name argument:
# set variable from correct positional parameter; if you pass in the control
# file name or other options, this might not be $1 so adjust as needed
# if the tmeproary file won't be int he same directory, need to be full path
filename=$1
# optionally check file exists, is readable, etc. but overkill for demo
# can also check temporary file does not already exist - stop or remove
# create soft link somewhere it won't impact any other processes
ln -s ${filename} /tmp/${filename##*/}.dat
# run SQL*Loader with soft link as target
sqlldr user/password#db control=file.ctl data=/tmp/${filename##*/}.dat
# clean up
rm -f /tmp/${filename##*/}.dat
You can then call that as:
./scriptfile.sh /path/to/filename
If you can create the link in the same directory then you only need to pass the file, but if it's somewhere else - which may be necessary depending on why renaming isn't an option, and desirable either way - then you need to pass the full path of the data file so the link works. (If the temporary file will be int he same filesystem you could use a hard link, and you wouldn't have to pass the full path then either, but it's still cleaner to do so).
As you haven't shown your current command line options you may have to adjust that to take into account anything else you currently specify there rather than in the control file, particularly which positional argument is actually the data file path.
I have the same issue. I get a monthly download of reference data used in medical application and the 485 downloaded files don't have file extensions (#2gb). Unless I can load without file extensions I have to copy the files with .dat and load from there.

How to find text file in same directory

I am trying to read a list of baby names from the year 1880 in CSV format. My program, when run in the terminal on OS X returns an error indicating yob1880.txt doesnt exist.
No such file or directory # rb_sysopen - /names/yob1880.txt (Errno::ENOENT)
from names.rb:2:in `<main>'
The location of both the script and the text file is /Users/*****/names.
lines = []
File.expand_path('../yob1880.txt', __FILE__)
IO.foreach('../yob1880.txt') do |line|
lines << line
if lines.size >= 1000
lines = FasterCSV.parse(lines.join) rescue next
store lines
lines = []
end
end
store lines
If you're running the script from the /Users/*****/names directory, and the files also exist there, you should simply remove the "../" from your pathnames to prevent looking in /Users/***** for the files.
Use this approach to referencing your files, instead:
File.expand_path('yob1880.txt', __FILE__)
IO.foreach('yob1880.txt') do |line|
Note that the File.expand_path is doing nothing at the moment, as the return value is not captured or used for any purpose; it simply consumes resources when it executes. Depending on your actual intent, it could realistically be removed.
Going deeper on this topic, it may be better for the script to be explicit about which directory in which it locates files. Consider these approaches:
Change to the directory in which the script exists, prior to opening files
Dir.chdir(File.dirname(File.expand_path(__FILE__)))
IO.foreach('yob1880.txt') do |line|
This explicitly requires that the script and the data be stored relative to one another; in this case, they would be stored in the same directory.
Provide a specific path to the files
# do not use Dir.chdir or File.expand_path
IO.foreach('/Users/****/yob1880.txt') do |line|
This can work if the script is used in a small, contained environment, such as your own machine, but will be brittle if it data is moved to another directory or to another machine. Generally, this approach is not useful, except for short-lived scripts for personal use.
Never put a script using this approach into production use.
Work only with files in the current directory
# do not use Dir.chdir or File.expand_path
IO.foreach('yob1880.txt') do |line|
This will work if you run the script from the directory in which the data exists, but will fail if run from another directory. This approach typically works better when the script detects the contents of the directory, rather than requiring certain files to already exist there.
Many Linux/Unix utilities, such as cat and grep use this approach, if the command-line options do not override such behavior.
Accept a command-line option to find data files
require 'optparse'
base_directory = "."
OptionParser.new do |opts|
opts.banner = "Usage: example.rb [options]"
opts.on('-d', '--dir NAME', 'Directory name') {|v| base_directory = Dir.chdir(File.dirname(File.expand_path(v))) }
end
IO.foreach(File.join(base_directory, 'yob1880.txt')) do |line|
# do lines
end
This will give your script a -d or --dir option in which to specify the directory in which to find files.
Use a configuration file to find data files
This code would allow you to use a YAML configuration file to define where the files are located:
require 'yaml'
config_filename = File.expand_path("~/yob/config.yml")
config = {}
name = nil
config = YAML.load_file(config_filename)
base_directory = config["base"]
IO.foreach(File.join(base_directory, 'yob1880.txt')) do |line|
# do lines
end
This doesn't include any error handling related to finding and loading the config file, but it gets the point across. For additional information on using a YAML config file with error handling, see my answer on Asking user for information, and never having to ask again.
Final thoughts
You have the tools to establish ways to locate your data files. You can even mix-and-match solutions for a more sophisticated solution. For instance, you could default to the current directory (or the script directory) when no config file exists, and allow the command-line option to manually override the directory, when necessary.
Here's a technique I always use when I want to normalize the current working directory for my scripts. This is a good idea because in most cases you code your script and place the supporting files in the same folder, or in a sub-folder of the main script.
This resets the current working directory to the same folder as where the script is situated in. After that it's much easier to figure out the paths to everything:
# Reset working directory to same folder as current script file
Dir.chdir(File.dirname(File.expand_path(__FILE__)))
After that you can open your data file with just:
IO.foreach('yob1880.txt')

How to recursively change file permissions only in a ruby script

I have been using the function FileUtils.chmod_R to recursively change files and directories permissions under a given path but now want to change only the file permissions and leave the directories as they are. Looking at the man page for this function I can't see how to do this and I would prefer not to do this with a bash script. Please can someone tell me if this is possible with the FileUtils.chmod_R function or would I have to write additional code to iterate over every file that exist under a given path (recursive) and then FileUtils.chmod it to the desire permission? I am a ruby newbie so please point me someplace if I am asking anything obvious
You could do something like below - this will change permissions of the list of files matched by Dir.glob.
FileUtils.chmod 0400, Dir.glob('/path/to/dir/**/*')
As mentioned in this thread,
Dir.glob("**/*/") # will return list of all directories
Dir.glob("**/*") # will return list of all files

Deleted files not being removed from HDD in Ruby

I'm experiencing a weird situation with deleting files in Ruby, the code seems to report correctly, but the physical file isn't removed from my hard drive. I can do rm path/to/file on the command line - that works. I can even open up the Rails console and File.safe_unlink the file and that also works, it's just within my Rails app it fails to delete the actual file:
def destroy
Rails.logger.debug local_path #=> /Users/ryan/.../public/system/.../file.jpg
Rails.logger.debug File.exist?(local_path) #=> true
File.safe_unlink(local_path)
Rails.logger.debug File.exist?(local_path) #=> false
# yet the physical file actually STILL exists!
end
The physical file is within a Git repo (the repo is stored within /public/system/) any gotchas with that? I've tried using the ruby-git gem to delete the file using the rm command it provides, but that doesn't delete the file either.
I've opened up all the permissions on the files during testing and still nothing works. I've also confirmed this with File.writable?(local_path) which returned true.
Any thoughts why it could be preventing the file from being removed?
Have you checked the permissions on the directory? Deletion is a directory write operation, not file write operation. (rm will also check file perms and ask if you really want to do it, as a courtesy, if the file is write-protected; but if the directory isn't writable, it flat out refuses.)

Resources