I create simple package by using table. Because the package has many procedures so I split the code into many text file. Each file will pass the convention checking of Maple
I want to create a master file and "include" all text-code file into master file before compiling.
restart;
HINHHOC9 := table():
$include <"D:/CaoHoc/LuanVan/Code/workspace/LuanVan/01_DocDuLieu.maple"> ;
url := currentdir();
save HINHHOC9, cat(url, "/HINHHOC9.m");
libname := libname, url;
with(HINHHOC9);
But the master file compiled error
> restart;
> HINHHOC9 := table():
>
on line 5 of "D:/CaoHoc/LuanVan/Code/workspace/LuanVan/00_MasterFile.maple",
syntax error, cannot open $include file:
"D:\CaoHoc\LuanVan\Code\workspace\LuanVan\01_DocDuLieu.maple":
$include <"D:/CaoHoc/LuanVan/Code/workspace/LuanVan/01_DocDuLieu.maple"> ;
^
> quit
memory used=0.6MB, alloc=6.3MB, time=0.14
Please help me to find out the way to include and compile the code
Thanks
You have both quotation marks and angle brackets around the file name. You need one, but not both, of those.
Related
We can specify both of flag and perm at os.OpenFile.
They have really similar options, O_APPEND and ModeAppend. What's the difference between them?
f, _ := os.OpenFile("access.log", os.O_APPEND|os.O_CREATE, os.ModeAppend|0644)
The flag specify the flags used on the system call to open the file while perm sets the File mode on the file. The file mode includes the permissions and type of file eg. symlink, directory, etc...
os.O_APPEND tells the underlying OS that all the write calls you do on that file handler should always append to the file so you don't need to set the offset to write on the correct part of the file.
ModeAppend sets the file mode to be append. This means that the this file can only be modified by appending to it, not by rewriting the file contents. The specifics of this depends on the OS and file system you are using. I believe Plan 9, implements it by ignoring the offset on any write call to the file and always appending to it, while in linux it means that the file can only be open for writing in append mode. I think that on most linux distros you need to be root to set the file mode to append.
In 99.99% of cases you just want to use perm to set the file permissions rwx. In your case if you want to open a file and append to it you should use:
// os.O_WRONLY tells the computer you are only going to writo to the file, not read
// os.O_CREATE tells the computer to create the file if it doesn't exist
// os.O_APPEND tells the computer to append to the end of the file instead of overwritting or truncating it
f, err := os.OpenFile("access.log", os.O_WRONLY|os.O_CREATE|os.O_APPEND, 0644)
You might have only ignore the return error on os.OpenFile to put the example online, but you should get used to always checking for errors. You have no idea how many users ran into trouble when starting with go because they ignore the errors. Sometimes is something stupid and easy to fix like a typo, but if you ignore the error you don't know what the issue is.
You can read more about the append file mode here.
I am recovering Stata following a Windows upgrade. I have a list of my packages generated from ado dir in the following format:
[1] package mdesc from http://fmwww.bc.edu/RePEc/bocode/m
'MDESC': module to tabulate prevalence of missing values
[2] package univar from http://fmwww.bc.edu/RePEc/bocode/u
'UNIVAR': module to generate univariate summary with box-and-whiskers plot
[3] package tabmiss from http://www.ats.ucla.edu/stat/stata/ado/analysis
tabmiss. Shows tabulation of number of missing and non-missing values
I have many packages and would like to reinstall them without having to designate each directory/url via net cd. While using net cd along with net install or ssc install along with package names in a loop is trivial (as below), it would seem that an automated method for this task might be available.
net cd http://www.ats.ucla.edu/stat/stata/ado/analysis
local ucla tabmiss csgof powerlog ldfbeta
foreach x of local ucla {
net install `x'
}
To my knowledge, there is no built-in or automated method of tracking and managing your installed packages outside of what is available through ado or net.
I would also tend to agree with #Nick Cox that this task seems strange and I can't imagine how a new Stata install or reinstall could know what was installed previously, but I find the question interesting for other reasons.
The main reason being for users who have Stata installed on multiple machines who need the same packages on both machines. I faced a similar issue when I purchased a new computer and installed Stata but wanted all of the packages I use to be available as well. Outside of moving the ado directory or selected contents I'm not aware of any quick solution.
Here it would be possible to use the output of ado dir on one machine to determine what you need to install on a second machine with a new Stata install.
The method you propose using a foreach loop could save you time from having to type in or copy/paste a lot of packages and URLs. At the same time however, this is only beneficial if you have many packages from only a few repositories because you will need to net cd to the URL each time as you show in your example.
An alternative solution is the programmatic solution. As you know, ado dir will list each installed package, the URL and a short description of the package. Using this, a log file, and the built in I/O functionality, a short program could be written to automate the process and dynamically build a do file that contains the commands to install the already installed packages.
The code below generates a do file containing commands (in this case, net describe package, from(url)) for each package I have installed on my computer.
clear *
tempfile log1
log using "`log1'", text name(mylog)
ado dir
log close mylog
tempname logfile
file open `logfile' using "`log1'", read
file read `logfile' line
file open dfh using "path/to/your/dofile.do", write replace
local pckage "package"
while r(eof) == 0 {
if `: list pckage in line' {
local packageName : word 3 of `line'
local dirName : word 5 of `line'
di "`packageName' `dirName'"
file write dfh "net describe `packageName', from(`dirName')"
file write dfh _newline
}
file read `logfile' line
}
file close `logfile'
file close dfh
In the above code, I create a temp file to write a .txt log file to and store the contents of ado dir in that file.
Then, I open the log file using file open and read it line by line in the while loop.
Above the loop, I'm creating a do file at /path/to/your/dofile.do to hold the output of the loop - the dynamically created commands relating to the installed packages on my machine.
The loop will iterate so long as r(eof) = 0, where r(eof) is an end of file marker. I use an if statement to sort out lines of the log file which contain the word package, as I'm only interested in those lines with the package name and URL in them.
Inside of the if block, I parse the local macro line to pull the package name and the URL/directory name.
this is important: this section of code assumes that the 3rd and 5th words in the macro will always be the package name and URL respectively - Confirm this from the output of ado dir before executing.
You will also need to change the command that is being written to the file handle dfh inside of the loop to what you want (net install, etc) when you are ready to execute.
For more help on using file, locals, and tempfiles execute any of the following in Stata:
help file
help extended_fcn
help macrolists
There may be nicer ways to parse the contents of ado dir but this has worked for me. And of course I'd always advise that you take the time to understand what the code is doing so that you can make any necessary tweaks to fit your particular situation.
I try to loop over the file I findin a relative path to build a list of relative path/soure file name=source file name
SHARED_LIB_PACK=""
for LIB in $(find ../level1/leve2/ -name "*.so*")
do
$SHARED_LIB_PACK=$SHARED_LIB_PACK" "$LIB"="${LIB##*/}
done
but as I run it, it complain :
line 6: = ../level1/level2/file.so.1.0=file.so.1.0: No such file or directory
Any help will be welcome
Firstly, variable assignment is done via:
FOO="bar"
and not
$FOO="bar"
The former will not work.
Secondly, your quotes seem to be in strange places:
SHARED_LIB_PACK=$SHARED_LIB_PACK" "$LIB"="${LIB##*/}
should probably be
LIB="${LIB##*/}"
SHARED_LIB_PACK="$SHARED_LIB_PACK $LIB"
or
SHARED_LIB_PACK="$SHARED_LIB_PACK ${LIB##*/}"
I am struggeling to read an *.xls file into R:
I did the following:
I set my working directory to the *.xls file and then:
> library(gdata) # load the gdata package
> mydata = read.xls("comprice.xls", sheet=1, verbose=FALSE)
Mistake in findPerl(verbose = verbose) : perl executable not found. Use perl= argument to specify the correct path. mistake in file.exists(tfn) : unknown 'file' argument
However, my path is correct and there is the file! Whats wrong?
UPDATE
I have installed it already, however now I get: Exception: cannot find function "read.xls"...
This error message means that perl is not installed on your computer or it is not set on your path.
If the perl is installed then you can put argument perl= inside read.xls() function.
read.xls(xlsfile, perl="C:/perl/bin/perl.exe")
As an alternative, you could try xlsxpackage:
read.xlsx("comprice.xls", 1) reads your file and makes the data.frame column classes nearly useful, but is very slow for large data sets.
read.xlsx2("comprice.xls", 1) is faster, but you'll have to define column classes manually. If you run the command twice, you will not need to count columns so much:
data <- read.xlsx2("comprice.xls", 1)
data <- read.xlsx2("comprice.xls", 1, colClasses= rep("numeric", ncol(data)))
Perl is either not installed or cannot be found. You can either install it, or specify the path where it is installed using
perl='path of perl installation'
in the call.
Build Rules are documented in the Xcode Build System Guide
They are well adapted to the common case where one input file is transformed into a fixed number (usually one) of output files.
The output files must be described in the "Output Files" area of the build rule definition; one line per output file. Typically the output files have the same name as the input file but have different extensions.
In my case, one single input file is transformed into a variable number of files with the same extensions. The number and the names of the output files depend on the content of the input file and are not known in advance.
The output files will have to be further processed later on (they are in this case C files to be compiled).
How can I set up a build rule for such a case?
Any suggestions welcome.
(I asked the same question on the Apple developer forum, but I figured it'd be a good idea to ask here too).
I dealt with this by, instead of generating multiple C files, just concatenating them all together into one file (e.g. "AUTOGENERATED.c"), and specifying that as the output file.
So long as your output files don't contain anything that will conflict (static functions with the same name, conflicting #defines etc.) this works well.
See this article on Cocoa With Love:
http://cocoawithlove.com/2010/02/custom-build-rules-generated-tables-and.html
This has an example of generating custom C code and using that as input to the normal build process. He's using ${} variable syntax in the output
The best way I found to add any number of files to my xcode project (and make some processing) is to write a little php script. The script can simply copy files into the bundle. The tricky part is the integration with xcode. It took me some time to find a clean way. (You can use the script language you like with this method).
First, use "Add Run Script" instead of "Add Copy File"
Shell parameter:
/bin/sh
Command parameter:
${SRCROOT}/your_script.php -s ${SRCROOT} -o ${CONFIGURATION_BUILD_DIR}/${UNLOCALIZED_RESOURCES_FOLDER_PATH}
exit $?
(screenshot in xcode)
${SRCROOT} is your project directory.
${CONFIGURATION(...) is the bundle directory. Exactly what you need :)
This way, your script return code can stop xcode build (use die(0) for success and die(1) for failures) and the output of script will be visible in xcode's build log.
Your script will look like that: (don't forget chmod +x on it)
#!/usr/bin/php
<?php
error_reporting(E_ALL);
$options = getopt("s:o:");
$src_dir = $options["s"]."/";
$output_dir = $options["o"]."/";
// process_files (...)
die(0);
?>
BONUS: here my 'add_file' function.
Note the special treatment for PNG (use apple's png compression)
Note the filemtime/touch usage to prevent copy files each times.
l
define("COPY_PNG", "/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/copypng -compress");
function add_file_to_bundle($output_dir, $filepath) {
// split path
$path_info = pathinfo($filepath);
$output_filepath = $output_dir.$path_info['basename'];
// get file's dates of input and output
$input_date = filemtime($filepath);
$output_date = #filemtime($output_filepath);
if ($input_date === FALSE) { echo "can't get input file's modification date"; die(1); }
// skip unchanged files
if ($output_date === $input_date) {
//message("skip ".$path_info['basename']);
return 0;
}
// special copy for png with apple's png compression tool
if (strcasecmp($path_info['extension'], "png") == 0) {
//message($path_info['basename']." is a png");
passthru(COPY_PNG." ".escapeshellarg($filepath)." ".escapeshellarg($output_filepath), $return_var);
if ($return_var != 0) die($return_var);
}
// classic copy
else {
//message("copy ".$path_info['basename']);
passthru("cp ".escapeshellarg($filepath)." ".escapeshellarg($output_filepath), $return_var);
if ($return_var != 0) die($return_var);
}
// important: set output file date with input file date
touch($output_filepath, $input_date, $input_date);
return 1;
}