phpdoc use multiple directory in parsing - phpdoc

I want to parse two or more folders with phpdoc. Parsing one folder works fine.
In the documentation, I can see this:
-d|–directory[=”...”]
Provide a comma-separated list of source folders to parse.
But I can't get it to work using these variations:
phpdoc.bat -d[application\controllers\mnm,application\libraries] -t phpdoc
phpdoc.bat -d [application\controllers\mnm,application\libraries] -t phpdoc
phpdoc.bat -directory[application\controllers\mnm,application\libraries] -t phpdoc
phpdoc.bat -directory["application\controllers\mnm","application\libraries"] -t phpdoc
phpdoc.bat -directory[="application\controllers\mnm","application\libraries"] -t phpdoc
phpdoc.bat -directory [="application\controllers\mnm","application\libraries"] -t phpdoc
phpdoc.bat -d [="application\controllers\mnm","application\libraries"] -t phpdoc
The error message I receive is
[Exception]
No parsable files were found, did you specify any using the -f or -d parameter?
What is the correct syntax?

This is a solution!
phpdoc.bat -d "application\controllers\mnm","application\libraries" -t phpdoc

Related

is it possible for Wget to flatten the result directories?

Is there any way to make wget output everything in a single flat folder? right now i'm
wget --quiet -E -H -k -K -p -e robots=off #{url}
but i'm getting everything in the same nested way as it is on the site, is there any option to flatten the resulting folder structure? (and also the sources links on the index.html file)
After reading on the documentation and some examples i found that i was missing the -nd flag, that would make wget just get the files and not the directories
correct call wget --quiet -E -H -k -nd -K -p -e robots=off #{url}

iterate through specific files using webHDFS in a bash script

I want to download specific files in a HDFS directory, with their names starting with "total_conn_data_". Since I've got many files I want to write a bash script.
Here's what I do:
myPatternFile="total_conn_data_*.csv"
for filename in `curl -i -X GET "https://knox.blabla/webhdfs/v1/path/to/the/directory/?OP=LISTSTATUS" -u username`; do
curl -i -X GET "https://knox.blabla/webhdfs/v1/path/to/the/directory/$filename?OP=OPEN" -u username -L -o "./data/$filename" -k;
done
But it does not work since curl -i -X GET "https://knox.blabla/webhdfs/v1/path/to/the/directory/?OP=LISTSTATUS" -u username is sending back a json text and not file names.
How should I do? Thanks
curl provides output in json format only. you will have to use other tools like jquery and sed to format that output and get the list of files.

How to iterate all files inside folder and run curl command to install AEM packages

I am trying to create a shell script which to iterate all the zip files and install those in AEM package manager using curl command.
The below single curl command is working, It is properly installing the package in respective AEM instance.
curl -u admin:admin -F file=#"content-ope.zip" -F name="content-ope.zip" -F force=true -F install=true http://localhost:4502/crx/packmgr/service.jsp
But we have to install many zip files so we are planning to keep all of them in one folder, iterate all the zip files and install using curl command. Tried with while and for loop but unable to read all the .zip files using shell script.
Can anyone have any idea on this?
I wrote that exact thing, see here:
https://gist.github.com/ahmed-musallam/07fbf430168d4ac57bd8c89d8be9bca5
#!/bin/bash
# this script will install ALL zip packages in current directory the AEM instance at 4502
for f in *.zip
do
    echo "installing: $f"
    curl -u admin:admin -F file=#"$f" -F name="$f" -F force=true -F install=true http://localhost:4502/crx/packmgr/service.jsp
    echo "done."
done
Instead of using curl you can just copy the files over to the install folder of the AEM instance. These will get installed automatically. https://helpx.adobe.com/in/experience-manager/6-3/sites/administering/using/package-manager.html#FileSystemBasedUploadandInstallation
find . -name "*.zip" -maxdepth 1 -exec curl -u admin:admin -F file=#"{}" -F name="{}" -F force=true -F install=true http://localhost:4502/crx/packmgr/service.jsp ";"
Node, this will substitute ./foo.zip instead of foo.zip. If you need to strip ./, you should probably write shell-script wrapping you curl command that accepts zip file name as argument and strips ./ from it before passing to curl.

Curl wildcard delete

I'm trying to use curl to delete files before i upload a new set, I'm having trouble trying to wildcard the files.
The below code works to delete one specific file
curl -v -u usr:"pass" ftp://11.11.11.11/outgoing/ -Q "DELE /outgoing/configuration-1.zip"
But when i try and wildcard the file with the below
curl -v -u usr:"pass" ftp://11.11.11.11/outgoing/ -Q "DELE /outgoing/configuration-*.zip"
i ge the error below
errorconfiguration-*: No such file or directory
QUOT command failed with 550
Can i use wildcards in curl delete?
Thanks
Curl does not support wildcards in any commands on an FTP server. In order to perform the required delete, you'll have to first list the files in the directory on the server, filter down to the files you want, and then issue delete commands for those.
Assuming your files are in the path ftp://11.11.11.11/outgoing, you could do something like:
curl -u usr:"pass" -l ftp://11.11.11.11/outgoing \
| grep '^configuration[-][[:digit:]]\+[.]zip$' \
| xargs -I{} -- curl -v -u usr:"pass" ftp://11.11.11.11/outgoing -Q "DELE {}"
That command (untested, since I don't have access to your server) does the following:
Outputs a directory listing for the outgoing/outgoing directory on the server.
Filters that directory listing for file names that start with configuration-, then have one or more digits, and then end with .zip. You may need to adjust this regex for different patterns.
Supplies the matching names to xargs, which, using the delimiter {} to interpolate each matched name, runs the curl command to DELETE each file on the server.
You could use one curl command to delete all of the files by concatting the matched names together into a single delete command, but that would be less legible for use as an example.

How to download multiple URLs using wget using a single command?

I am using following command to download a single webpage with all its images and js using wget in Windows 7:
wget -E -H -k -K -p -e robots=off -P /Downloads/ http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
It is downloading the HTML as required, but when I tried to pass on a text file having a list of 3 URLs to download, it didn't give any output, below is the command I am using:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt -B 'http://'
I tried this also:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
This text file had URLs http:// prepended in it.
list.txt contains list of 3 URLs which I need to download using a single command. Please help me in resolving this issue.
From man wget:
2 Invoking
By default, Wget is very simple to invoke. The basic syntax is:
wget [option]... [URL]...
So, just use multiple URLs:
wget URL1 URL2
Or using the links from comments:
$ cat list.txt
http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
http://www.verizonwireless.com/smartphones-2.shtml
http://www.att.com/shop/wireless/devices/smartphones.html
and your command line:
wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt
works as expected.
First create a text file with the URLs that you need to download.
eg: download.txt
download.txt will as below:
http://www.google.com
http://www.yahoo.com
then use the command wget -i download.txt to download the files. You can add many URLs to the text file.
If you have a list of URLs separated on multiple lines like this:
http://example.com/a
http://example.com/b
http://example.com/c
but you don't want to create a file and point wget to it, you can do this:
wget -i - <<< 'http://example.com/a
http://example.com/b
http://example.com/c'
pedantic version:
for x in {'url1','url2'}; do wget $x; done
the advantage of it you can treat is as a single wget url command

Resources