Now I want to download the dataset on the website(http://pubchemqc.riken.jp/), it's in google drive and filesize is 2TB. the website recommend using rclone to download the file but not said how to use.
As the picture shows
This was asked a while back already but I'd like to keep it recorded here as it seems hard to find.
Unfortunately I don't know if there's a way to automate this process but it's still easy.
What you can do is create a new remote with the link's root_folder_id.
The Root Folder ID is present in the url e.g:
in the example link
https://drive.google.com/drive/folders/a1b2c3deFgHi4JKlm56nOpqrStuv7w8xy9
the root_folder_id is the string after the slash /folders/ so in this case it would be "a1b2c3deFgHi4JKlm56nOpqrStuv7w8xy9". If the Google Drive shared link you got is a folder all you have to do is copy that id and use it in the rclone's new remote setup. Now if the shared link points directly to a file, like the one in the OP, there's no folder to download it from, so we gotta create our own! The whole process would look like:
get the folder_id from Google Drive's link. If it is a link to a direct file then first we have to create a new folder anywhere inside our own Google Drive, it's name doesn't matter as we will point directly to it using the ID. After creating this new folder, open it and note the url, it should look something like
https://drive.google.com/drive/u/1/folders/Fdrcv3nQvxQqXUGEEyvacwUxdYXpV33Ct
just copy everything after /folders/ and save it for later.
now go back to the link of the direct file. Add a shortcut for that file to inside your recently created folder, so say the folder was named dl-with-rclone, click the "Add shortcut to Drive" icon and navigate to the "dl-with-rclone" folder to add the shortcut there.
heading to rclone, do:
rclone config <to open settings>
On the prompt e/n/d/r/c/s/q> hit n (the key relative to creating a new remote)
On name> give it a name like sharedWithMe or anything else for personal reference later
On the prompt Storage>, the possible remotes to connect to, write down drive or the number relative to it (currently 15)
if you set up your own client_id and client_secret you put them next
On the scope> prompt option 2 "Read-only" is enough
now this is the important one:
on the root_folder_id> prompt input the ID of the folder of the shared link or the folder you created and pointed the file shortcut to, e.g. Fdrcv3nQvxQqXUGEEyvacwUxdYXpV33Ct
Now you can pretty much hit enter for everything, once you reach Use auto config? you will be redirected for a browser to login. Make sure the logged in account is the same that you created the folder for the shortcut. If the shared link is for a folder already your logged in account doesn't matter.
After finishing your remote setup you can exit rclone config
The command you would need would be something like rclone copy sharedWithMe: destination/folder
This being rclone you would surely be able to copy from one remote to another, that's up to the user.
In summary: I did test this method on your link, creating a folder in my Drive, pointing the file shortcut to it and using the root_folder_id to setup the rclone remote and it did begin to download:
rclone test download screenshot
this is a general question about Power Automate and OneDrive that I've seen no solution to.
I'm trying to create a flow "Copy files from a folder in OneDrive (Business) to an FTP server". The trigger is "When a filed is created" in a OneDrive directory. When I attempt to navigate to the folder I just see ROOT, then "No Items".
I also can't figure out how to obtain the unique identifier of the folder.
I was able to get the ID of my folders with this short Instant flow.
It'll output a JSON response with the ID, Name, Path, etc for all your folders in the Root of OneDrive. There is also a "List files in folder" block that you can use to get subfolders of folders.
I was able to copy the ID string from the JSON response and paste it into the "When a file is create" OneDrive trigger and have it go off successfully.
The issue was I was attempting to do this using OneDrive(Business) when I should have used Sharepoint based on my organization's license
I am going to prepare a (windows) server side service which allows users to download their requested files as a Zipped folder. In Microsoft windows(7) If you select all files and folders and send them to a zip folder, a random name will be assigned to the generated zip file which is the name of one files or folders in that collection.
Is there any reason that windows doesn't set a new name say new-zip-file? and how can I predict what the name would be?
The name of the zip is in fact not random,
it relies in the file you choose to click at last.
you can select multiple files but you make right click on one only file in order to zip the group of files, and thats the file name windows chooose.
I'm using PGAdmin 1.14.3.
When I try to execute an import command:
COPY grad(country_code, postal_code, place_name, admin_name1, admin_code1, admin_name2, admin_code2, admin_name3, admin_code3, latitude, longitude, accuracy)
FROM 'C:\\Users\\denis\\Desktop\\BP2Project\\USA\\US.txt';
I get a
ERROR: could not open file
"C:\Users\denis\Desktop\BP2Project\USA\US.txt" for reading:
Permission denied SQL state: 42501
I did look up other similar questions and none of them solved my issue.
I logged in as user "postgres" who is the superuser. I don't see why I'm missing permissions. I'm on Windows 7.
The permissions article mentioned in the answer by Houari and Flimzy is a good reference material, but a direct answer (the quick fix I used) is:
Right click the folder containing the data file(s) that permission was denied to and then click Properties.
In the Folder's Properties window, select the Security tab.
Click the Edit button.
In the "Permissions for the folder" window that opened, click the Add... button.
Type Everyone into the "Enter the object names to select" text area box.
Click OK and the window will close.
Verify that the default Read & Execute permissions were set to Allow via the check checkbox in the previous window.
As JLB notes, Write permission is needed if dumping from PostgreSQL, opposed to copying into it.
Click OK and the window will close.
Click the Apply button in the Folder Properties window.
Now you can run the SQL COPY statement that needs to access those files.
Once done, return to the Folder's Properties window.
Click the Edit button.
Select the Everyone entry in the "Group or user names:" field.
Click the Remove button.
Click OK on the remaining open windows.
The permissions have now been returned to what they were.
The user Postgres must have read access on the file from which you are about to copy.
Look at this article to see how to modify files' security access on Windows.
Ok, this is how got COPY command working,to export a table to CSV, step by step.
Pls note that I am using pgAdmin 111.
Create the target folder you want to export a table to. E.g C:\myExports
Set a read/write permission on this folder following the steps below :
Right click the folder containing the data file(s) that permission was denied >to and then click Properties.
In the Folder's Properties window, select the Security tab.
Click the Edit button.
In the "Permissions for the folder" window that opened, click the Add... button.
Type Everyone into the "Enter the object names to select" text area box.
Click OK and the window will close.
Verify that the default Read & Execute permissions were set to Allow via the >check checkbox in the previous window.
Click OK and the window will close.
Click the Apply button in the Folder Properties window.
This is the tricky part, inside myExports folder create a blank CSV file with your desired name.E.g employee.csv
Then run the Copy command like this :
copy employee to 'C:\myExports\employee.csv' delimiter ',' csv;
employee is the table name in this example..
Hope this helps.
If you don't want to give permissions to Everyone, you can add permissions to the account that started the service. In the Control Panel - Administrative Tools - Services, copy the account name in the 'Log On' tab. (On my system the account is called 'Network Service'.) Then share the folder with the CSV-file with this user as shown in the answer above.
To solve this problem you must give permission to the CSV file because that CSV file present in a COPY command are read directly by the server, but not client application. So to make this file accessible to a server we must give full read-write permission so that Postgresql user can read and write on that file.
Reference: article showing step by step procedure.
I just ran into this error and even after adding postgres to permissions on the file folder and the file itself, it still didn't work. So, I put the file in a public folder. On Windows this was the path: "C:\Users\Public\Documents\census.csv." It worked!
Responses to this problem on different threads go something like this
1. "Tell me exactly what command you used"
2. "Make sure you have right permissions"
3. "Just use /copy"
I just tried giving permissions to Everyone on the cvs file I am trying to copy from, and it is still giving me the permission denied error. I think this functionality is broken and has been broken for multiple consecutive releases over multiple consecutive versions of Windows.
for me and I've just spent some long hours on this.
I have a central db residing on a HP box running 14.04 postgresql-9.5 pgAdmin3 postgis-2.2, shares are made through a tweeked Samba share. My clients are using a mixture of windows 10.1, 7, 8.1 and I have one ubuntu 14.04 desktop.
I'm working with large tables updating records and normalising data and have built the routines around SQL copy statements from CSV files which were made from the core COPY public.table_1 TO (the share folder I'd set up in Samba https://www.youtube.com/watch?v=ndAYZ0DJ-U4) '/srv/samba/share/[filename].csv'
I can then update the database once the tables have been amended with COPY table_1 from '/srv/samba/share/test.csv' USING DELIMITERS ',' WITH NULL AS '' CSV HEADER; from any of my clients.
The key as far as I have been able to determine is that the clients doing the updating must be superusers, also everything must tie up in terms of users as there are 4 servers working together here Postgresql, Samba, UNIX and WINS
All of my users are registered on each of the servers with the same username and password homogeneity is the main factor.
I had tried for a long time moving things about and trying various naming conventions but in the end it was http://www.postgresql.org/message-id/CFF47E56EA077241B1FFF390344B5FC10ACB1C0C#webmail.begavalley.nsw.gov.au that sorted me out it was like a big switch clicking in. chown 777 on your shares and group management was an important learning curve but., the hours I've spent on this will reap rewards down the line... Loving Ubuntu loving life and loving the spirit of open source but that just might be sleep deprivation kicking in... IT WORKS
I am trying to execute SQL commands directly from a file in psql 14, and landed the same error.
The reason is that "postgres" user is different from the 'admin' or the main user of the operating-system. So, this main user denies "postgres" to access files from its file system.
Although there is a way to bypass it.
Windows lets any user access the files in 'C:\Users\Public' and Linux distros allow files in '/tmp' folder to do the same.
So, whatever files you are trying to access from postgres' terminal, keep the files in
'C:\Users\Public' for Windows
'/tmp' in Ubuntu
Read from the orginal source of this answer
use \copy command from psql instead with this config:
sudo psql -U postgres -d <your-db> -c "\copy <your-query-or-table> TO '<pat-to-save-file>' WITH (FORMAT CSV)"
I need to rename image files sequentially as they are added to a folder. ie. image-0001.jpg and image-0002.jpg are in a folder I add test.jpg and it is renamed image-0003.jpg. I have tried automators rename function but it will start over with image-0001.jpg each time a new file is added instead of continuing the sequence.
Any help is greatly appreciated.
You could make this easy on yourself by using a handy application built into the OS called "Folder Actions". Folder Actions contains one or more special handlers, formally known as folder action event handlers, that run when they are triggered by a change in the target folder. I know I'm confusing but I'll do my best.
What you are trying to accomplish requires an adding folder items to event handler. It requires one direct parameter, which can be anything you wish i.e. target_folder. The handler requires an additional parameter as well; after receiving, which should also be a variable name, i.e. these_items. I have composed a script for you that should do the trick. I have added comments that show you what I'm doing when I do it. Here it is:
on adding folder items to the target_folder after receiving these_items
tell application "Finder"
set all_images to every item of the target_folder as list
repeat with i from 1 to the count of these_items --iterates through all the items you dropped in the folder
set this_image to item i of these_items --the current image
set the name of this_image to "image" & (i + the count of all_images) as string --renames the image based on the number of images already in the folder
end repeat
end tell
end adding folder items to
YAY! The script is done! But are WE done? Not quite. We still need to attach the script to a folder (the script won't run if you try to execute it in script editor).
To do this, first save the script as a Script File in the Folder Action Scripts folder in the Scripts folder in either the local Library folder or the current user's Library folder. Create the folder yourself if it doesn't already exist. Next, launch the Folder Actions Setup application by double-clicking it in the AppleScript folder in the Applications folder. In the window that comes up, click the + button under the table on the left (click the "Enable Folder Actions" checkbox if it isn't already checked) to open a standard file browser sheet, navigate to your desired folder, and click "Open". The Choose a Script to Attach sheet automatically opens, listing all the scripts in all of the Folder Action Script folders. Choose the newly-created script, click "Attach", and BAM you are done!
To see the script in action, drag an image onto the folder. The image is instantly renamed, regardless of if the folder window is open. If you have any questions, or if the script doesn't work, just ask me. :)
well without digging through some code and handin you an answer I'll tell you want you want to do
is create a while loop that checks for the existence of image-000 & i where i is a variable of course and if it exists increment i then when the file doesn't exist rename your file.