Download file in Laravel from AWS S3(non public bucket) - laravel

I am able to save all my files in the bucket but having difficulties with download.
My code is:
$url = Storage::disk('s3')->temporaryUrl(
$request->file, now()->addMinutes(10)
);
return Storage::disk('s3')->download($url);
Full file path stored in $request->file
Example path: https://bucket_name.privacy_region_info/folder_inside_bucket/cTymyY2gzakfczO3j3H2TtbJX4eeRW4Uj073CZUW
I am getting the fallowing https://prnt.sc/1ip4g77
Did I not understand the purpose od generating temporaryUrl? How can I download files from S3 non public bucket?
BTW I am using Laravel 8 and league/flysystem-aws-s3-v3 1.0.29.

The error message you have shown suggests your user does not have the correct permissions, or that the file does not exist.
If you are sure the file exists, i would suspect a permissions issue.
In AWS IAM, make sure the user has a policy attached to it that grants the correct permissions.
In this case from the comments, i can see the user only has "Write" permissions. You will need explicit "Read" permissions too.

Related

Issues with loading Maxmind Data into Clickhouse Database using a local file

I'm trying to insert Maxmind Data into a Clickhouse Dictionary but defining it source as a local file where I can running my Client from.
so to define my dictionary I use the query:
CREATE DICTIONARY usage_analytics.city_locations(
geoname_id UInt64 DEFAULT 0,
...
...
...
...
)
PRIMARY KEY geoname_id
SOURCE(File(path '/home/ubuntu/maxmind_csv/GeoLite2-City-Locations-en.csv' format 'CSVWithNames'))
SETTINGS(format_csv_allow_single_quotes = 0)
LAYOUT(HASHED())
LIFETIME(300);
yet I keep getting hit with the error of:
Failed to load dictionary 'usage_analytics.city_locations': std::exception. Code: 1001, type: std::__1::__fs::filesystem::filesystem_error, e.what() = filesystem error: in canonical: No such file or directory [\home/ubuntu/maxmind_csv/GeoLite2-City-Locations-en.csv] [/],
According to the documentation, I have to use its absolute path, which I did by using readlink, and still it cannot detect my file. I am running a clickhouse client from a remote machine and have the files on the remote machine. Am I suppose to have my files else where or what?
It looks like this file is not available, to fix it need to to set right ownership for file:
chown clickhouse:clickhouse /home/ubuntu/maxmind_csv/GeoLite2-City-Locations-en.csv
# chown -R clickhouse:clickhouse /home/ubuntu/maxmind_csv
.XML dictionary allows to read files from any folder.
SQL dictionary does not.
https://clickhouse.tech/docs/en/sql-reference/dictionaries/external-dictionaries/external-dicts-dict-sources/#dicts-external_dicts_dict_sources-local_file
When dictionary with source FILE is created via DDL command (CREATE DICTIONARY ...), the source file needs to be located in user_files directory, to prevent DB users accessing arbitrary file on ClickHouse node.
/etc/clickhouse-server/config.xml
<!-- Directory with user provided files that are accessible by 'file' table function. -->
<user_files_path>/var/lib/clickhouse/user_files/</user_files_path>

Can't create bucket using aws-sdk ruby gem. Aws::S3::Errors::SignatureDoesNotMatch

I have a new computer and I'm trying to set up my AWS CLI environment so that I can run a management console I've created.
This is the code I'm running:
def create_bucket(bucket_args)
AWS_S3 = Aws::S3::Client.new(signature_version: 'v4')
AWS_S3.create_bucket(bucket_args)
end
Which raises this error:
Aws::S3::Errors::SignatureDoesNotMatch - The request signature we calculated does not match the signature you provided. Check your key and signing method.:
This was working properly on my other computer, which I no longer have access to. I remember debugging this same error on the other computer, and I thought I had resolved it by adding signature_version = s3v4 to my ~/.aws/config file. But this fix is not working on my new computer, and I'm not sure why.
To give some more context: I am using aws-sdk (2.5.5) and these aws cli specs: aws-cli/1.11.2 Python/2.7.12 Linux/4.4.0-38-generic botocore/1.4.60
In this case the issue was that my aws credentials (in ~/.aws/credentials) - specifically my secret token - were invalid.
The original had a slash in it:
xx/xxxxxxxxxxxxxxxxxxxxxxxxxx
which I didn't notice at first, so when I double clicked the token to select the word, it didn't include the first three characters. I then pasted this into the terminal when running aws configure.
To fix this, I found the correct, original secret acceess token and set the correct value in ~/.aws/credentials.

mkdir error. permission denied. google api

I am attempting to use a service account with Google's API to add a calendar event using php. I have this working perfectly on a site already. When i moved it to another site on the same server, i suddenly began to receive the following error messages:
~PHP Warning: mkdir(): Permission denied in Google/Cache/File.php
~Uncaught exception 'Google_Cache_Exception' with message 'Could not create storage directory: in Google/Cache/File.php
The two environments are identical as far as i can tell
~Same server
~Same permissions on all files/folders
~Same credentials
~Both URLS authorized in Google's console
I checked with my server to see if something in the upvoted answer here could be the issue, but was assured that everything was set up correctly.
I've done a lot of searching and reading, but can't imagine what might be causing these errors when everything works perfectly from the other site.
Any help would be much appreciated!
In case anyone comes upon this: I solved this by following the advise given here:
A client error occurred: Could not create storage directory: /tmp/Google_Client/00
specifically, i manually added nested folders (google and cache) inside my tmp directory and then set the path to it for google using this code (from the link above):
$config = new Google_Config();
$config->setClassConfig('Google_Cache_File', array('directory' => '../../tmp/google/cache'));
// Here I set a relative folder to avoid pb on permissions to a folder like /tmp that is not permitted on my mutualised host
$client = new Google_Client($config);
Add this PHP code to your script:
$client = new Google_Client();
$client->setCache(new Google_Cache_File('/path/to/shared/cache'));
In case also that anyone needs to do this. I'm working with Codeigniter and recently moved to osx el capitan on mac from windows 7. My google cache folder had the admin permissions read and write but the error persists: mkdir permission denied, on google/cache/file.php
I looked at my code where I load the google api and added>
$config = new Google_Config();
$config->setClassConfig('Google_Cache_File', array('directory' => 'application/third_party/Google/src/Google/Cache/'));
$client = new Google_Client($config);
So with those line you set your cache folder.
Hope this helps in the future as It helped me.

File Write - Unauthorized Access Exception

Trying to save a file locally from an app running in the iOS 8 Simulator and I'm continually getting access denied exceptions.
In previous apps I've used the following code to get a valid file path:
Environment.GetFolderPath(Environment.SpecialFolder.Personal)
or
Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments)
But I've read that with iOS 8 this has now got to be written as:
NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory,
NSSearchPathDomain.User)[0]
So I'm using the following code to generate a file path for a .txt file and receiving an access denied exception when trying to save with it:
public void SaveMyFile(string content)
{
NSUrl[] urls;
string filePath;
//
urls = NSFileManager.DefaultManager.GetUrls(NSSearchPathDirectory.DocumentDirectory,
NSSearchPathDomain.User);
filePath = Path.Combine(urls[0].Path, "MyApp", "myFile.txt");
File.WriteAllText(filePath, content);
}
So the file path that it gives me and also denies access to is /Users/Idox/Library/Developer/CoreSimulator/Devices/92498E38-7D50-4081-8A64-83061DC00A86/data/Containers/Data/Application/C35B3E98-C9E3-4ABA-AA7F-CD8419FA0EA5/Documents/MyApp/myFile.txt.
I'm wondering if there's some setting that needs to be toggled to give the app write access to this directory or if the directory itself is invalid.
I've also done a call to Directory.Exists(string path) to check if the directory is there, which it is.
You're missing the Path property on urls[0].Path
filePath = Path.Combine(urls[0].Path, "MyApp", "myFile.txt");
This was fixed in Xamarin.iOS 8.4, so if you're using a recent version of Xamarin you can use Environment.GetFolderPath without problems (which is useful if you want to share code across platforms).

How does MediaWiki calculate the file path to an image?

I'm just installing MediaWiki (loving it). I'm lookin at this for adding images. I can se the logic of
[[File:MediaWiki:Image sample|50px]]
but where so I set the filepath for "File" (nothing obvious in LocalSettings.php) ... or is there some other logic at work?
I'd appreciate any help
Thanks
File location is determined by $wgLocalFileRepo which by default depends on $wgUploadDirectory and $wgHashedUploadDirectory. The upload directory defaults to [MediaWiki base dir]/images (Adrian must be using an older version). If hashing is enabled, /x/xy will be appended to the path, where xy are the first two letters of the md5 hash of the filename.
The defaults from DefaultSettings.php are:
$wgUploadPath = "$wgScriptPath/uploads";
$wgUploadDirectory = "$IP/uploads";
If you want to change this, you should copy and paste this into LocalSettings.php
And make sure that $wgEnableUploads = true; is in LocalSettings.php too.
Your "Image sample" is the name of image, not the name of a file. By config file you can just set the root folder for image uploads.
Just for future reference in case someone else runs into this issue:
I installed MediaWiki on my Mac OS Sierra and when I attempted to upload an image I got the following message:
Failed:
Could not open lock file for "mwstore://local-backend/local-public/d/d9/babypicture.png".
I changed the permissions on the mediawiki_root/images folder to be owned by _www user and group.
chown -R _www:_www wiki/images
I was able to upload the image afterward.

Resources