I'm thinking I installed this wrong, but keep getting the same error. Installed league/flysystem-aws-s3-v3. S3 creds are setup in .env file.
Added these to the require in composer.
"aws/aws-sdk-php"
"aws/aws-sdk-php-laravel": "~3.0"
--------------------------------------------
Code:
$s3 = \Storage::disk('s3');
$s3->put($location, file_get_contents($image), 'public');
Getting the following error:
exception 'InvalidArgumentException' with message 'Missing required client configuration options:
' in /var/www/laravel/vendor/aws/aws-sdk-php/src/ClientResolver.php:328
Did some research online and couldn't find a solution.
Try to update :
composer update
Try This: go into config/filesystems.php and eliminate any env that encompasses a value.
For example mine showed 'cloud' => env('FILESYSTEM_CLOUD', 's3'),
so I changed it into 'cloud' => 'FILESYSTEM_CLOUD', 's3',
Do this to ALL values in this file, INCLUDING the S3 key, secret, region, and bucket. After that it should get you up and going.
I believe the issue is with Caching the values as Null rather than the actual edited values.
Related
We have recently switched from v1 to v3 of the flysystem sftp package. However, the v3 version is throwing the following error when trying to access files in an existing directory like so:
$disk = Storage::disk('sftp');
$directories = $disk->directories('documents'); // this will only be used for testing to dump and check if the directory exists
$files = $disk->files('documents/whitepapers');
dump($directories) will succesfully return:
array:1 [
0 => "documents/whitepapers"
]
However $files cannot be used, since an error is being thrown:
League\Flysystem\UnableToListContents
Unable to list contents for 'documents/whitepapers', shallow listing
Reason: Undefined array key "type"
So the Undefined array key "type" of the StorageAttributes is causing the issue. However, I have no idea how to fix this. It worked fine in v1 of the package with the same server, directories and files.
Is that an issue in the package or am I doing something wrong here?
The code is running on PHP 8.1.8 and Laravel 9.30.1 with league/flysystem-sftp-v3 3.5.2
Ran into a similar issue today and found it was solved in a new release.
Pull Request: https://github.com/thephpleague/flysystem/issues/1563
Commit: https://github.com/thephpleague/flysystem/commit/57d6217b5c5783eb7ae12270ca364f805eb0e919
Just upgrade league/flysystem-sftp-v3 to version 3.6 or later.
composer require league/flysystem-sftp-v3:^3.6.0
I am able to save all my files in the bucket but having difficulties with download.
My code is:
$url = Storage::disk('s3')->temporaryUrl(
$request->file, now()->addMinutes(10)
);
return Storage::disk('s3')->download($url);
Full file path stored in $request->file
Example path: https://bucket_name.privacy_region_info/folder_inside_bucket/cTymyY2gzakfczO3j3H2TtbJX4eeRW4Uj073CZUW
I am getting the fallowing https://prnt.sc/1ip4g77
Did I not understand the purpose od generating temporaryUrl? How can I download files from S3 non public bucket?
BTW I am using Laravel 8 and league/flysystem-aws-s3-v3 1.0.29.
The error message you have shown suggests your user does not have the correct permissions, or that the file does not exist.
If you are sure the file exists, i would suspect a permissions issue.
In AWS IAM, make sure the user has a policy attached to it that grants the correct permissions.
In this case from the comments, i can see the user only has "Write" permissions. You will need explicit "Read" permissions too.
I am attempting to use a service account with Google's API to add a calendar event using php. I have this working perfectly on a site already. When i moved it to another site on the same server, i suddenly began to receive the following error messages:
~PHP Warning: mkdir(): Permission denied in Google/Cache/File.php
~Uncaught exception 'Google_Cache_Exception' with message 'Could not create storage directory: in Google/Cache/File.php
The two environments are identical as far as i can tell
~Same server
~Same permissions on all files/folders
~Same credentials
~Both URLS authorized in Google's console
I checked with my server to see if something in the upvoted answer here could be the issue, but was assured that everything was set up correctly.
I've done a lot of searching and reading, but can't imagine what might be causing these errors when everything works perfectly from the other site.
Any help would be much appreciated!
In case anyone comes upon this: I solved this by following the advise given here:
A client error occurred: Could not create storage directory: /tmp/Google_Client/00
specifically, i manually added nested folders (google and cache) inside my tmp directory and then set the path to it for google using this code (from the link above):
$config = new Google_Config();
$config->setClassConfig('Google_Cache_File', array('directory' => '../../tmp/google/cache'));
// Here I set a relative folder to avoid pb on permissions to a folder like /tmp that is not permitted on my mutualised host
$client = new Google_Client($config);
Add this PHP code to your script:
$client = new Google_Client();
$client->setCache(new Google_Cache_File('/path/to/shared/cache'));
In case also that anyone needs to do this. I'm working with Codeigniter and recently moved to osx el capitan on mac from windows 7. My google cache folder had the admin permissions read and write but the error persists: mkdir permission denied, on google/cache/file.php
I looked at my code where I load the google api and added>
$config = new Google_Config();
$config->setClassConfig('Google_Cache_File', array('directory' => 'application/third_party/Google/src/Google/Cache/'));
$client = new Google_Client($config);
So with those line you set your cache folder.
Hope this helps in the future as It helped me.
I am new in the DevOps world and my company uses Fog library to deploy EC2 instances for our Dev Environment. One of my company's products needs a CDN and I am trying to figure out how I can automate CDN using the same Fog Library.
I found info at fog.io and here is the code I put in makeCDN.rb (with a .sh wrapper to deploy it).
#!/usr/bin/ruby
require 'fog'
# create a connection to the service
cdn = Fog::CDN.new({
:provider => 'AWS',
:aws_access_key_id => 'fake_key_id',
:aws_secret_access_key => '2345fake_access_key6789'
})
cdn.post_distribution({
'CustomOrigin' => {
'DNSName' => 'hostname.domain.org', #example name
'HTTPPort' => '80',
'OriginProtocolPolicy' => 'match-viewer',
'DefaultRootObject' => '/',
'Enabled' => 'true',
}
})
So, I am unsure what I am doing wrong but the error I am getting is:
/home/eztheog/.rvm/gems/ruby-1.9.3-p547#fogDev/gems/excon-0.38.0/lib/excon/middlewares/expects.rb:10:in
`response_call': Expected(201) <=> Actual(400 Bad Request) (Excon::Errors::BadRequest)
response => #<Excon::Response:0x00000001d73b78 #data={:body=>"<?xml version=\"1.0\"?>\n<ErrorResponse
xmlns=\"http://cloudfront.amazonaws.com/doc/2010-11-01/\"><Error>
<Type>Sender</Type><Code>MalformedXML</Code><Message>1 validation error
detected: Value null at 'distributionConfig.enabled' failed to satisfy
constraint: Member must not be null</Message></Error>
<RequestId>c2b33cda-abee-11e4-8115-b584e1255c70</RequestId>
</ErrorResponse>", :headers=>{"x-amzn-RequestId"=>"c2b33cda-abee-11e4-8115-b584e1255c70",
"Content-Type"=>"text/xml", "Content-Length"=>"371", "Date"=>"Tue, 03
Feb 2015 21:51:07 GMT"}, :status=>400, :remote_ip=>"205.251.242.229",
:local_port=>39733, :local_address=>"10.100.6.203"}, #body="<?xml
version=\"1.0\"?>\n<ErrorResponse
xmlns=\"http://cloudfront.amazonaws.com/doc/2010-11-01/\"><Error>
<Type>Sender</Type><Code>MalformedXML</Code><Message>1 validation error
detected: Value null at 'distributionConfig.enabled' failed to satisfy
constraint: Member must not be null</Message></Error>
<RequestId>c2b33cda-abee-11e4-8115-b584e1255c70</RequestId>
</ErrorResponse>", #headers={"x-amzn-RequestId"=>"c2b33cda-abee-11e4-
8115-b584e1255c70", "Content-Type"=>"text/xml", "Content-Length"=>"371",
"Date"=>"Tue, 03 Feb 2015 21:51:07 GMT"}, #status=400,
#remote_ip="205.251.242.229", #local_port=39733,
#local_address="10.100.6.203">
I have found information here but am unsure how to parse the info into the ruby file.
There seems to be little blog stuff that I can find to figure out how to do this.
Can anyone point me the right direction?
I found this gist that explains the context.
So, the RubyDoc.info link (in question) said that the Enabled Boolean was an Option. But, in AWS it is not (or so it seems based on the error I got).
But, to resolve this, the :Enabled => true has to be OUTSIDE of the cdn.distribution block.
Hope this helps anyone who is looking for this in the future!
Also, this might be a BUG because the Fog library says that this feature is OPTIONAL but it seems mandatory by AWS.
I'm having a problem getting the HTMLPurifier bundle to work. I installed is as so:
php artisan bundle:install Sanitizer
then I edited application/bundles.php
'sanitize' => array('auto' => true),
when I use it:
$clean_output = Sanitize::purify($bad_input);
I get an Unhandled Exception: Class 'Sanitize' not found
I also noticed that when I try: php artisan bundle:upgrade Sanitize,
I get an error saying bundle not installed.
What am I missing?
Any help greatly appreciated.
This bundle has been setup very badly, especially with the different namings and casings. Generally a bundle uses a standard name across the board. What's happening here is that when you install the bundle it's actually installing it to bundles/laravel-htmlpurifier. What you've defined in your bundles.php file is expecting it to be installed at bundles/sanitize.
You have two options.
Option 1
Rename the laravel-htmlpurifier directory to sanitize.
Option 2
Set the location key in the bundles.php file.
'sanitize' => array('auto' => true, 'location' => 'laravel-htmlpurifier')
This should also resolve the upgrading problems you were having. You shouldn't really have to do this as it's normally the responsibility of the bundle author to ensure everything is named correctly and consistently.