I followed https://www.npmjs.com/package/strapi-upload-google-storage to setup Strapi File Upload plugin with Google Cloud Storage provider. When I upload a cover image to a post and save. It failed with "TypeError: Converting circular structure to JSON".
[2019-02-08T17:54:02.097Z] debug GET 53771798877e88bccc275e15ba634a83.svg (1 ms)
TypeError: Converting circular structure to JSON
at JSON.stringify ()
at EventEmitter.stringify (/Users/gwowen/workspace-node/adtalem-cms/node_modules/strapi/node_modules/fast-safe-stringify/index.js:5:15)
at EventEmitter.asJson (/Users/gwowen/workspace-node/adtalem-cms/node_modules/strapi/node_modules/pino/pino.js:161:22)
at EventEmitter.pinoWrite (/Users/gwowen/workspace-node/adtalem-cms/node_modules/strapi/node_modules/pino/pino.js:215:16)
at EventEmitter.LOG (/Users/gwowen/workspace-node/adtalem-cms/node_modules/strapi/node_modules/pino/lib/tools.js:93:10)
at update (/Users/gwowen/workspace-node/adtalem-cms/plugins/content-manager/controllers/ContentManager.js:83:18)
at process._tickCallback (internal/process/next_tick.js:68:7)
Is this a known bug or is there something not configured right?
Thank you!
Unfortunately there is no correct link to the source code for that package so it's hard to tell what the issue might be. Judging from the readme with the npm package, you need to supply a piece of json. If I combine that information with your call stack, you might have pasted the wrong piece of json.
You need to check the json you pasted in the config in strapi for circular references and double check you haven't missed anything in your configuration.
Related
I am trying to download content of a previous revision of a file using the google drive api for Ruby Google::Apis::DriveV3::DriveService#get_revision:
Tempfile.create('file_revision-', encoding: 'ASCII-8BIT') do |temp_file|
drive_api_service.get_revision(file_id, revision_id, download_dest: temp_file)
# work with the content
end
I keep getting Google::Apis::ClientError with status code 403, reason phrase "Forbidden". The scope https://www.googleapis.com/auth/drive is enabled and the error occurs only when I am trying to download the content of the revision. If I run the code without the parameter download_dest, the request succeeds without any problems. I am also able to fetch any meta data or revision, export files, upload files etc., the problem only occurs when I try to fetch the content, either by get_file or get_revision.
Has anyone encountered this issue?
Is there an additional permissions that specifically needs to be added to allow content downloading?
I have also looked at other questions on stack overflow and I have checked all possible configuration for google drive api I could find and everything seems to be enabled.
Right now, Google drive api does not support content downloading for native google drive files (google docs, sheets etc.), so you cant download content (current or of an old revision), you can only export it to a different mime type.
I'm trying to create an app in InTune using the Microsoft Graph REST API. I'm able to create the app, the contentversion, upload a file to Azure Storage, and call the commit action. After that, I'm waiting for uploadState 'commitFileSuccess', but it returns 'commitFileFailed'.
I saw a similar question, but that's assuming the file encryption is wrong:
commitFileFailed during mobileAppContentFile Commit
However, I have no clue where the error lies. Is there anybody with experience on this particular subject?
If you need more info, please let me know.
I found out what I was doing wrong. I used the PowerShell sample from Github as a base to write my own Ruby version, but I overlooked one thing.
https://github.com/microsoftgraph/powershell-intune-samples/tree/master/LOB_Application
I was uploading my apk file at once to the azureStorageUri, but I needed to upload it in chunks. After doing that the uploadState got updated to commitFileSuccess.
I will share my Ruby script once I've cleaned it up!
i try to setup the youtube api but i get a 403.
I tried to setup several times- without success.
https://www.googleapis.com/youtube/v3/videos?part=snippet,contentDetails&id=-DIJvggBrg8&key=xyz
Maybe someone is able to help me or even login to the console for a setup?
The 403 error by itself is not of immediate use. Its attached error code (as I already pointed out above) sheds light on the things that happened.
The API responds to any query you make with a text (that is structured in JSON format). That response text contains the needed error code.
I suggest you to proceed with the following steps:
delete the old API key (right now it is still accessible!);
create a new API key and
run your API query using the new key and then post here the API response text.
Note that I checked the video -DIJvggBrg8 to which your query refers to with my own API key and got no error but a JSON text describing the video.
I have uploaded as per Oracle Documentation but didn't get in same format. here is postman screenshot next when get request for same file is here after getting from server.my docx file
please let me know what i missed. thanks in advance.
You used POST in your postman call, instead of PUT.
POST in this case is used as the command for creating or updating the metadata, not the object.
Try it with PUT instead.
See the Oracle Cloud Storage documentation for this as well.
Do you know best way to upload too many files to Azure Blob container?
I am currently do something to upload multiple files to Azure blob storage. The number of files may be huge, like 30,000 or more(each file could be sized of 10KB~1MB). Firstly, I have a list of files locations, then I would use Parallel.Foreach to upload the files. code snippet like this:
`List locations=...
Parallel.Foreach(locations, location=>
{
...
UploadFromStream(...);
...
});`
The codes run to inconsistence results.
Sometimes it runs well, I can see all files uploaded to the Azure blob container.
Sometimes, I will got exceptions like this:
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature., Inner Exception: The remote server returned an error: (403) Forbidden
Sometimes, I got a timeout exception.
I have worked against the issue for several days, unfortunatly, I havn't got a perfect solution yet. So I want to know how do you do when you handling similar scenario, how do you do when upload too many files to Azure blob storage?
Finally, I have not found what's wrong with my code.
However, I have found solution for this issue. I expire Parallel.Foreach, just use common foreach. Then I use BeginuploadFromStream method instead of UploadFromStream, it actually upload files asynchronously.
So far, it runs prefectly, more stable, without any exception happens.