Unexpected Behavior in Laravel 7 with Storage::put() and fopen() - laravel

I am working on a solo project where admin users will be able to manage Virtual Machines via a web application run on laravel 7. I am currently creating the creation controller which requires uploading large files (.ovas). I am trying to use streams (I think) in a store method.
Specifically, a view has a form with a standard file inclusion that sends the request to the controller. The form gets validated and then I try to move the file into storage as such:
$data = $this->validator($request);
Storage::put('files', fopen($data['file'], 'r+'));
Where $data['file'] is the uploaded file. I can see the file being uploaded locally (while monitoring hdd usage), but after it is sent, laravel returns the following error
fopen(/var/www/html/devel/ocl/storage/app/files): failed to open stream: Is a directory
The error seems pretty obvious, so I was curious and decided to return the fopen() call before storage to see if I could invoke the same error by changing the method to:
$data = $this->validator($request);
return var_dump(fopen($data['file'], 'r+'));
However, no error is invoked and I get resource(8) of type (stream) as expected (I think). The tmp resource disappears and no errors are thrown. I am a bit baffled on why this is occurring as I would assume that function call resolves before the storage. I feel like I'm missing something fundamental with PHP/laravel and cannot think of how to continue to explore the issue.
Why does the first call to fopen() seem to see $data['file'] as a path, but the second call to just fopen() work as intended? I am hoping the answer can help remedy the issue.
Thanks!

Related

File not downloading laravel 5.6

I've read other questions and implemented answers but haven't had any success.
Here's the error:
files in database, and locally:
here's my form:
here's my file-controller:
The flow is I upload a file, it is stored into the database, and locally, the files locally work properly. I implemented the download attribute and was able to get actual files, but those files and that process was scrapped because the files downloaded, while they had the correct names and mime types, the file had a error like "could not open file".
I am primarily using the download method in the filecontroller which you can see above, I used some ideas for getting the actual path to the file, as that seems to be the issue, but even using the storage_path method I'm getting this error.
Thanks!
I didn't understand the pathing for filestorage enough, and was therefor obviously getting the incorrect paths to the file being stored.
For example, the storage_path method returns the path to storage, I thought it was going to give me the exact path to the file. Once figuring this out I was able to navigate and fix my issues.
filecontroller
line 52: storeAs was storing to a redundant path, i made it as '/upload'/
filescontroller#download
return response()->download(storage_path("app/upload/{$file}"));
following the paths where files were being saved locally, I was able to fix this issue.

Firefox Extension : Unable to parse JSON data for extension storage

I have written a Firefox Extension using Web Extension APIs. It has passed the Preliminary review but the reviewer said that he cannot proceed with the full review cause when he installs it, he gets the following error -
"Unable to parse JSON data for extension storage"
Upon inspecting for quite sometime, I figured that Firefox creates a file called "storage.js" in the profile folder for each extension where it writes and reads from, all the local storage data for that particular extension. And if the extension tries to write to this file before this file is created, the error "Unable to write JSON data to extension storage" is thrown and if the extension code tries to read from this file before this file is created, the error "Unable to parse JSON data for extension storage" is thrown.
Now, my concern is how do I know for sure that the file has been created and that it can be written to or read from?
PS : This happens when the extension is just installed. For consequent sessions, this error wont come as that file is no longer missing.
This seems to be a bug in the current Firefox implementation, and your assessment is spot on:
The underlying ExtStorage module will always call read before get, set etc. even write and clear.
read will unconditionally try to access the underlying, extension specific storage file, that may not exist yet for freshly installed add-ons using the storage API for the first time.
This will therefore result in the logging of one such Unable to parse JSON data for extension storage message, no matter what you do with the storage API.
Therefore triggering the message cannot be avoided.
I suggest you do the following:
Contact the editors team, requesting they re-evaluate your add-on based on:
The message in question is really only a warning (when appearing after first access of the storage API by your addon).
Even when the message would be an actual error (the storage is corrupt), it would still not be your error, as the storage API implementation by mozilla needs to be more resilient then and there is nothing you can do anyway.
The message being issued on first regular use of the storage API, unrelated to what WebExtensions add-on uses that API and in what way, is a mozilla bug, and not something you caused or can fix yourself or at least work around.
Therefore denying a full review just because a mozilla bug erroneously logs a spurious message once without any other severe effects is... questionable.
File a bug about this so mozilla developers can address this issue. You'll wanna CC at least Bill McCloskey (:billm) since he wrote that code ;)

RmGetList consistently returning 0x5 ERROR_ACCESS_DNIED

I am getting consistently a return value of 5 from RmGetList, any possible reasons?
I am following this tutorial: http://blogs.msdn.com/b/oldnewthing/archive/2012/02/17/10268840.aspx
My RmStartSession and RmRegstierResources both return 0 which means success. A note though, after RmStartSession the dwSession is always 0, and my szSessionKey stays a string in the form of 0ea790d06656a54f84645b5755f7b6d6 null terminated. Is this a problem?
My code is js-ctypes so I'm reluctant to share this in winapi but ill share it: https://github.com/Noitidart/_scratchpad/blob/master/_WinAPI-RstrtMgr.js#L293
Edit: I learned that dwSession of 0 is a valid. However I still can't figure out why I'm getting access denied on RmGetList, anyone any ideas?
I've learned that the restart manager doesn't support folders, and the error 5 is returned when you're trying to pass a folder:
https://blog.yaakov.online/failed-experiment-what-processes-have-a-lock-on-this-folder/
Update: here's some sample code of how to use the restart manager API:
https://github.com/Microsoft/msbuild/blob/master/src/Tasks/LockCheck.cs
RmGetList will return error 5 if any higher level thing like a file system fillter block the file
the driver filter denies access to file xxx from any process and function whether windows kernel or user code tries to access it
it does the same with restart manager , restart manger tries to access the file , the driver throws error 5 , the restart manager wont know what to do with it ,and rethrows it back to calling function ,so you'll get a access denied
if you are trying it for all files in windows volume, there will be much files with throwing error 5, including 3rd parity antivirus files or ms defender or ...
simply use a try catch and ignore them because even if you know what pid is locking them you couldn't do anything about it, other than watching
it also happens if you don't have even read access to the locked file ,in this case try fixing security permotions and trying again

Classic ASP Error Handling: Calling a function after an error?

I do maintenance on a classic ASP website that basically has no error handling at all yet. So users see any error message that comes across... Instead, I would like my code to be able to catch any potential error and then fire off an email before redirecting the user to a more friendly error-page.
This website is rather large, and every webpage comes with the same include file at the beginning... So ideally, I would like to set an error handler from the beginning of this include file. I haven't had any luck finding a way to do this without having to go through every page individually having error handling happen at the end of the script... Is there a possible way to code something like this from the include file?:
' Include file contents:
Function MyHandler()
'Code for triggering email goes here
response.redirect "ErrorPage.asp"
End Function
On Error call MyHandler()
Thanks in advance!
I suggest to use Custom Error Pages in IIS (Web Server), if you have access to those. You can redirect different types of errors to different scripts if you like or point them all to a single one and have there the logic for all error codes.
You can catch common errors there and maybe redirect the user to a alternative page/site, or return a specific error message.. I would suggest to use the custom error page also to log the error and some information from the session (e.g. form submit data, query strings, referrer URLs, cookies etc.) in a database and/or send a notification email to some service account to identify specific issues that are occurring and then also have something to go on to actually fix the cause of many of the errors.

upload huge numbers of files to blob

Do you know best way to upload too many files to Azure Blob container?
I am currently do something to upload multiple files to Azure blob storage. The number of files may be huge, like 30,000 or more(each file could be sized of 10KB~1MB). Firstly, I have a list of files locations, then I would use Parallel.Foreach to upload the files. code snippet like this:
`List locations=...
Parallel.Foreach(locations, location=>
{
...
UploadFromStream(...);
...
});`
The codes run to inconsistence results.
Sometimes it runs well, I can see all files uploaded to the Azure blob container.
Sometimes, I will got exceptions like this:
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature., Inner Exception: The remote server returned an error: (403) Forbidden
Sometimes, I got a timeout exception.
I have worked against the issue for several days, unfortunatly, I havn't got a perfect solution yet. So I want to know how do you do when you handling similar scenario, how do you do when upload too many files to Azure blob storage?
Finally, I have not found what's wrong with my code.
However, I have found solution for this issue. I expire Parallel.Foreach, just use common foreach. Then I use BeginuploadFromStream method instead of UploadFromStream, it actually upload files asynchronously.
So far, it runs prefectly, more stable, without any exception happens.

Resources