Files not sent to server: Illegal invocation at File.remoteFunction - ajax

I have a function that preprocess a complex object before sending it to server to save some space, so it creates a copy of the object, and got this error on submission (AJAX).
It was working before I decided to create a "clean" copy of the object.
Why is this error thrown?

Found out that you can´t copy the file object, the spec doesn´t allow it, so, in my function, I had to make every file point to the original file object in order to accomplish the submission of every file:
prepareData = function(originalObject){
var data = clone originalObject;
data.id_bs = data.bs.id;
delete data.bs;
data.id_Cc= data.cc.id;
delete data.cc;
//Added this to make it work
for ( var kDoc = 0; kDoc < originalObject.docs.length; kDoc++ ){
data.docs[kDoc] = originalObject.docs[kDoc];
}
return data;
}

Related

Transactions for file operations in Laravel

In Laravel I can do database transactions by passing a closure to the DB::transaction function.
Does Laravel have any support for a similar feature for the File or Storage facade? Where the file operations are run in a transaction, with rollback in case a file operation fails?
I'm imagining something like
$files = ... // Something that returns a collection
File::transaction(function () use ($files) {
$files->each(function() {
File::move(....);
});
});
There is no built in way of doing it so you'd have to make an implementation yourself.
A simple method of achieving it would be
$fileName = ""; // or $fileNames ( array ) if multiple file uploads
$files = "" // to be used if you're going to update or delete files. Again if multiple file modifications then use array
try{
/* Just a note, but your new file could overwrite any existing files,
so before uploading, check if another file exists with same filename
And if it does, load that file and keep it in the $files variable
*/
// Upload File
$fileName = // name of uploaded file
$files = // Any existing file you're going to modify. Load the entire file data, not just the name
// Modify/Delete a file
}( \Exception $e ){
// Now delete the file using fileName or $fileNames if the variable is not empty
// If you modified/deleted any file, then undo those modifications using the data in $files ( if it's not empty )
}
In this method, existing files are loaded to memory, but if there are multiple large files, it might be better to move them to a temporary location instead, and move them back if any exception is thrown. Just don't forget to delete these temporary files if the file transaction is a success

TB Plugin errors after updating TB from 38.7.2 to 45.1.0

Several years ago I made a private Thunderbird plugin for automatically processing paypal emails about subscriptions. The user has to put the paypal emails in a certain folder "PaypalMsgs", and the plugin reads them one by one, finds out if it is a payment, a cancellation etc. and then updates the "Other" field of the person in the address book.
The plugin got broken with the recent update of Thunderbird to 45.1.0 because it cannot find the folder PaypalMsgs any more.
This is the code for finding the folder:
// determine the local root folder
var localRootFolder = Components
.classes["#mozilla.org/messenger/account-manager;1"]
.getService(Components.interfaces.nsIMsgAccountManager)
.localFoldersServer
.rootFolder;
// start with root folder to find folder with given name
this.ppPaypalFldr = this.findFldrDeep(localRootFolder, "PaypalMsgs");
// recursive function to find a folder fldr with the name fldrName
findFldrDeep: function(fldr, fldrName) {
if(fldr.name == fldrName) {
return fldr;
} else {
if(fldr.hasSubFolders) {
var fldrEnum = fldr.subFolders;
while(fldrEnum.hasMoreElements()) {
var sfldr = fldrEnum.getNext();
var result = this.findFldrDeep(sfldr, fldrName);
if(result) {
return result;
}
}
} else {
return null;
}
}
},
When executed nothing happens and TB's error console shows:
Error: TypeError: this.ppPaypalFldr undefined
at the first location where this.ppPaypalFldr is used
It might be an easy thing, like the definition of the services of nsIMsgAccountManager might have changed or the folder type suddenly has different functions, but I have a really hard time to find reliable documentation or even the source for TB 45.
Thank you for any hints and support!
After more seach, debugging and thinking (sic!) I found the problem:
At the line
var sfldr = fldrEnum.getNext();
The interface is missing and it looks like in TB45 something has changed so the interface is not automatically retrieved from somewhere (the software worked without this interface since about 4 or 5 years).
So the correct line is:
var sfldr = fldrEnum.getNext().QueryInterface(Components.interfaces.nsIMsgFolder);
I also checked all of the plugin and added all interfaces - now it works like a charm.
Writing the problem here alone has helped me a lot to find the solution ;-)

Need assistance with unfamiliar syntax, error - e is undefined - Google Apps Script(GAS)

I'm using a script exactly like the one on the tutorial here, https://developers.google.com/apps-script/reference/ui/file-upload
However, despite using the syntax I keep getting e is undefined in the statement:
var fileBlob = e.parameter.dsrFile;
I think that means my function doPost(e) is probably wrong somehow. Here is my entire script below.
// Create Menu to Locate .CSV
function doGet(e) {
var app = UiApp.createApplication().setTitle("Upload CSV");
var formContent = app.createVerticalPanel();
formContent.add(app.createFileUpload().setName("dsrFile"));
formContent.add(app.createSubmitButton("Start Upload"));
var form = app.createFormPanel();
form.add(formContent);
app.add(form);
return app;
}
// Upload .CSV file
function doPost(e)
{
// data returned is a blob for FileUpload widget
var fileBlob = e.parameter.dsrFile;
var doc = DocsList.createFile(fileBlob);
}
e is undefined because you are not passing anything to doPost. You have to pass the needed object to doPost. Check where you call the function and what parameters do you pass to it if any. Even if you pass a parameter to that function, it holds undefined value. Make sure that you are passing the correct objects to your functions.
Your script should work perfectly. e is defined by Google Apps Script, not need to pass anything in particular is contains the fields of your form, in particular in this case the file you uploaded.
I would suspect you may be falling foul to the dev url vs publish url syndrome, where you are executing an old scrip rather that the code you are currently working on.
Be sure you script end with 'dev' and not 'exec'
https://script.google.com/a/macros/appsscripttesting.com/s/AKfyck...EY7qzA7m6hFCnyKqg/dev
Let me know if you are still getting the error after running it from the /dev url

Accessing temporary file from upload in django view

Just as the title says, I want to know how to access the data from the temporary file stored by Django, when a file is uploaded, inside a view.
I want to read the data uploaded values so I can make a progress bar. My methodology is to perform a jquery getJSON request:
function update_progress_info() {
$progress.show();
$.getJSON(progress_url, function(data, status){
if (data) {
var progress = parseInt(data.uploaded) / parseInt(data.length);
var width = $progress.find('.progress-container').width()
var progress_width = width * progress;
$progress.find('.progress-bar').width(progress_width);
$progress.find('.progress-info').text('uploading ' + parseInt(progress*100) + '%');
}
window.setTimeout(update_progress_info, freq);
});
};
where progress_url is the view I have that handles the uploaded file data:
# views.py (I don't know what to do here):
def upload_progress(request):
for line in UploadedFile.temporary_file_path
response = (line)
return response
Django handles uploaded files with UploadHandler defined in settings.py with this name FILE_UPLOAD_HANDLERS that defaults to this tuple:
FILE_UPLOAD_HANDLERS =
("django.core.files.uploadhandler.MemoryFileUploadHandler",
"django.core.files.uploadhandler.TemporaryFileUploadHandler",)
The behavior with file uploads is that if the file is less than 2.5 mg then it will be kept on memory, hence, they will not be written in disk as temporary files.
If the file weights more, it will be written in chunks in the FILE_UPLOAD_TEMP_DIR in the settings.py. That's the file you'll have to query to know how many bytes have been uploaded.
You can access the uploaded/uploading files through your request variables in views like this: file = requests.FILES['file'] . There, file variable will have the type UploadedFile which contains a method temporary_file_path with the address of the file in the disk being uploaded. (Note: only files larger than 2.5 mg will have this methods) so there you may get the size of the file being uploaded.
Another way to do this is create your own UploadHandler like a ProgressBarUploadHandler and add it to your file upload handlers. This is the way the docs recommend it. Here are some snippets and tutorials for doing it.
If you need any more info the doc is really well documented.
I hope you find this helpful. Good luck.

Does anyone know how to stop T4 (tt) templates from regerating every single file? Is there a way to flag certain files?

So the tt templates will regenerate every file whenever you save. Now, great, it generates files. However, I am making partial classes to extend other classes, but I only need the files that dont already exist for me generated. The ones that exist, I'd like to preserve. So far, I am finding not one solid solution googling the globe...
In my code below, the exception for finding existing files doesnt matter, because the template will start by deleting all files first. Then it regenerates.
It there a method like "onsave" that I can override?
// BEGIN CODE TO GENERATE EXTENSIONS
<#
foreach (EntityType entity in ItemCollection.GetItems<EntityType>().OrderBy(e => e.Name))
{
string fileName = entity.Name + ".Extension.cs";
string filePath = this.Host.TemplateFile.Substring(0,this.Host.TemplateFile.LastIndexOf(#"\"));
filePath = filePath + #"\Extensions\" + fileName;
if((File.Exists(filePath) && PreserveExistingExtensions == false) || !File.Exists(filePath))
{
fileManager.StartNewFile(fileName);
BeginNamespace(namespaceName, code);
bool entityHasNullableFKs = entity.NavigationProperties.Any(np => np.GetDependentProperties().Any(p=>ef.IsNullable(p)));
#>
<#=Accessibility.ForType(entity)#>
<#=code.SpaceAfter(code.AbstractOption(entity))#>partial class
<#=code.Escape(entity)#><#=code.StringBefore(" : ", code.Escape(entity.BaseType))#>
{
}
<#
EndNamespace(namespaceName);
}
}
fileManager.Process();
#>
I do something similar (partial classes) where I have one that is always generated, but the custom one will only be generated if it doesn't exist. This second one is created as starting class for customizations. I'll output two files like so:
MyClass.generated.cs
MyClass.cs
MyClass.cs will never be recreated, unless it doesn't exist. MyClass.generated.cs will always be recreated.
I use the T4toolbox to do this, Oleg Sych has actually made this quite easy.
You can check out some sample T4 Templates I built here. Specifically have a look at this one, it's a good example for generated partial classes where one needs to be created every time, and one is only created if it doesn't exist.
The main thing to look at is this line in the code:
var requestBaseMessageCustom = new MessageTemplate(rootNamespace, serviceName + "Request");
requestBaseMessageCustom.Output.File = "Messages/" + serviceName + "Request.cs";
requestBaseMessageCustom.Output.PreserveExistingFile = true;
requestBaseMessageCustom.Render();
Notice the property called PreserveExistingFile, that's the key.

Resources