Retaining File names when using Fine Uploader - fine-uploader

I'm trying to implement the Fine Uploader for the first time and have a question. I've got the uploader pushing files to my S3 bucket. The issue I have is the file is being renamed to some sort of string.
Here's an example: 4f65aefe-c55b-42b0-afd4-b749c755e7e8.zip
I'd like to keep the original file name if possible. Is that possible?
Here is the script on the page with my current set of params:
var uploader = new qq.s3.FineUploader({
element: document.getElementById("fineUploader"),
request: {
endpoint: "mybucket.amazonaws.com",
accessKey: "ABCDEFGHIJKLMNOP"
},
signature: {
endpoint: "/wp-content/themes/zone/vendor/fineuploader/php-s3-server/endpoint.php"
},
iframeSupport: {
localBlankPagePath: "/wp-content/themes/zone/success.html"
},
cors: {
expected: true
},
chunking: {
enabled: true
},
resume: {
enabled: true
},
});
Am I missing something? Thanks in advance.

Yes, this is expected. By default, Fine Uploader S3 will use a UUID to name your object when sending it to an S3 bucket. In almost all cases, this is the safest behavior. If you change this value, you run the risk of overwriting existing files with new ones in the event of a name collision. The object is annotated with the original file name attached to an "x-amz-meta-qqfilename" header.
If you must save the object in S3 using a different name, you can modify the objectProperties.key option appropriately. A value of "filename" will save the object using the original filename. You can also set the value to a function where you can determine the name on-demand, even using values from some other location, provided your key function returns a Promise. Read more about this option at http://docs.fineuploader.com/api/options-s3.html#objectProperties.key.

Related

How to create this event filter in AWS SQS

This is the real data structure of my current json data, it contains [] in the data in the first place.
{"events":[{"type":"message","message":{"type":"text"}}]}
So, basically I just want text message type data from SQS, but now I don't know what should the filter be.
If you are using yaml in serverless, I´d try this:
filterPatterns:
- body: {events: {type: ["message"], message: {type: ["text"]}}}
In case it helps, I had a similar scenario: in my case, I want a function to be triggered only when inside the body of the SQS message, the field "type" has values "create" or "delete"}
In my case, the next code worked:
filterPatterns:
- body : {type: [create, delete]}

Using map on returned graphql query is making known members undefined

I'm using Gatsbyjs to build a blog and I can't use the onCreatePage API to pass data from my graphql query into page templates.
My query grabs data from Kentico Cloud and it looks like this.
{
allKenticoCloudTypeBlogPost{
edges{
node{
contentItems{
elements{
url_slug{
value
}
}
}
}
}
}
}
This is a valid query and it returns data that looks like this.
The problem comes in my gatsby-node.js file where I want to utilize this query to build out pages using my predefined template.
Specifically in the createPage method which looks like this.
result.data.allKenticoCloudTypeBlogPost.edges.map(({node}) => {
createPage({
path: `${node.contentItems.elements.url_slug.value}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: node.contentItems.elements.url_slug.value,
}
})
});
The error that displays is the following.
TypeError: Cannot read property 'url_slug' of undefined
gatsby-node.js:31 result.data.allKenticoCloudTypeBlogPost.edges.map
C:/Users/xxxx/Desktop/Marketing Repos/xxxx/gatsby-node.js:31:57
I decided to investigate doing a console.table on node.contentItems, as it appears as though the elements part is where it gets tripped up.
The result of console.table(node.contentItems) just before the createPage method is this.
It appears that node.contentItems has a member called url_slug rather than the elements member that I expected.
I thought I could then solve my problem by updating my createPage method call like so.
result.data.allKenticoCloudTypeBlogPost.edges.map(({node}) => {
console.table(node.contentItems);
createPage({
path: `${node.contentItems.url_slug.value}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: node.contentItems.url_slug.value,
}
})
});
But then I get an error saying
TypeError: Cannot read property 'value' of undefined.
I truly don't understand how I can do a table log and see the url_slug member, but then when I try to access it, it says that it's undefined. All while I know that my query is correct because I can run it in graphiQL and get back the exact data I expect.
Any help would be appreciated. Thank you.
In your query result, node.contentItems is an array, even though you're trying to access it as if it's an object:
path: `${node.contentItems.elements.url_slug.value}`,
^^^^^^^^
console.log(contentItems) // [ { elements: {...} }, { elements: {...} }, ... ]
I think your confusion probably stems from the way console.table display data. It's confusing if you don't already know the shape of your data. Your screenshot says, this object has 4 properties with index 0 -> 3 (so likely an array), each has one property called elements (listed on table header), which is an object with the only property url_slug.
I'm not familiar with KenticoCloud, but maybe your posts are nested in contentItems, in which case you should loop over it:
result.data.allKenticoCloudTypeBlogPost.edges.map(({node}) => {
node.contentItems.forEach(({ elements }) => {
createPage({
path: elements.url_slug.value,
context: { slug: elements.url_slug.value },
component: ...
})
})
});
Is there a reason you are wrapping node with curly brackets in your map argument?
You might have already tried this, but my first intuition would be to do this instead:
result.data.allKenticoCloudTypeBlogPost.edges.map(node => {
console.log(node.contentItems)
createPage({
path: `${node.contentItems.elements.url_slug.value}`,
component: path.resolve(`./src/templates/blog-post.js`),
context: {
slug: node.contentItems.elements.url_slug.value,
}
})
});

GraphQL: how to have it return a flexible, dynamic array, depending on what the marketeer filled in? [duplicate]

We are in the situation that the response of our GraphQL Query has to return some dynamic properties of an object. In our case we are not able to predefine all possible properties - so it has to be dynamic.
As we think there are two options to solve it.
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
/*
THIS is our special field which needs to return a dynamic object
*/
},
// ...
},
});
As you can see in the example code is element the property which has to return an object. A response when resolve this could be:
{
name: 'some name',
elements: {
an_unkonwn_key: {
some_nested_field: {
some_other: true,
},
},
another_unknown_prop: 'foo',
},
}
1) Return a "Any-Object"
We could just return any object - so GraphQL do not need to know which fields the Object has. When we tell GraphQL that the field is the type GraphQlObjectType it needs to define fields. Because of this it seems not to be possible to tell GraphQL that someone is just an Object.
Fo this we have changed it like this:
elements: {
type: new GraphQLObjectType({ name: 'elements' });
},
2) We could define dynamic field properties because its in an function
When we define fields as an function we could define our object dynamically. But the field function would need some information (in our case information which would be passed to elements) and we would need to access them to build the field object.
Example:
const MyType = new GraphQLObjectType({
name: 'SomeType',
fields: {
name: {
type: GraphQLString,
},
elements: {
type: new GraphQLObjectType({
name: 'elements',
fields: (argsFromElements) => {
// here we can now access keys from "args"
const fields = {};
argsFromElements.keys.forEach((key) => {
// some logic here ..
fields[someGeneratedProperty] = someGeneratedGraphQLType;
});
return fields;
},
}),
args: {
keys: {
type: new GraphQLList(GraphQLString),
},
},
},
// ...
},
});
This could work but the question would be if there is a way to pass the args and/or resolve object to the fields.
Question
So our question is now: Which way would be recommended in our case in GraphQL and is solution 1 or 2 possible ? Maybe there is another solution ?
Edit
Solution 1 would work when using the ScalarType. Example:
type: new GraphQLScalarType({
name: 'elements',
serialize(value) {
return value;
},
}),
I am not sure if this is a recommended way to solve our situation.
Neither option is really viable:
GraphQL is strongly typed. GraphQL.js doesn't support some kind of any field, and all types defined in your schema must have fields defined. If you look in the docs, fields is a required -- if you try to leave it out, you'll hit an error.
Args are used to resolve queries on a per-request basis. There's no way you can pass them back to your schema. You schema is supposed to be static.
As you suggest, it's possible to accomplish what you're trying to do by rolling your own customer Scalar. I think a simpler solution would be to just use JSON -- you can import a custom scalar for it like this one. Then just have your elements field resolve to a JSON object or array containing the dynamic fields. You could also manipulate the JSON object inside the resolver based on arguments if necessary (if you wanted to limit the fields returned to a subset as defined in the args, for example).
Word of warning: The issue with utilizing JSON, or any custom scalar that includes nested data, is that you're limiting the client's flexibility in requesting what it actually needs. It also results in less helpful errors on the client side -- I'd much rather be told that the field I requested doesn't exist or returned null when I make the request than to find out later down the line the JSON blob I got didn't include a field I expected it to.
One more possible solution could be to declare any such dynamic object as a string. And then pass a stringified version of the object as value to that object from your resolver functions. And then eventually you can parse that string to JSON again to make it again an object on the client side.
I'm not sure if its recommended way or not but I tried to make it work with this approach and it did work smoothly, so I'm sharing it here.

Type error with internationalization feature of sails.js based on i18n

I'm trying to use the internationalization feature of sails based on i18n.
In my controller it works well. However, I would like to setup this in my model definition.
Please see the code below:
module.exports = {
attributes: {
name:{
type:'string',
required:true,
displayName: sails.__("test")
},
....
Unfortunately it does not work. I have the error below:
displayName: sails.__("test")
^
TypeError: Object [a Sails app] has no method '__'
Would you have an idea?
Any help will be very much appreciated.
Thanks,
displayName: sails.__("test")
You are trying to invoke the internationalization function statically; that is, you're seeing the error because you're running that function the moment your .js file is require()d by node.js, and before sails has finished loading.
There are two ways you can go about solving this problem.
1. Translate the value on each query
If you'd like to store the original value of displayName, and instead internationalize it each time you query for the model, you can override toJSON().
Instead of writing custom code for every controller action that uses a particular model (including the "out of the box" blueprints), you can manipulate outgoing records by simply overriding the default toJSON function in your model.
For example:
attributes: {
name:{
type:'string',
required:true,
},
getDisplayName: function () {
return sails.__(this.name);
},
toJSON: function () {
var obj = this.toObject();
obj.displayName = sails.__(this.name);
return obj;
},
...
}
2. Translate the value before create
You can use the Waterline Lifecycle Callbacks to translate the value to a particular language before the model is saved to the databas
Sails exposes a handful of lifecycle callbacks on models that are called automatically before or after certain actions. For example, we sometimes use lifecycle callbacks for automatically encrypting a password before creating or updating an Account model.
attributes: {
name:{
type:'string',
required:true,
},
displayName: {
type: 'string'
},
...
},
beforeCreate: function (model, next) {
model.displayName = sails.__(model.name);
next();
}
This internationalized the value of displayName will now be set on your model before it is inserted into the database.
Let me know how this works out for you.
Your solution is interesting. However, my wish would be to have a display name for each properties.
module.exports = {
attributes: {
name:{
type:'string',
required:true,
displayName: "Your great name"
},
adress:{
type:'string',
required:true,
displayName: "Where do you live?"
},
....
So is there a simple or clean solution to apply sails.__( foreach properties display name of the attribute?
Thanks,

Bloodhound does not cache data from remote fetches in local storage

I am trying to load autocompletion information of people's names for typeahead and then not have to query the server again if I already have a result.
For example if i search a person's name and the data for that person (among others) gets retrieved from a remote query, when I delete the name and search for the surname instead I want to have the previously cached names with that surname to show up. What actually happens is that the results are again retrieved from the server and the suggested.
Caching only works while typing a single word ("Mic" -> "Mich" -> "Micha" -> "Michael").
TL;DR: I want to cache results from bloodhound in Local Storage not only from prefetch (which cannot be applied to my situation) but from remote as well and use that before querying remote again.
What i currently have is
function dispkey(suggestion_object){
console.log(suggestion_object);
return suggestion_object["lastname"] + ", " + suggestion_object["firstname"];
}
var engine = new Bloodhound({
name: 'authors',
local: [],
remote: 'http://xxxxxx.xxx/xxxx/xxxxxxxxxx?query=%%QUERY',
datumTokenizer: function(d) {
return Bloodhound.tokenizers.whitespace(d.val);
},
queryTokenizer: function (s){
return s.split(/[ ,]+/);
},
});
engine.initialize();
$('.typeahead').typeahead({
highlight: true,
hint: true,
minLength: 3,
},
{
displayKey: dispkey,
templates: {
suggestion: Handlebars.compile([
'<p id="author_autocomplete_email_field" >{{email}}</p>',
'<p id="author_autocomplete_name_field">{{lastname}} {{firstname}}</p>',
].join(''))},
source: engine.ttAdapter(),
});
I haven't found something similar and i am afraid there is no trivial solution to this.
P.S.: I also noticed that datumTokenizer never gets called
datumTokenizer: function(d) {
console.log("Lalalalala");
return Bloodhound.tokenizers.whitespace(d.val);
},
when i used this, "Lalalalala" was never outputted in the chrome debug console.
As jharding mentioned it's not possible to have remote suggestions pulled from localstorage at this point.
However, I recently worked on a small project where I needed to store previous form inputs for future use in typeahead.js. To do this I saved an array of form input values to localstorage.
var inputs = ['val1', 'val2', 'val3', 'val4'];
localStorage.setItem('values', JSON.stringify(inputs));
I then retrieved the array for use in the typeahead field.
var data = JSON.parse(localStorage.getItem('values'));
$('input').typeahead({
minLength: 3,
highlight: true,
},
{
name: 'data',
displayKey: 'value',
source: this.substringMatcher(data)
});
You can view my full source here.

Resources