My server generates UUID for uploaded files, so i need to set UUID to the fileState after i received answer from upload server (to successfully use delete function). I added and implemented
setUuid: function(id, uuid)
In UploadHandler, FineUploaderBasic and UploadHandlerXhr to solve this issue but this involve editing fine-uploader sources, is there any other way around? I have feeling this can break something internally.
I would suggest not passing the UUID back to fine uploader. It would be simpler to associate your UUID with fine uploader's UUID server side. You could maintain a map of associations in the session if you don't want to persist them.
I encountered this issue and found some documentation that states you can set the UUID from the server side by returning it from the upload method.
click image where newUuid is being set
Here is my return method using C#
return new FineUploaderResult(true, new { newUuid = attachmentId });
Related
I'm trying to handle backwards compatibility with my GraphQL API.
We have on-premise servers that get periodically updated based off of when they connect to the internet. We have a Mobile app that talks to the on-premise server.
Problem
We get into an issue where the Mobile app is up to date and the on-premise server isn't. When a change in the Schema occurs, it causes issues.
Example
Product version 1
type Product {
name: String
}
Product version 2
type Product {
name: String
account: String
}
New version of mobile app asks for:
product(id: "12345") {
name
account
}
Because account is not valid in version 1, I get the error:
"Cannot query field \"account\" on type \"Product\"."
Does anyone know how I can avoid this issue so I don't recieve this particular error. I'm totally fine with account coming back with Null or just some other plan of attack for updating Schema's. But having it completely blow up with no response is not good
Your question did not specify what you're actually using on the backend. But it should be possible to customize the validation rules a GraphQL service uses in any implementation based on the JavaScript reference implementation. Here's how you do it in GraphQL.js:
const { execute, parse, specifiedRules, validate } = require('graphql')
const validationRules = specifiedRules.filter(rule => rule.name !== 'FieldsOnCorrectType')
const document = parse(someQuery)
const errors = validate(schema, document, validationRules)
const data = await execute({ schema, document })
By omitting the FieldsOnCorrectType rule, you won't get any errors and unrecognized fields will simply be left off the response.
But you really shouldn't do that.
Modifying the validation rules will result in spec-breaking changes to your server, which can cause issues with client libraries and other tools you use.
This issue really boils down to your deployment process. You should not push new versions of the client that depend on a newer version of the server API until that version is deployed to the server. Period. That would hold true regardless of whether you're using GraphQL, REST, SOAP, etc.
My app creates mails with attachments, and uses an intent with Intent.ACTION_SEND to launch a mail app.
It works with all the mail apps I tested with, except for the new Gmail 5.0 (it works with Gmail 4.9), where the mail opens without attachment, showing the error: "Permission denied for the attachment".
There are no useful messages from Gmail on logcat. I only tested Gmail 5.0 on Android KitKat, but on multiple devices.
I create the file for the attachment like this:
String fileName = "file-name_something_like_this";
FileOutputStream output = context.openFileOutput(
fileName, Context.MODE_WORLD_READABLE);
// Write data to output...
output.close();
File fileToSend = new File(context.getFilesDir(), fileName);
I'm aware of the security concerns with MODE_WORLD_READABLE.
I send the intent like this:
public static void compose(
Context context,
String address,
String subject,
String body,
File attachment) {
Intent emailIntent = new Intent(Intent.ACTION_SEND);
emailIntent.setType("message/rfc822");
emailIntent.putExtra(
Intent.EXTRA_EMAIL, new String[] { address });
emailIntent.putExtra(Intent.EXTRA_SUBJECT, subject);
emailIntent.putExtra(Intent.EXTRA_TEXT, body);
emailIntent.putExtra(
Intent.EXTRA_STREAM,
Uri.fromFile(attachment));
Intent chooser = Intent.createChooser(
emailIntent,
context.getString(R.string.send_mail_chooser));
context.startActivity(chooser);
}
Is there anything I do wrong when creating the file or sending the intent? Is there a better way to start a mail app with attachment? Alternatively - has someone encountered this problem and found a workaround for it?
Thanks!
I was able to pass a screenshot .jpeg file from my app to GMail 5.0 through an Intent. The key was in this answer.
Everything I have from #natasky 's code is nearly identical but instead, I have the file's directory as
context.getExternalCacheDir();
Which "represents the external storage directory where you should save cache files" (documentation)
GMail 5.0 added some security checks to attachments it receives from an Intent. These are unrelated to unix permissions, so the fact that the file is readable doesn't matter.
When the attachment Uri is a file://, it'll only accept files from external storage, the private directory of gmail itself, or world-readable files from the private data directory of the calling app.
The problem with this security check is that it relies on gmail being able to find the caller app, which is only reliable when the caller has asked for result. In your code above, you do not ask for result and therefore gmail does not know who the caller is, and rejects your file.
Since it worked for you in 4.9 but not in 5.0, you know it's not a unix permission problem, so the reason must be the new checks.
TL;DR answer:
replace startActivity with startActivityForResult.
Or better yet, use a content provider.
Use getExternalCacheDir() with File.createTempFile.
Use the following to create a temporary file in the external cache directory:
File tempFile = File.createTempFile("fileName", ".txt", context.getExternalCacheDir());
Then copy your original file's content to tempFile,
FileWriter fw = new FileWriter(tempFile);
FileReader fr = new FileReader(Data.ERR_BAK_FILE);
int c = fr.read();
while (c != -1) {
fw.write(c);
c = fr.read();
}
fr.close();
fw.flush();
fw.close();
now put your file to intent,
emailIntent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(tempFile));
You should implement a FileProvider, which can create Uris for your app's internal files. Other apps are granted permission to read these Uris. Then, simply instead of calling Uri.fromFile(attachment), you instantiate your FileProvider and use:
fileProvider.getUriForFile(attachment);
Google have an answer for that issue:
Store the data in your own ContentProvider, making sure that other apps have the correct permission to access your provider. The preferred mechanism for providing access is to use per-URI permissions which are temporary and only grant access to the receiving application. An easy way to create a ContentProvider like this is to use the FileProvider helper class.
Use the system MediaStore. The MediaStore is primarily aimed at video, audio and image MIME types, however beginning with Android 3.0 (API level 11) it can also store non-media types (see MediaStore.Files for more info). Files can be inserted into the MediaStore using scanFile() after which a content:// style Uri suitable for sharing is passed to the provided onScanCompleted() callback. Note that once added to the system MediaStore the content is accessible to any app on the device.
Also you can try set permissions for your file:
emailIntent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
And finally you can copy/store your files in external storage - permissions not needed there.
I tested it and I found out that it was definitely private storage access problem.
When you attach some file to Gmail (over 5.0) do not use the file from private storage such as /data/data/package/. Try to use /storage/sdcard.
You can successfully attach your file.
Not sure why GMail 5.0 doesn't like certain file paths (which I've confirmed it does have read access to), but an apparently better solution is to implement your own ContentProvider class to serve the file. It's actually somewhat simple, and I found a decent example here: http://stephendnicholas.com/archives/974
Be sure to add the tag to your app manifest, and include a "android:grantUriPermissions="true"" within that. You'll also want to implement getType() and return the appropriate MIME type for the file URI, otherwise some apps wont work with this... There's an example of that in the comment section on the link.
I was having this problem and finally found an easy way to send email with attachment. Here is the code
public void SendEmail(){
try {
//saving image
String randomNameOfPic = Calendar.DAY_OF_YEAR+DateFormat.getTimeInstance().toString();
File file = new File(ActivityRecharge.this.getCacheDir(), "slip"+ randomNameOfPic+ ".jpg");
FileOutputStream fOut = new FileOutputStream(file);
myPic.compress(Bitmap.CompressFormat.JPEG, 100, fOut);
fOut.flush();
fOut.close();
file.setReadable(true, false);
//sending email
Intent intent = new Intent(Intent.ACTION_SEND);
intent.setType("text/plain");
intent.putExtra(Intent.EXTRA_EMAIL, new String[]{"zohabali5#gmail.com"});
intent.putExtra(Intent.EXTRA_SUBJECT, "Recharge Account");
intent.putExtra(Intent.EXTRA_TEXT, "body text");
//Uri uri = Uri.parse("file://" + fileAbsolutePath);
intent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(file));
intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
startActivityForResult(Intent.createChooser(intent, "Send email..."),12);
}catch (Exception e){
Toast.makeText(ActivityRecharge.this,"Unable to open Email intent",Toast.LENGTH_LONG).show();
}
}
In this code "myPic" is bitmap which was returned by camera intent
Step 1: Add authority in your attached URI
Uri uri = FileProvider.getUriForFile(context, ""com.yourpackage", file);
Same as your manifest file provide name
android:authorities="com.yourpackage"
Step 2`; Add flag for allow to read
myIntent.setFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
I have a simple Xamarin Forms app. I've now got a simple POCO object (eg. User instance or an list of the most recent tweets or orders or whatever).
How can I store this object locally to the device? Lets imagine I serialize it as JSON.
Also, how secure is this data? Is it part of Keychains, etc? Auto backed up?
cheers!
You have a couple options.
SQLite. This option is cross-platform and works well if you have a lot of data. You get the added bonus of transaction support and async support as well. EDIT: In the past I suggested using SQLite.Net-PCL. Due to issues involving Android 7.0 support (and an apparent sunsetting of support) I now recommend making use of the project that was originally forked from: sqlite-net
Local storage. There's a great nuget that supports cross-platform storage. For more information see PCLStorage
There's also Application.Current.Properties implemented in Xamarin.Forms that allow simple Key-Value pairs of data.
I think you'll have to investigate and find out which route serves your needs best.
As far as security, that depends on where you put your data on each device. Android stores app data in a secure app folder by default (not all that secure if you're rooted). iOS has several different folders for data storage based on different needs. Read more here: iOS Data Storage
Another option is the Xamarin Forms settings plugin.
E.g. If you need to store a user instance, just serialize it to json when storing and deserialize it when reading.
Uses the native settings management
Android: SharedPreferences
iOS: NSUserDefaults
Windows Phone: IsolatedStorageSettings
Windows RT / UWP: ApplicationDataContainer
public User CurrentUser
{
get
{
User user = null;
var serializedUser = CrossSettings.Current.GetValueOrDefault<string>(UserKey);
if (serializedUser != null)
{
user = JsonConvert.DeserializeObject<User>(serializedUser);
}
return user;
}
set
{
CrossSettings.Current.AddOrUpdateValue(UserKey, JsonConvert.SerializeObject(value));
}
}
EDIT:
There is a new solution for this. Just use Xamarin.Essentials.
Preferences.Set(UserKey, JsonConvert.SerializeObject(value));
var user= JsonConvert.DeserializeObject<User>(Preferences.Get(UserKey, "default_value");
Please use Xamarin.Essentials
The Preferences class helps to store application preferences in a key/value store.
To save a value:
Preferences.Set("my_key", "my_value");
To get a value:
var myValue = Preferences.Get("my_key", "default_value");
If you want to store a simple value, such as a string, follow this Example code.
setting the value of the "totalSeats.Text" to the "SeatNumbers" key from page1
Application.Current.Properties["SeatNumbers"] = totalSeats.Text;
await Application.Current.SavePropertiesAsync();
then, you can simply get the value from any other page (page2)
var value = Application.Current.Properties["SeatNumbers"].ToString();
Additionally, you can set that value to another Label or Entry etc.
SeatNumbersEntry.Text = value;
If it's Key value(one value) data storage, follow below code
Application.Current.Properties["AppNumber"] = "123"
await Application.Current.SavePropertiesAsync();
Getting the same value
var value = Application.Current.Properties["AppNumber"];
I am doing an upload via CORS to Amazon S3 with the Kendo Upload control. I'm having an issue with the fact that I need to grab a signature from my server, then add it to the 'data' for the event object of 'upload' handler I created. The problem is, of course, that in the handler I fire off an async request to get the signature, and the upload handler continues on it's merry way without the signature data i need. The published API has no 'upload()' or something command that I could call when my async request returns.
I saw an ASP-Kendo-S3 example somewhere, but it's not exactly clear from that code, how that signature is being obtained, and of course, I'm not using ASP.
Kendo Upload has an onUpload event. In Kendo's asp.net example there really isn't anything specific to that framework that wouldn't port to anything else.
They populate the page initially with the profile (base64 encoded JSON).
To get a signature for that base64 encoded json profile they use this method (C#):
private static string Sign(string text, string key)
{
var signer = new HMACSHA1(Encoding.UTF8.GetBytes(key));
return Convert.ToBase64String(signer.ComputeHash(Encoding.UTF8.GetBytes(text)));
}
It looks pretty self explanatory to the point where you could port it to another language.
I've set up a db4o server and client. My client calls are not working.
What's puzzling to me is that the db4o examples show how the client can close the server, but not how to get and save data. See: http://community.versant.com/Documentation/Reference/db4o-7.12/java/reference/Content/client-server/networked/simple_db4o_server.htm
If I start the db4o server, and run netstat, I can see that the port is open. And on the client, I can do things like this:
Debug.WriteLine(db.Ext().IsClosed().ToString());
And that returns False, which is good.
But when I try to get or save data, it doesn't work. When saving data, it appears to work, but I don't see the data in the DB. When trying to retrieve the object, I get this error:
Db4objects.Db4o.Ext.Db4oException: Exception of type 'Db4objects.Db4o.Ext.Db4oException' was thrown. ---> System.ArgumentException: Field '_delegateType' defined on type 'Db4objects.Db4o.Internal.Query.DelegateEnvelope' is not a field on the target object which is of type 'Db4objects.Db4o.Reflect.Generic.GenericObject'.
Here are the client calls to save, then get:
Server server = new Server() { Name = "AppServerFoo" };
IObjectContainer db = GetDatabase();
db.Store(server);
db.Close();
Here's the only line in the GetDatabase() method:
return Db4oClientServer.OpenClient(Db4oClientServer.NewClientConfiguration(), "DellXps", 8484, "user", "password");
And here's the call to get from the DB:
IObjectContainer db = GetDatabase();
Server testServer = db.Query<Server>(server => server.Name == "AppServerFoo").FirstOrDefault();
db.Close();
What am I missing?
Well a server without a reference the persisted classes is a 'risky' thing. A lot of functionally doesn't work and the error which occur are cryptic. I highly recommend to always reference the assembly with the persisted classes on the server.
Another tip: Use LINQ instead of native queries. It works better and has less problems.
A ha! I got it. It took some googling, but the problem was that the server didn't have a reference to the entities. As soon as my server project referenced my client project, it worked. So, it looks like I just need to move my entities to a common project that both the client and server can reference.
Thanks to this link:
http://www.gamlor.info/wordpress/2009/11/db4o-client-server-and-concurrency/
This link looks like a gateway to having the server work without referencing the entities:
http://community.versant.com/Documentation/Reference/db4o-7.12/net2/reference/html/reference/client-server/server_without_persistent_classes_deployed.html