In my application i want to access contact names and mobile data usage from Android mobile. I get the contacts list from the mobile by using the below code. But I have no idea how to access mobile data usage from the settings in android mobile. I have referred many sites but none of them could give me a clear answer. How do I access the mobile data usage when mobile data is turned ON?
var context = MainActivity.GetAppContext();
var uri = ContactsContract.Contacts.ContentUri;
string[] projection = { ContactsContract.Contacts.InterfaceConsts.Id,
ContactsContract.Contacts.InterfaceConsts.DisplayName,
ContactsContract.Contacts.InterfaceConsts.HasPhoneNumber
};
var cursor = context.ContentResolver.Query(uri, projection, null, null, null);
ObservableCollection<UserContact> UserContactList = new
ObservableCollection<UserContact>();
cursor.MoveToFirst();
You can query only basic info using the Android.Net.TrafficStats class:
var receivedTotal = Android.Net.TrafficStats.MobileRxBytes;
var transmittedTotal = Android.Net.TrafficStats.MobileTxBytes;
//app-specific
var uid = Android.OS.Process.MyUid();
var receivedByApp = Android.Net.TrafficStats.GetUidRxBytes(uid);
var transmittedByApp = Android.Net.TrafficStats.GetUidTxBytes(uid);
Please note that the app-specific methods return overall number of bytes since the network interface or app started and may be zeroed-out when connection drops or app closes. Also they count not only mobile data, but include wi-fi connection as well.
There is no built-in API to query mobile data usage of other apps in the default API. Some manufacturers might offer their own implementations for this, but it is not available in the "clean Android OS".
How can i using internal database for example (sqlite) for offline app in nativescript without using any plugin.
i'm searched every were how i can installed or used sqlite or other internal database for nativescript but i didn't have any answer.
Just like you would do with any code that you need to access the native APIs
e.g. (JavaScript) Android example
var query = "select sqlite_version() AS sqlite_version";
var db = android.database.sqlite.SQLiteDatabase.openOrCreateDatabase(":memory:", null);
var cursor = db.rawQuery(query, null);
var sqliteVersion = "";
if (cursor.moveToNext()) {
sqliteVersion = cursor.getString(0);
console.log(sqliteVersion);
}
The API references for SQLite in Android here and that said you can now follow a basic Android database tutorial and implement it step by step in your NativeScript application using JavaScript or TypeScript
Still, the plugin could provide all that wrapped in a ready-to-go functionality so unless you are lacking something it will be easier to use the nativescript-sqlite and avoid writing native code for Android and then for iOS.
I have a simple Xamarin Forms app. I've now got a simple POCO object (eg. User instance or an list of the most recent tweets or orders or whatever).
How can I store this object locally to the device? Lets imagine I serialize it as JSON.
Also, how secure is this data? Is it part of Keychains, etc? Auto backed up?
cheers!
You have a couple options.
SQLite. This option is cross-platform and works well if you have a lot of data. You get the added bonus of transaction support and async support as well. EDIT: In the past I suggested using SQLite.Net-PCL. Due to issues involving Android 7.0 support (and an apparent sunsetting of support) I now recommend making use of the project that was originally forked from: sqlite-net
Local storage. There's a great nuget that supports cross-platform storage. For more information see PCLStorage
There's also Application.Current.Properties implemented in Xamarin.Forms that allow simple Key-Value pairs of data.
I think you'll have to investigate and find out which route serves your needs best.
As far as security, that depends on where you put your data on each device. Android stores app data in a secure app folder by default (not all that secure if you're rooted). iOS has several different folders for data storage based on different needs. Read more here: iOS Data Storage
Another option is the Xamarin Forms settings plugin.
E.g. If you need to store a user instance, just serialize it to json when storing and deserialize it when reading.
Uses the native settings management
Android: SharedPreferences
iOS: NSUserDefaults
Windows Phone: IsolatedStorageSettings
Windows RT / UWP: ApplicationDataContainer
public User CurrentUser
{
get
{
User user = null;
var serializedUser = CrossSettings.Current.GetValueOrDefault<string>(UserKey);
if (serializedUser != null)
{
user = JsonConvert.DeserializeObject<User>(serializedUser);
}
return user;
}
set
{
CrossSettings.Current.AddOrUpdateValue(UserKey, JsonConvert.SerializeObject(value));
}
}
EDIT:
There is a new solution for this. Just use Xamarin.Essentials.
Preferences.Set(UserKey, JsonConvert.SerializeObject(value));
var user= JsonConvert.DeserializeObject<User>(Preferences.Get(UserKey, "default_value");
Please use Xamarin.Essentials
The Preferences class helps to store application preferences in a key/value store.
To save a value:
Preferences.Set("my_key", "my_value");
To get a value:
var myValue = Preferences.Get("my_key", "default_value");
If you want to store a simple value, such as a string, follow this Example code.
setting the value of the "totalSeats.Text" to the "SeatNumbers" key from page1
Application.Current.Properties["SeatNumbers"] = totalSeats.Text;
await Application.Current.SavePropertiesAsync();
then, you can simply get the value from any other page (page2)
var value = Application.Current.Properties["SeatNumbers"].ToString();
Additionally, you can set that value to another Label or Entry etc.
SeatNumbersEntry.Text = value;
If it's Key value(one value) data storage, follow below code
Application.Current.Properties["AppNumber"] = "123"
await Application.Current.SavePropertiesAsync();
Getting the same value
var value = Application.Current.Properties["AppNumber"];
I'm using H2 database console as a servlet in my own web application that provides a front end of many databases.
How to skip or help a login step at H2 database console by passing some parameters in my own code?
(I have many databases, so I won't use "saved settings" first.)
imaginary: http://myapp/h2console/login.do?user=scott&password=tiger&url=jdbc:thin:......
Because of the somewhat special session handling of the console, this is not possible just using an fixed URL. (The session handling allows to open multiple connections within multiple tabs from one browser, which is not possible when using cookies.)
However, what you can do is create a URL in the same way as Server.startWebServer(Connection conn) does:
// the server is already running in your case,
// so most likely you don't need the following lines:
WebServer webServer = new WebServer();
Server web = new Server(webServer, new String[] { "-webPort", "0" });
web.start();
Server server = new Server();
server.web = web;
webServer.setShutdownHandler(server);
// this will create a new session and return the URL for it:
String url = webServer.addSession(conn);
I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.
How can I do that without writing code? Is there some interface for that?
Free tools:
Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
ClumpsyLeaf CloudXplorer
Azure Storage Explorer from CodePlex (try version 4 beta)
There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.
Out of these, I personally like CloudBerry Explorer the best.
The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.
For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.
If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
Hope this helps.
The StorageClient has this built into it. No need to write really anything:
var account = new CloudStorageAccount(creds, false);
var client = account.CreateCloudBlobClient();
var blob = client.GetBlobReference("/somecontainer/hugefile.zip");
//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;
//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core
//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;
blob.UploadFile("somehugefile.zip");
I use Cyberduck to manage my blob storage.
It is free and very easy to use. It works with other cloud storage solutions as well.
I recently found this one as well: CloudXplorer
Hope it helps.
There is a new OpenSource tool provided by Microsoft :
Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.
Please, check those links:
Download binaries: http://storageexplorer.com/
Source Code: https://github.com/Azure/deco
You can use Cloud Combine for reliable and quick file upload to Azure blob storage.
A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:
upload.bat
#ECHO OFF
SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>
:AGAIN
IF "%~1" == "" GOTO DONE
AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob
SHIFT
GOTO AGAIN
:DONE
PAUSE
Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.
You can upload files to Azure Storage Account Blob using Command Prompt.
Install Microsoft Azure Storage tools.
And then Upload it to your account blob will CLI command:
AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob
Hope it Helps.. :)
You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:
// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
var buffer = new Byte[4096];
int bytesRead;
var tempTotal = 0;
File.FileStream.Position = DataSent;
while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
&& tempTotal + bytesRead < CHUNK_SIZE
&& !File.IsDeleted
&& File.State != Constants.FileStates.Error)
{
requestStream.Write(buffer, 0, bytesRead);
requestStream.Flush();
DataSent += bytesRead;
tempTotal += bytesRead;
File.UiDispatcher.BeginInvoke(OnProgressChanged);
}
requestStream.Close();
if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}
void StartUpload()
{
var uriBuilder = new UriBuilder(UploadUrl);
if (UseBlocks)
{
// encode the block name and add it to the query string
CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
}
// with or without using blocks, we'll make a PUT request with the data
var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
webRequest.Method = "PUT";
webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}
The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:
readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerializeData(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page
The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI
I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility
By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.
(for the difference please see this MSDN article.)
To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
to
new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob
The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.
Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.
You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement
Syntax
AzCopy /Source : /Destination /s
Try the Blob Service API
http://msdn.microsoft.com/en-us/library/dd135733.aspx
However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.