How to create a sharedDrive in google api (node.js) - google-api

How do you make a shared drive from the google drive api, the code in the documentation doesnt work in the line
driveService.drives.create({...
https://developers.google.com/drive/api/v3/manage-shareddrives

The code snippet you provide is based on the previous creation of driveService
To create driveService, you need to follow the flow as described in the Drive API quickstart
The only difference is that the quickstart calls the variable to which the new Drive.Builder is assigned Drive service, while in the sample for creating shared drives is is called driveService instead of service
All you need to do is to patch the two code pieces together in a way that you use both times the same variable name
Sample:
// follow the quickstart
...
Drive service = new Drive.Builder(HTTP_TRANSPORT, JSON_FACTORY, getCredentials(HTTP_TRANSPORT))
.setApplicationName(APPLICATION_NAME)
.build();
//now create the shared drive
Drive driveMetadata = new Drive();
driveMetadata.setName("Project Resources");
String requestId = UUID.randomUUID().toString();
Drive drive = service.drives().create(requestId,
driveMetadata)
.execute();
System.out.println("Drive ID: " + drive.getId());
...

Related

google-drive-api method "drive.files.create" stopped working (V3 - 1.58.0.2859) and return nothing

We have a .Net app using Google Drive api to upload files to a g-drive. And it just stopped working days ago (Nov 29th). But we didn't remember doing anything changes during that time.
During the investigation, we could confirm the service account for calling the Google API are valid, since the same service account is also being used for calling other google APIs, and works fine. We also can confirm it's not a permission issue, since we even set the permission of the gdrive to allow "anyone" who has the link, to have edit permission, but the issue is still there.
Unfortunately, we cannot find any useful log, and the return message of the API call is NULL. No error code or error message returned.The only related info we saw is: on the chart of "Error by API method", it shows "drive.files.create" failed 100%.
One interesting thing is, if we disable the Google Drive API, then enable it again, it will work once, then will stop working again.
private string SaveFileToGoogleDrive(IFormFile file, string claimNumber)
{
try
{
var driveService = GetDriveServiceInstance();
var fileMetadata = new Google.Apis.Drive.v3.Data.File();
var mimeType = file.ContentType;
fileMetadata.Name = CreateFileName(file.FileName, claimNumber);
fileMetadata.MimeType = mimeType;
fileMetadata.Parents = new List { _googleSettings.GoogleDriveFolderId };
FilesResource.CreateMediaUpload request;
using (var stream = new MemoryStream())
{
file.CopyTo(stream);
request = driveService.Files.Create(fileMetadata, stream, mimeType);
request.Fields = "id";
request.Upload();
}
var googleFile = request.ResponseBody; \\The response body is always NULL, after the issue happened. :(
return googleFile.Id;
}
catch(Exception ex)
{
_logger.Error($"Google Drive exception {ex.Message} SACKTRACE: {(ex.StackTrace ?? "")} INNER EXCEPTION: {(ex.InnerException != null ? ex.InnerException.Message + "STACK TRACE:" + ex.InnerException.StackTrace ?? "" : "")}");
return string.Empty;
}
}
We found more details from the progress property in the response object, and saw the error message "The user's Drive storage quota has been exceeded.", but it does not make sense at all, since we are using "Enterprise edition" Google Workspace, which is supposed to have no limit. The service account and the key look good, GCP didn't complain at all. And that's the first thing we checked during troubleshooting.
Do you have any idea on what to do to solve the issue or what too look for when investigating this issue?
We found more details from the progress property in the response object, and saw the error message "The user's Drive storage quota has been exceeded.", but it's not make sense at all, since we are using "Enterprise editions" google workspace, which suppose has no limit. The service account and key look good, GCP didn't complain any thing. And that's the first thing we checked during troubleshooting. Anyway, the fix is: after create a new service account then use the new key of this new service account, the system back to work.

Xamarin Android share PDF. Permission denied for the attachment [duplicate]

My app creates mails with attachments, and uses an intent with Intent.ACTION_SEND to launch a mail app.
It works with all the mail apps I tested with, except for the new Gmail 5.0 (it works with Gmail 4.9), where the mail opens without attachment, showing the error: "Permission denied for the attachment".
There are no useful messages from Gmail on logcat. I only tested Gmail 5.0 on Android KitKat, but on multiple devices.
I create the file for the attachment like this:
String fileName = "file-name_something_like_this";
FileOutputStream output = context.openFileOutput(
fileName, Context.MODE_WORLD_READABLE);
// Write data to output...
output.close();
File fileToSend = new File(context.getFilesDir(), fileName);
I'm aware of the security concerns with MODE_WORLD_READABLE.
I send the intent like this:
public static void compose(
Context context,
String address,
String subject,
String body,
File attachment) {
Intent emailIntent = new Intent(Intent.ACTION_SEND);
emailIntent.setType("message/rfc822");
emailIntent.putExtra(
Intent.EXTRA_EMAIL, new String[] { address });
emailIntent.putExtra(Intent.EXTRA_SUBJECT, subject);
emailIntent.putExtra(Intent.EXTRA_TEXT, body);
emailIntent.putExtra(
Intent.EXTRA_STREAM,
Uri.fromFile(attachment));
Intent chooser = Intent.createChooser(
emailIntent,
context.getString(R.string.send_mail_chooser));
context.startActivity(chooser);
}
Is there anything I do wrong when creating the file or sending the intent? Is there a better way to start a mail app with attachment? Alternatively - has someone encountered this problem and found a workaround for it?
Thanks!
I was able to pass a screenshot .jpeg file from my app to GMail 5.0 through an Intent. The key was in this answer.
Everything I have from #natasky 's code is nearly identical but instead, I have the file's directory as
context.getExternalCacheDir();
Which "represents the external storage directory where you should save cache files" (documentation)
GMail 5.0 added some security checks to attachments it receives from an Intent. These are unrelated to unix permissions, so the fact that the file is readable doesn't matter.
When the attachment Uri is a file://, it'll only accept files from external storage, the private directory of gmail itself, or world-readable files from the private data directory of the calling app.
The problem with this security check is that it relies on gmail being able to find the caller app, which is only reliable when the caller has asked for result. In your code above, you do not ask for result and therefore gmail does not know who the caller is, and rejects your file.
Since it worked for you in 4.9 but not in 5.0, you know it's not a unix permission problem, so the reason must be the new checks.
TL;DR answer:
replace startActivity with startActivityForResult.
Or better yet, use a content provider.
Use getExternalCacheDir() with File.createTempFile.
Use the following to create a temporary file in the external cache directory:
File tempFile = File.createTempFile("fileName", ".txt", context.getExternalCacheDir());
Then copy your original file's content to tempFile,
FileWriter fw = new FileWriter(tempFile);
FileReader fr = new FileReader(Data.ERR_BAK_FILE);
int c = fr.read();
while (c != -1) {
fw.write(c);
c = fr.read();
}
fr.close();
fw.flush();
fw.close();
now put your file to intent,
emailIntent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(tempFile));
You should implement a FileProvider, which can create Uris for your app's internal files. Other apps are granted permission to read these Uris. Then, simply instead of calling Uri.fromFile(attachment), you instantiate your FileProvider and use:
fileProvider.getUriForFile(attachment);
Google have an answer for that issue:
Store the data in your own ContentProvider, making sure that other apps have the correct permission to access your provider. The preferred mechanism for providing access is to use per-URI permissions which are temporary and only grant access to the receiving application. An easy way to create a ContentProvider like this is to use the FileProvider helper class.
Use the system MediaStore. The MediaStore is primarily aimed at video, audio and image MIME types, however beginning with Android 3.0 (API level 11) it can also store non-media types (see MediaStore.Files for more info). Files can be inserted into the MediaStore using scanFile() after which a content:// style Uri suitable for sharing is passed to the provided onScanCompleted() callback. Note that once added to the system MediaStore the content is accessible to any app on the device.
Also you can try set permissions for your file:
emailIntent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
And finally you can copy/store your files in external storage - permissions not needed there.
I tested it and I found out that it was definitely private storage access problem.
When you attach some file to Gmail (over 5.0) do not use the file from private storage such as /data/data/package/. Try to use /storage/sdcard.
You can successfully attach your file.
Not sure why GMail 5.0 doesn't like certain file paths (which I've confirmed it does have read access to), but an apparently better solution is to implement your own ContentProvider class to serve the file. It's actually somewhat simple, and I found a decent example here: http://stephendnicholas.com/archives/974
Be sure to add the tag to your app manifest, and include a "android:grantUriPermissions="true"" within that. You'll also want to implement getType() and return the appropriate MIME type for the file URI, otherwise some apps wont work with this... There's an example of that in the comment section on the link.
I was having this problem and finally found an easy way to send email with attachment. Here is the code
public void SendEmail(){
try {
//saving image
String randomNameOfPic = Calendar.DAY_OF_YEAR+DateFormat.getTimeInstance().toString();
File file = new File(ActivityRecharge.this.getCacheDir(), "slip"+ randomNameOfPic+ ".jpg");
FileOutputStream fOut = new FileOutputStream(file);
myPic.compress(Bitmap.CompressFormat.JPEG, 100, fOut);
fOut.flush();
fOut.close();
file.setReadable(true, false);
//sending email
Intent intent = new Intent(Intent.ACTION_SEND);
intent.setType("text/plain");
intent.putExtra(Intent.EXTRA_EMAIL, new String[]{"zohabali5#gmail.com"});
intent.putExtra(Intent.EXTRA_SUBJECT, "Recharge Account");
intent.putExtra(Intent.EXTRA_TEXT, "body text");
//Uri uri = Uri.parse("file://" + fileAbsolutePath);
intent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(file));
intent.addFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);
startActivityForResult(Intent.createChooser(intent, "Send email..."),12);
}catch (Exception e){
Toast.makeText(ActivityRecharge.this,"Unable to open Email intent",Toast.LENGTH_LONG).show();
}
}
In this code "myPic" is bitmap which was returned by camera intent
Step 1: Add authority in your attached URI
Uri uri = FileProvider.getUriForFile(context, ""com.yourpackage", file);
Same as your manifest file provide name
android:authorities="com.yourpackage"
Step 2`; Add flag for allow to read
myIntent.setFlags(Intent.FLAG_GRANT_READ_URI_PERMISSION);

Android Xamarin.Auth 1.2.2 How to Update Uploaded Cloud Storage File LastModifiedDate?

With Xamarin.Forms, I am using OneDrive SDK on UWP to access OneDrive and it is working good. I am uploading/downloading small data files and I use the following code to change a file's LastModifiedDate:
Item itemUpdate1 = new Item();
itemUpdate1.FileSystemInfo = new Microsoft.OneDrive.Sdk.FileSystemInfo {LastModifiedDateTime = lastModifiedDateTime };
await oneDriveClient1.Drive.Items[item1.Id].Request().UpdateAsync(itemUpdate1);
On Android, I use Xamarin.Auth to access OneDrive and I cannot figure out how to update a file's LastModifiedDate. I am using the following code to sign in and upload the file:
var auth = new OAuth2Authenticator(clientId: clientId, scope: storageScopes1, authorizeUrl: new System.Uri("https://login.live.com/oauth20_authorize.srf"),
redirectUrl: new System.Uri("https://login.live.com/oauth20_desktop.srf"));
System.Uri dataFileUri = new System.Uri("https://api.onedrive.com/v1.0/drive/special/approot:/" + dataFileName1 + ":/content");
var requestUpload = new OAuth2Request("PUT", dataFileUri, null, account);
I would like to know if OAuth2Request can be used to update the file's LastModifiedDate or if there is another way to do it?
Thanks for your help.
Short answer is no. Xamarin.Auth is only to handle the Authentication between your App and the OneDrive Rest Api.
If you want to modify any of the properties of a file in OneDrive you will need to either use a OneDrive SDK for Android as you do in the UWP project or do these modification using the Rest API directly as you did to upload the file.
UPDATE 1
System.Uri dataFileUri = new System.Uri("https://api.onedrive.com/v1.0/drive/special/approot:/" + dataFileName1 + ":/content");
var requestUpload = new OAuth2Request("PUT", dataFileUri, null, account);
As you did in the code above, if you can get the endpoint for the OneDrive REST API to modify the file's properties (like LastModifiedDate) you could be able do it with OAuth2Request.
Go to the OneDrive Dev Portal and try to get that information from the documentation.

Google Drive SDK 1.8.1 RedirectURL

Is there any way to provide RedirectURL then using GoogleWebAuthorizationBroker?
Here is the sample code in C#:
Task<UserCredential> credential = GoogleWebAuthorizationBroker.AuthorizeAsync(secrets, scopes, GoogleDataStore.User, cancellationToken, dataStore);
Or we have to use different approach?
I have an "installed application" that runs on a user's desktop, not a website. By default, when I create an "installed application" project in the API console, the redirect URI seems to be set to local host by default.
What ends up happening is that after the authentication sequence the user gets redirected to localhost and receives a browser error. I would like to prevent this from happening by providing my own redirect URI: urn:ietf:wg:oauth:2.0:oob:auto
This seems to be possible using Python version of the Google Client API, but I find it difficult to find any reference to this with .NET.
Take a look in the implementation of PromptCodeReceiver, as you can see it contains the redirect uri.
You can implement your own ICodeReceiver with your prefer redirect uri, and call it from a WebBroker which should be similar to GoogleWebAuthorizationBroker.
I think it would be great to understand why can't you just use PrompotCodeReceiver or LocalServerCodeReceiver.
And be aware that we just released a new library last week, so you should update it to 1.9.0.
UPDATE (more details, Nov 25th 2014):
You can create your own ICodeReceiver. You will have to do the following:
* The code was never tested... sorry.
public class MyNewCodeReceiver : ICodeReceiver
{
public string RedirectUri
{
get { return YOU_REDIRECT_URI; }
}
public Task<AuthorizationCodeResponseUrl> ReceiveCodeAsync(
AuthorizationCodeRequestUrl url,
CancellationToken taskCancellationToken)
{
// YOUR CODE HERE FOR RECEIVING CODE FROM THE URL.
// TAKE A LOOK AT THE FOLLOWING:
// PromptCodeReceiver AND LocalServerCodeReceiver
// FOR EXAMPLES.
}
}
PromptCodeReceiver
and LocalServerCodeReceiver.
Then you will have to do the following
(instead of using the GoogleWebAuthorizationBroker.AuthorizeAsync method):
var initializer = new GoogleAuthorizationCodeFlow.Initializer
{
ClientSecrets = secrets,
Scopes = scopes,
DataStore = new FileDataStore("Google.Apis.Auth");
};
await new AuthorizationCodeInstalledApp(
new GoogleAuthorizationCodeFlow(initializer),
new MyNewCodeReceiver())
.AuthorizeAsync(user, taskCancellationToken);
In addition:
I'll be happy to understand further why you need to set a different redirect uri, so we will be able to improve the library accordingly.
When I create an installed application the current PromptCodeReceiver and LocalServerCodeReceiver work for me, so I'm not sure what's the problem with your code.

How do I upload some file into Azure blob storage without writing my own program?

I created an Azure Storage account. I have a 400 megabytes .zip file that I want to put into blob storage for later use.
How can I do that without writing code? Is there some interface for that?
Free tools:
Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer
Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage
ClumpsyLeaf CloudXplorer
Azure Storage Explorer from CodePlex (try version 4 beta)
There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.
Out of these, I personally like CloudBerry Explorer the best.
The easiest way is to use Azure Storage PowerShell. It provided many commands to manage your storage container/blob/table/queue.
For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx.
If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio (http://www.cerebrata.com/Products/CloudStorageStudio). It's a commercial tool for managing Windows Azure Storage and Hosted Service. You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx
Hope this helps.
The StorageClient has this built into it. No need to write really anything:
var account = new CloudStorageAccount(creds, false);
var client = account.CreateCloudBlobClient();
var blob = client.GetBlobReference("/somecontainer/hugefile.zip");
//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;
//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core
//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;
blob.UploadFile("somehugefile.zip");
I use Cyberduck to manage my blob storage.
It is free and very easy to use. It works with other cloud storage solutions as well.
I recently found this one as well: CloudXplorer
Hope it helps.
There is a new OpenSource tool provided by Microsoft :
Project Deco - Crossplatform Microsoft Azure Storage Account Explorer.
Please, check those links:
Download binaries: http://storageexplorer.com/
Source Code: https://github.com/Azure/deco
You can use Cloud Combine for reliable and quick file upload to Azure blob storage.
A simple batch file using Microsoft's AzCopy utility will do the trick. You can drag-and-drop your files on the following batch file to upload into your blob storage container:
upload.bat
#ECHO OFF
SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>
:AGAIN
IF "%~1" == "" GOTO DONE
AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob
SHIFT
GOTO AGAIN
:DONE
PAUSE
Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.
You can upload files to Azure Storage Account Blob using Command Prompt.
Install Microsoft Azure Storage tools.
And then Upload it to your account blob will CLI command:
AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob
Hope it Helps.. :)
You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb. You can do this in C# like this:
// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
var buffer = new Byte[4096];
int bytesRead;
var tempTotal = 0;
File.FileStream.Position = DataSent;
while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
&& tempTotal + bytesRead < CHUNK_SIZE
&& !File.IsDeleted
&& File.State != Constants.FileStates.Error)
{
requestStream.Write(buffer, 0, bytesRead);
requestStream.Flush();
DataSent += bytesRead;
tempTotal += bytesRead;
File.UiDispatcher.BeginInvoke(OnProgressChanged);
}
requestStream.Close();
if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}
void StartUpload()
{
var uriBuilder = new UriBuilder(UploadUrl);
if (UseBlocks)
{
// encode the block name and add it to the query string
CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
}
// with or without using blocks, we'll make a PUT request with the data
var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
webRequest.Method = "PUT";
webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}
The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. You can generate a SAS URL like this:
readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;
public UploadService()
{
// Setup the connection to Windows Azure Storage
var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
BlobClient = storageAccount.CreateCloudBlobClient();
// Get and create the container
BlobContainer = BlobClient.GetContainerReference("publicfiles");
}
string JsonSerializeData(string url)
{
var serializer = new DataContractJsonSerializer(url.GetType());
var memoryStream = new MemoryStream();
serializer.WriteObject(memoryStream, url);
return Encoding.Default.GetString(memoryStream.ToArray());
}
public string GetUploadUrl()
{
var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
{
Permissions = SharedAccessPermissions.Write,
SharedAccessExpiryTime =
DateTime.UtcNow.AddMinutes(60)
});
return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}
I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page
The new Azure Portal has an 'Editor' menu option in preview when in the container view. Allows you to upload a file directly to the container from the Portal UI
I've used all the tools mentioned in post, and all work moderately well with block blobs. My favorite however is BlobTransferUtility
By default BlobTransferUtility only does block blobs. However changing just 2 lines of code and you can upload page blobs as well. If you, like me, need to upload a virtual machine image it needs to be a page blob.
(for the difference please see this MSDN article.)
To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from
new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
to
new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob
The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.
Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.
You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement
Syntax
AzCopy /Source : /Destination /s
Try the Blob Service API
http://msdn.microsoft.com/en-us/library/dd135733.aspx
However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.

Resources