YouTube Data API (search for a channel) - youtube-data-api

Using the YouTube Data API, I can find a channel with the "search" API:
https://www.googleapis.com/youtube/v3/search?part=snippet&q=Alexander+condrashov&type=channel&key=YOUR_API_KEY
This gives me one object in my items array.
But this is a "search" API, not a "channel" API.
I think that the correct way to find a channel is to use "Channels: list" API.
Trying to find channel by username gives me zero objects in my items array:
https://www.googleapis.com/youtube/v3/channels?part=snippet%2CcontentDetails%2Cstatistics&forUsername=Alexander+kondrashov&key=YOUR_API_KEY
Has anyone succeeded in using "Channels: list" with "forUsername" to find a channel?

I know this is old and you may have already figured something out, but unfortunately YouTube's Channels API isn't great for finding channels reliably for anything other than the channel's ID. As a workaround, I combine the two approaches to get a realiable result (though it takes two queries and costs more quota which is lame sauce).
For example (in C# using the Google SDKs):
public async Task<Channel> GetChannelByUsername(string username)
{
try
{
var channel = default(Channel);
var searchRequest = this.youtubeService.Search.List("snippet");
searchRequest.Q = username;
searchRequest.Type = "channel";
var searchResults = await searchRequest.ExecuteAsync();
if (searchResults != null && searchResults.Items.Count > 0)
{
var searchResult = searchResults.Items.FirstOrDefault(i => i.Snippet.ChannelTitle.ToLower() == username);
if (searchResult != null && searchResult.Snippet != null)
{
var channelSearchResult = searchResult.Snippet;
var channelListRequest = this.youtubeService.Channels.List("snippet");
channelListRequest.Id = channelSearchResult.ChannelId;
var channelListResult = await channelListRequest.ExecuteAsync();
if(channelListResult != null)
{
channel = channelListResult.Items.FirstOrDefault();
}
}
}
return channel;
}
catch (Exception ex)
{
// Handle Exception
}
}

Related

Google API Client for .NET: How to implement Exponential Backoff

I've created a method to add members in a Batch Request to a google group using .NET core and google's .NET client library. The code looks like this:
private void InitializeGSuiteDirectoryService()
{
_directoryServiceCredential = GoogleCredential
.FromJson(GlobalSettings.Instance.GSuiteSettings.Credentials)
.CreateScoped(_scopes)
.CreateWithUser(GlobalSettings.Instance.GSuiteSettings.User);
_directoryService = new DirectoryService(new BaseClientService.Initializer()
{
HttpClientInitializer = _directoryServiceCredential,
ApplicationName = _applicationName
});
}
public OperationResult<int> AddGroupMembers(Group group, IEnumerable<Member> members)
{
var result = new OperationResult<int>();
var memberList = members.ToList();
var batchRequestCount = 0;
if (memberList.Any())
{
var request = new BatchRequest(_directoryService);
foreach (var member in memberList)
{
batchRequestCount++;
request.Queue<Members>(_directoryService.Members.Insert(member, group.Id), (content, error, i, message) =>
{
if (message.IsSuccessStatusCode)
{
//log OK
}
else
{
// Implement Exponential backoff only on the request that failed.
}
});
if (batchRequestCount == 30|| member.Equals(memberList.Last()))
{
request.ExecuteAsync().Wait();
request = new BatchRequest(_directoryService); //Clear queue
}
}
}
return result;
}
The logic works fine if the amount of members is small; however, when the members count is let's say 100( this is the max amount of users in my google's test instance), I get an Error from Google that reads: "quotaExceeded". According to Google's documentation, the limit for a batch request on their Admin SDK is 1000 and I've set my logic to Execute when we reach a limit of 30.
The QUESTION is: How do I implement error handling to retry whenever I get this error? Their documentation suggests implementing 'Exponential Backoff' with a response that contains a 'retry-able error'(I don't see this when I inspect my response).
So here's what I ended up doing to implement Exponential Backoff on my call to add members to a Gsuite group. Since I'm using dotnet core, I was able to use 'Polly', which is a resilience and transient-fault-handling library that offers this functionality out of the box. There may be some need for refactoring, but here's what the code looks like for now:
public OperationResult<int> AddGroupMembers(Group group, IEnumerable<Member> members)
{
var result = new OperationResult<int>();
var memberList = members.ToList();
var batchRequestCount = 0;
if (memberList.Any())
{
var request = new BatchRequest(_directoryService);
foreach (var member in memberList)
{
retryRequest = false; // This variable needs to be declared at the class level to guarantee the value is available to the original thread running the process.
batchRequestCount++;
request.Queue<Members>(_directoryService.Members.Insert(member, group.Id), (content, error, i, message) =>
{
// If error code is 'quotaExceeded' retry the request ( You can add as many error codes as you'd like to retry here)
if (error.Code == 403)
{
retryRequest = true;
}
});
// Execute batch request to add members in batches of 30 member max
if (batchRequestCount == 30|| member.Equals(memberList.Last()))
{
// Below is what the code to retry using polly looks like
var response = Policy
.HandleResult<HttpResponseMessage>(message => message.StatusCode == HttpStatusCode.Conflict)
.WaitAndRetry(new[]
{
TimeSpan.FromSeconds(1),
TimeSpan.FromSeconds(2),
TimeSpan.FromSeconds(4)
}, (results, timeSpan, retryCount, context) =>
{
// Log Warn saying a retry was required.
})
.Execute(() =>
{
var httpResponseMsg = new HttpResponseMessage();
// Execute batch request Synchronously
request.ExecuteAsync().Wait();
if (retryRequest)
{
httpResponseMsg.StatusCode = HttpStatusCode.Conflict;
retryRequest = false;
}
else
{
httpResponseMsg.StatusCode = HttpStatusCode.OK;
}
return httpResponseMsg;
});
if (response.IsSuccessStatusCode)
{
// Log info
}
else
{
// Log warn
}
requestCount = 0;
request = new BatchRequest(_directoryService);
batchCompletedCount++;
}
}
}
return result;
}

Populate SQLite database from generic "list of items"

I use a SQLite database to populate a listview with a generic list of TodoItems:
ListView.ItemsSource = await App.Database.GetItemsAsync("SELECT * FROM [TodoItem]");
I can read data of the same kind from a database source using the HttpClient class as well via
public async Task<List<TodoItem>> ReadDataAsync ()
....
Items = new List<TodoItem> ();
....
var content = await response.Content.ReadAsStringAsync ();
Items = JsonConvert.DeserializeObject <List<TodoItem>> (content);
....
return Items;
whereafter I can use this as a data source as well:
ListView.ItemsSource = await App.ReadDataAsync();
Both ways work well.
What I want now is combine both routines to accomplish a code which checks whether we have an online connection, if yes, drop the database and populate it using the result of ReadDataAsync from above, if no, leave the database alone.
I found no way to directly assign a List to my database, overwriting the whole contents, I think of something like:
App.Database.Items = await App.ReadDataAsync();
but SQLite doesn't seem to expose the data store directly. How can I accomplish this?
For android for example, you could try: There's no need to drop the database.
//Check for internet connection
var connectivityManager =
(ConnectivityManager)this.GetSystemService("connectivity");
var activeConnection = connectivityManager.ActiveNetworkInfo;
if ((activeConnection != null) && activeConnection.IsConnected)
{
//Make call to the API
var list = await App.ReadDataAsync();
//Update or Insert local records
foreach (var item in list)
{
var success = UpdateOrInsert(item);
//Do something after - like retry incase of failure
}
}
The UpdatedOrInsert method can look like:
private bool UpdateOrInsert(TodoItem item)
{
var rowsAffected = App.Database.Update(item);
if (rowsAffected == 0)
{
// The item doesn't exist in the database, therefore insert it
rowsAffected = App.Database.Insert(item);
}
var success = rowsAffected > 0;
return success;
}

Running async function in parallel using LINQ's AsParallel()

I have a Document DB repository class that has one get method like below:
private static DocumentClient client;
public async Task<TEntity> Get(string id, string partitionKey = null)
{
try
{
RequestOptions requestOptions = null;
if (partitionKey != null)
{
requestOptions = new RequestOptions { PartitionKey = new PartitionKey(partitionKey) };
}
var result = await client.ReadDocumentAsync(
UriFactory.CreateDocumentUri(DatabaseId, CollectionId, id),
requestOptions);
return (TEntity)(dynamic)result.Resource;
}
catch (DocumentClientException e)
{
// Have logic for different exceptions actually
throw;
}
}
I have two collections - Collection1 and Collection2. Collection1 is non-partitioned whereas Collection2 is partitioned.
On the client side, I create two repository objects, one for each collection.
private static DocumentDBRepository<Collection1Item> collection1Repository = new DocumentDBRepository<Collection1Item>("Collection1");
private static DocumentDBRepository<Collection2Item> collection2Repository = new DocumentDBRepository<Collection2Item>("Collection2");
List<Collection1Item> collection1Items = await collection1Repository.GetItemsFromCollection1(); // Selects first forty documents based on time
List<UIItem> uiItems = new List<UIItem>();
foreach (var item in collection1Items)
{
var collection2Item = await storageRepository.Get(item.Collection2Reference, item.TargetId); // TargetId is my partition key for Collection2
uiItems.Add(new UIItem
{
ItemId = item.ItemId,
Collection1Reference = item.Id,
TargetId = item.TargetId,
Collection2Reference = item.Collection2Reference,
Value = collection2Item.Value
});
}
This works fine. But since it is happening sequentially with foreach, I wanted to do those Get calls in parallel. When I do it in parallel as below:
ConcurrentBag<UIItem> uiItems = new ConcurrentBag<UIItem>();
collection1Items.AsParallel().ForAll(async item => {
var collection2Item = await storageRepository.Get(item.Collection2Reference, item.TargetId); // TargetId is my partition key for Collection2
uiItems.Add(new UIItem
{
ItemId = item.ItemId,
Collection1Reference = item.Id,
TargetId = item.TargetId,
Collection2Reference = item.Collection2Reference,
Value = collection2Item.Value
});
}
);
It doesn't work and uiItems is always empty.
You don't need Parallel.For to run async operations concurrently. If they are truly asynchronous they already run concurrently.
You could collect the task returned from each operation and simply call await Task.WhenAll() on all the tasks. If you modify your lambda to create and return a UIItem, the result of await Task.WhenAll() will be a collection of UIItems. No need to modify global state from inside the concurrent operations.
For example:
var itemTasks = collection1Items.Select(async item =>
{
var collection2Item = await storageRepository.Get(item.Collection2Reference, item.TargetId);
return new UIItem
{
ItemId = item.ItemId,
Collection1Reference = item.Id,
TargetId = item.TargetId,
Collection2Reference = item.Collection2Reference,
Value = collection2Item.Value
}
});
var results= await Task.WhenAll(itemTasks);
A word of caution though - this will fire all Get operations concurrently. That may not be what you want, especially when calling a service with rate limiting.
Try simply starting tasks and waiting for all of them at the end. That would result in parallel execution.
var tasks = collection1Items.Select(async item =>
{
//var collection2Item = await storageRepository.Get...
return new UIItem
{
//...
};
});
var uiItems = await Task.WhenAll(tasks);
PLINQ is useful when working with in-memory constructs and using as many threads as possible, but if used with the async-await technique (which is for releasing threads while accessing external resources), you can end up with strange results.
I would like to share a solution for an issue i saw in some comments.
If you're scared about thread rate limit, and you want to limit this by yourself, you can do something like this, using SemaphoreSlim.
var nbCores = Environment.ProcessorCount;
var semaphore = new SemaphoreSlim(nbCores, nbCores);
var processTasks = items.Select(async x =>
{
await semaphore.WaitAsync();
try
{
await ProcessAsync();
}
finally
{
semaphore.Release();
}
});
await Task.WhenAll(processTasks);
In this example, i called concurrently my "ProcessAsync" but limited to {processor number} concurrent processes.
Hope that's help someone.
NB : You could set the "nbCores" variable as a proper value that satisfy your code condition, of course.
NB 2 : This example is for some use cases, not all of them. I would highly suggest with a big load of task to refer to TPL programming

BroadcastBlock missing items

I have a list of project numbers that I need to process. A project could have about 8000 items and I need to get the data for each item in the project and then push this data into a list of servers. Can anybody please tell me the following..
1) I have 1000 items in iR but only 998 were written to the servers. Did I loose items by using broadCastBlock?
2) Am I doing the await on all actionBlocks correctly?
3) How do I make the database call async?
Here is the database code
public MemcachedDTO GetIR(MemcachedDTO dtoItem)
{
string[] Tables = new string[] { "iowa", "la" };
using (SqlConnection connection = new SqlConnection(ConfigurationManager.ConnectionStrings["test"].ConnectionString))
{
using (SqlCommand command = new SqlCommand("test", connection))
{
DataSet Result = new DataSet();
command.CommandType = CommandType.StoredProcedure;
command.Parameters.Add("#ProjectId", SqlDbType.VarChar);
command.Parameters["#ProjectId"].Value = dtoItem.ProjectId;
connection.Open();
Result.EnforceConstraints = false;
Result.Load(command.ExecuteReader(CommandBehavior.CloseConnection), LoadOption.OverwriteChanges, Tables);
dtoItem.test = Result;
}
}
return dtoItem;
}
Update:
I have updated the code to the below. It just hangs when I run it and only writes 1/4 of the data to the server? Can you please let me know what I am doing wrong?
public static ITargetBlock<T> CreateGuaranteedBroadcastBlock<T>(IEnumerable<ITargetBlock<T>> targets, DataflowBlockOptions options)
{
var targetsList = targets.ToList();
var block = new ActionBlock<T>(
async item =>
{
foreach (var target in targetsList)
{
await target.SendAsync(item);
}
}, new ExecutionDataflowBlockOptions
{
CancellationToken = options.CancellationToken
});
block.Completion.ContinueWith(task =>
{
foreach (var target in targetsList)
{
if (task.Exception != null)
target.Fault(task.Exception);
else
target.Complete();
}
});
return block;
}
[HttpGet]
public async Task< HttpResponseMessage> ReloadItem(string projectQuery)
{
try
{
var linkCompletion = new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = 2
};
var cts = new CancellationTokenSource();
var dbOptions = new DataflowBlockOptions { CancellationToken = cts.Token };
IList<string> projectIds = projectQuery.Split(',').ToList();
IEnumerable<string> serverList = ConfigurationManager.AppSettings["ServerList"].Split(',').Cast<string>();
var iR = new TransformBlock<MemcachedDTO, MemcachedDTO>(
dto => dto.GetIR(dto), new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 3 });
List<ActionBlock<MemcachedDTO>> actionList = new List<ActionBlock<MemcachedDTO>>();
List<MemcachedDTO> dtoList = new List<MemcachedDTO>();
foreach (string pid in projectIds)
{
IList<MemcachedDTO> dtoTemp = new List<MemcachedDTO>();
dtoTemp = MemcachedDTO.GetItemIdsByProject(pid);
dtoList.AddRange(dtoTemp);
}
foreach (string s in serverList)
{
var action = new ActionBlock<MemcachedDTO>(
async dto => await PostEachServerAsync(dto, s, "setitemcache"));
actionList.Add(action);
}
var bBlock = CreateGuaranteedBroadcastBlock(actionList, dbOptions);
foreach (MemcachedDTO d in dtoList)
{
await iR.SendAsync(d);
}
iR.Complete();
iR.LinkTo(bBlock);
await Task.WhenAll(actionList.Select(action => action.Completion).ToList());
return Request.CreateResponse(HttpStatusCode.OK, new { message = projectIds.ToString() + " reload success" });
}
catch (Exception ex)
{
return Request.CreateResponse(HttpStatusCode.InternalServerError, new { message = ex.Message.ToString() });
}
}
1) I have 1000 items in iR but only 998 were written to the servers. Did I loose items by using broadCastBlock?
Yes in the code below you set BoundedCapacity to one, if at anytime your BroadcastBlock cannot pass along an item it will drop it. Additionally a BroadcastBlock will only propagate Completion to one TargetBlock, do not use PropagateCompletion=true here. If you want all blocks to complete you need to handle Completion manually. This can be done by setting the ContinueWith on the BroadcastBlock to pass Completion to all of the connected targets.
var action = new ActionBlock<MemcachedDTO>(dto => PostEachServerAsync(dto, s, "set"), new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 3, BoundedCapacity = 1 });
broadcast.LinkTo(action, linkCompletion);
actionList.Add(action);
Option: Instead of the BroadcastBlock use a properly bounded BufferBlock. When your downstream blocks are bound to one item they cannot receive additional items until they finish processing what they have. That will allow the BufferBlock to offer its items to another, possibly idle, ActionBlock.
When you add items into a throttled flow, i.e. a flow with a BoundedCapacity less than Unbounded. You need to be using the SendAsync method or at least handling the return of Post. I'd recommend simply using SendAsync:
foreach (MemcachedDTO d in dtoList)
{
await iR.SendAsync(d);
}
That will force your method signature to become:
public async Task<HttpResponseMessage> ReloadItem(string projectQuery)
2) Am I doing the await on all actionBlocks correctly?
The previous change will permit you to loose the blocking Wait call in favor of a await Task.WhenAlll
iR.Complete();
actionList.ForEach(x => x.Completion.Wait());
To:
iR.Complete();
await bufferBlock.Completion.ContinueWith(tsk => actionList.ForEach(x => x.Complete());
await Task.WhenAll(actionList.Select(action => action.Completion).ToList());
3) How do I make the database call async?
I'm going to leave this open because it should be a separate question unrelated to TPL-Dataflow, but in short use an async Api to access your Db and async will naturally grow through your code base. This should get you started.
BufferBlock vs BroadcastBlock
After re-reading your previous question and the answer from #VMAtm. It seems you want each item sent to All five servers, in that case you will need a BroadcastBlock. You would use a BufferBlock to distribute the messages relatively evenly to a flexible pool of servers that each could handle a message. None the less, you will still need to take control of propagating completion and faults to all the connected ActionBlocks by awaiting the completion of the BroadcastBlock.
To Prevent BroadcastBlock Dropped Messages
In general you two options, set your ActionBlocks to be unbound, which is their default value:
new ExecutionDataflowBlockOptions { MaxDegreeOfParallelism = 3, BoundedCapacity = Unbounded });
Or broadcast messages your self from any variety of your own construction. Here is an example implementation from #i3arnon. And another from #svick

add channels couchbase with Xamarin Forms

How can I add channels to my PULL and PUSH replications with Xamarin PLC (C#)? And to the documents?
I tried to write something and set channels, but still not working.
The documents created after that instruction hadn't got any channels.
Here my code:
{
if (Manager.SharedInstance.GetDatabase(DatabaseName) == null)
return;
IAuthenticator auth =
string.IsNullOrWhiteSpace(Username)
? null
: AuthenticatorFactory.CreateBasicAuthenticator(Username, Password);
pull = Manager.SharedInstance.GetDatabase(DatabaseName).CreatePullReplication(RemoteSyncUrl);
if(auth != null)
{
pull.Authenticator = auth;
}
push = Manager.SharedInstance.GetDatabase(DatabaseName).CreatePushReplication(RemoteSyncUrl);
if(auth != null)
{
push.Authenticator = auth;
}
pull.Continuous = true;
push.Continuous = true;
pull.Changed += ReplicationProgress;
push.Changed += ReplicationProgress;
//Set channels
List<string> channels = new List<string>();
channels.Add("channel");
// I had try to do that but not working,
// the documents created after that hadn't got the channels
pull.Channels = channels;
push.Channels = channels;
pull.Start();
push.Start();
}
Thanks in advance.
Riccardo

Resources