Currently i am creating workItems as Bugs from c# code using TFS Sdk. I have around some 5k bugs which I query from Sql Server Db Paging by 1000 then creating workItems into TFS. But for Creating 1000 WorkItems and saving into TFS its taking more then 15min.
Following is my creation of workItems
foreach (var item in listbugs)
{
var workItem = new WorkItem(workItemTypes["bug"]);
workItem.Title = "newTitle";
workItem.Fields["repro steps"].Value = "Rerpoducible Step";
workItem.Save();
}
So if u guys know how to do create Bulk workItems please share the knowledge.
Thanks you help is greatly apprecaited.
Documentation (http://msdn.microsoft.com/en-us/library/bb130338%28v=vs.90%29.aspx) states:
Every time you save a work item to Team Foundation Server, you generate a round-trip operation between the work item object model and the server. To minimize the round-trips when saving several work items, use the BatchSave method.
Documentation for BatchSave: http://msdn.microsoft.com/en-us/library/bb140385%28v=vs.90%29.aspx
Related
I have the following scenario that I was wondering if it's possible/feasible to implement. I apologize if this is considered an overly "broad" question, but I think SO would be the best place to ask this.
Let us suppose I have a website and I want to display a graph to an end-user. For the purposes of this example, let's say we want to show them "Sales per category" in the past hour. The data would be displayed in a graph, and the SQL to run the query might be something like this:
SELECT SUM(revenue) FROM sales
WHERE timestamp > NOW() - INTERVAL 1 HOUR
GROUP BY category
As far as I'm aware, there are two general ways to update the data for the end-user:
Do some sort of polling (or a similar technique) at a certain interval to re-fetch the data from the query. However, this can become quite expensive depending on the complexity/duration of the query and how many people are connected simultaneously.
The second method would be to store all the data in-memory and push the update directly to that memory store (which could be either client-side, or server side, and we could send a ws request to the end user whenever there's a data update. An example of this would be using something like https://github.com/jpmorganchase/perspective.
My question then is if it's possible at all do do real-time data updating (the case I describe in Example 2) when the data is too large to store in memory. I think the answer is a "no", but perhaps I'm missing some ways to do this. For example, let's say I have 1TB of data stored in BigQuery and I am streaming updates to it with new product purchases -- is there a way to push updates to the end-client without having to re-run the query for every time I want to get an update? Are there any other technologies that might be used/useful for this scenario?
Again, I don't think it's possible but wanted to see what's possible for as near-real-time display to an end-client as possible on a queried data set.
If your data is unique per client, big and real-time changing, there is no salvation in using any database or cache as an exchange. You have to send the data update directly.
If you can't directly push data to the client from the process doing the database update, you probably can pass the data from the process doing the update to the process doing the pushes through a message broker (I'll use Rabbitmq as an example).
The optimal configuration for this setup is a topic model, where a topic is a client ID or key, and making one listener per connected client for that topic - alternatively, one listener for all clients, but registering/unregistering topics dynamically.
Have the websocket handler listen to the topic of their client. Setup the process updating the database to also stream updates to the topic id of the client. The broker will discard all updates not going to a connected client, making the load more manageable in the listener end.
Without any storage or polling, this solution is low latency. And even with a thousand simultaneous clients, I doubt the broker would ever exhaust memory.
Since you are interested in this option I decided to extend the comment to an answer. I will take the SQL Server and C# compoment - sqltabledependency. You can check it out if it fits your needs.
You would create a temp table where you would put any changes from the sales table e.g. sales_data_watch (you could have there also the precalculation aggregations as in your example).
You would create a hourly job which would monitor the changes in the sales table and perform insert/updates on the sales_data_watch
You would have connected the C# sqltabledependency connected to sales_data_watch (note: taken from the example to fit your table)
public class SaleData
{
public int revenue{ get; set; }
}
public class Program
{
private static string _con = "data source=.; initial catalog=MyDB; integrated security=True";
public static void Main()
{
// The mapper object is used to map model properties
// that do not have a corresponding table column name.
// In case all properties of your model have same name
// of table columns, you can avoid to use the mapper.
var mapper = new ModelToTableMapper<SaleData>();
mapper.AddMapping(s => s.revenue, "Aggregated revenue");
// Here - as second parameter - we pass table name:
// this is necessary only if the model name is different from table name
// (in our case we have Sale vs Sales).
// If needed, you can also specifiy schema name.
using (var dep = new SqlTableDependency<SaleData>(_con, "sales_data_watch", mapper: mapper));
{
dep.OnChanged += Changed;
dep.Start();
Console.WriteLine("Press a key to exit");
Console.ReadKey();
dep.Stop();
}
}
public static void Changed(object sender, RecordChangedEventArgs<SaleData> e)
{
var changedEntity = e.Entity;
Console.WriteLine("DML operation: " + e.ChangeType);
Console.WriteLine("Revenue: " + changedEntity.Revenue);
}
}
After all the notifications have been distributed you could do truncate table sales_data_watch after (if you don't want the table to grow too big which would slow down the whole process eventually.
This is using only sql server and C# component. There are other, probably better options, for example: Detect record table change with MVC, SignalR, jQuery and SqlTableDependency to do it differently. That will depend on your preferences.
Edit a complete example link for Building real time charts with Angular 5, Google Charts, SignalR Core, .NET Core 2, Entity Framework Core 2 and SqlTable dependency (this link is first page of three). At the top of the page you can see real-time google's gaugeschart. All credits go to anthonygiretti. You can download the example project at github.
Technologies used
Database
Sql Server, localDb with Visual Studio 2017 is correct to make it work
Front End technologies
Angular 5
Google Charts
Visual Studio Code
SignalR Client
BackEnd technologies
.NET Core 2
SignalR Core
EntityFramework Core
EntityFramework Core for Sql Server
SqlTableDependency
First is to install compoenents needed - service broker, SQL Table, Angular-CLI, Angular 5 project, SignalR client (VS 2017, .Net Core 2 SDK installed) - link is the same part1
Next comes the backend setup - part2
To make it work this project contains :
A DbContext (GaugesContext.cs) for EntityFramework Core
A Hub (GaugeHub.cs) for SignalR that broadcast data
A Model that contains strongly typed data to send (Gauge.cs)
A Repository exposed with Entity Framework and its Interface (GaugeRepository.cs and IGaugeRepository.cs)
A Subscription to Gauge sql table with SqlTableDependency and its Interface (GaugeDatabaseSubscription.cs and IDatabaseSubscription)
Two Extension methods that extends IServiceCollection (AddDbContextFactory.cs) and IApplicationBuilder (UseSqlTableDependency.cs)
And Startup.cs and Program.cs
Last part is to setup the frontend - part3
We have :
A folder that contains the gauge chart component (gaugeschart.component.html and gaugeschart.component.ts)
A folder that contains a gauge chart service and a Google Charts base service (google-gauges-chart.service.ts and google-charts.base.service.ts)
A folder that contains environments files
A folder that contains a strongly typed model for the gauge chart (gauge.ts)
Finally at the root of src folder the defaults files components and module (app component files and app module file)
In the next step you should test it to see if the data are projected into the graphs correctly when you changed the data.
I think question might be rooted in an issue with the client's graph and it's design requirements.
A "sales in the last hour" graph is both lacking information and hard to update.
Updates need to deduct sales as the "latest hour" progresses (1:05pm turns to 1:06pm) as well as add new sales.
In addition, the information might look exciting, but it provides very little information that marketing can use to improve sales (i.e., at which hours should more ads be added).
I would consider a 24 hour graph, or a 12 hour graph divided by actual hours.
This could simplify updates and would probably provide more useful metrics.
This way, updates to the graph are always additive, so no in-memory data-store is required (and the information is more actionable).
For example, every new sale could be published to a "new_sale" channel. The published sale data could include it's exact time.
This would allow subscribed clients to add new sales to the correct hour in the graph without ever invoking an additional database call and without requiring an in-memory data-store.
I'm using the CRM 2015 Portal SDK (Microsoft.CrmSdk.Extensions 7.1.0 from NuGet), and trying to create an invoice based on an existing sales order. I can create the invoice, but the total is always zero until somebody views the invoice using the Dynamics CRM UI - and I can't find any way to force a recalculation via the SDK.
The SDK documentation (such as it is!) is a whopping great CHM file included in the SDK download, which is generated from XMLDoc comments - very little in the way of descriptions or examples. There's also the content at https://msdn.microsoft.com/en-us/library/gg309408.aspx, which includes a couple of walkthroughs but doesn't go into very much detail.
Here's the code I'm running - this is inside an Action method in an ASP.NET MVC 5 project:
public ActionResult CreateInvoice(Guid id) {
var xrm = new SpotlightXrmServiceContext("ConnectionStrings.Crm2015.Spotlight");
var order = xrm.SalesOrderSet.FirstOrDefault(o => o.Id == id);
var columns = new ColumnSet(true);
var invoiceEntity = xrm.ConvertSalesOrderToInvoice(order.Id, columns);
return(View(invoiceEntity));
}
After running this code, the Invoice exists in the CRM database, and I can see it listed in the CRM UI by navigating to Sales > Invoices - but the invoice total is zero. If I click the invoice in the CRM UI to view it, something behind the scenes forces a recalculation, because when the invoice loads, the total is £154 (which is the total amount of the original sales order)
The web app in question is a post-payment redirect from our payment provider, so what I need to do from my web application is:
Locate the existing sales order
Create an invoice associated with that order
Mark the invoice as Paid
Mark the sales order as fulfilled
I appear to have fallen at the first hurdle. Any ideas?
I'm currently running in a multi-DB SQL Server environment and using linq to sql to perform queries.
I'm using the approach documented here to achieve cross DB joins:
http://www.enderminh.com/blog/archive/2009/04/25/2654.aspx
so basically:
2 data contexts - Users and Payments
Users.dbo.UserDetails {PK: UserId }
Payments.dbo.CurrentPaymentMethod { PK: UserId }
I drag the tables onto the DBML, and in the properties window, change the Source from dbo.UserDetails to Users.dbo.UserDetails to fully qualify the DB name.
I can then issue a single data context cross DB join by doing something like:
var results = (from user in datacontext.Table<UserDetail>()
join paymentmethod in dataContext.Table<CurrentPaymentMethod>() on user.UserId equals paymentmethod.UserId
... rest of query here ...);
Now this is tickety boo and works as I want it to. The only problem I'm currently having is when schema updates etc. happen (which is relatively frequent as we're in a significant dev phase).
(and finally, the question!)
What I want to achieve (and I've marked the question up as T4 as a guess, as I know that the DBML files are T4 guided) is an automated way when I drag any table onto a data context that the Source automatically picks up the DB name (so will have Users.dbo.UserDetails instead of just dbo.UserDetails)?
Thanks for any pointers :)
Terry
Have a look at the T4 Toolbox and the LinqToSql code generator it provides (Courtesy of Oleg Sych) - You can customize the templates to generate references however you'd like, but I think the problem you're going to run into is that the database name isn't stored in the dbml file.
What you could probably do is add a filter to the generator, perhaps using a dictionary or similar, such that in your .tt file, you maintain a list of tables and the databases they belong to. That way, if your maintenance task is to delete the class from the designer and drop it on again, it will get the right database name.
I am new to LINQ to SQL.... I need to add a new column to an existing table and I updated the dbml to include this new field. Then in my C# code, I query the database and accessing this new field. Everything is fine with the new database; however, if I load back a previous database without this new field, my program would crash when it's accessing the database (obviously because of this new field). How do I make it, either the database or my C# code to support backward compatibility?
Here's my code snip
I added a field email to Customer table and also add it to the DataContext.dbml, below is the c# code
DataContext ctx = new DataConext ();
var cusList = ctx.Customer;
foreach (var c in cusList)
{
.
.
.
//access the new field
if (c.email != null)
displayEmail (email);
.
.
.
}
When I ran through debugger, it's crashing at the very first foreach loop if I am using an older version database without the new email field.
Thanks.
Make sure you upgrade old database. That's what updates are made for.
I don't think there's a better option. But i might be wrong.
Should be a code-land fix. Make your code check for the existance of the column, and use different queries on each case.
I agree with Arnis L.: Upgrade your old database. LTS is going to want to look for that column called email in your table, and will complain if it can't find it. I could suggest a workaround that would entail using a Stored Procedure, but you'd need to update your old database to use this stored proc, so it's not a very helpful suggestion :-)
How to update your old database? This is the old-school way:
ALTER TABLE Customers
ADD Email VARCHAR(130) NULL
You could execute this manually against the older database through Query Analyzer, for example. See here for full docs on ALTER TABLE: http://msdn.microsoft.com/en-us/library/aa275462%28SQL.80%29.aspx
If you are working on a team with very strict procedures for deployment from development to production systems, you would already be writing your "change scripts" to the database in this same fashion. However, if you are developing through Enterprise Manager, it might seem counter-productive to have to do the same work, a second time, just to keep old database schemas in sync with the latest schema.
For a friendlier, more "gooey" approach to this latter style of development, I myself can't recommend enough the usage of something like the very excellent Red Gate SQL Compare tools to help you keep multiple SQL Server databases in sync. (There are other 3rd party utilities out there that supposedly can roughly the same thing, and that might even be a little cheaper, but I haven't looked much further into them.)
Best of luck!-Mike
I a working in VisualStudio 2005. I have a dataset with sevaral datatables in it already. I had to modify the database to add a new foreign key that I forgot about. How do I get visual studio to recognize the new relationship?
.Net does not load FK relationships into your DataSet automatically - however, you can add them yourself with a DataRelation*.
*this may not be true if you are using LINQ - if you are, I am unsure.
Right-click on the DataSet page, and select Add->Relation?
If you've defined it in your database, you can always re-drag the affected table back into the Dataset then re-enter your Queries.