I have a scenario where my data looks like this.
Books
---------------------------------
title | returnedDate
Great Gatsby | 2015-05-04
Great Gatsby | 2015-03-22
Great Gatsby | 2015-01-11
Life of PI | 2015-04-04
Life of PI | 2015-04-02
Clean Code | 2015-06-05
I would like to return the most first and last book in each group (grouped by title) in a single linq statement. I know I can get the first or last item with a linq query like this.
var books = dbContext.Books
.GroupBy(b => b.title)
.Select(g => g.OrderDescending().FirstOrDefault());
How can I get the last item if one exists as well?
My final result would look like:
Books
---------------------------------
title | returnedDate
Great Gatsby | 2015-05-04
Great Gatsby | 2015-01-11
Life of PI | 2015-04-04
Life of PI | 2015-04-02
Clean Code | 2015-06-05
var books = dbContext.Books
.GroupBy(b => b.title)
.Select(g=>new {
Title=g.Key,
First=g.OrderByDescending(x=>x).FirstOrDefault(),
Last=g.OrderBy(x=>x).FirstOrDefault()
});
Results:
title | First | Last
Great Gatsby | 2015-05-04 | 2015-01-11
Life of PI | 2015-04-04 | 2015-04-02
Clean Code | 2015-06-05 | 2015-06-05
If you really want it like you asked, then it becomes a bit more difficult:
var books = dbContext.Books
.GroupBy(b => b.title)
.Select(g=>new {
title=g.Key,
returnedDate=g.OrderByDescending(x=>x).FirstOrDefault()
}).Concat(
dbContext.Books
.GroupBy(b => b.title)
.Where(g=>g.Count()>1)
.Select(g=>new {
title=g.Key,
returnedDate=g.OrderBy(x=>x).FirstOrDefault()
})
).OrderBy(c=>c.title).ThenDescendingBy(c=>c.returnedDate);
Yuck. Probably a better way, but first that came to mind.
It's possible by getting the first and last return date and then returning the books of which the return dates are equal to these:
from b in dbContext.Books
group b by b.title into bg
let first = bg.OrderByDescending (b => b.returnedDate).FirstOrDefault().returnedDate
let last = bg.OrderBy (b => b.returnedDate).FirstOrDefault().returnedDate
from b in bg
where b.returnedDate == first || b.returnedDate == last
orderby b.title, b.returnedDate
select b
With a bit of fiddling I came up with this. Not sure how efficient this will be when dealing with a large table of data.
[Test]
public void FirstLastLinq()
{
var books = new List<Book>
{
new Book { Title = "Great Gatsby", Returned=new DateTime(2015,04,03) },
new Book { Title = "Great Gatsby", Returned=new DateTime(2015,04,02) },
new Book { Title = "Great Gatsby", Returned=new DateTime(2015,04,01) },
new Book { Title = "Life of PI", Returned=new DateTime(2015,03,05) },
new Book { Title = "Life of PI", Returned=new DateTime(2015,03,04) },
new Book { Title = "Clean Code", Returned=new DateTime(2015,02,02) },
};
var newBooks = books.GroupBy(b => b.Title).SelectMany(g => g.OrderByDescending(b => b.Returned)
.Where(b1 => b1.Returned == g.Min(b2 => b2.Returned) ||
(b1.Returned == g.Max(b3 => b3.Returned) && g.Min(b4 => b4.Returned) != g.Max(b5 => b5.Returned))));
Assert.IsNotNull(newBooks);
}
private class Book
{
public string Title { get; set; }
public DateTime Returned { get; set; }
}
Related
I have a table:
Name | account info| AccountNumber
-----| ------------| ------
abc | IT | 3000
bdc | Desk | 2000
sed | Kitchen | 3000
afh | work | 4000
hsfs | home | 2000
I want to achieve something like this:
Name | account info| DisguiseInfo
-----| ------------| ------
abc | IT | Acc1
bdc | Desk | Acc2
sed | Kitchen | Acc1
afh | work | Acc3
hsfs | home | Acc2
I tried doing this:
int count = 1;
var disguise = listResults.GroupBy(x => x.ID).Select(y =>
y.First()).Distinct();
foreach (var i in disguise)
{
i.DisguiseName = "Acc " + count;
count++;
}
Which gives a results like this (very close to what I want):
Name | account info| DisguiseInfo
-----| ------------| ------
abc | IT | Acc1
bdc | Desk | Acc2
sed | Kitchen |
afh | work | Acc3
hsfs | home |
The problem with that is that, it doesn't give the ability to add the same string value 'Acc1' to the same duplicate value in the list, (the rest of the table comes blank only the fist values gets replaced), So how do I replace the entire value with matching IDs?
//EDIT
the data is being populated using sqlcommand in a class called SQLQuery, in this class there's a method called Account which execute like this:
SqlDataReader reader = command.ExecuteReader();
List<ViewModel> returnList = new List<ViewModel>();
if (reader.HasRows)
{
while (reader.Read())
{
ViewModel vm = new ViewModel();
vm.Name = reader.GetString(2);
vm.AccountInfo= reader.GetString(3);
vm.AccountNumber = reader.GetInt32(4);
returnList.Add(vm)
}
}
so this method return the first table above no issues.
In my controller action, is where I want to perhaps copy the SQLQuery return list into another list to filter so I'm doing (in the action method):
public async Task<IActionResult> DisguiseAction(string accNum)
{
List<ViewModel> executeSQL = new List<ViewModel>();
SQLQuery getQuery = new SQLQuery();
executeSQL = getQuery.Account(accNum); //at this point the sql
//gets executed with the correct value. Now I need to disguise the
//value. which I did
int count = 1;
var disguise = listResults.GroupBy(x => x.ID).Select(y =>
y.First()).Distinct();
foreach (var i in disguise)
{
i.DisguiseName = "Acc " + count;
count++;
}
}
Your problem is the .Distinct() call, that only takes the first element out of each group. Due to the fact, that you need to memoize all already seen values, it is easier to use a dictionary to hold the already mapped values. One possibility could be:
var accounts = new List<Account>
{
new Account { Name = "abc", Department = "IT", AccountInfo = 3000 },
new Account { Name = "bdc", Department = "Desk", AccountInfo = 2000 },
new Account { Name = "sed", Department = "Kitchen", AccountInfo = 3000 },
new Account { Name = "afh", Department = "work", AccountInfo = 4000 },
new Account { Name = "hsfs", Department = "home", AccountInfo = 2000 },
};
var mappings = new Dictionary<int, string>();
var summary = accounts
.Select(acc => new AccountSummary
{
Name = acc.Name,
Department = acc.Department,
DisguiseInfo = GetOrAddMapping(acc.AccountInfo, mappings)
})
.ToList();
foreach (var item in summary)
{
Console.WriteLine(JsonSerializer.Serialize(item));
}
And the helper method would in this case be:
private static string GetOrAddMapping(int accountInfo, Dictionary<int, string> mappings)
{
if (!mappings.TryGetValue(accountInfo, out var value))
{
value = $"Acc{mappings.Count + 1}";
mappings.Add(accountInfo, value);
}
return value;
}
using System;
using System.Collections.Generic;
public class Ent
{
public Ent(string a, string b, string c)
{
name = a;
location = b;
id = c;
}
public string name;
public string location;
public string id;
public override string ToString()
{
return $"{name} | {location} | {id}";
}
}
public class Program
{
public static void Main()
{
var input = new List<Ent>
{
new Ent("abc", "IT", "3000"),
new Ent("bcd", "Desk", "2000"),
new Ent("sed", "Kitchen", "3000"),
new Ent("afh", "work", "4000"),
new Ent("hsf", "home", "2000"),
};
var output = input
.GroupBy(x => x.id) // x is of type Ent
.SelectMany(y => // y is of type IGrouping<string, IEnumerable<Ent>>
y.Select(z => // z is of type Ent
new Ent(z.name, z.location, "Acc" + y.Key.Substring(0, 1))));
foreach(var line in output)
Console.WriteLine(line);
}
}
Gives an output that looks like:
abc | IT | Acc3
sed | Kitchen | Acc3
bcd | Desk | Acc2
hsf | home | Acc2
afh | work | Acc4
This code works using GroupBy on the id, then unroll the groups using SelectMany, but now we have the Key for each group. So when unrolling, re-create each line, but replace the id with a transformed value of the Key.
After grouping by AccountInfo, you could take advantage of the .SelectMany() overload that provides an indexer for the source element (i.e. an indexer for the AccountInfo values).
In the following example, I am assuming that you have two separate classes for the original (identifiable) accounts and the disguised accounts, e.g.:
public class BasicAccount
{
public string Name { get; set; }
public string AccountType { get; set; }
}
public class Account : BasicAccount
{
public int AccountInfo { get; set; }
}
public class DisguisedAccount : BasicAccount
{
public string DisguisedInfo { get; set; }
}
If your original accounts are collected in a variable List<Account> accounts as such:
List<Account> accounts = new()
{
new() { Name = "abc", AccountType = "IT", AccountInfo = 3000 },
new() { Name = "bdc", AccountType = "Desk", AccountInfo = 2000 },
new() { Name = "sed", AccountType = "Kitchen", AccountInfo = 3000 },
new() { Name = "afh", AccountType = "work", AccountInfo = 4000 },
new() { Name = "hsfs",AccountType = "home", AccountInfo = 2000 }
};
, your disguised accounts could be produced as follows:
IEnumerable<DisguisedAccount> disguisedAccounts = accounts
.GroupBy(a => a.AccountInfo)
.SelectMany(( accountsByInfo, counter ) => accountsByInfo
.Select(account => new DisguisedAccount
{
Name = account.Name,
AccountType = account.AccountType,
DisguisedInfo = $"Acc{counter}"
}));
Note: Using this approach, you lose the ordering given by the original accounts collection. The resulting collection is:
Name
AccountType
DisguisedInfo
abc
IT
Acc1
sed
Kitchen
Acc1
bdc
Desk
Acc2
hsfs
home
Acc2
afh
work
Acc3
Example fiddle here.
I am unclear: does the snowball analyzer have to be used when making the index?
var Client = new ElasticClient(Settings);
Client.CreateIndex("pictures", i => i
.Settings(st => st
.Analysis(a => a
.Analyzers(ad => ad
.Snowball("snowball", s => s.Language(SnowballLanguage.English))
)
)
)
);
or when doing the search?
var queryResults = Client.Search<PictureIndex>(s => s
.From(0).Size(10)
.Query(q=>q
.QueryString(qs=>qs
.Analyzer("snowball")
.Query("my test string")
)
)
);
This code doesn't return the expected results.
For example, if I have:
tomato, and
tomatoes
in my index, I'm expecting to find 2 results if I search for tomato, but it's not the case.
I'm trying to make an English only, case insensitive, stemmed search and add fuzziness to accommodate misspellings.
(As a bonus, I'd like to be able to submit a list of synonyms)
Edit:
this is the test code I have, but I think I misunderstand how to enter synonyms. The code will return no matches.
public static class Program
{
public class IndexData
{
public int Id { get; set; }
public string Text { get; set; }
}
public static void Main()
{
var Settings = new ConnectionSettings(new Uri("http://elasticsearch:9200")).DefaultIndex("testindex");
var A = new List<IndexData>
{
new IndexData { Id = 11, Text = "I like red, green and blue. But also cookies and candies" },
new IndexData { Id = 12, Text = "There is a red cookie on the shelf" },
new IndexData { Id = 13, Text = "Blue candies are my favorite" }
};
var Client = new ElasticClient(Settings);
var D = Client.DeleteIndexAsync("testindex").Result;
var U = Client.CreateIndex("testindex", i => i
.Settings(s => s
.Analysis(a => a
.CharFilters(cf => cf
.Mapping("my_char_filter", m => m
.Mappings("Blue => blue", "Red => red", "Green => green")
)
)
.TokenFilters(tf => tf
.Synonym("my_synonym", sf => sf
.Synonyms("red, blue")
.Synonyms("green, blue")
)
)
.Analyzers(an => an
.Custom("my_analyzer", ca => ca
.Tokenizer("standard")
.CharFilters("my_char_filter")
.Filters("lowercase", "stop", "my_synonym")
)
)
)
)
);
var R = Client.IndexDocument(A[0]);
R = Client.IndexDocument(A[1]);
R = Client.IndexDocument(A[2]);
var Articles = Client.Search<IndexData>(s => s
.From(0)
.Size(1000)
.Analyzer("my_analyzer")
.Query(q => q.Fuzzy(fz => fz.Field("text").Value("blue").MaxExpansions(2)))
);
var Documents = Articles.Documents;
}
}
What I am trying to achieve is a text search where:
I can have some minor misspellings
Handle plurals: tomato = tomatoes
I can define synonyms (for example here I'm expecting search for 'blue' to return also the 'red' and the 'green')
Get a sorted list of matches with best hits first. By best, I mean hits that cover more of the search terms.
I have to admit that, despite going over the docs, I am extremely confused by the terminology and the flow of the whole system. Another issue is that half of the samples on the web just don't compile because it looks like the API has changed at some point.
We're using RavenDB (2261) as the back end for a queue-based video upload system, and we've been asked to provide a 'live' SLA report on various metrics to do with the upload system.
The document format looks like this:
{
"ClipGuid": "01234567-1234-abcd-efef-123412341234",
"CustomerId": "ABC123",
"Title": "Shakespeare in Love",
"DurationInSeconds": 82,
"StateChanges": [
{
"OldState": "DoesNotExist",
"NewState": "ReceivedFromUpload",
"ChangedAt": "2013-03-15T15:38:38.7050002Z"
},
{
"OldState": "ReceivedFromUpload",
"NewState": "Validating",
"ChangedAt": "2013-03-15T15:38:38.8453975Z"
},
{
"OldState": "Validating",
"NewState": "AwaitingSubmission",
"ChangedAt": "2013-03-15T15:38:39.9529762Z"
},
{
"OldState": "AwaitingSubmission",
"NewState": "Submitted",
"ChangedAt": "2013-03-15T15:38:43.4785084Z"
},
{
"OldState": "Submitted",
"NewState": "Playable",
"ChangedAt": "2013-03-15T15:41:39.5523223Z"
}
],
}
Within each ClipInfo record, there's a collection of StateChanges that are added each time the clip is passed from one part of the processing chain to another. What we need to to is to reduce these StateChanges to two specific timespans - we need to know how long a clip took to change from DoesNotExist to AwaitingSubmission, and how long it took from DoesNotExist to Playable. We then need to group these durations by date/time, so we can draw a simple SLA report that looks like this:
The necessary predicates can be expressed as LINQ statements but when I try specifying this sort of complex logic within a Raven query I just seem to get back empty results (or lots of DateTime.MinValue results)
I realise document databases like Raven aren't ideal for reporting - and we're happy to explore replication into SQL or some other sort of caching mechanism - but at the moment I just can't see any way of extracting the data other than doing multiple queries to retrieve the entire contents of the store and then performing the calculations in .NET.
Any recommendations?
Thanks,
Dylan
I have made some assumptions which you may need to adjust for:
You operate strictly in the UTC time zone - your "day" is midnight to midnight UTC.
Your week is Sunday through Saturday
The date you want to group by is the first status date reported (the one marked with "DoesNotExist" as its old state.)
You will need a separate map/reduce index per date bracket that you are grouping on - Daily, Weekly, Monthly.
They are almost identical, except for how the starting date is defined. If you want to get creative, you might be able to come up with a way to make these into a generic index definition - but they will always end up being three separate indexes in RavenDB.
// This is the resulting class that all of these indexes will return
public class ClipStats
{
public int CountClips { get; set; }
public int NumPassedWithinTwentyPct { get; set; }
public int NumPlayableWithinOneHour { get; set; }
public DateTime Starting { get; set; }
}
public class ClipStats_ByDay : AbstractIndexCreationTask<ClipInfo, ClipStats>
{
public ClipStats_ByDay()
{
Map = clips => from clip in clips
let state1 = clip.StateChanges.FirstOrDefault(x => x.OldState == "DoesNotExist")
let state2 = clip.StateChanges.FirstOrDefault(x => x.NewState == "AwaitingSubmission")
let state3 = clip.StateChanges.FirstOrDefault(x => x.NewState == "Playable")
let time1 = state2.ChangedAt - state1.ChangedAt
let time2 = state3.ChangedAt - state1.ChangedAt
select new
{
CountClips = 1,
NumPassedWithinTwentyPct = time1.TotalSeconds < clip.DurationInSeconds * 0.2 ? 1 : 0,
NumPlayableWithinOneHour = time2.TotalHours < 1 ? 1 : 0,
Starting = state1.ChangedAt.Date
};
Reduce = results => from result in results
group result by result.Starting
into g
select new
{
CountClips = g.Sum(x => x.CountClips),
NumPassedWithinTwentyPct = g.Sum(x => x.NumPassedWithinTwentyPct),
NumPlayableWithinOneHour = g.Sum(x => x.NumPlayableWithinOneHour),
Starting = g.Key
};
}
}
public class ClipStats_ByWeek : AbstractIndexCreationTask<ClipInfo, ClipStats>
{
public ClipStats_ByWeek()
{
Map = clips => from clip in clips
let state1 = clip.StateChanges.FirstOrDefault(x => x.OldState == "DoesNotExist")
let state2 = clip.StateChanges.FirstOrDefault(x => x.NewState == "AwaitingSubmission")
let state3 = clip.StateChanges.FirstOrDefault(x => x.NewState == "Playable")
let time1 = state2.ChangedAt - state1.ChangedAt
let time2 = state3.ChangedAt - state1.ChangedAt
select new
{
CountClips = 1,
NumPassedWithinTwentyPct = time1.TotalSeconds < clip.DurationInSeconds * 0.2 ? 1 : 0,
NumPlayableWithinOneHour = time2.TotalHours < 1 ? 1 : 0,
Starting = state1.ChangedAt.Date.AddDays(0 - (int) state1.ChangedAt.Date.DayOfWeek)
};
Reduce = results => from result in results
group result by result.Starting
into g
select new
{
CountClips = g.Sum(x => x.CountClips),
NumPassedWithinTwentyPct = g.Sum(x => x.NumPassedWithinTwentyPct),
NumPlayableWithinOneHour = g.Sum(x => x.NumPlayableWithinOneHour),
Starting = g.Key
};
}
}
public class ClipStats_ByMonth : AbstractIndexCreationTask<ClipInfo, ClipStats>
{
public ClipStats_ByMonth()
{
Map = clips => from clip in clips
let state1 = clip.StateChanges.FirstOrDefault(x => x.OldState == "DoesNotExist")
let state2 = clip.StateChanges.FirstOrDefault(x => x.NewState == "AwaitingSubmission")
let state3 = clip.StateChanges.FirstOrDefault(x => x.NewState == "Playable")
let time1 = state2.ChangedAt - state1.ChangedAt
let time2 = state3.ChangedAt - state1.ChangedAt
select new
{
CountClips = 1,
NumPassedWithinTwentyPct = time1.TotalSeconds < clip.DurationInSeconds * 0.2 ? 1 : 0,
NumPlayableWithinOneHour = time2.TotalHours < 1 ? 1 : 0,
Starting = state1.ChangedAt.Date.AddDays(1 - state1.ChangedAt.Date.Day)
};
Reduce = results => from result in results
group result by result.Starting
into g
select new
{
CountClips = g.Sum(x => x.CountClips),
NumPassedWithinTwentyPct = g.Sum(x => x.NumPassedWithinTwentyPct),
NumPlayableWithinOneHour = g.Sum(x => x.NumPlayableWithinOneHour),
Starting = g.Key
};
}
}
Then when you want to query...
var now = DateTime.UtcNow;
var today = now.Date;
var dailyStats = session.Query<ClipStats, ClipStats_ByDay>()
.FirstOrDefault(x => x.Starting == today);
var startOfWeek = today.AddDays(0 - (int) today.DayOfWeek);
var weeklyStats = session.Query<ClipStats, ClipStats_ByWeek>()
.FirstOrDefault(x => x.Starting == startOfWeek);
var startOfMonth = today.AddDays(1 - today.Day);
var monthlyStats = session.Query<ClipStats, ClipStats_ByMonth>()
.FirstOrDefault(x => x.Starting == startOfMonth);
In the results, you will have totals. So if you want percent averages for your SLA, simply divide the statistic by the count, which is also returned.
I have a collection of items that contain an Enum (TypeCode) and a User object, and I need to flatten it out to show in a grid. It's hard to explain, so let me show a quick example.
Collection has items like so:
TypeCode | User
---------------
1 | Don Smith
1 | Mike Jones
1 | James Ray
2 | Tom Rizzo
2 | Alex Homes
3 | Andy Bates
I need the output to be:
1 | 2 | 3
Don Smith | Tom Rizzo | Andy Bates
Mike Jones | Alex Homes |
James Ray | |
I've tried doing this using foreach, but I can't do it that way because I'd be inserting new items to the collection in the foreach, causing an error.
Can this be done in Linq in a cleaner fashion?
I'm not saying it is a great way to pivot - but it is a pivot...
// sample data
var data = new[] {
new { Foo = 1, Bar = "Don Smith"},
new { Foo = 1, Bar = "Mike Jones"},
new { Foo = 1, Bar = "James Ray"},
new { Foo = 2, Bar = "Tom Rizzo"},
new { Foo = 2, Bar = "Alex Homes"},
new { Foo = 3, Bar = "Andy Bates"},
};
// group into columns, and select the rows per column
var grps = from d in data
group d by d.Foo
into grp
select new {
Foo = grp.Key,
Bars = grp.Select(d2 => d2.Bar).ToArray()
};
// find the total number of (data) rows
int rows = grps.Max(grp => grp.Bars.Length);
// output columns
foreach (var grp in grps) {
Console.Write(grp.Foo + "\t");
}
Console.WriteLine();
// output data
for (int i = 0; i < rows; i++) {
foreach (var grp in grps) {
Console.Write((i < grp.Bars.Length ? grp.Bars[i] : null) + "\t");
}
Console.WriteLine();
}
Marc's answer gives sparse matrix that can't be pumped into Grid directly.
I tried to expand the code from the link provided by Vasu as below:
public static Dictionary<TKey1, Dictionary<TKey2, TValue>> Pivot3<TSource, TKey1, TKey2, TValue>(
this IEnumerable<TSource> source
, Func<TSource, TKey1> key1Selector
, Func<TSource, TKey2> key2Selector
, Func<IEnumerable<TSource>, TValue> aggregate)
{
return source.GroupBy(key1Selector).Select(
x => new
{
X = x.Key,
Y = source.GroupBy(key2Selector).Select(
z => new
{
Z = z.Key,
V = aggregate(from item in source
where key1Selector(item).Equals(x.Key)
&& key2Selector(item).Equals(z.Key)
select item
)
}
).ToDictionary(e => e.Z, o => o.V)
}
).ToDictionary(e => e.X, o => o.Y);
}
internal class Employee
{
public string Name { get; set; }
public string Department { get; set; }
public string Function { get; set; }
public decimal Salary { get; set; }
}
public void TestLinqExtenions()
{
var l = new List<Employee>() {
new Employee() { Name = "Fons", Department = "R&D", Function = "Trainer", Salary = 2000 },
new Employee() { Name = "Jim", Department = "R&D", Function = "Trainer", Salary = 3000 },
new Employee() { Name = "Ellen", Department = "Dev", Function = "Developer", Salary = 4000 },
new Employee() { Name = "Mike", Department = "Dev", Function = "Consultant", Salary = 5000 },
new Employee() { Name = "Jack", Department = "R&D", Function = "Developer", Salary = 6000 },
new Employee() { Name = "Demy", Department = "Dev", Function = "Consultant", Salary = 2000 }};
var result5 = l.Pivot3(emp => emp.Department, emp2 => emp2.Function, lst => lst.Sum(emp => emp.Salary));
var result6 = l.Pivot3(emp => emp.Function, emp2 => emp2.Department, lst => lst.Count());
}
* can't say anything about the performance though.
You can use Linq's .ToLookup to group in the manner you are looking for.
var lookup = data.ToLookup(d => d.TypeCode, d => d.User);
Then it's a matter of putting it into a form that your consumer can make sense of. For instance:
//Warning: untested code
var enumerators = lookup.Select(g => g.GetEnumerator()).ToList();
int columns = enumerators.Count;
while(columns > 0)
{
for(int i = 0; i < enumerators.Count; ++i)
{
var enumerator = enumerators[i];
if(enumator == null) continue;
if(!enumerator.MoveNext())
{
--columns;
enumerators[i] = null;
}
}
yield return enumerators.Select(e => (e != null) ? e.Current : null);
}
Put that in an IEnumerable<> method and it will (probably) return a collection (rows) of collections (column) of User where a null is put in a column that has no data.
I guess this is similar to Marc's answer, but I'll post it since I spent some time working on it. The results are separated by " | " as in your example. It also uses the IGrouping<int, string> type returned from the LINQ query when using a group by instead of constructing a new anonymous type. This is tested, working code.
var Items = new[] {
new { TypeCode = 1, UserName = "Don Smith"},
new { TypeCode = 1, UserName = "Mike Jones"},
new { TypeCode = 1, UserName = "James Ray"},
new { TypeCode = 2, UserName = "Tom Rizzo"},
new { TypeCode = 2, UserName = "Alex Homes"},
new { TypeCode = 3, UserName = "Andy Bates"}
};
var Columns = from i in Items
group i.UserName by i.TypeCode;
Dictionary<int, List<string>> Rows = new Dictionary<int, List<string>>();
int RowCount = Columns.Max(g => g.Count());
for (int i = 0; i <= RowCount; i++) // Row 0 is the header row.
{
Rows.Add(i, new List<string>());
}
int RowIndex;
foreach (IGrouping<int, string> c in Columns)
{
Rows[0].Add(c.Key.ToString());
RowIndex = 1;
foreach (string user in c)
{
Rows[RowIndex].Add(user);
RowIndex++;
}
for (int r = RowIndex; r <= Columns.Count(); r++)
{
Rows[r].Add(string.Empty);
}
}
foreach (List<string> row in Rows.Values)
{
Console.WriteLine(row.Aggregate((current, next) => current + " | " + next));
}
Console.ReadLine();
I also tested it with this input:
var Items = new[] {
new { TypeCode = 1, UserName = "Don Smith"},
new { TypeCode = 3, UserName = "Mike Jones"},
new { TypeCode = 3, UserName = "James Ray"},
new { TypeCode = 2, UserName = "Tom Rizzo"},
new { TypeCode = 2, UserName = "Alex Homes"},
new { TypeCode = 3, UserName = "Andy Bates"}
};
Which produced the following results showing that the first column doesn't need to contain the longest list. You could use OrderBy to get the columns ordered by TypeCode if needed.
1 | 3 | 2
Don Smith | Mike Jones | Tom Rizzo
| James Ray | Alex Homes
| Andy Bates |
#Sanjaya.Tio I was intrigued by your answer and created this adaptation which minimizes keySelector execution. (untested)
public static Dictionary<TKey1, Dictionary<TKey2, TValue>> Pivot3<TSource, TKey1, TKey2, TValue>(
this IEnumerable<TSource> source
, Func<TSource, TKey1> key1Selector
, Func<TSource, TKey2> key2Selector
, Func<IEnumerable<TSource>, TValue> aggregate)
{
var lookup = source.ToLookup(x => new {Key1 = key1Selector(x), Key2 = key2Selector(x)});
List<TKey1> key1s = lookup.Select(g => g.Key.Key1).Distinct().ToList();
List<TKey2> key2s = lookup.Select(g => g.Key.Key2).Distinct().ToList();
var resultQuery =
from key1 in key1s
from key2 in key2s
let lookupKey = new {Key1 = key1, Key2 = key2}
let g = lookup[lookupKey]
let resultValue = g.Any() ? aggregate(g) : default(TValue)
select new {Key1 = key1, Key2 = key2, ResultValue = resultValue};
Dictionary<TKey1, Dictionary<TKey2, TValue>> result = new Dictionary<TKey1, Dictionary<TKey2, TValue>>();
foreach(var resultItem in resultQuery)
{
TKey1 key1 = resultItem.Key1;
TKey2 key2 = resultItem.Key2;
TValue resultValue = resultItem.ResultValue;
if (!result.ContainsKey(key1))
{
result[key1] = new Dictionary<TKey2, TValue>();
}
var subDictionary = result[key1];
subDictionary[key2] = resultValue;
}
return result;
}
I'm looking for the LINQ equivalent to the Sybase's LIST() or MySQL's group_concat()
It'll convert:
User Hobby
--------------
Bob Football
Bob Golf
Bob Tennis
Sue Sleeping
Sue Drinking
To:
User Hobby
--------------
Bob Football, Golf, Tennis
Sue Sleeping, Drinking
That's the GroupBy operator. Are you using LINQ to Objects?
Here's an example:
using System;
using System.Collections.Generic;
using System.Linq;
public class Test
{
static void Main()
{
var users = new[]
{
new { User="Bob", Hobby="Football" },
new { User="Bob", Hobby="Golf" },
new { User="Bob", Hobby="Tennis" },
new { User="Sue", Hobby="Sleeping" },
new { User="Sue", Hobby="Drinking" },
};
var groupedUsers = users.GroupBy(user => user.User);
foreach (var group in groupedUsers)
{
Console.WriteLine("{0}: ", group.Key);
foreach (var entry in group)
{
Console.WriteLine(" {0}", entry.Hobby);
}
}
}
}
That does the grouping - can you manage the rest yourself?
See if this solution helps you:
List<User> users = new List<User>()
{
new User {Name = "Bob", Hobby = "Football" },
new User {Name = "Bob", Hobby = "Golf"},
new User {Name = "Bob", Hobby = "Tennis"},
new User {Name = "Sue", Hobby = "Sleeping"},
new User {Name = "Sue", Hobby = "Drinking"}
};
var groupedUsers = from u in users
group u by u.Name into g
select new
{
Name = g.First<User>().Name,
Hobby = g.Select(u => u.Hobby)
};
foreach (var user in groupedUsers)
{
Console.WriteLine("Name: {0}", user.Name);
foreach (var hobby in user.Hobby)
{
Console.WriteLine("Hobby: {0}", hobby);
}
}
re the _concat aspect of your question, using:
static class EnumerableExtensions
{
public static String AsJoined( this IEnumerable<String> enumerable )
{
return AsJoined( enumerable, "," );
}
public static String AsJoined( this IEnumerable<String> enumerable, String separator )
{
return String.Join( separator, enumerable.ToArray() );
}
}
The outputting foreach in bruno conde and Jon Skeet's answers can become:
Console.WriteLine( "User:\tHobbies");
foreach ( var group in groupedUsers )
Console.WriteLine( "{0}:\t{1}", group.Key, group.Select( g => g.Hobby ).AsJoined( ", " ) );
... and you'll get the precise result output format you asked for (yes, I know the others have already solved your problem, but its hard to resist!)
Or else we can do the following-
var users = new[]
{
new { User="Bob", Hobby="Football" },
new { User="Bob", Hobby="Golf" },
new { User="Bob", Hobby="Tennis" },
new { User="Sue", Hobby="Sleeping" },
new { User="Sue", Hobby="Drinking" },
};
var userList = users.ToList();
var ug = (from user in users
group user by user.User into groupedUserList
select new { user = groupedUserList.Key, hobby = groupedUserList.Select(g =>g.Hobby)});
var ug2 = (from groupeduser in ug
select new{ groupeduser.user, hobby =string.Join(",", groupeduser.hobby)});
To do it in one Linq Statement. There is no way I'd recommend the code, but it shows that it could be done.
var groupedUsers = from user in users
group user by user.User into userGroup
select new
{
User = userGroup.Key,
userHobies =
userGroup.Aggregate((a, b) =>
new { User = a.User, Hobby = (a.Hobby + ", " + b.Hobby) }).Hobby
}
;
foreach (var x in groupedUsers)
{
Debug.WriteLine(String.Format("{0} {1}", x.User, x.userHobies));
}
all answers is not good enough;
because this is a db query,but all of us do that just in memory;
diff is that some operation in memory will occuce a error can't trans to store expression;
var list = db.Users.GroupBy(s=>s.User).
select(g=>new{user=g.Key,hobbys=g.select(s=>s.Hobby)}); // you can just do that from db
var result=list.ToList(); // this is important,to query data to memory;
var result2 = result.select(g=>new{user=g.Key,hobbyes=string.join(",",g.hobbyes)}; //then,do what you love in memory