I have not found a way to make this happen or a site that has something close. I may just not know how to ask this about this operation, so I will try here.
What I have is a CSV with a list of employees hours worked each day (regular and OT) all in separate rows. The columns have headers. Employee name, Date, Regular Hours, and OT Hours are in separate columns.
The challenge is that for employee Bob, there may be 20 rows for one day. Bob applies a slice of time to different projects all day. Then of course there are multiple days and multiple employees.
What I am trying to end up with is a report that shows all of the regular and OT hours (separately) for each employee on a daily basis.
I started with this, which parses the SV and gives me just the columns of interest. However, this outputs just a long list of every row that matches and that is not going to get me where I need to go.
$employees = import-csv "c:\Test1.csv"
ForEach ($employee in $employees){
$employeeName = $($employee.'Resource name')
$WorkDate = $($employee.'Charge Date')
$RegHours = $($employee.RegHours)
$OT = $($employee.OTHours)
Write-host $employeeName $WorkDate $RegHours
}
I then attempted to format the output in an attempt to start consolidating the output. Because it seems to me that I will need 2 consolidations. One to get all of the users data together, separate from the other users, and one to get all of the same date under each user together. Then, a SUM can be performed on each day, for each user. At least that is how I am viewing it at this point.
So this is the formatting attempt. But this just give headers to each row. It does not consolidate. The headers are nice, but the user data needs to be consolidated under the header for it to be useful.
$employees = import-csv "C:\test2.csv"
ForEach ($employee in $employees)
{
$employeeName = $($employee.'Resource name')
$WorkDate = $($employee.'Charge Date')
$RegHours = $($employee.RegHours)
$OT = $($employee.OTHours)
$obj = $null
$obj = New-Object System.Object
$obj | Add-Member -type NoteProperty -Name "Employee Name" -Value $employeeName
$obj | Add-Member -type NoteProperty -Name "Work Date" -Value $WorkDate
$obj | Add-Member -type NoteProperty -Name "Regular Hours" -Value $RegHours
$obj | Format-table
}
Related
I have created an alert policy in GCP MOnitoring which will notify me when a certain kind of log message stops appearing (a dead man's switch). I have create a logs-based metric with a label, "client", which I use to group the metric and get a timeseries per client. I have been using "absence of data" as the trigger for the alert. This has all been working well, until...
After a recent change, the logs now also com from different resources, so there is a need to combine the metric across those resources. I can achieve this using QML
{ fetch gce_instance::logging.googleapis.com/user/ping
| group_by [metric.client], sum(val())
| every 30m
; fetch global::logging.googleapis.com/user/ping
| group_by [metric.client], sum(val())
| every 30m }
| union
Notice that I need to align the two series with the same bucket size (30m) to be able to join them, which makes sense. I notice that the value for a timeseries is "undefined" in those buckets where the metric data was absent (by downloading a CSV of the query).
To create an alert using this query, I tried something like this:
{ fetch gce_instance::logging.googleapis.com/user/ping
| group_by [metric.client], sum(val())
| every 30m
; fetch global::logging.googleapis.com/user/ping
| group_by [metric.client], sum(val())
| every 30m }
| union
| absent_for 1h
If I look at the CSV output for this query it doesn't reflect the absence of metric data for a timeseries, and this is presumably because a value of "undefined" doesn't qualify as absent data.
Is there a way to detect for absence of data for a "unioned" metric (and therefore aligned) across multiple resources?
Update 1
I have tried this, which seems to get me some of the way there. I'd really appreciate comments on this approach.
{
fetch gce_instance::logging.googleapis.com/user/ping
| group_by [metric.client], sum(val())
;
fetch global::logging.googleapis.com/user/ping
| group_by [metric.client], sum(val())
}
| union
| absent_for 1h
I have settled on a solution as follows,
{
fetch gce_instance::logging.googleapis.com/user/ping
| group_by [metric.client]
;
fetch global::logging.googleapis.com/user/ping
| group_by [metric.client]
}
| union
| absent_for 1h
| every 30m
Note:
group_by [metric.client] conforms the tables from different resource, which allows the union to work
absent_for does align input timeseries using the default period or one specified by a following every
I found it really hard to debug these MQL queries, in particular to confirm that absent_for was going to trigger an alert. I realised that I could use value [active] to show a plot of the active column (which absent_for produces) and that gave me confidence that my alert was actually going to work.
{
fetch gce_instance::logging.googleapis.com/user/ping
| group_by [metric.client]
;
fetch global::logging.googleapis.com/user/ping
| group_by [metric.client]
}
| union
| absent_for 1h
| value [active]
I'm quite new in Powershell. I am trying to filter the result of the cmdlet Get-MailboxStatics, because i wanna know statistics on a subset of items in the mailbox, precisely i'd like to know the number of items received before a certain date and after that the dimension of them all aggregated.
i tried with this:
$stats = Get-MailboxStatistics -identity $username | where {$_.EndDate -eq $age}
But this attempt doesn't work and give me an empty result.
I am a newbie in UIPath.
I have a DataTable with these headers:
1.) Date
2.) Error
I want to extract a Distinct Date for every error, and use this code:
dtQuery = ExtractDataTable.DefaultView.ToTable(True,{"Date","Error"})
With this, I get my desired result. My problem is how can I append (a new Column, "Count") EACH COUNT of DISTINCT VALUES given? For Example:
DATE | ERROR | COUNT
2/27/2019 | Admin Query String |
2/27/2019 | 404 Shield |
2/26/2019 | 404 Shield |
2/25/2019 | 404 Shield |
2/25/2019 | Admin Query String |
I tried to use ADD DATA COLUMN ACTIVITY with these properties:
Column Name = "COUNT"
Data Table = dtQuery
DefaultValue = ExtractDataTable.DefaultView.ToTable(True,{"Date","Error"}).Rows.Count
But by using this, it gives me this:
DATE | ERROR | COUNT
2/27/2019 | Admin Query String | 5
2/27/2019 | 404 Shield | 5
2/26/2019 | 404 Shield | 5
2/25/2019 | 404 Shield | 5
2/25/2019 | Admin Query String | 5
Thanks in advance! Happy coding!
After hours of research, here is what I learned.
I can iterate on each item of the datatable by using FOR EACH ROW Activity.
So for every row item of my dtQuery, I add ASSIGN Activity that looks like this:
row(2) = [item i want to add]
But that doesn't answer my question. I want to know the count of each unique item with 2 criteria - They are same DATE and ERROR.
Maybe I can code directly on the Excel File?
So I researched for Excel Formula that looks like "Select Distinct Col1....etc."
I found this video tutorial, hope it might help: Countif
But its only for a single criterion, so I found this: Countifs
So to wrap it up,
For Each Row Image
1.) I loop inside dtQuery using For Each Row Activity
2.) Inside loop, I add Assign Activity with this code
row(2) = "=COUNTIFS('LookUp Sheet'!B:B,'Result Sheet'!A" & indexerRow + 2 & ",'LookUp Sheet'!D:D,'Result Sheet'!B" & indexerRow + 2 & ")"
Hope this help others who will be stumbling upon the same problem. Happy Automating! ^_^
i know it might be impossible to do what i need but i'm searching for a way to ensure that a user input date in the right format.
i have written a piece of code that can validate dates who the DD (day) part is between 13-31 but i'm struggling with the dates that has DD between 1-12
for ex:
i can validate this date 23/06/2016 and fix it to be 06/23/2016
but i can't ensure the date 12 of June is entered correctly.
if someone will input 12/06/2016 it's by definition a valid date but not the one i need (06/12/2016) and i can't really make sure that the date is the one i intended it to be
if any one can pleas point me to a solution it will be highly appreciated
if it makes any difference this is my code:
$Date = $args[0]
$Time = $args[1]
try
{
$tmpDate = $Date +" " +$Time
[DateTime]$UserDate = $tmpDate
}
catch
{
$tmp = $Date
$tmp -match "(?<d>.*)/(?<m>.*)/(?<y>.*)">$null
$_Date = $matches['m']+"/"+$matches['d']+"/"+$matches['y']
$tmpDate = $_Date +" " +$Time
try
{
[DateTime]$UserDate = $tmpDate
}
catch
{
write-host "Error: The Given Date was not recognized as a valid Date, Please Try again" -ForegroundColor Red
Exit 808040
}
}
The way most programs work is that they ask the user to input date in the acceptable format. Displaying something like YYYY/MM/DD near the input box or prompt will help the user understand what program will accept. Then whatever user entered you can validate each part separately and warn the user if any part of the date is wrong.
I have a treeview widget in my Tcl/Tk application that will often show duplicate records. I tried writing "lsort -unique" and "lrmdups" into my code to automatically delete the treeview duplicates, but with no luck. If possible, does anyone know how to do this?
If you are asking about the ttk::treeview widget that is provided with Tk 8.5 and above then one way to ensure unique entries is to be careful about the -id parameter. It will automatically prevent duplicate items with the same id:
% pack [ttk::treeview .tv -columns {One Two}] -fill both -expand 1
% .tv insert {} end -id id1 -text First -values {1st first}
id1
% .tv insert {} end -id id1 -text Second -values {2nd second}
Item id1 already exists
If you know the id of an item to delete, you can just do this:
.tv delete $id