How to get commit history of users in gitlab in rails - ruby

I am trying to create an app which will fetch the gitlab/github commit history of the user which i can just show in a side panel and it can be checked on or off depending on my criteria. I want to know if there is a way to fetch the current logged user's gitlab/github commit history. I tried to use the gem
https://github.com/libgit2/rugged
But couldn't find a way to implement my need. If anyone knows how to implement this it would be so much helpful. Thanks.
Update
I have now managed to get the user by using :
Gitlab.endpoint = 'https://gitlab.com/api/v4'
Gitlab.private_token = 'token'
g = Gitlab.client(
endpoint: 'https://gitlab.com/api/v4',
private_token: 'token',
httparty: {
headers: { 'Cookie' => 'gitlab_canary=true' }
}
)
By using the command g.user i am able to get the user but i need to get the commits of the user that he has done in gitlab.

Use this gitlab API GET /projects/:id/repository/commits to fetch all the commits on a repository gitlab api. Check the attached code for more details.
Basically this git log --author="user_name" command can give you git commit history for a specific user, you can even use email address just the first name or last name in the above command.
Once you have authenticated with gitlab you can run the following command from ruby.
cmd = 'git log --author="user_name"'
system(cmd)
Sample code form ruby to connect to gitlab using private token which is not ideal but just an example
require 'json'
require 'curb'
require 'net/http'
begin
def parseCoverageReport(report_text)
coverage_perc = report_text.match /All files\s+\|\s+(\d+\.?\d+).*\n/
if not coverage_perc then
coverage_perc = report_text.match /^TOTAL\s+\d+\s+\d+\s+(\d+)%$/
end
if coverage_perc then
#if we found coverage value in job trace
#puts "coverage_perc[1]: #{coverage_perc[1]}"
coverage_perc[1].to_i
end
end
gen_config = YAML.load_file("config/general.yml")
gitlab_config = YAML.load_file("config/gitlab.yml")
SCHEDULER.every gen_config[:job_trigger_interval], :first_in => 0 do |job|
table = {
title: "Projects",
hrows: Array.new.push({cols: [
{value: "Project name"},
{value: "Open Merge Requests"},
{value: "Code coverage"}
]}),
rows: Array.new
}
instances = gitlab_config['instances']
instances.each do |instance|
gitlab_url = gitlab_config['instances'][instance.first]['url']
# gitlab_token = gitlab_config['instances'][instance.first]['api_key']
gitlab_token = ENV[gitlab_config['instances'][instance.first]['api_key']]
red_threshold = gitlab_config['instances'][instance.first]['red_threshold']
orange_threshold = gitlab_config['instances'][instance.first]['orange_threshold']
cov_red_threshold = gitlab_config['instances'][instance.first]['cov_red_threshold']
cov_orange_threshold = gitlab_config['instances'][instance.first]['cov_orange_threshold']
projects = gitlab_config['instances'][instance.first]['projects']
projects.each do |name, project|
merge_reqs = JSON.parse(Curl.get("#{gitlab_url}/api/v4/projects/#{project['id']}/merge_requests?state=opened&private_token=#{gitlab_token}&per_page=200").body_str)
git_project = JSON.parse(Curl.get("#{gitlab_url}/api/v4/projects/#{project['id']}?private_token=#{gitlab_token}").body_str)
opened_mrs = merge_reqs.select { |merge_reqs| %w[opened].include? merge_reqs['state'] }
repo_name = git_project['name']
repo_url = git_project['web_url']
status = case
when opened_mrs.size >= red_threshold then 'danger'
when opened_mrs.size >= orange_threshold then 'warning'
else
'ok'
end
mrs_count = "#{opened_mrs.size}"
send_event("#{name}_mr", { current: mrs_count, status: status })
color = case
when opened_mrs.size >= red_threshold then 'red'
when opened_mrs.size >= orange_threshold then 'orange'
else
'green'
end
font_color = color == 'orange' ? 'black' : 'white'
cov_color = color
font_cov_color = 'white'
code_coverage = "---"
code_coverage_tag = "---"
cov_job_url = ''
jobs = JSON.parse(Curl.get("#{gitlab_url}/api/v4/projects/#{project['id']}/jobs?scope=success&private_token=#{gitlab_token}&per_page=30").body_str)
code_cov_job = jobs.find { |gitlab_job| !gitlab_job['coverage'].nil? }
if not code_cov_job then
#if no job has 'coverage' feature set up in Gitlab try to parse
#'coverage' from jobs trace manually
jobs.each do |job|
trace_report = Curl.get("#{gitlab_url}/api/v4/projects/#{project['id']}/jobs/#{job['id']}/trace?private_token=#{gitlab_token}").body_str
code_cov_percentage = parseCoverageReport(trace_report)
if code_cov_percentage then
code_cov_job = job
code_cov_job['coverage'] = code_cov_percentage
break
end
end
end
if code_cov_job then
#found code coverage data => process them
code_coverage = code_cov_job['coverage'].to_i
cov_job_url = code_cov_job['web_url'].to_s
#update code covergate SprintProgress widgets at the same job
widget_title = "code_coverage_progress_#{project['id']}"
send_event(widget_title, {
title: "Code Coverage - #{git_project['name']}",
sprintboard_url: cov_job_url,
min: 0,
max: 100,
value: code_coverage,
moreinfo: ''
})
cov_color = case
when code_coverage <= cov_red_threshold then 'red'
when code_coverage <= cov_orange_threshold then 'orange'
else
'green'
end
code_coverage = "#{code_coverage}%"
code_coverage_tag = "<a href='#{cov_job_url}' target='_blank'>#{code_coverage.to_s}</a>"
end
repo_name_a_tag = "<a href='#{repo_url}' target='_blank'>#{repo_name}</a>"
open_mrs_size = "<a href='#{repo_url}/merge_requests' target='_blank'>#{opened_mrs.size}</a>"
table[:rows].push({
cols: [
{ value: repo_name_a_tag, style: "color: #{font_color}; background-color: #{color}" },
{ value: open_mrs_size, style: "color: #{font_color}; background-color: #{color}" },
{ value: code_coverage_tag, style: "color: #{cov_color == 'orange' ? 'black' : 'white'}; background-color: #{cov_color}" }
]
})
end
end
send_event('open_merge_requests_table', table)
end
rescue Errno::ENOENT
puts "No config file found for gitlab - not starting the Gitlab job"
end
In the above ruby example please have a look at the following code snippet
merge_reqs = JSON.parse(Curl.get("#{gitlab_url}/api/v4/projects/#{project['id']}/merge_requests?state=opened&private_token=#{gitlab_token}&per_page=200").body_str)
git_project = JSON.parse(Curl.get("#{gitlab_url}/api/v4/projects/#{project['id']}?private_token=#{gitlab_token}").body_str)
opened_mrs = merge_reqs.select { |merge_reqs| %w[opened].include? merge_reqs['state'] }
repo_name = git_project['name']
repo_url = git_project['web_url']
In here what i am trying to do is connect to our gitlab instance using a private_token and then for a specific project id (which you can get it form the UI of gitlab) check for the open merge request. I also get the git_project from which i get the name and web_url (which was my use case).
For your use case you will have to get the project_id(for gitlab UI) and then use some appropriate method to get the commits.gitlab docs

Related

how to add extra value to query in ruby

hello i have query to fill list box with this query how i can add an static value to select all clients
tipo = $evm.root['dialog_param_type']
option = $evm.root['dialog_param_action']
dialog_field = $evm.object
if tipo == "ALL"
puts option
dialog_field['read_only'] = "true"
$evm.object['values'] = {1 => "-- Please select an option from above --"}
else
puts option
dialog_field['read_only'] = "false"
values= {}
customer_list= {}
get_query($evm.object.decrypt('query')).each do |$evm.object['values'] = {1 => "-- Please select an option from above --"}|item|
customer_list["#{item['customer_id']}", Customer id: *] = item['customer_name'].to_s
end
dialog_field['values'] = customer_list
end
it fills it with clients on my db; so i wanted to add an option to select them all like aditional value
values= {ALL=> *}}

Google Analytics API ruby client - multiple metrics

I'm using the google API ruby client and I want to implement some more complex analytics queries such as suggested in this document
https://developers.google.com/analytics/devguides/reporting/core/v3/common-queries
This document suggests that metrics can be supplied as a comma delimited string of multiple metrics but the API client only accepts an expression.
How can I query on multiple metrics in a single query? The ruby client appears only to accept an expression which generally consists of a single metric such as sessions or pageviews like this:
metric = Google::Apis::AnalyticsreportingV4::Metric.new(expression: 'ga:sessions')
If I remove "expression" and enter a list of metrics I just get an error.
Invalid value 'ga:sessions;ga:pageviews' for metric parameter.
Here is my solution, together with a generic method for reporting Google Analytics data:
This answer should be read in conjunction with https://developers.google.com/drive/v3/web/quickstart/ruby
analytics = Google::Apis::AnalyticsreportingV4::AnalyticsReportingService.new
analytics.client_options.application_name = APPLICATION_NAME
analytics.authorization = authorize
def get_analytics_data( analytics,
view_id,
start_date: (Date.today + 1 - Date.today.wday) - 6,
end_date: (Date.today + 1 - Date.today.wday),
metrics: ['ga:sessions'],
dimensions: [],
order_bys: [],
segments: nil, # array of hashes
filter: nil,
page_size: nil )
get_reports_request_object = Google::Apis::AnalyticsreportingV4::GetReportsRequest.new
report_request_object = Google::Apis::AnalyticsreportingV4::ReportRequest.new
report_request_object.view_id = view_id
analytics_date_range_object = Google::Apis::AnalyticsreportingV4::DateRange.new
analytics_date_range_object.start_date = start_date
analytics_date_range_object.end_date = end_date
report_request_object.date_ranges = [analytics_date_range_object]
# report_request_metrics = []
report_request_object.metrics = []
metrics.each { |metric|
analytics_metric_object = Google::Apis::AnalyticsreportingV4::Metric.new
analytics_metric_object.expression = metric
report_request_object.metrics.push(analytics_metric_object) }
# report_request_object.metrics = report_request_metrics
unless dimensions.empty?
report_request_object.dimensions = []
dimensions.each { |dimension|
analytics_dimension_object = Google::Apis::AnalyticsreportingV4::Dimension.new
analytics_dimension_object.name = dimension
report_request_object.dimensions.push(analytics_dimension_object) }
end
unless segments.nil?
report_request_object.segments = []
analytics_segment_object = Google::Apis::AnalyticsreportingV4::Segment.new
analytics_dynamic_segment_object = Google::Apis::AnalyticsreportingV4::DynamicSegment.new
analytics_segment_definition_object = Google::Apis::AnalyticsreportingV4::SegmentDefinition.new
analytics_segment_filter_object = Google::Apis::AnalyticsreportingV4::SegmentFilter.new
analytics_simple_segment_object = Google::Apis::AnalyticsreportingV4::SimpleSegment.new
analytics_or_filters_for_segment_object = Google::Apis::AnalyticsreportingV4::OrFiltersForSegment.new
analytics_segment_filter_clause_object = Google::Apis::AnalyticsreportingV4::SegmentFilterClause.new
analytics_segment_metric_filter_object = Google::Apis::AnalyticsreportingV4::SegmentMetricFilter.new
analytics_dimension_object = Google::Apis::AnalyticsreportingV4::Dimension.new
analytics_dimension_object.name = 'ga:segment'
report_request_object.dimensions.push(analytics_dimension_object)
analytics_or_filters_for_segment_object.segment_filter_clauses = []
analytics_simple_segment_object.or_filters_for_segment = []
analytics_segment_definition_object.segment_filters = []
segments.each { |segment|
analytics_segment_metric_filter_object.metric_name = segment[:metric_name]
analytics_segment_metric_filter_object.comparison_value = segment[:comparison_value]
analytics_segment_metric_filter_object.operator = segment[:operator]
analytics_segment_filter_clause_object.metric_filter = analytics_segment_metric_filter_object
analytics_or_filters_for_segment_object.segment_filter_clauses.push(analytics_segment_filter_clause_object)
analytics_simple_segment_object.or_filters_for_segment.push(analytics_or_filters_for_segment_object)
analytics_segment_filter_object.simple_segment = analytics_simple_segment_object
analytics_segment_definition_object.segment_filters.push(analytics_segment_filter_object)
analytics_dynamic_segment_object.name = segment[:name]
analytics_dynamic_segment_object.session_segment = analytics_segment_definition_object
analytics_segment_object.dynamic_segment = analytics_dynamic_segment_object
report_request_object.segments.push(analytics_segment_object) }
end
unless order_bys.empty?
report_request_object.order_bys = []
order_bys.each { |orderby|
analytics_orderby_object = Google::Apis::AnalyticsreportingV4::OrderBy.new
analytics_orderby_object.field_name = orderby
analytics_orderby_object.sort_order = 'DESCENDING'
report_request_object.order_bys.push(analytics_orderby_object)}
end
unless filter.nil?
report_request_object.filters_expression = filter
end
unless page_size.nil?
report_request_object.page_size = page_size
end
get_reports_request_object.report_requests = [report_request_object]
response = analytics.batch_get_reports(get_reports_request_object)
end
If using dimensions, you can report data like this:
response = get_analytics_data(analytics, VIEW_ID, metrics: ['ga:pageviews'], dimensions: ['ga:pagePath'], order_bys: ['ga:pageviews'], page_size: 25)
response.reports.first.data.rows.each do |row|
puts row.dimensions
puts row.metrics.first.values.first.to_i
puts
end

How to save and display Dashing historical values?

Currently to setup a graph widget, the job should pass all values to be displayed:
data = [
{ "x" => 1980, "y" => 1323 },
{ "x" => 1981, "y" => 53234 },
{ "x" => 1982, "y" => 2344 }
]
I would like to read just current (the latest) value from my server, but previous values should be also displayed.
It looks like I could create a job, which will read the current value from the server, but remaining values to be read from the Redis (or sqlite database, but I would prefer Redis). The current value after that should be saved to the database.
I never worked with Ruby and Dashing before, so the first question I have - is it possible? If I will use Redis, then the question is how to store the data since this is key-value database. I can keep it as widget-id-1, widget-id-2, widget-id-3 ... widget-id-N etc., but in this case I will have to store N value (like widget-id=N). Or, is there any better way?
I came to the following solution:
require 'redis' # https://github.com/redis/redis-rb
redis_uri = URI.parse(ENV["REDISTOGO_URL"])
redis = Redis.new(:host => redis_uri.host, :port => redis_uri.port, :password => redis_uri.password)
if redis.exists('values_x') && redis.exists('values_y')
values_x = redis.lrange('values_x', 0, 9) # get latest 10 records
values_y = redis.lrange('values_y', 0, 9) # get latest 10 records
else
values_x = []
values_y = []
end
SCHEDULER.every '10s', :first_in => 0 do |job|
rand_data = (Date.today-rand(10000)).strftime("%d-%b") # replace this line with the code to get your data
rand_value = rand(50) # replace this line with the code to get your data
values_x << rand_data
values_y << rand_value
redis.multi do # execute as a single transaction
redis.lpush('values_x', rand_data)
redis.lpush('values_y', rand_value)
# feel free to add more datasets values here, if required
end
data = [
{
label: 'dataset-label',
fillColor: 'rgba(220,220,220,0.5)',
strokeColor: 'rgba(220,220,220,0.8)',
highlightFill: 'rgba(220,220,220,0.75)',
highlightStroke: 'rgba(220,220,220,1)',
data: values_y.last(10) # display last 10 values only
}
]
options = { scaleFontColor: '#fff' }
send_event('barchart', { labels: values_x.last(10), datasets: data, options: options })
end
Not sure if everything is implemented correctly here, but it works.

Ruby Dashing: Issues with Jobs (on a CentOS Server)

I'm having a weird event occur, where my dashing dashboard's list widget is showing erroneous data. Here's the screenshot from my live Dashing widget
Erroneous Widget
Expected Output
What follows is the code for the widget:
Code in .erb
<li data-row="1" data-col="1" data-sizex="2" data-sizey="6">
<div data-id="facebook_insights" data-view="List" data-unordered="true" data-title="Facebook Insights: Weekly Post Views" data-moreinfo="Updated every 10 seconds"</div>
</li>
Code in job .rb
require 'mysql2'
social_count = Hash.new({ value: 0 })
time = Time.new()
date_time1 = Time.new(time.year, time.month, time.day-1)
...
SCHEDULER.every '10s' do
begin
db = Mysql.new(<HOST>,<USER>,<PASS>,<DBNAME>)
mysql1 = "SELECT <VAR> FROM <TABLE> WHERE <VAR> = '#{date_time1}' ORDER BY <VAR> DESC LIMIT 1"
...
result1 = db.query(mysql1)
...
rescue
ensure
db.close
end
result1.each do |row|
strrow1 = row[0]
$value1 = strrow1.to_i
end
...
social_count[0] = {label: "1:", value: $value1}
...
send_event('facebook_insights', { items: social_count.values })
end
What is really baffling, is that this code works for a similar widget using different data in the SQL query. Can anyone help me understand why?
I checked and re-checked the data and in my other, working code, I had my $value variables defined as $valueX with X being the number. I thought to myself "Maybe the variable names are getting confused due to them having the same name", so I changed my code to
Working Code
result1.each do |row|
strrow1 = row[0]
$variable1 = strrow1.to_i
end
...
social_count[0] = {label: "1:", value: $variable1}
Et Voila! Eureka! It worked. Not sure why it still got confused with the names, but from now on, my names will be unique!

ajax and ruby script only returning 1 record instead of many

I am doing an Ajax call, using Ruby and Sinatra. The query should return multiple rows, it only returns one though.
The ajax script is:
$(document).ready(function() {
$(".showmembers").click(function(e) {
e.preventDefault();
alert('script');
var short_id = $('#shortmembers').val();
console.log(short_id);
$.getJSON(
"/show",
{ 'id' : short_id },
function(res, status) {
console.log(res);
$('#result').html('');
$('#result').append('<input type=checkbox value=' + res["email"] + '>');
$('#result').append( res["first"] );
$('#result').append( res["last"] );
$('#result').append( res["email"] );
});
});
});
and the Ruby script is:
get '/show' do
id = params['id']
DB["select shortname, first, last, email from shortlists sh JOIN shortmembers sm ON sm.short_id = sh.list_id JOIN candidates ca ON ca.id = sm.candidate_id where sh.list_id = ?", id].each do |row|
#shortname = row[:shortname]
#first = row[:first]
#last = row[:last]
#email = row[:email]
puts #shortname
puts #first
puts #last
puts #email
halt 200, { shortname: #shortname, first: #first, last: #last, email: #email }.to_json
end
end
If I run the query directly in the terminal on postgres I get 9 rows returned but, as above on my website, it just returns the first row only.
What's the problem? No error in the console, just one record.
You have halt 200 inside your loop. This will cause Sinatra to terminate the request processing and return the result back up the stack.
To return a full set of results, you will need to do something like the following:
get '/show' do
id = params['id']
results = DB["select shortname, first, last, email from shortlists sh
JOIN shortmembers sm ON sm.short_id = sh.list_id
JOIN candidates ca ON ca.id = sm.candidate_id
where sh.list_id = ?", id].map do |row|
{
:short_name => row[:shortname],
:first=>row[:first],
:last=>row[:last],
:email=>row[:email]
}
end
halt 200, results.to_json
end
This will return the selected fields from each row as an array of hashes.
In fact, as I look at the above, the solution might even be as simple as:
get '/show' do
id = params['id']
results = DB["select shortname, first, last, email from shortlists sh
JOIN shortmembers sm ON sm.short_id = sh.list_id
JOIN candidates ca ON ca.id = sm.candidate_id
where sh.list_id = ?", id]
halt 200, results.to_json
end
since you don't seem to be selecting anything but the columns you desire in the first place.

Resources