What language/character set is this? - utf-8

I've been given this data thats actually been translated from hebrew character and they've come out like this? Im guessing something todo with character sets or something. but could someone tell me what this actually is please? or translate it into something interesting.
thanks
% D7% 92% D7% A8% D7% A1% D7% 94% 20% D7% A1% D7% 99% D7% A0% D7% 99% D7% AA% 20% D7% 9B% D7% AA% D7 % 95% D7% A6% D7% 90% D7% 94% 20% D7% 99% D7% A4% D7% A0% D7% 99% D7% AA% 20 [bwbwlnw% 20 []],% 20 &?% 20quot?% 20% D7% 9B% D7% 93% D7% 99% 20% D7% 9C% D7% 94% D7% A9% D7% AA% D7% 9E% D7% A9,% 20 [[% D7% 9B% D7% 9E% D7% 95% 20% D7% 96% D7% 94% 20krksss% 20% D7% 96% D7% 94]],% 20% D7% A2% D7% 9D% 20% D7% AA % D7% 99% D7% 90% D7% 95% D7% A8% 20% D7% 94% D7% 90% D7% 97% D7% 95% D7% A8% D7% 99% 20% D7% 91% D7 % A9% D7% 91% D7% 95% D7% A2% 20% D7% 91% D7% 90% D7% 96% D7% 95% D7% A8% 20% D7% 94% D7% AA% D7% A9 % D7% 9C% D7% 95% D7% 9D% 20% D7% 97% D7% A8% D7% 93% D7% 94% 20% D7% 9B% D7% 90% D7% A9% D7% A8% 20% D7% 94% D7% 9C% D7% A7% D7% 95% D7% 97% 20% D7% A9% D7% 9C% 20% D7% 94% D7% 99% D7% 95% D7% 9D% 20abbwrtthynyty% 20% D7% 94% D7% 92% D7% A0% D7% 90% D7% 9C% D7% 95% D7% 92% D7% 99% D7% 94% 20% D7% 9E% D7% A7% D7% 91% D7% 9C% 20% D7% A8% D7% 98% D7% 95% D7% 91% 20% D7% 90% D7% 99% D7% A4% D7% A9% D7% 94% D7% 95,% 20 []]% 20% D7% 94% D7% 9E% D7% 9C% D7% 90% D7% 9A% 20% D7% A9% D7% 9C% 20% D7% 94% D7% 9E% D7% 91% D7% 95% D7% 90% 20% D7% A0% D7% 98% D7% 95% 20% D7% A9% D7% 9C% 20% D7% A9% D7% 9E% D7% 97% D7% 94% 20% D7% 94% D7% 94% D7% 92% D7% 94% 20 [[[[]% 20fsbwlwkhnt% 20])% 20 []% 20aaldaaxl% 20% D7% 9B% D7% AA % D7% 95% D7% A6% D7% 90% D7% 94% 20awkwmblktwftt,% 20% D7% 94% D7% A2% D7% A0% D7% 99% D7% 99% D7% 9F% 20% D7% 94% D7% 9B% D7% 99% 20% D7% 9E% D7% AA% D7% 90% D7% 99% D7% 9D% 20% D7% 9E% D7% A9% D7% 95% D7% 95% D7% 94% 20% D7% A2% D7% 9D% 20% D7% 90% D7% AA% D7% 94% 20% D7% 9E% D7% 90% D7% 9E% D7% A5% D7% 20% A7% D7% 98% D7% 9F% 20% D7% 9E% D7% 90% D7% 95% D7% 93,% 20 [[[&?% 20 #% 2039 ]]],% 20% D7% 92 % D7% 95% D7% A8% D7% 9E% D7% AA% 20% D7% 9C% D7% AA% D7% A4% D7% A7% D7% 95% D7% 93% 20 []% 20% D7 % 9C% D7% A9% D7% 95% D7% A7% 20wsjjlt% 20% D7% 90% D7% 95% D7% 91% D7% 99% D7% 99% D7% A7% D7% 98,% 20% D7% 95% D7% 96% D7% 94% 20% D7% 9C% D7% 90% 20% D7% 94% D7% A9% D7% 9C% D7% 99% D7% 9E% D7% 94% 20% D7% 94% D7% 99% D7% 90% 20alcangue% 20% D7% 99% D7% 93% D7% 95% D7% A2,% 20% D7% 9C% D7% A4% D7% 99% 20% D7 % 94% D7% A1% D7% 93% D7% A8,% 20% D7% 96% D7% 94% 20% D7% 99% D7% 95% D7% A8% D7% 93.% 20% D7% 94 % D7% 9E% D7% 93% D7% 95% D7% 91% D7% A8% 20% D7% 9C% D7% 9E% D7% 9C% D7% 90% 20% D7% 94% D7% A9% D7 % 99% D7% A8% 20% D7% 94% D7% 9E% D7% A6% D7% 91% 20% D7% A6% D7% 95% D7% A7% 20% D7% 94% D7% 94% D7 % A4% D7% A8% D7% 93% D7% 94% 2010% 20% D7% 9B% D7% 9C% 20% D7% 94% D7% A0% D7% 95% D7% A9% D7% 90% 20 % D7% 92% D7% 9C% D7% 99% D7% 9C,% 20% D7% 91% 20% D7% A4% D7% 92% D7% 9D% 20% D7% A7% D7% 9C% 20% D7% 95% D7% 94% D7% 95% D7% 90% 20% D7% A0% D7% 9E% D7% A6% D7% 90% 20% D7% AA% D7% 97% D7% AA% 20% D7% 94% D7% A8% D7% 90% D7% A9% 20% D7% 91% D7% A0% D7% 99% D7% A1% D7% 99% D7% 95% D7% 9F% 20% D7% 90% D7% A9% D7% A8% 20% D7% A9% D7% 9C% D7% 95% 20% D7% 94% D7% 95% D7% 90% 20% D7% 92% D7% 9C% D7% 99% D7% 93% D7% 94% 20% D7% 9C% D7% 90% 20% D7% A0% D7% 99% D7% AA% D7% 9F% 20% D7% 9C% D7% A0% D7 % 95% 20% D7% 9C% D7% A7% D7% 99% D7% 99% D7% 9D% 20% D7% A9% D7% 9C% D7% A0% D7% 95% 20% D7% 9C% D7 % A0% D7% 95% 20% D7% 9B% D7% 99% 20% D7% 91% D7% 90% D7% 96% D7% 95% D7% A8% 20 [[twwGshG]]% 20% D7% 94% D7% 90% D7% 97% D7% 95% D7% A8% D7% 99% 20% D7% 95% D7% 90% D7% AA% D7% 20% 96% D7% 94% 20% D7% 9C% D7% 97% D7% 9C% D7% A5.% 20 &?% 20quot?

It's definitely Hebrew, I can make out some of it, but the encoding is garbled. Removing all the extra spaces and some other characters, I got:
%D7%92%D7%A8%D7%A1%D7%94%20%D7%A1%D7%99%D7%A0%D7%99%D7%AA%20%D7%9B%D7%AA%D7%95%D7%A6%D7%90%D7%94%20%D7%99%D7%A4%D7%A0%D7%99%D7%AA%20%20%20%D7%9B%D7%93%D7%99%20%D7%9C%D7%94%D7%A9%D7%AA%D7%9E%D7%A9%20%D7%9B%D7%9E%D7%95%20%D7%96%D7%94%20%20%D7%96%D7%94%20%D7%A2%D7%9D%20%D7%AA%D7%99%D7%90%D7%95%D7%A8%20%D7%94%D7%90%D7%97%D7%95%D7%A8%D7%99%20%D7%91%D7%A9%D7%91%D7%95%D7%A2%20%D7%91%D7%90%D7%96%D7%95%D7%A8%20%D7%94%D7%AA%D7%A9%D7%9C%D7%95%D7%9D%20%D7%97%D7%A8%D7%93%D7%94%20%D7%9B%D7%90%D7%A9%D7%A8%20%D7%94%D7%9C%D7%A7%D7%95%D7%97%20%D7%A9%D7%9C%20%D7%94%D7%99%D7%95%D7%9D%20%20%D7%94%D7%92%D7%A0%D7%90%D7%9C%D7%95%D7%92%D7%99%D7%94%20%D7%9E%D7%A7%D7%91%D7%9C%20%D7%A8%D7%98%D7%95%D7%91%20%D7%90%D7%99%D7%A4%D7%A9%D7%94%D7%95%20%20%D7%94%D7%9E%D7%9C%D7%90%D7%9A%20%D7%A9%D7%9C%20%D7%94%D7%9E%D7%91%D7%95%D7%90%20%D7%A0%D7%98%D7%95%20%D7%A9%D7%9C%20%D7%A9%D7%9E%D7%97%D7%94%20%D7%94%D7%94%D7%92%D7%94%20%20%20%20%20%D7%9B%D7%AA%D7%95%D7%A6%D7%90%D7%94%20%20%D7%94%D7%A2%D7%A0%D7%99%D7%99%D7%9F%20%D7%94%D7%9B%D7%99%20%D7%9E%D7%AA%D7%90%D7%99%D7%9D%20%D7%9E%D7%A9%D7%95%D7%95%D7%94%20%D7%A2%D7%9D%20%D7%90%D7%AA%D7%94%20%D7%9E%D7%90%D7%9E%D7%A5%D7%20%A7%D7%98%D7%9F%20%D7%9E%D7%90%D7%95%D7%93%20%20%20%D7%92%D7%95%D7%A8%D7%9E%D7%AA%20%D7%9C%D7%AA%D7%A4%D7%A7%D7%95%D7%93%20%20%D7%9C%D7%A9%D7%95%D7%A7%20%20%D7%90%D7%95%D7%91%D7%99%D7%99%D7%A7%D7%98%20%D7%95%D7%96%D7%94%20%D7%9C%D7%90%20%D7%94%D7%A9%D7%9C%D7%99%D7%9E%D7%94%20%D7%94%D7%99%D7%90%20%20%D7%99%D7%93%D7%95%D7%A2%20%D7%9C%D7%A4%D7%99%20%D7%94%D7%A1%D7%93%D7%A8%20%D7%96%D7%94%20%D7%99%D7%95%D7%A8%D7%93%20%D7%94%D7%9E%D7%93%D7%95%D7%91%D7%A8%20%D7%9C%D7%9E%D7%9C%D7%90%20%D7%94%D7%A9%D7%99%D7%A8%20%D7%94%D7%9E%D7%A6%D7%91%20%D7%A6%D7%95%D7%A7%20%D7%94%D7%94%D7%A4%D7%A8%D7%93%D7%94%2010%20%D7%9B%D7%9C%20%D7%94%D7%A0%D7%95%D7%A9%D7%90%20%D7%92%D7%9C%D7%99%D7%9C%20%D7%91%20%D7%A4%D7%92%D7%9D%20%D7%A7%D7%9C%20%D7%95%D7%94%D7%95%D7%90%20%D7%A0%D7%9E%D7%A6%D7%90%20%D7%AA%D7%97%D7%AA%20%D7%94%D7%A8%D7%90%D7%A9%20%D7%91%D7%A0%D7%99%D7%A1%D7%99%D7%95%D7%9F%20%D7%90%D7%A9%D7%A8%20%D7%A9%D7%9C%D7%95%20%D7%94%D7%95%D7%90%20%D7%92%D7%9C%D7%99%D7%93%D7%94%20%D7%9C%D7%90%20%D7%A0%D7%99%D7%AA%D7%9F%20%D7%9C%D7%A0%D7%95%20%D7%9C%D7%A7%D7%99%D7%99%D7%9D%20%D7%A9%D7%9C%D7%A0%D7%95%20%D7%9C%D7%A0%D7%95%20%D7%9B%D7%99%20%D7%91%D7%90%D7%96%D7%95%D7%A8%20%20%D7%94%D7%90%D7%97%D7%95%D7%A8%D7%99%20%D7%95%D7%90%D7%AA%D7%20%96%D7%94%20%D7%9C%D7%97%D7%9C%D7%A5%20%20
The start is:
גרסה סינית כתוצאה יפנית כדי להשתמש כמו זה זה עם תיאור האחורי בשבוע באזור התשלום חרדה כאשר הלקוח של היום הגנאלוגיה מקבל רטוב איפשהו המלאך של המבוא נטו של שמחה ההגה כתוצאה
Which means (my translation):
Chinese version as a Japanese result so as to use like this This with backward description in the area of the payment worried that the customer of the day the geneology get wet somewhere the angel of the introduction of happiness of pronunciation

I believe this table should help
Knowing that it is hebrew and that the 05 is added everywhere it should be these characters...

Google translate says As Chinese Remove D7 Japanese [bwbwlnw []], &? quot? To use, [[like this krksss it]], with a description of the payment in the week back in fear when the customer of today abbwrtthynyty somewhere genealogy gets wet, []] the angel of the net introduction of joy wheel [[[[] fsbwlwkhnt]) [] aaldaaxl As awkwmblktwftt, it compares with the best fit you very Satan effort, [[[&? # 39]]], causes the function [] market wsjjlt object, which is not completed is alcangue known, in order, it goes down. The song in question to fill the separation cliff situation all the subject cylinder 10, in slight defect and is under his head in an attempt which is not possible to have ice cream to us that in our [[twwGshG]] back and the extract. &? quot?

try php function urldecode - urldecode — Decodes URL-encoded string
$data = urldecode($your_string);
http://php.net/manual/en/function.urldecode.php

Related

ansible print the stdout_lines in csv file in the exact format that playbook prints on the console

I have the following ansible playbook code which prints the some metrices of one remote server. Here I want to print the output in the csv file with the exact msg format shown in the output. How to print this in csv file.
Ansible playbook:
tasks:
- name: Get ip address of the remote node
ansible.builtin.shell: hostname -i | awk '{print $2}'
register: ipaddr
- name: Check uptime
shell: uptime | cut -d',' -f1
register: uptime_op
- debug:
msg: "{{uptime_op.stdout_lines}}"
- name: Get lsbkl value
shell: lsblk
register: lsblk_output
- debug:
msg: "{{lsblk_output.stdout_lines}}"
- name: Get Disc space value
shell: df -h
register: df_output
- debug:
msg: "{{df_output.stdout_lines}}"
output:
PLAY [test_host] *************************************************************************************************************
TASK [Gathering Facts] ******************************************************************************************************
Tuesday 20 December 2022 10:07:07 -0800 (0:00:00.017) 0:00:00.017 ******
ok: [hostname.domain.com]
TASK [Get ip address of the remote node] ************************************************************************************
Tuesday 20 December 2022 10:07:14 -0800 (0:00:07.399) 0:00:07.417 ******
changed: [hostname.domain.com]
TASK [Check uptime] *********************************************************************************************************
Tuesday 20 December 2022 10:07:18 -0800 (0:00:03.860) 0:00:11.278 ******
changed: [hostname.domain.com]
TASK [debug] ****************************************************************************************************************
Tuesday 20 December 2022 10:07:22 -0800 (0:00:03.781) 0:00:15.059 ******
ok: [hostname.domain.com] => {
"msg": [
" 23:37pm up 359 days 5:53"
]
}
TASK [Get lsbkl value] ******************************************************************************************************
Tuesday 20 December 2022 10:07:22 -0800 (0:00:00.086) 0:00:15.145 ******
changed: [hostname.domain.com]
TASK [debug] ****************************************************************************************************************
Tuesday 20 December 2022 10:07:26 -0800 (0:00:03.815) 0:00:18.960 ******
ok: [hostname.domain.com] => {
"msg": [
"NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINT",
"sda 8:0 0 1.1T 0 disk ",
"├─sda1 8:1 0 15G 0 part /",
"├─sda2 8:2 0 518M 0 part /boot/efi",
"├─sda3 8:3 0 1K 0 part ",
"├─sda5 8:5 0 2G 0 part /ctools",
"├─sda6 8:6 0 10G 0 part /var",
"├─sda7 8:7 0 48G 0 part [SWAP]",
"├─sda8 8:8 0 250M 0 part /dsm",
"├─sda9 8:9 0 501M 0 part /var/cfengine",
"├─sda10 8:10 0 10G 0 part /tmp",
"└─sda11 8:11 0 1T 0 part /infrastructure",
"sdb 8:16 0 1.8T 0 disk ",
"├─sdb1 8:17 0 484.3G 0 part /p4depot",
"├─sdb2 8:18 0 931.3G 0 part /p4meta",
"└─sdb3 8:19 0 372.9G 0 part /p4log"
]
}
TASK [Get Disc space value] *************************************************************************************************
Tuesday 20 December 2022 10:07:26 -0800 (0:00:00.088) 0:00:19.049 ******
changed: [hostname.domain.com]
TASK [debug] ****************************************************************************************************************
Tuesday 20 December 2022 10:07:30 -0800 (0:00:03.787) 0:00:22.836 ******
ok: [hostname.domain.com] => {
"msg": [
"Filesystem Size Used Avail Use% Mounted on",
"devtmpfs 189G 8.0K 189G 1% /dev",
"tmpfs 189G 0 189G 0% /dev/shm",
"tmpfs 189G 4.0G 185G 3% /run",
"tmpfs 189G 0 189G 0% /sys/fs/cgroup",
"/dev/sda1 15G 11G 4.8G 69% /",
"/dev/sda2 518M 0 518M 0% /boot/efi",
"/dev/sda10 10G 83M 10G 1% /tmp",
"/dev/sda11 1.1T 34M 1.1T 1% /infrastructure",
"/dev/sda8 247M 62M 185M 25% /dsm",
"/dev/sda6 10G 1.5G 8.6G 15% /var",
"/dev/sda9 498M 119M 379M 24% /var/cfengine",
"/dev/sdb2 931G 30G 902G 4% /p4meta",
"/dev/sdb3 373G 61M 373G 1% /p4log",
"/dev/sdb1 485G 112G 373G 23% /p4depot",
"/dev/sda5 2.1G 3.6M 1.8G 1% /ctools",
"tmpfs 1.0G 0 1.0G 0% /dsm/tmp/dsmbg.tmpfs",
"10.223.232.121:/new_itools 951G 497G 454G 53% /nfs/site/itools",
"incfs03n03b-04:/common_usr_local 11G 1.2G 8.9G 12% /nfs/iind/local",
"incfs04n08b-1:/prod 513M 1.5M 512M 1% /nfs/iind/proj/prod",
"incfs06n11b-1:/home0 351G 149G 202G 43% /nfs/iind/disks/home23",
"incfs02n10a-1:/iind_disks_home24 501G 59G 442G 12% /nfs/iind/disks/home24",
"incfs06n04a-05:/iind_gen_adm 301G 176G 125G 59% /nfs/site/gen/adm",
"incfs03n06b-1:/ba_ctg_home01 301G 263G 38G 88% /nfs/iind/disks/home110",
"inc08n07b-1:/home_tree 11G 79M 10G 1% /nfs/iind/home",
"incfs06n10a-1:/iind_gen_adm_netmeter_m 81G 28G 53G 35% /nfs/iind/disks/iind_gen_adm_netmeter",
"tmpfs 38G 0 38G 0% /run/user/37124",
"incfs07n05b-1:/common 201G 158G 43G 79% /nfs/site/disks/iind_gen_adm_common",
"tmpfs 38G 0 38G 0% /run/user/12142325"
]
}
PLAY RECAP ******************************************************************************************************************
hostname.domain.com : ok=8 changed=4 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
Attaching the expected csv file how it should look like.
Given the registered data df_output.stdout_lines there must be also df_output.stdout attribute. Use the filter community.general.jc and parse the registered data
- set_fact:
df: "{{ df_output.stdout|community.general.jc('df') }}"
gives
df:
- available: 189
filesystem: devtmpfs
mounted_on: /dev
size: 189G
use_percent: 1
used: 8
- available: 189
filesystem: tmpfs
mounted_on: /dev/shm
size: 189G
use_percent: 0
used: 0
...
Then, for each host create a CSV file on the controller. For example,
- copy:
dest: "/tmp/ansible_df_{{ item }}.csv"
content: |
{{ df_output.stdout_lines.0.split()[:-1]|join(',') }}
{% for m in hostvars[item]['df'] %}
{{ m.filesystem }},{{ m.size }},{{ m.used }},{{m.available }},{{ m.use_percent }},{{ m.mounted_on }}
{% endfor %}
loop: "{{ ansible_play_hosts }}"
run_once: true
delegate_to: localhost
will create
shel> cat /tmp/ansible_df_localhost.csv
Filesystem,Size,Used,Avail,Use%,Mounted
devtmpfs,189G,8,189,1,/dev
tmpfs,189G,0,189,0,/dev/shm
tmpfs,189G,4,185,3,/run
tmpfs,189G,0,189,0,/sys/fs/cgroup
/dev/sda1,15G,11,4,69,/
/dev/sda2,518M,0,518,0,/boot/efi
/dev/sda10,10G,83,10,1,/tmp
/dev/sda11,1.1T,34,1,1,/infrastructure
/dev/sda8,247M,62,185,25,/dsm
/dev/sda6,10G,1,8,15,/var
/dev/sda9,498M,119,379,24,/var/cfengine
/dev/sdb2,931G,30,902,4,/p4meta
/dev/sdb3,373G,61,373,1,/p4log
/dev/sdb1,485G,112,373,23,/p4depot
/dev/sda5,2.1G,3,1,1,/ctools
tmpfs,1.0G,0,1,0,/dsm/tmp/dsmbg.tmpfs
10.223.232.121:/new_itools,951G,497,454,53,/nfs/site/itools
incfs03n03b-04:/common_usr_local,11G,1,8,12,/nfs/iind/local
incfs04n08b-1:/prod,513M,1,512,1,/nfs/iind/proj/prod
incfs06n11b-1:/home0,351G,149,202,43,/nfs/iind/disks/home23
incfs02n10a-1:/iind_disks_home24,501G,59,442,12,/nfs/iind/disks/home24
incfs06n04a-05:/iind_gen_adm,301G,176,125,59,/nfs/site/gen/adm
incfs03n06b-1:/ba_ctg_home01,301G,263,38,88,/nfs/iind/disks/home110
inc08n07b-1:/home_tree,11G,79,10,1,/nfs/iind/home
incfs06n10a-1:/iind_gen_adm_netmeter_m,81G,28,53,35,/nfs/iind/disks/iind_gen_adm_netmeter
tmpfs,38G,0,38,0,/run/user/37124
incfs07n05b-1:/common,201G,158,43,79,/nfs/site/disks/iind_gen_adm_common
tmpfs,38G,0,38,0,/run/user/12142325
Given the data for testing
shell> cat data.json
{
"df_stdout_lines": [
"Filesystem Size Used Avail Use% Mounted on",
"devtmpfs 189G 8.0K 189G 1% /dev",
"tmpfs 189G 0 189G 0% /dev/shm",
"tmpfs 189G 4.0G 185G 3% /run",
"tmpfs 189G 0 189G 0% /sys/fs/cgroup",
"/dev/sda1 15G 11G 4.8G 69% /",
"/dev/sda2 518M 0 518M 0% /boot/efi",
"/dev/sda10 10G 83M 10G 1% /tmp",
"/dev/sda11 1.1T 34M 1.1T 1% /infrastructure",
"/dev/sda8 247M 62M 185M 25% /dsm",
"/dev/sda6 10G 1.5G 8.6G 15% /var",
"/dev/sda9 498M 119M 379M 24% /var/cfengine",
"/dev/sdb2 931G 30G 902G 4% /p4meta",
"/dev/sdb3 373G 61M 373G 1% /p4log",
"/dev/sdb1 485G 112G 373G 23% /p4depot",
"/dev/sda5 2.1G 3.6M 1.8G 1% /ctools",
"tmpfs 1.0G 0 1.0G 0% /dsm/tmp/dsmbg.tmpfs",
"10.223.232.121:/new_itools 951G 497G 454G 53% /nfs/site/itools",
"incfs03n03b-04:/common_usr_local 11G 1.2G 8.9G 12% /nfs/iind/local",
"incfs04n08b-1:/prod 513M 1.5M 512M 1% /nfs/iind/proj/prod",
"incfs06n11b-1:/home0 351G 149G 202G 43% /nfs/iind/disks/home23",
"incfs02n10a-1:/iind_disks_home24 501G 59G 442G 12% /nfs/iind/disks/home24",
"incfs06n04a-05:/iind_gen_adm 301G 176G 125G 59% /nfs/site/gen/adm",
"incfs03n06b-1:/ba_ctg_home01 301G 263G 38G 88% /nfs/iind/disks/home110",
"inc08n07b-1:/home_tree 11G 79M 10G 1% /nfs/iind/home",
"incfs06n10a-1:/iind_gen_adm_netmeter_m 81G 28G 53G 35% /nfs/iind/disks/iind_gen_adm_netmeter",
"tmpfs 38G 0 38G 0% /run/user/37124",
"incfs07n05b-1:/common 201G 158G 43G 79% /nfs/site/disks/iind_gen_adm_common",
"tmpfs 38G 0 38G 0% /run/user/12142325"
]
}
Example of a complete playbook for testing
- hosts: localhost
vars_files:
- data.json
vars:
df_output:
stdout: |
Filesystem Size Used Avail Use% Mounted on
devtmpfs 189G 8.0K 189G 1% /dev
tmpfs 189G 0 189G 0% /dev/shm
tmpfs 189G 4.0G 185G 3% /run
tmpfs 189G 0 189G 0% /sys/fs/cgroup
/dev/sda1 15G 11G 4.8G 69% /
/dev/sda2 518M 0 518M 0% /boot/efi
/dev/sda10 10G 83M 10G 1% /tmp
/dev/sda11 1.1T 34M 1.1T 1% /infrastructure
/dev/sda8 247M 62M 185M 25% /dsm
/dev/sda6 10G 1.5G 8.6G 15% /var
/dev/sda9 498M 119M 379M 24% /var/cfengine
/dev/sdb2 931G 30G 902G 4% /p4meta
/dev/sdb3 373G 61M 373G 1% /p4log
/dev/sdb1 485G 112G 373G 23% /p4depot
/dev/sda5 2.1G 3.6M 1.8G 1% /ctools
tmpfs 1.0G 0 1.0G 0% /dsm/tmp/dsmbg.tmpfs
10.223.232.121:/new_itools 951G 497G 454G 53% /nfs/site/itools
incfs03n03b-04:/common_usr_local 11G 1.2G 8.9G 12% /nfs/iind/local
incfs04n08b-1:/prod 513M 1.5M 512M 1% /nfs/iind/proj/prod
incfs06n11b-1:/home0 351G 149G 202G 43% /nfs/iind/disks/home23
incfs02n10a-1:/iind_disks_home24 501G 59G 442G 12% /nfs/iind/disks/home24
incfs06n04a-05:/iind_gen_adm 301G 176G 125G 59% /nfs/site/gen/adm
incfs03n06b-1:/ba_ctg_home01 301G 263G 38G 88% /nfs/iind/disks/home110
inc08n07b-1:/home_tree 11G 79M 10G 1% /nfs/iind/home
incfs06n10a-1:/iind_gen_adm_netmeter_m 81G 28G 53G 35% /nfs/iind/disks/iind_gen_adm_netmeter
tmpfs 38G 0 38G 0% /run/user/37124
incfs07n05b-1:/common 201G 158G 43G 79% /nfs/site/disks/iind_gen_adm_common
tmpfs 38G 0 38G 0% /run/user/12142325
stdout_lines: "{{ df_stdout_lines }}"
tasks:
- debug:
var: df_output.stdout_lines
- debug:
var: df_output.stdout
- set_fact:
df: "{{ df_output.stdout|community.general.jc('df') }}"
- debug:
var: df
- copy:
dest: "/tmp/ansible_df_{{ item }}.csv"
content: |
{{ df_output.stdout_lines.0.split()[:-1]|join(',') }}
{% for m in hostvars[item]['df'] %}
{{ m.filesystem }},{{ m.size }},{{ m.used }},{{m.available }},{{ m.use_percent }},{{ m.mounted_on }}
{% endfor %}
loop: "{{ ansible_play_hosts }}"
run_once: true
delegate_to: localhost

Github Actions failing to install Libv8 gem on ubuntu-18.04.5

We are using Github Actions for our CI setup. all of sudden our Gem installation action stopped working when it is trying to install mini_racer gem which depends on libv8 gem. but when actions try to install and build extentions for this gem it fails there.
here are the configurations:
jobs:
spec:
runs-on: ubuntu-latest
services:
postgresql:
image: circleci/postgres:11.5-alpine-ram
ports: ["5432:5432"]
env:
POSTGRES_USER:
POSTGRES_DB:
POSTGRES_PASSWORD:
steps:
- uses: actions/checkout#v1
- name: Bundler/Gems Cache
uses: actions/cache#v2
with:
path: vendor/bundle
key: ${{ runner.os }}-bundle-v1-${{ hashFiles('Gemfile.lock') }}
restore-keys: |
${{ runner.os }}-bundle-v1-
- name: Read nvm version
id: nvmrc
run: echo ::set-output "name=NODEVERSION::$(cat .nvmrc)"
- uses: ruby/setup-ruby#v1
- uses: actions/setup-node#v1
with:
node-version: "${{ steps.nvmrc.outputs.NODEVERSION }}"
- name: Install Gems
run: |
gem install --no-document bundler
bundle config path vendor/bundle
bundle config set without 'development'
bundle install --deployment --jobs 2
here are the logs:
Gem::Ext::BuildError: ERROR: Failed to build gem native extension.
current directory:
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/ext/libv8
/opt/hostedtoolcache/Ruby/2.6.6/x64/bin/ruby -I
/opt/hostedtoolcache/Ruby/2.6.6/x64/lib/ruby/2.6.0 -r
./siteconf20201214-2974-1gm42lz.rb extconf.rb
creating Makefile
WARNING: Your metrics.cfg file was invalid or nonexistent. A new one will be
created.
________ running 'git -c core.deltaBaseCacheLimit=2g clone --no-checkout
--progress https://chromium.googlesource.com/v8/v8.git
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor/_gclient_v8_msyeloop'
in
'/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor'
Cloning into
'/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor/_gclient_v8_msyeloop'...
remote: Sending approximately 777.22 MiB ...Counting objects: 1
remote: Counting objects: 1166
remote: Counting objects: 3144
remote: Counting objects: 7699, done
remote: Finding sources: 2% (1/38)
remote: Finding sources: 5%
(2/38)
remote: Finding sources: 7% (3/38)
remote:
Finding sources: 10% (4/38)
remote: Finding sources: 13% (5/38)
remote: Finding sources: 15% (6/38)
remote: Finding sources: 18%
(7/38)
remote: Finding sources: 21% (8/38)
remote:
Finding sources: 23% (9/38)
remote: Finding sources: 26% (10/38)
remote: Finding sources: 28% (11/38)
remote: Finding sources: 31%
(12/38)
remote: Finding sources: 34% (13/38)
remote:
Finding sources: 36% (14/38)
remote: Finding sources: 39% (15/38)
remote: Finding sources: 42% (16/38)
remote: Finding sources: 44%
(17/38)
remote: Finding sources: 47% (18/38)
remote:
Finding sources: 50% (19/38)
remote: Finding sources: 52% (20/38)
remote: Finding sources: 55% (21/38)
remote: Finding sources: 57%
(22/38)
remote: Finding sources: 60% (23/38)
remote:
Finding sources: 63% (24/38)
remote: Finding sources: 65% (25/38)
remote: Finding sources: 68% (26/38)
remote: Finding sources: 71%
(27/38)
remote: Finding sources: 73% (28/38)
remote:
Finding sources: 76% (29/38)
remote: Finding sources: 78% (30/38)
remote: Finding sources: 81% (31/38)
remote: Finding sources: 84%
(32/38)
remote: Finding sources: 86% (33/38)
remote:
Finding sources: 89% (34/38)
remote: Finding sources: 92% (35/38)
remote: Finding sources: 94% (36/38)
remote: Finding sources: 97%
(37/38)
remote: Finding sources: 100% (38/38)
remote:
Finding sources: 100% (38/38)
Receiving objects: 0% (1/806610)
Receiving objects: 1% (8067/806610)
Receiving objects: 2% (16133/806610)
Receiving objects: 3% (24199/806610)
Receiving objects: 4% (32265/806610), 10.18 MiB | 20.35 MiB/s
Receiving
objects: 5% (40331/806610), 10.18 MiB | 20.35 MiB/s
Receiving objects: 6%
(48397/806610), 10.18 MiB | 20.35 MiB/s
Receiving objects: 6% (49045/806610),
10.18 MiB | 20.35 MiB/s
Receiving objects: 7% (56463/806610), 21.73 MiB |
21.73 MiB/s
Receiving objects: 8% (64529/806610), 21.73 MiB | 21.73 MiB/s
Receiving objects: 9% (72595/806610), 21.73 MiB | 21.73 MiB/s
Receiving
objects: 10% (80661/806610), 32.48 MiB | 21.65 MiB/s
Receiving objects: 11%
(88728/806610), 32.48 MiB | 21.65 MiB/s
Receiving objects: 12% (96794/806610),
32.48 MiB | 21.65 MiB/s
Receiving objects: 13% (104860/806610), 32.48 MiB |
21.65 MiB/s
Receiving objects: 13% (105075/806610), 32.48 MiB | 21.65 MiB/s
Receiving objects: 14% (112926/806610), 48.31 MiB | 24.17 MiB/s
Receiving
objects: 15% (120992/806610), 48.31 MiB | 24.17 MiB/s
Receiving objects: 16%
(129058/806610), 48.31 MiB | 24.17 MiB/s
Receiving objects: 17%
(137124/806610), 66.43 MiB | 26.58 MiB/s
Receiving objects: 18%
(145190/806610), 66.43 MiB | 26.58 MiB/s
Receiving objects: 19%
(153256/806610), 66.43 MiB | 26.58 MiB/s
Receiving objects: 19%
(159896/806610), 66.43 MiB | 26.58 MiB/s
Receiving objects: 20%
(161322/806610), 83.43 MiB | 27.82 MiB/s
Receiving objects: 21%
(169389/806610), 83.43 MiB | 27.82 MiB/s
Receiving objects: 22%
(177455/806610), 83.43 MiB | 27.82 MiB/s
Receiving objects: 23%
(185521/806610), 83.43 MiB | 27.82 MiB/s
Receiving objects: 24%
(193587/806610), 101.47 MiB | 29.00 MiB/s
Receiving objects: 25%
(201653/806610), 101.47 MiB | 29.00 MiB/s
Receiving objects: 26%
(209719/806610), 101.47 MiB | 29.00 MiB/s
Receiving objects: 27%
(217785/806610), 101.47 MiB | 29.00 MiB/s
Receiving objects: 27%
(219562/806610), 101.47 MiB | 29.00 MiB/s
Receiving objects: 28%
(225851/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 29%
(233917/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 30%
(241983/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 31%
(250050/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 32%
(258116/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 33%
(266182/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 34%
(274248/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 35%
(282314/806610), 122.18 MiB | 30.55 MiB/s
Receiving objects: 36%
(290380/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 37%
(298446/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 38%
(306512/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 39%
(314578/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 40%
(322644/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 41%
(330711/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 42%
(338777/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 43%
(346843/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 44%
(354909/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 45%
(362975/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 46%
(371041/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 47%
(379107/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 48%
(387173/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 49%
(395239/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 50%
(403305/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 51%
(411372/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 52%
(419438/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 52%
(421222/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 53%
(427504/806610), 137.46 MiB | 30.53 MiB/s
Receiving objects: 54%
(435570/806610), 150.47 MiB | 31.15 MiB/s
Receiving objects: 55%
(443636/806610), 160.89 MiB | 30.90 MiB/s
Receiving objects: 55%
(444613/806610), 160.89 MiB | 30.90 MiB/s
Receiving objects: 56%
(451702/806610), 171.65 MiB | 30.90 MiB/s
Receiving objects: 57%
(459768/806610), 181.79 MiB | 29.64 MiB/s
Receiving objects: 57%
(459851/806610), 181.79 MiB | 29.64 MiB/s
Receiving objects: 57%
(465088/806610), 204.23 MiB | 26.82 MiB/s
Receiving objects: 58%
(467834/806610), 217.07 MiB | 25.67 MiB/s
Receiving objects: 58%
(471120/806610), 228.67 MiB | 23.64 MiB/s
Receiving objects: 59%
(475900/806610), 251.90 MiB | 22.54 MiB/s
Receiving objects: 59%
(476769/806610), 251.90 MiB | 22.54 MiB/s
Receiving objects: 59%
(482388/806610), 275.84 MiB | 23.15 MiB/s
Receiving objects: 60%
(483966/806610), 284.36 MiB | 22.79 MiB/s
Receiving objects: 60%
(488380/806610), 295.33 MiB | 22.85 MiB/s
Receiving objects: 61%
(492033/806610), 320.99 MiB | 23.09 MiB/s
Receiving objects: 61%
(494842/806610), 320.99 MiB | 23.09 MiB/s
Receiving objects: 62%
(500099/806610), 345.68 MiB | 23.37 MiB/s
Receiving objects: 62%
(502447/806610), 345.68 MiB | 23.37 MiB/s
Receiving objects: 63%
(508165/806610), 369.49 MiB | 23.53 MiB/s
Receiving objects: 63%
(508340/806610), 369.49 MiB | 23.53 MiB/s
Receiving objects: 63%
(515700/806610), 394.58 MiB | 24.49 MiB/s
Receiving objects: 64%
(516231/806610), 405.08 MiB | 24.39 MiB/s
Receiving objects: 64%
(524205/806610), 417.50 MiB | 24.35 MiB/s
Receiving objects: 65%
(524297/806610), 417.50 MiB | 24.35 MiB/s
Receiving objects: 66%
(532363/806610), 442.39 MiB | 24.23 MiB/s
Receiving objects: 66%
(532452/806610), 442.39 MiB | 24.23 MiB/s
Receiving objects: 66%
(540228/806610), 469.93 MiB | 24.90 MiB/s
Receiving objects: 67%
(540429/806610), 469.93 MiB | 24.90 MiB/s
Receiving objects: 67%
(546936/806610), 496.55 MiB | 25.57 MiB/s
Receiving objects: 68%
(548495/806610), 508.55 MiB | 25.33 MiB/s
Receiving objects: 68%
(553835/806610), 522.43 MiB | 26.08 MiB/s
Receiving objects: 69%
(556561/806610), 546.91 MiB | 26.00 MiB/s
Receiving objects: 69%
(558983/806610), 546.91 MiB | 26.00 MiB/s
Receiving objects: 70%
(564627/806610), 573.57 MiB | 25.97 MiB/s
Receiving objects: 70%
(564643/806610), 573.57 MiB | 25.97 MiB/s
Receiving objects: 70%
(571963/806610), 599.12 MiB | 25.72 MiB/s
Receiving objects: 71%
(572694/806610), 612.16 MiB | 25.70 MiB/s
Receiving objects: 71%
(579737/806610), 625.82 MiB | 26.06 MiB/s
Receiving objects: 72%
(580760/806610), 638.73 MiB | 25.84 MiB/s
Receiving objects: 72%
(584921/806610), 649.36 MiB | 25.58 MiB/s
Receiving objects: 73%
(588826/806610), 668.74 MiB | 24.09 MiB/s
Receiving objects: 73%
(589490/806610), 668.74 MiB | 24.09 MiB/s
Receiving objects: 73%
(592259/806610), 687.90 MiB | 22.62 MiB/s
Receiving objects: 74%
(596892/806610), 697.41 MiB | 21.84 MiB/s
Receiving objects: 74%
(598795/806610), 708.29 MiB | 21.36 MiB/s
Receiving objects: 75%
(604958/806610), 727.52 MiB | 19.73 MiB/s
Receiving objects: 75%
(609014/806610), 727.52 MiB | 19.73 MiB/s
Receiving objects: 76%
(613024/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 77%
(621090/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 78%
(629156/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 79%
(637222/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 80%
(645288/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 81%
(653355/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 82%
(661421/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 83%
(669487/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 84%
(677553/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 85%
(685619/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 86%
(693685/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 87%
(701751/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 88%
(709817/806610), 737.40 MiB | 19.56 MiB/s
Receiving objects: 89%
(717883/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 90%
(725949/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 91%
(734016/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 92%
(742082/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 93%
(750148/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 94%
(758214/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 94%
(760146/806610), 753.10 MiB | 21.03 MiB/s
Receiving objects: 95%
(766280/806610), 767.77 MiB | 22.01 MiB/s
Receiving objects: 96%
(774346/806610), 767.77 MiB | 22.01 MiB/s
Receiving objects: 97%
(782412/806610), 767.77 MiB | 22.01 MiB/s
Receiving objects: 98%
(790478/806610), 767.77 MiB | 22.01 MiB/s
Receiving objects: 99%
(798544/806610), 767.77 MiB | 22.01 MiB/s
remote: Total 806610 (delta 650490),
reused 806597 (delta 650490)
Receiving objects: 100% (806610/806610), 767.77 MiB | 22.01 MiB/s
Receiving
objects: 100% (806610/806610), 777.03 MiB | 24.82 MiB/s, done.
Resolving deltas: 0% (0/650490)
Resolving deltas: 1% (6505/650490)
Resolving
deltas: 2% (13010/650490)
Resolving deltas: 3% (19515/650490)
Resolving
deltas: 4% (26020/650490)
Resolving deltas: 5% (32525/650490)
Resolving
deltas: 6% (39030/650490)
Resolving deltas: 7% (45535/650490)
Resolving
deltas: 8% (52040/650490)
Resolving deltas: 8% (54023/650490)
Resolving
deltas: 9% (58545/650490)
Resolving deltas: 10% (65049/650490)
Resolving
deltas: 11% (71554/650490)
Resolving deltas: 12% (78059/650490)
Resolving
deltas: 12% (80547/650490)
Resolving deltas: 13% (84564/650490)
Resolving
deltas: 14% (91069/650490)
Resolving deltas: 15% (97574/650490)
Resolving
deltas: 16% (104079/650490)
Resolving deltas: 17% (110584/650490)
Resolving
deltas: 18% (117089/650490)
Resolving deltas: 19% (123594/650490)
Resolving
deltas: 19% (129563/650490)
Resolving deltas: 20% (130098/650490)
Resolving
deltas: 21% (136603/650490)
Resolving deltas: 22% (143108/650490)
Resolving
deltas: 23% (149613/650490)
Resolving deltas: 24% (156118/650490)
Resolving
deltas: 25% (162623/650490)
Resolving deltas: 26% (169128/650490)
Resolving
deltas: 26% (169669/650490)
Resolving deltas: 27% (175633/650490)
Resolving
deltas: 28% (182138/650490)
Resolving deltas: 29% (188643/650490)
Resolving
deltas: 30% (195147/650490)
Resolving deltas: 31% (201652/650490)
Resolving
deltas: 32% (208157/650490)
Resolving deltas: 33% (214662/650490)
Resolving
deltas: 34% (221167/650490)
Resolving deltas: 35% (227672/650490)
Resolving
deltas: 36% (234177/650490)
Resolving deltas: 37% (240682/650490)
Resolving
deltas: 37% (245257/650490)
Resolving deltas: 38% (247187/650490)
Resolving
deltas: 39% (253692/650490)
Resolving deltas: 40% (260196/650490)
Resolving
deltas: 41% (266701/650490)
Resolving deltas: 42% (273206/650490)
Resolving
deltas: 43% (279711/650490)
Resolving deltas: 44% (286216/650490)
Resolving
deltas: 45% (292721/650490)
Resolving deltas: 45% (295348/650490)
Resolving
deltas: 46% (299226/650490)
Resolving deltas: 47% (305731/650490)
Resolving
deltas: 48% (312236/650490)
Resolving deltas: 49% (318741/650490)
Resolving
deltas: 50% (325245/650490)
Resolving deltas: 51% (331750/650490)
Resolving
deltas: 51% (337204/650490)
Resolving deltas: 52% (338255/650490)
Resolving
deltas: 53% (344760/650490)
Resolving deltas: 53% (345131/650490)
Resolving
deltas: 53% (349618/650490)
Resolving deltas: 54% (351265/650490)
Resolving
deltas: 54% (355167/650490)
Resolving deltas: 55% (357770/650490)
Resolving
deltas: 55% (361674/650490)
Resolving deltas: 56% (364275/650490)
Resolving
deltas: 56% (368428/650490)
Resolving deltas: 57% (370780/650490)
Resolving
deltas: 57% (372062/650490)
Resolving deltas: 57% (373250/650490)
Resolving
deltas: 57% (374571/650490)
Resolving deltas: 58% (377285/650490)
Resolving
deltas: 58% (377414/650490)
Resolving deltas: 59% (383790/650490)
Resolving
deltas: 59% (389126/650490)
Resolving deltas: 60% (390294/650490)
Resolving
deltas: 61% (396799/650490)
Resolving deltas: 62% (403304/650490)
Resolving
deltas: 62% (408552/650490)
Resolving deltas: 63% (409809/650490)
Resolving
deltas: 64% (416314/650490)
Resolving deltas: 64% (416724/650490)
Resolving
deltas: 65% (422819/650490)
Resolving deltas: 65% (422953/650490)
Resolving
deltas: 65% (424227/650490)
Resolving deltas: 65% (424800/650490)
Resolving
deltas: 65% (425447/650490)
Resolving deltas: 65% (427050/650490)
Resolving
deltas: 66% (429324/650490)
Resolving deltas: 66% (432457/650490)
Resolving
deltas: 67% (435829/650490)
Resolving deltas: 67% (439483/650490)
Resolving
deltas: 67% (441258/650490)
Resolving deltas: 67% (441605/650490)
Resolving
deltas: 67% (441965/650490)
Resolving deltas: 68% (442334/650490)
Resolving
deltas: 68% (442339/650490)
Resolving deltas: 68% (442727/650490)
Resolving
deltas: 68% (443124/650490)
Resolving deltas: 68% (443538/650490)
Resolving
deltas: 68% (443952/650490)
Resolving deltas: 68% (444384/650490)
Resolving
deltas: 68% (444832/650490)
Resolving deltas: 68% (445306/650490)
Resolving
deltas: 68% (445811/650490)
Resolving deltas: 68% (446347/650490)
Resolving
deltas: 69% (448839/650490)
Resolving deltas: 69% (452594/650490)
Resolving
deltas: 70% (455343/650490)
Resolving deltas: 70% (457709/650490)
Resolving
deltas: 71% (461848/650490)
Resolving deltas: 71% (463195/650490)
Resolving
deltas: 71% (467863/650490)
Resolving deltas: 72% (468353/650490)
Resolving
deltas: 73% (474858/650490)
Resolving deltas: 73% (475812/650490)
Resolving
deltas: 74% (481363/650490)
Resolving deltas: 74% (482138/650490)
Resolving
deltas: 75% (487868/650490)
Resolving deltas: 75% (488741/650490)
Resolving
deltas: 76% (494373/650490)
Resolving deltas: 76% (494890/650490)
Resolving
deltas: 76% (500342/650490)
Resolving deltas: 77% (500878/650490)
Resolving
deltas: 78% (507383/650490)
Resolving deltas: 78% (507505/650490)
Resolving
deltas: 78% (511559/650490)
Resolving deltas: 79% (513888/650490)
Resolving
deltas: 79% (518262/650490)
Resolving deltas: 80% (520392/650490)
Resolving
deltas: 81% (526897/650490)
Resolving deltas: 81% (528295/650490)
Resolving
deltas: 82% (533402/650490)
Resolving deltas: 82% (535696/650490)
Resolving
deltas: 83% (539907/650490)
Resolving deltas: 83% (542594/650490)
Resolving
deltas: 84% (546412/650490)
Resolving deltas: 84% (547102/650490)
Resolving
deltas: 84% (548527/650490)
Resolving deltas: 84% (549571/650490)
Resolving
deltas: 84% (550455/650490)
Resolving deltas: 84% (551217/650490)
Resolving
deltas: 84% (551916/650490)
Resolving deltas: 85% (552917/650490)
Resolving
deltas: 85% (553610/650490)
Resolving deltas: 86% (559422/650490)
Resolving
deltas: 86% (559927/650490)
Resolving deltas: 86% (565509/650490)
Resolving
deltas: 87% (565927/650490)
Resolving deltas: 87% (569272/650490)
Resolving
deltas: 87% (571315/650490)
Resolving deltas: 88% (572432/650490)
Resolving
deltas: 88% (574885/650490)
Resolving deltas: 89% (578937/650490)
Resolving
deltas: 89% (582307/650490)
Resolving deltas: 90% (585441/650490)
Resolving
deltas: 90% (586843/650490)
Resolving deltas: 90% (590983/650490)
Resolving
deltas: 90% (591772/650490)
Resolving deltas: 91% (591946/650490)
Resolving
deltas: 91% (594285/650490)
Resolving deltas: 91% (598153/650490)
Resolving
deltas: 92% (598451/650490)
Resolving deltas: 92% (600602/650490)
Resolving
deltas: 92% (604791/650490)
Resolving deltas: 93% (604956/650490)
Resolving
deltas: 93% (608723/650490)
Resolving deltas: 94% (611461/650490)
Resolving
deltas: 94% (613848/650490)
Resolving deltas: 95% (617966/650490)
Resolving
deltas: 95% (620653/650490)
Resolving deltas: 95% (622468/650490)
Resolving
deltas: 95% (624007/650490)
Resolving deltas: 96% (624471/650490)
Resolving
deltas: 96% (625026/650490)
Resolving deltas: 96% (629694/650490)
Resolving
deltas: 97% (630976/650490)
Resolving deltas: 97% (634891/650490)
Resolving
deltas: 98% (637481/650490)
Resolving deltas: 98% (638946/650490)
Resolving
deltas: 99% (643986/650490)
Resolving deltas: 99% (644038/650490)
Resolving
deltas: 99% (646997/650490)
Resolving deltas: 100% (650490/650490)
Resolving
deltas: 100% (650490/650490), done.
[0:02:45] Still working on:
[0:02:45] v8
________ running 'vpython third_party/depot_tools/update_depot_tools_toggle.py
--disable' in
'/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor'
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor/depot_tools/.cipd_bin/.cipd/pkgs/0/fI6WggdkRyN1r91MnTeApc2_gyTtXfYpueHZVLcaF8gC/vpython:
could not resolve options: failed to resolve Python script: stat
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor/third_party/depot_tools/update_depot_tools_toggle.py:
no such file or directory
Error: Command 'vpython third_party/depot_tools/update_depot_tools_toggle.py
--disable' returned non-zero exit status 1 in
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/vendor
Running: gclient root
Running: gclient config --spec 'solutions = [
{
"name": "v8",
"url": "https://chromium.googlesource.com/v8/v8.git",
"deps_file": "DEPS",
"managed": False,
"custom_deps": {},
},
]
'
Running: gclient sync --with_branch_heads
Subprocess failed with return code 2.
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/ext/libv8/builder.rb:83:in
`block in setup_build_deps!': unable to fetch v8 source (RuntimeError)
from
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/ext/libv8/builder.rb:81:in
`chdir'
from
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/ext/libv8/builder.rb:81:in
`setup_build_deps!'
from
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/ext/libv8/builder.rb:40:in
`build_libv8!'
from
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0/ext/libv8/location.rb:24:in
`install!'
from extconf.rb:7:in `<main>'
extconf failed, exit code 1
Gem files will remain installed in
/home/runner/work/vendor/bundle/ruby/2.6.0/gems/libv8-8.4.255.0
for inspection.
Results logged to
/home/runner/work/vendor/bundle/ruby/2.6.0/extensions/x86_64-linux/2.6.0/libv8-8.4.255.0/gem_make.out
An error occurred while installing libv8 (8.4.255.0), and Bundler cannot
continue.
Make sure that `gem install libv8 -v '8.4.255.0' --source
'https://rubygems.org/'` succeeds before bundling.
In Gemfile:
mini_racer was resolved to 0.3.1, which depends on
libv8
Error: Process completed with exit code 5.
This is caused by https://github.com/rubyjs/libv8/issues/310 and caused by upgrade from Bundler v2.1.x to v2.2.x Sorry, I don't have a simple fix for you. Maybe downgrade to Bundler 2.1.x for now?

Rounding integers to nearest 5

Using the example below, anything below 2.5% will round down to 0%. How do I get it to round down to 1% and not 0%?
s='1.5% 2.5% 3.5% 25% 27% 34% 68%'
awk '{for (i=1; i<=NF; i++) $i = int( ($i+2) / 5) * 5 "%"} 1' <<< "$s"
0% 0% 5% 25% 25% 35% 70%

Emacs is slow and lags when open link

My emacs is sometimes slow. Especially when I open link under cursor.
I have run profiler. What next to do with it? How to improve performance?
Results is as below.
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
- command-execute 113 52%
- call-interactively 113 52%
- evil-ex 30 13%
- read-from-minibuffer 23 10%
+ command-execute 6 2%
+ elscreen-run-screen-update-hook 2 0%
redisplay_internal (C function) 1 0%
- eval-expression 28 12%
- eval 28 12%
- debug 28 12%
- recursive-edit 24 11%
- command-execute 16 7%
- call-interactively 16 7%
+ evil-ex 7 3%
+ byte-code 4 1%
+ evil-mouse-drag-region 3 1%
+ org-open-at-point 1 0%
+ mouse-set-point 1 0%
- evil-mouse-drag-region 14 6%
- evil-mouse-drag-track 14 6%
- eval 14 6%
- track-mouse 14 6%
- byte-code 14 6%
- read-event 9 4%
+ redisplay_internal (C function) 1 0%
- org-agenda 10 4%
- byte-code 10 4%
- org-agenda-get-restriction-and-command 10 4%
- byte-code 10 4%
read-char-exclusive 8 3%
- byte-code 9 4%
- read-file-name 9 4%
+ read-file-name-default 9 4%
+ minibuffer-complete 5 2%
+ org-open-at-point 4 1%
+ org-todo 4 1%
+ org-refile 3 1%
+ evil-previous-line 2 0%
+ profiler-report-write-profile 2 0%
+ profiler-report 1 0%
+ org-ctrl-c-ctrl-c 1 0%
- timer-event-handler 62 28%
- byte-code 62 28%
- apply 62 28%
- tooltip-timeout 62 28%
- run-hook-with-args-until-success 62 28%
- tooltip-help-tips 62 28%
- tooltip-show 62 28%
- byte-code 62 28%
- x-show-tip 59 27%
- face-set-after-frame-default 59 27%
- byte-code 59 27%
- face-spec-recalc 57 26%
- make-face-x-resource-internal 54 24%
- set-face-attributes-from-resources 53 24%
- set-face-attribute-from-resource 50 23%
+ face-name 4 1%
+ face-spec-set-2 2 0%
- ... 26 11%
Automatic GC 25 11%
+ vc-backend 1 0%
+ elscreen-run-screen-update-hook 5 2%
mouse-fixup-help-message 4 1%
+ redisplay_internal (C function) 4 1%
and 2 0%
+ tooltip-show-help 1 0%
Update 1
For some time no longer I observe this problem.

Find a local minimum in a special graph

The issue at hand looks easy, but I could not find an easy solution so far.
I've got a histogram describing the value distributing of an array of floats, roughly looking like this:
As you can see, there is a local maximum near 0, which keeps falling down to a local minimum, then rising quickly to a plateau, and in the end falling to 0. I would like to detect the local minimum.
In practice, the histogram is not as smooth:
There are lots of spikes, and the local minimum may be stretched and uneven. I'm not sure how to tackle this problem.
There is little domain knowledge. The first max may even be higher than the second max. There may be spikes in any direction, values may be as low as 0.
This is a real life sample taken from 8 distinct runs. It's scaled to 0 - 10 to make it easier to understand.
0: 22% 12% 19% 17% 6% 5% 6% 5%
1: 3% 2% 1% 1% 4% 1% 4% 1%
2: 6% 2% 13% 5% 0% 2% 0% 2%
3: 62% 62% 52% 42% 2% 5% 2% 5%
4: 4% 19% 12% 28% 10% 13% 10% 13%
5: 0% 0% 3% 29% 30% 29% 30%
6: 37% 31% 37% 30%
7: 1% 7% 1% 7%
8: 6% 1% 6% 1%
9:
10:
Values rounded down. Missing values denote no occurrence of any value.
Explanation of the first line:
0: 22% the initial max
1: 3% local min
2: 6% still min
3: 62% plateau max
4: 4% second min
5: 0% 0
6: no more values
7:
8:
9:
10:
For reference, a list of the same data, this time scaled to 0 - 100 (there were no values in the 90-100 range at all). I messed up on the formatting, but it should give a rough idea.
0: 0% 0% 0% 1% 0% 0% 0% 0%
1: 0% 1% 1% 3% 0% 0% 0% 0%
2: 1% 2% 1% 3% 0% 0% 0% 0%
3: 4% 2% 3% 3% 0% 1% 0% 1%
4: 6% 1% 3% 2% 0% 0% 0% 0%
5: 2% 0% 3% 1% 0% 0% 0% 0%
6: 1% 0% 2% 0% 0% 0% 0% 0%
7: 1% 0% 1% 0% 0% 0% 0% 0%
8: 1% 0% 1% 0% 0% 0% 0% 0%
9: 1% 0% 1% 0% 1% 0% 1% 0%
10: 1% 0% 0% 0% 1% 0% 1% 0%
11: 0% 0% 0% 0% 0% 0% 0% 0%
12: 0% 0% 0% 0% 0% 0% 0% 0%
13: 0% 0% 0% 0% 0% 0% 0% 0%
14: 0% 0% 0% 0% 0% 0% 0% 0%
15: 0% 0% 0% 0% 0% 0% 0% 0%
16: 0% 0% 0% 0% 0% 0% 0% 0%
17: 0% 0% 0% 0% 0% 0% 0% 0%
18: 0% 0% 0% 0% 0% 0% 0% 0%
19: 0% 0% 0% 0% 0% 0% 0% 0%
20: 0% 0% 0% 0% 0% 0% 0% 0%
21: 0% 0% 0% 0% 0% 0% 0% 0%
22: 0% 0% 0% 0% 0% 0% 0% 0%
23: 0% 0% 0% 0% 0% 0% 0% 0%
24: 0% 0% 1% 0% 0% 0% 0% 0%
25: 0% 0% 1% 0% 0% 0% 0% 0%
26: 0% 0% 1% 0% 0% 0% 0% 0%
27: 0% 0% 1% 0% 0% 0% 0% 0%
28: 1% 0% 2% 1% 0% 0% 0% 0%
29: 3% 0% 2% 2% 0% 0% 0% 0%
30: 7% 1% 3% 2% 0% 0% 0% 0%
31: 10% 2% 4% 3% 0% 0% 0% 0%
32: 10% 3% 4% 4% 0% 0% 0% 0%
33: 6% 6% 5% 5% 0% 0% 0% 0%
34: 5% 5% 4% 4% 0% 0% 0% 0%
35: 5% 8% 6% 3% 0% 0% 0% 0%
36: 5% 10% 6% 4% 0% 0% 0% 0%
37: 5% 9% 5% 3% 0% 0% 0% 0%
38: 3% 8% 5% 5% 0% 0% 0% 0%
39: 2% 5% 5% 5% 0% 0% 0% 0%
40: 1% 4% 4% 5% 0% 1% 0% 1%
41: 1% 3% 2% 5% 0% 1% 0% 1%
42: 0% 1% 1% 4% 0% 0% 0% 0%
43: 0% 2% 0% 4% 1% 1% 1% 1%
44: 0% 1% 0% 3% 1% 1% 1% 1%
45: 0% 1% 0% 1% 0% 1% 0% 1%
46: 0% 1% 0% 1% 1% 1% 1% 1%
47: 0% 1% 0% 0% 1% 1% 1% 1%
48: 0% 1% 0% 0% 1% 1% 1% 1%
50: 0% 0% 0% 1% 1% 1% 1% 1%
50: 0% 1% 1% 1% 1% 1%
51: 0% 0% 2% 1% 2% 1%
52: 0% 1% 2% 1% 2% 1%
53: 0% 0% 4% 2% 4% 2%
54: 0% 2% 2% 2% 2%
55: 0% 2% 2% 2% 2%
56: 0% 2% 3% 2% 3%
57: 0% 2% 4% 2% 4%
58: 4% 6% 4% 6%
59: 3% 3% 3% 3%
60: 5% 5% 5% 5%
61: 5% 7% 5% 7%
62: 3% 5% 3% 5%
63: 4% 3% 4% 3%
64: 5% 2% 5% 2%
65: 3% 2% 2% 2%
66: 5% 1% 5% 1%
67: 1% 0% 1% 0%
68: 1% 0% 1% 0%
69: 0% 1% 0% 1%
70: 0% 0% 0% 0%
71: 0% 0% 0% 0%
72: 0% 0% 0% 0%
73: 0% 1% 0% 1%
74: 0% 0% 0% 0%
75: 0% 0% 0% 0%
76: 0% 1% 0% 1%
77: 0% 0% 0% 0%
78: 0% 0% 0% 0%
79: 0% 0% 0% 0%
80: 0% 0% 0% 1%
81: 0% 0% 0% 0%
82: 0% 0% 0% 0%
83: 0% 0% 0% 0%
84: 0% 0% 0% 0%
85: 1% 1%
86: 0% 0%
87: 1% 1%
88: 1% 1%
89: 0% 0%
Your "true" histogram is low frequency. Your noise is high frequency. Low-pass filtering the data with an appropriate bandwidth filter will get rid of most of the noise.
Here's an algoithm:
Smooth your data set by calculating
a moving average for a small window.
Test your smoothed data for local minima (i.e. any single datum
that it is smaller than its
neighbours.
If there are more than two local minima, increase the window size, and goto step 1.
Update:
Having looked at the sample data you posted, I've realised that you need to detect minimal plateaus rather than just individual points, so step two in the algorithm should be tweaked to identify a point as part of a minimum if there are no neighbours with smaller values between the nearest higher value neighbours on either side. Then when counting minima in step 3, a minimal plateau should count as a single minimum.
I've tested this algorithm on your example datasets and it performs well, picking minima at: 18, 12, 15, 13, 23, 20, 23and20 for your datasets respectively.
a possible heuristic: using spline approximation to smooth the histogram, and make it polynomical-like and then look for a local minimum.
note that this is only a heuristic solution and might fail... but I think will provide a good solution for most cases.
This actually sounds rather like histogram-based image segmentation to me (although this is not an image, so it's really just histogram segmentation). Sounds weird, but bear with me.
Is what's important about the minimum the fact that it's a minimum, or that it divides the small maximum from the large maximum? If it's the fact that it divides the maxima, then segmentation is definitely what you want.
Have a look at K-means clustering. You'd have two clusters. It's not a terribly complicated procedure, but Wikipedia (and other sources) do a much better job of explaining it than i could, so i'll leave it to them.

Resources