01:20 kylese left, hulk joined 02:15 hulk left, kylese joined 03:22 lichtkind__ joined 03:24 lichtkind_ left 03:44 runesicle joined
Geth docker: f9a08f038e | AntonOks++ | 6 files
Bump to 2026.03 [skip workflow]
03:48
03:57 runesicle left 04:12 Aedil joined 04:16 Aedil left 04:19 Aedil joined 07:06 sibl joined 07:07 Sgeo left 07:47 annamalai left 07:48 sibl left, annamalai joined, sibl joined 08:03 dakkar joined 08:04 sibl left, sibl joined 08:12 Aedil left
disbot2 <simon_sibl> I do a lot of silly action with csv file, I have a 10 lines raku file that parse 2 csv file and basically merge them with a simple condition, isnt there a cli tool that would do that ? or the 10 lines script is probably more flexible ? 09:50
timo @simon_sibl maybe this does what you want? miller.readthedocs.io/en/latest/ 09:58
disbot2 <simon_sibl> wow nice, thank you ! indeed thats very handy cli for that 10:01
timo it looks very very powerful, with everything that implies :) 10:02
10:06 Aedil joined
timo apparently I used it with this script before to analyze NFAs run during core setting compilation: mlr --icsv --ocsvlite head -n 10000000 then filter '$eos == 0x32aea9' then put -s entry=0 '$len = $offset - $orig_offset + 1; $entry = @entry; @entry += 1' then cut -f nfa,entry,orig_offset,offset,len,nfa then repeat -f len then put -s curpos=-1 'if (@curpos == -1) { $pos = fmtnum($orig_offset, "%d"); 10:13
@curpos = $orig_offset + 1 } else { $pos = fmtnum(@curpos, "%d"); if (@curpos == $offset) { @curpos = -1; } else { @curpos += 1; }; }' then cut -f pos then uniq -c -f pos then sort -n -f count core_setting_nfa_stats.csv.zst | zstd -v -17 > what_position_how_often.csv.zst
10:17 annamalai left, annamalai joined