timo | d'oh, the file path is still in something somewhere | 00:55 | |||||||||||||||||||||||||||||||||||||
[Tux] |
|
06:09 | |||||||||||||||||||||||||||||||||||||
tux.nl/Talks/CSV6/speed4-20.html / tux.nl/Talks/CSV6/speed4.html tux.nl/Talks/CSV6/speed.log | |||||||||||||||||||||||||||||||||||||||
timo | finding some comedy gold in the dump of the core setting. like `my @conversion := <0 1 2 [snip] X Y Z>` resulting in almost 40 wvals in a row, followed by a call to infix:<,> with that amount of arguments :D | 07:41 | |||||||||||||||||||||||||||||||||||||
surely we could have turned that into a compile-time constant in the optimizer ... | |||||||||||||||||||||||||||||||||||||||
the QAST we generate for `proto method A(|) {*}` is slightly puzzling to me. i'm sure spesh actually makes it all turn into almost nothing, but it seems wasteful in the context of the setting where a load of onlystar proto methods live | 08:39 | ||||||||||||||||||||||||||||||||||||||
we grab a param_sp (slurpy all the positionals) and do nothing with it, and we grab a param_sn (slurpy all nameds), then hllize it (even though param_sn creates a slurpy_hash from the current frame's hll), then we decont it, then we typecheck it against Any | 08:40 | ||||||||||||||||||||||||||||||||||||||
the only thing we do with the result from that is assertparamcheck | 08:41 | ||||||||||||||||||||||||||||||||||||||
but the optimizer can already statically know that's not necessary, and whatever's spitting out the QAST for an onlystar proto method with just | should also know better i think? | |||||||||||||||||||||||||||||||||||||||
i'm thinking probably something in Perl6::Actions lower_signature is missing something here | 08:45 | ||||||||||||||||||||||||||||||||||||||
it's already checking for cases like "final slurpy, don't need to do anything more" and "anonymous slurpy needs no handling" in some places nearby | |||||||||||||||||||||||||||||||||||||||
08:47
finanalyst joined
|
|||||||||||||||||||||||||||||||||||||||
Geth | rakudo/main: 2ee68bffd5 | timo++ (committed using GitHub Web editor) | 2 files Remove $?FILE and $?LINE where they don't make much sense The filename and line will get set by World if the exception is thrown at BEGIN time, "file" is not a named argument the X::Comp constructor takes ("filename" would be), so this is not a regression. And the filename and line from the core setting is not very helpful, since you also get a stack trace in that case. |
09:47 | |||||||||||||||||||||||||||||||||||||
10:06
finanalyst left
10:07
sena_kun joined
10:12
sena_kun left
|
|||||||||||||||||||||||||||||||||||||||
timo | m: say "making `proto method asdf(|) {*}` generate less code shrunk core c setting by $(0x1c01610 * 100 / 0x1c08578) percent; $(0x1c01610 - 0x1c08578) bytes difference" | 10:46 | |||||||||||||||||||||||||||||||||||||
camelia | making `proto method asdf(|) *` generate less code shrunk core c setting by 99.90297437 percent; -28520 bytes difference | ||||||||||||||||||||||||||||||||||||||
timo | haha | ||||||||||||||||||||||||||||||||||||||
drop in the ocean :') | 10:47 | ||||||||||||||||||||||||||||||||||||||
lizmat | also, since those protos are bypassed by dispatch, I doubt it will make it run any faster :-) | 10:49 | |||||||||||||||||||||||||||||||||||||
ab5tract | what! :O | 10:50 | |||||||||||||||||||||||||||||||||||||
timo | so we could just make them completely empty? :D | ||||||||||||||||||||||||||||||||||||||
ab5tract | lizmat: are they bypassed in both old and newdisp? | 10:51 | |||||||||||||||||||||||||||||||||||||
lizmat | they need to exists to get the candidates though | ||||||||||||||||||||||||||||||||||||||
only in newdisp | |||||||||||||||||||||||||||||||||||||||
afaik | 10:52 | ||||||||||||||||||||||||||||||||||||||
ab5tract | I think you're right | ||||||||||||||||||||||||||||||||||||||
lizmat | on initial dispatch, the dispatch program records the candidate that matches the given guards | ||||||||||||||||||||||||||||||||||||||
ab5tract | iiuc, a major distinction between the two is that in newdisp the proto is more like a marker that we fetch all the candidates from and then compute the matching candidate(s) | 10:54 | |||||||||||||||||||||||||||||||||||||
whereas in olddisp the computation of the appropriate candidate(s) lives in the meta of the proto itself. | 10:55 | ||||||||||||||||||||||||||||||||||||||
lizmat | well, if the protoi is foo(|) {*} | 10:56 | |||||||||||||||||||||||||||||||||||||
releasable6 | Next release in ≈3 days and ≈7 hours. There are no known blockers. Please log your changes in the ChangeLog: github.com/rakudo/rakudo/wiki/ChangeLog-Draft | 11:00 | |||||||||||||||||||||||||||||||||||||
timo | lizmat: what do you think of this as a tool to guide optimizations: on every line of the core setting, show which dispatch calls are compiled for that line | 11:07 | |||||||||||||||||||||||||||||||||||||
ab5tract | m: proto foo(%) { say "ok..."; }; multi foo(@a) { dd :@a }; foo [1,2] # why isn't this a compile time error? | 11:09 | |||||||||||||||||||||||||||||||||||||
camelia | Type check failed in binding to parameter '<anon>'; expected Associative but got Array ([1, 2]) in sub foo at <tmp> line 1 in block <unit> at <tmp> line 1 |
||||||||||||||||||||||||||||||||||||||
ab5tract | shouldn't all candidates match an explicit proto sig? or maybe I'm totally missing the point of proto | ||||||||||||||||||||||||||||||||||||||
according to the docs, I'm totally missing the point | 11:13 | ||||||||||||||||||||||||||||||||||||||
lizmat | timo: that'd be a nice tool, but wouldn't that make more sense for any piece of Raku code ? | 11:18 | |||||||||||||||||||||||||||||||||||||
timo | it would. no reason to have it limited | 11:19 | |||||||||||||||||||||||||||||||||||||
ab5tract: yeah i think you're right. or maybe it's a DIHWIDT? | 11:22 | ||||||||||||||||||||||||||||||||||||||
ab5tract | it feels like one of those super-ultra-wide specification details that are either a) more confusing to the user than clarifying, b) more painful for the the implementor than any overall gains for user space, or c) both of the above | 11:23 | |||||||||||||||||||||||||||||||||||||
Geth | rakudo/debugspew_for_lower_signature: 85a07423e3 | (Timo Paulssen)++ | 2 files Remove $?FILE and $?LINE where they don't make much sense The filename and line will get set by World if the exception is thrown at BEGIN time, "file" is not a named argument the X::Comp constructor takes ("filename" would be), so this is not a regression. And the filename and line from the core setting is not very helpful, since you also get a stack trace in that case. |
11:24 | |||||||||||||||||||||||||||||||||||||
rakudo/debugspew_for_lower_signature: 4d4604ef66 | (Timo Paulssen)++ | src/Perl6/Actions.nqp add some debug spew code to lower_signature activated by env var LOWER_SIG_DEBUG, not very refined. |
|||||||||||||||||||||||||||||||||||||||
timo | oh, it reported the other commit because the pull request was squashed in the github web ui? | ||||||||||||||||||||||||||||||||||||||
ab5tract | I'm curious what would happen if we changed that logic and ran a blin run | ||||||||||||||||||||||||||||||||||||||
timo | ^- not meant for inclusion, just for anyone looking at the lower_signature code in the future and doesn't want to type out the debug stuff again | 11:25 | |||||||||||||||||||||||||||||||||||||
ab5tract | timo: yeah, either that or maybe you didn't rebase the branch on HEAD before pushing again | ||||||||||||||||||||||||||||||||||||||
which is probably exactly what you meant by the squashing | |||||||||||||||||||||||||||||||||||||||
timo | right. if i rebase and push again there'll be even more spam :D | 11:27 | |||||||||||||||||||||||||||||||||||||
so, the annotations data is kind of huge actually | 11:35 | ||||||||||||||||||||||||||||||||||||||
lizmat | and yet another Rakudo Weekly News hits the Net: rakudoweekly.blog/2024/09/23/2024-...with-raku/ | 11:58 | |||||||||||||||||||||||||||||||||||||
notable6__: weekly | |||||||||||||||||||||||||||||||||||||||
notable6__ | lizmat, 3 notes: 2024-09-17T20:40:09Z <antononcube>: x.com/antononcube/status/1836081920981209089 ; 2024-09-18T17:23:41Z <antononcube>: youtu.be/JHO2Wk1b-Og ; 2024-09-22T15:16:25Z <antononcube>: www.youtube.com/watch?v=QOsVTCQZq_s | ||||||||||||||||||||||||||||||||||||||
lizmat | notable6__: weekly reset | 11:59 | |||||||||||||||||||||||||||||||||||||
notable6__ | lizmat, Moved existing notes to “weekly_2024-09-23T11:59:04Z” | ||||||||||||||||||||||||||||||||||||||
timo | ok so, out of 76541 annotation entries in the core.c.setting, 75109 times the file name was the same as the previous entry's filename, 22652 times the line number was the same as the previous one, and 30263 times the bytecode offset was the same as the previous one. not exactly sure under what circumstances the bytecode offset would be the same for two consecutive annotations | 12:00 | |||||||||||||||||||||||||||||||||||||
the first entry with a bytecode index higher than 0xffff was at 22070 out of 75109, line number and filename string index never went above 0xffff | 12:01 | ||||||||||||||||||||||||||||||||||||||
m: say "annotation segment is $(76541 * 12) bytes long; $(76541 * 12 * 100 / 0x1c01610)% of the total file size" | 12:02 | ||||||||||||||||||||||||||||||||||||||
camelia | annotation segment is 918492 bytes long; 3.1277634% of the total file size | ||||||||||||||||||||||||||||||||||||||
timo | m: say 0xfc9df0 | 12:05 | |||||||||||||||||||||||||||||||||||||
camelia | 16555504 | ||||||||||||||||||||||||||||||||||||||
lizmat | timo: OOC, are you using raku.land/zef:lizmat/MoarVM::Bytecode for this ? | ||||||||||||||||||||||||||||||||||||||
timo | hm, must have made an error somewhere in there | ||||||||||||||||||||||||||||||||||||||
aha! i was wondering where that was | 12:06 | ||||||||||||||||||||||||||||||||||||||
no, i'm putting that straight into moarvm's code itself | |||||||||||||||||||||||||||||||||||||||
lizmat | ack | ||||||||||||||||||||||||||||||||||||||
timo | m: say 0xfc9df0 / 12 | ||||||||||||||||||||||||||||||||||||||
camelia | 1379625.333333 | ||||||||||||||||||||||||||||||||||||||
timo | haha the size of the annotation segment in the output was calculated from unrelated numbers | 12:09 | |||||||||||||||||||||||||||||||||||||
yes this size is so much more reasonable | 12:10 | ||||||||||||||||||||||||||||||||||||||
m: say "$(0xbae440 * 100 / 0x1c01610) percent of core setting bytes are Frame Data Table" | 13:03 | ||||||||||||||||||||||||||||||||||||||
camelia | 41.70885183 percent of core setting bytes are Frame Data Table | ||||||||||||||||||||||||||||||||||||||
timo | i think it's a reasonable goal to reduce the size of that segment to roughly a quarter | 13:04 | |||||||||||||||||||||||||||||||||||||
lizmat | in memory, or in the file ? | ||||||||||||||||||||||||||||||||||||||
timo | in the file | 13:05 | |||||||||||||||||||||||||||||||||||||
oh, no, i was looking only at a part of that. quarter is probably too ambitious | 13:06 | ||||||||||||||||||||||||||||||||||||||
lizmat | would that mean more work when loading frame data ? | ||||||||||||||||||||||||||||||||||||||
timo | i would hope it would mean less work | 14:06 | |||||||||||||||||||||||||||||||||||||
lizmat | that would be good :-) | 14:07 | |||||||||||||||||||||||||||||||||||||
timo | when we open a CU, we go through that entire data linearly, which makes it a good candidate for some kind of compression scheme | ||||||||||||||||||||||||||||||||||||||
variable-width integers would be a good starting point already | |||||||||||||||||||||||||||||||||||||||
once we integrate zstd into moarvm's codebase and make it mandatory, that would also be a possibility | |||||||||||||||||||||||||||||||||||||||
lizmat | I hope there'll be a nqp::zstd op as well, to allow MoarVM::Bytecode to continue to work :-) | 14:12 | |||||||||||||||||||||||||||||||||||||
[Coke] | anyone know who nigel hamilton is in re: TRF? Doesn't look like he's on the board. Just an interested party? | 14:18 | |||||||||||||||||||||||||||||||||||||
lizmat | interested party afaik | 14:19 | |||||||||||||||||||||||||||||||||||||
at least according to the TPRF site | 14:21 | ||||||||||||||||||||||||||||||||||||||
[Coke] | ah, nevermind, his ticket is complaining about something that doesn't exist. | 14:27 | |||||||||||||||||||||||||||||||||||||
(or maybe on the website code) | 14:31 | ||||||||||||||||||||||||||||||||||||||
timo | bit fiddling in nqp isn't easy haha | 18:11 | |||||||||||||||||||||||||||||||||||||
oooh the rare reproduction of the "Requested synthetic 5 when only 1 have been created." error output ... but on 64bit now! | 18:28 | ||||||||||||||||||||||||||||||||||||||
18:30
sena_kun joined
|
|||||||||||||||||||||||||||||||||||||||
timo | i somehow managed to use nqp::writeint to write past the end of a buffer, is that a DIHWIDT? is it supposed to be a "careful or you'll corrupt your memory" API? | 18:35 | |||||||||||||||||||||||||||||||||||||
nqp: class A is repr("VMArray") is array_type(uint8) { }; my $attempts := 1024; while $attempts-- { my $foo := A.new(); my @a = <foo bar baz quux lol omg wtf bbq>; nqp::writeint($foo, nqp::elems($foo), 0x1234567890, 13); nqp::writeint($foo, nqp::elems($foo), 0xAF, 1); nqp::join(" oh ", @a); }; say "survived"; | 18:44 | ||||||||||||||||||||||||||||||||||||||
camelia | ===SORRY!=== Error while compiling <tmp> Assignment ("=") not supported in NQP, use ":=" instead at line 1, near " <foo bar " at NQP::src/HLL/Grammar.nqp:234 (/home/camelia/rakudo-m-inst-1/share/nqp/lib/NQPHLL.moarvm:panic) from <unknown>:1 (/home/ca… |
||||||||||||||||||||||||||||||||||||||
timo | nqp: class A is repr("VMArray") is array_type(uint8) { }; my $attempts := 1024; while $attempts-- { my $foo := A.new(); my @a := <foo bar baz quux lol omg wtf bbq>; nqp::writeint($foo, nqp::elems($foo), 0x1234567890, 13); nqp::writeint($foo, nqp::elems($foo), 0xAF, 1); nqp::join(" oh ", @a); }; say "survived"; | ||||||||||||||||||||||||||||||||||||||
camelia | ===SORRY!=== Error while compiling <tmp> Confused at line 1, near "say \"survi" at NQP::src/HLL/Grammar.nqp:234 (/home/camelia/rakudo-m-inst-1/share/nqp/lib/NQPHLL.moarvm:panic) from NQP::src/NQP/Grammar.nqp:148 (/home/camelia/rakudo-m-inst-1/share/n… |
||||||||||||||||||||||||||||||||||||||
timo | nqp: class A is repr("VMArray") is array_type(uint8) { }; my $attempts := 1024; while $attempts-- { my $foo := A.new(); my @a := <foo bar baz quux lol omg wtf bbq>; nqp::writeint($foo, nqp::elems($foo), 0x1234567890, 13); nqp::writeint($foo, nqp::elems($foo), 0xAF, 1); nqp::join(" oh ", @a); }; say("survived"); | ||||||||||||||||||||||||||||||||||||||
camelia | survived | ||||||||||||||||||||||||||||||||||||||
timo | not like this obviously :) | ||||||||||||||||||||||||||||||||||||||
d'oh | 18:48 | ||||||||||||||||||||||||||||||||||||||
yeah that would definitely do it ... | 18:49 | ||||||||||||||||||||||||||||||||||||||
> /* No need to account for start, set_size_internal will take care of that */ | 18:50 | ||||||||||||||||||||||||||||||||||||||
comments written moments before disaster | |||||||||||||||||||||||||||||||||||||||
Geth | nqp/reproducible_build_improvement: 4 commits pushed by (Timo Paulssen)++, (Elizabeth Mattijsen)++ | 19:25 | |||||||||||||||||||||||||||||||||||||
jdv | i didn't get to doing a blin run today so i'll do it tomrrow | 19:26 | |||||||||||||||||||||||||||||||||||||
Geth | nqp/reproducible_build_improvement: 75edd1c509 | (Timo Paulssen)++ | tools/build/gen-version.pl Sort files by name before hashing into source-digest |
19:28 | |||||||||||||||||||||||||||||||||||||
nqp/reproducible_build_improvement: 91666ab6b1 | (Timo Paulssen)++ | 2 files Bring in reproducibility fixes from debian patches |
|||||||||||||||||||||||||||||||||||||||
timo | tore out changes that turned out not to be correct (but not critical). nine had approved these before, maybe i don't need a full approval before merging this ahead of the release? it can wait, those changes are not critical, only nice-to-have | 19:31 | |||||||||||||||||||||||||||||||||||||
19:59
finanalyst joined
|
|||||||||||||||||||||||||||||||||||||||
patrickb | I'd say go ahead. | 20:30 | |||||||||||||||||||||||||||||||||||||
timo | linkable6: bf7c1d0db168ed3285926f2988938e44f7c32cb4 | 20:48 | |||||||||||||||||||||||||||||||||||||
i wonder if this is still good | |||||||||||||||||||||||||||||||||||||||
apparently not lol | |||||||||||||||||||||||||||||||||||||||
github.com/rakudo/rakudo/commit/bf...44f7c32cb4 | 20:50 | ||||||||||||||||||||||||||||||||||||||
21:41
sena_kun left
22:58
finanalyst left
|