timotimo yeah 00:01
lizmat Files=1345, Tests=117140, 235 wallclock secs (29.81 usr 9.17 sys + 3186.32 cusr 316.05 csys = 3541.35 CPU) 09:52
Geth rakudo: b1f59a2f91 | (Vadim Belman)++ | src/Perl6/Actions.nqp
Improve the performance of signature binding

Replacing `nqp::ctxcode(nqp::ctx)` pair with `nqp::curcode` reduces the overall performance loss on a test script from 20-30% to 14-20% with 14% being the most likely outcome.
09:53
rakudo: f2ad038701 | (Elizabeth Mattijsen)++ (committed using GitHub Web editor) | src/Perl6/Actions.nqp
Merge pull request #4059 from vrurg/rakudo_4056

Improve the performance of signature binding
lizmat Files=1345, Tests=117140, 235 wallclock secs (29.81 usr 9.17 sys + 3186.32 cusr 316.05 csys = 3541.35 CPU) 11:02
^^ before the latest merge
Geth rakudo: f2851b9078 | (Elizabeth Mattijsen)++ | src/core.c/REPL.pm6
Alternate fix for #4057

Instead of trying to look for statements, take the whole of a successful statement (whether that be from one line or multiple lines) as a single thing, assign it to the nameless state variable to prevent "useless use" warnings, and join that up with whatever the next line brings when it's being evalled.
11:07
linkable6 RAKUDO#4057 [open]: github.com/rakudo/rakudo/issues/4057 REPL complaints about "useless" actions
Geth rakudo: aecfc9b3d0 | (Elizabeth Mattijsen)++ | src/core.c/Supply-coercers.pm6
Increase sensitivity of Supply.batch(:seconds) x 1000

Instead of using "per second" granularity for batching based on elapsed time, allow for millisecond granularity. The value still needs to be specified in seconds. So, to get batching per millisecond, one would have to specify "seconds => 0.001"
13:22
timotimo m: Supply.interval(0.1).delayed(0.5).tap(&say); say "start"; sleep 0.2; say "0.2"; sleep 0.4; say "0.7"; sleep 0.3; say "1"; sleep 1; say "2"; 14:25
camelia start
0.2
0
1
0.7
2
3
4
1
5
6
7
8
9
10
11
12
13
14
2
Geth rakudo: 1801a5aa3a | (Elizabeth Mattijsen)++ | src/core.c/Supply-factories.pm6
Remove some unnecessary returns
16:19
Xliff Supply.interval(0.1).delayed(0.5).tap({ say 0.1 * $_ + 0.5 }); say "start"; sleep 0.2; say "0.2"; sleep 0.4; say "0.7"; sleep 0.3; say "1"; sleep 1; say "2"; 16:26
evalable6 start
0.2
0.5
0.6
0.7
0.7
0.8
0.9
1
1
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
2
Xliff Supply.interval(0.1).delayed(0.5).tap({ say 0.1 * $_ + 0.5 }); say "start"; sleep 0.2; say "0.2"; sleep 0.4; say "0.7"; sleep 0.3; say "1"; sleep 1; say "2"; sleep 0.5 16:27
evalable6 start
0.2
0.5
0.6
0.7
0.7
0.8
0.9
1
1
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
2
1.9
2
2.1
2.2
2.3
Xliff Supply.interval(0.1).delayed(0.5).tap({ say 0.1 * $_ + 0.5 }); say "start"; sleep 0.2; say "0.2"; sleep 0.4; say "0.7"; sleep 0.3; say "1"; sleep 1; say "2"; sleep 0.5 16:29
evalable6 start
0.2
0.5
0.6
0.7
0.7
0.8
0.9
1
1
1.1
1.2
1.3
1.4
1.5
1.6
1.7
1.8
1.9
2
2
2.1
2.2
2.3
2.4
lizmat Xliff: can;t see the forest for the trees... is there an issue there ? 17:05
Xliff No. Was just testing something. 17:07
It would be interesting if I could get the argument from the interval and the delay into the tap handler. 17:08
lizmat you know you can privmsg cameila
Xliff Yes. I know. And yet others use it like this just as often if not more so. 17:09
lizmat my $interval = 0.1; my $delayed = 0.5; Supply.interval($interval).delayed($delayed).tap: { say $interval, $delayed } ) # Xliff 17:10
Xliff: ah, but you don't know how many more people privmsg camelia :-) 17:11
Xliff That is irrelevant.
lizmat oki :-)
Xliff Thank you for your example. It was already considered and not what I meant. I'm assuming that there's no way to do it without the lexicals? 17:12
lizmat not that I'm aware of
Xliff OK. Thanks anyways! :) 17:13
lizmat I guess using dynamic variables
Xliff I would worry that they would get eaten by scope changes. 17:14
However that might only be when I run then through NativeCall.
Dynamics hate NativeCall.
lizmat ah, I didn't know
Xliff ¯\_(ツ)_/¯ 17:15
Geth rakudo/emit-on-empty: cb8eb68a92 | (Elizabeth Mattijsen)++ | src/core.c/Supply-coercers.pm6
Add :emit-(once-)on-empty to Suppy.batch(:$seconds)

With :emit-on-empty, you *will* get an emit even if the batch is empty. So that means you will get at least 1 emit for every $seconds.
With :emit-once-on-empty, you will get an emit once if the batch is empyty, and no further emits until more values have been received.
17:30
rakudo: lizmat++ created pull request #4060:
Add :emit-(once-)on-empty to Suppy.batch(:$seconds)
roast: bee06324a0 | (Nicholas Clark)++ | S32-io/utf16.t
:enc(utf16) assumes host byte order if there is no BOM, so test accordingly.

The test had always used 'sample-UTF-16LE.txt' when testing :enc(utf16), but that is only appropriate on little endian systems. For big endian systems, we need to test with 'sample-UTF-16BE.txt' instead.
Fixes #693
17:34
linkable6 ROAST#693 [closed]: github.com/Raku/roast/pull/693 :enc(utf16) assumes host byte order if there is no BOM, so test accordingly.
Geth rakudo/emit-on-empty: 492651ea48 | (Elizabeth Mattijsen)++ | src/core.c/Supply-coercers.pm6
Add :emit-timed to Supply.batch(:$seconds)

When :emit-timed is specified, a timer will be running emitting whatever is in the batch every given number of seconds *if* there is something in the batch. The default is to only emit if a new value is received and it arrives in a new period.
This removes the previous proposal for :emit-on-empty and :emit-once-on-empty.
18:02
Xliff : class A { }; say A.^lookup("elems").gist 19:26
m: class A { }; say A.^lookup("elems").gist
camelia Method+{is-nodal}.new
Xliff Is .elems now a global method?
m: Any.^lookup("elems").gist.say # .elems now on Any 19:28
camelia Method+{is-nodal}.new
Xliff bisectable: bisect Any.^lookup("elems").gist.say 19:29
bisectable6 Xliff, Will bisect the whole range automagically because no endpoints were provided, hang tight
Xliff, ¦6c (49 commits): «04===SORRY!04=== Error while compiling /tmp/SNBfIGlai7␤Undeclared routine:␤ bisect used at line 1␤␤ «exit code = 1»»
Xliff, Nothing to bisect!
Xliff bisectable: Any.^lookup("elems").gist.say
bisectable6 Xliff, Will bisect the whole range automagically because no endpoints were provided, hang tight
Xliff, More than 4 changes to bisect, please try a narrower range like old=2017.09 new=HEAD 19:30
Xliff, Output on all releases: gist.github.com/f7f4d84f5291b1006a...3955c50561