Perl 6 language and compiler development | Logs at colabti.org/irclogger/irclogger_log/perl6-dev | For toolchain/installation stuff see #perl6-toolchain | For MoarVM see #moarvm Set by Zoffix on 27 July 2018. |
|||||||||||||||||||||||||||||||||||||||
00:05
travis-ci joined
|
|||||||||||||||||||||||||||||||||||||||
travis-ci | Rakudo build passed. Nick Logan 'Merge pull request #2777 from vrurg/fix_Configure_for_relocatable | 00:05 | |||||||||||||||||||||||||||||||||||||
travis-ci.org/rakudo/rakudo/builds/509191450 github.com/rakudo/rakudo/compare/e...c2c9fea187 | |||||||||||||||||||||||||||||||||||||||
00:05
travis-ci left
00:48
MasterDuke joined
00:49
MasterDuke left,
MasterDuke joined
01:25
leont left
|
|||||||||||||||||||||||||||||||||||||||
MasterDuke | now that rakudo is relocatable, can the windows install file select where to install to? | 02:02 | |||||||||||||||||||||||||||||||||||||
ugexe | isn't part of the reason it installs to that location is because it doesnt have any spaces in the path, like a typical windows username e.g. C:\Users\My Name\ | 02:09 | |||||||||||||||||||||||||||||||||||||
i dont think rakudo is installable to a path with a space in it. or at least it was not a few years back | 02:10 | ||||||||||||||||||||||||||||||||||||||
MasterDuke | hm, i don't have a windows machine to test with, but hopefully somebody can. there have been multiple people asking about being able to install to a chosen path over the past couple years | 02:33 | |||||||||||||||||||||||||||||||||||||
ugexe | now that i think about it the msi might be ok. it might have just been the Makefile which doesn't quote its paths | 02:37 | |||||||||||||||||||||||||||||||||||||
03:43
squashable6 left
03:45
squashable6 joined,
ChanServ sets mode: +v squashable6
05:13
discord61 joined
05:14
discord6 left,
discord61 is now known as discord6
05:20
discord62 joined
05:21
discord6 left,
discord62 is now known as discord6
06:16
robertle left
|
|||||||||||||||||||||||||||||||||||||||
lizmat | Files=1253, Tests=87982, 398 wallclock secs (21.00 usr 6.57 sys + 2857.38 cusr 241.60 csys = 3126.55 CPU) | 09:01 | |||||||||||||||||||||||||||||||||||||
it's been a long time since it was below 400! | |||||||||||||||||||||||||||||||||||||||
Geth | rakudo: 4189e20c46 | (Kane Valentine)++ (committed using GitHub Web editor) | docs/release_guide.pod claim the next rakudo release |
09:12 | |||||||||||||||||||||||||||||||||||||
lizmat | kawaii++ | 09:16 | |||||||||||||||||||||||||||||||||||||
Geth | rakudo: 0e07079bad | (Elizabeth Mattijsen)++ | src/core/Rakudo/QuantHash.pm6 Move QuantHash.kv build logic to !SET-SELF So that it can be more easily overridden in a class |
09:32 | |||||||||||||||||||||||||||||||||||||
rakudo: 9399ea1198 | (Elizabeth Mattijsen)++ | 2 files Add basic check for keys in SETTING:: and PROCESS:: |
|||||||||||||||||||||||||||||||||||||||
10:01
robertle joined
|
|||||||||||||||||||||||||||||||||||||||
kawaii | AlexDaniel: do you have some time perhaps on Saturday to run through 'releasing' a mock 2019.03.2 with me? :) | 10:04 | |||||||||||||||||||||||||||||||||||||
I will need some practice using the Sakefile and other tools before we do it for real | |||||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: OK, so the sakefile is cool because it does everything locally. So you'll have rakudo and nqp repos on your computer with all the commits and tags, and also the tars. All of that you can inspect without pushing it anywhere | 10:06 | |||||||||||||||||||||||||||||||||||||
kawaii | So does the sakefile spit out a tar.gz I can actually `perl Configure.pl`, `make` and `make install`? | 10:08 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | yeah, the same ones you see on rakudo.org/files/rakudo | 10:09 | |||||||||||||||||||||||||||||||||||||
kawaii: so you just go to the tools/releasable directory and run something like `VERSION=2019.04 sake all` | 10:10 | ||||||||||||||||||||||||||||||||||||||
kawaii: `all` is not exactly all, it excludes the last step which pushes and uploads stuff | |||||||||||||||||||||||||||||||||||||||
kawaii | sure, do you need my keys again by the way? for when it's time for me to actually upload a 'real' release? | 10:12 | |||||||||||||||||||||||||||||||||||||
[Tux] |
|
||||||||||||||||||||||||||||||||||||||
AlexDaniel | releasable6: status | ||||||||||||||||||||||||||||||||||||||
releasable6 | AlexDaniel, Next release in ā30 days and ā8 hours. 0 blockers. 0 out of 97 commits logged | ||||||||||||||||||||||||||||||||||||||
releasable6 | AlexDaniel, Details: gist.github.com/93f9e23bb2d5a59fc0...bee78831e6 | ||||||||||||||||||||||||||||||||||||||
(working from home, so the box is "in use") | |||||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: in around 30 days | ||||||||||||||||||||||||||||||||||||||
kawaii: ok, another thing is that the sakefile is designed to ssh into a faster machine to run tests and stuff | 10:14 | ||||||||||||||||||||||||||||||||||||||
kawaii: the whole process takes around 1 hour if I'm not mistake, most of it is spent on running the spectest excessively :) | 10:15 | ||||||||||||||||||||||||||||||||||||||
kawaii | my laptop has 8 cores and 16gb of RAM, is that 'faster' enough to just run locally? | ||||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: for example, if a single test file flaps, it reruns the spectestā¦ | ||||||||||||||||||||||||||||||||||||||
kawaii: pretty much any machine is fine as long as you're willing to wait, my laptop at the time was just much slower than my server so that's why | |||||||||||||||||||||||||||||||||||||||
kawaii | I will spin up a VM at $work then with a dozen or so cores for a few hours then :) | 10:16 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: github.com/rakudo/rakudo/blob/9399...e#L91-L108 | ||||||||||||||||||||||||||||||||||||||
kawaii: these are the only lines that do ssh stuff, and you can simply replace them with a non-ssh equivalent | 10:17 | ||||||||||||||||||||||||||||||||||||||
kawaii | so if I want to run locally only, how do I negate that? | ||||||||||||||||||||||||||||||||||||||
ah okay understood | |||||||||||||||||||||||||||||||||||||||
AlexDaniel | the thing might benefit from a configurable flag, or somethingā¦ of course :) | ||||||||||||||||||||||||||||||||||||||
kawaii | github.com/rakudo/rakudo/blob/9399...kefile#L26 | 10:18 | |||||||||||||||||||||||||||||||||||||
looks like I can just cheat and have it SSH into localhost too | |||||||||||||||||||||||||||||||||||||||
AlexDaniel | maybe, yeah | 10:19 | |||||||||||||||||||||||||||||||||||||
kawaii | so aside from releases, is there anything I should be doing on a day-to-day basis too? i.e. checking over and assigning issues in the rakudo repo, choosing which issues are blockers etc? | 10:21 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: yes, though maybe not exactly day-to-day but like week-to-week | 10:31 | |||||||||||||||||||||||||||||||||||||
kawaii: so the list of issues here you'll need to go through, yes: github.com/rakudo/rakudo/issues | 10:32 | ||||||||||||||||||||||||||||||||||||||
kawaii: if something looks like it breaks stuff for people and you don't want to make a release with that, use the blocker label on it | |||||||||||||||||||||||||||||||||||||||
we used to have this thing: fail.rakudo.party/ | 10:33 | ||||||||||||||||||||||||||||||||||||||
kawaii | AlexDaniel: sure okay, I will probably ping you excessively about which issues are actually important over the next few releases until I learn better :) | ||||||||||||||||||||||||||||||||||||||
AlexDaniel | and there were like checkboxes near every ticket to make sure that you did indeed check if its a blocker or not | ||||||||||||||||||||||||||||||||||||||
we sorta lost that nice thing once we moved to github, and I was hoping that fail.rakudo.party/ will start listing github tickets eventually | 10:34 | ||||||||||||||||||||||||||||||||||||||
but nobody implemented that in the end, so here we are, just look through the issues carefully :) | |||||||||||||||||||||||||||||||||||||||
kawaii | I want to make a PR to Blin at some point over the next few weeks, to allow for it to scp/rsync the output files somewhere to be rendered as HTML upon completion - this will help for running it in Docker/containers hopefully. Will add some level of automation to ecosystem testing. | 10:35 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: there are some rules of thumb ā if the issue existed in previous releases, then it's probably not a blocker | 10:36 | |||||||||||||||||||||||||||||||||||||
kawaii: that said, sometimes it's worth to consider when was the last release of rakudo star, so maybe if the issue existed in the last 2-3 releases but didn't exist in the latest star, maybe it's better to fix it now | 10:37 | ||||||||||||||||||||||||||||||||||||||
kawaii: and in my view, anything that breaks a module or anything that is likely to break someone's code, is a blocker, even if it's a fixā¦ | 10:38 | ||||||||||||||||||||||||||||||||||||||
kawaii | understood :) | 10:39 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | some changes to the language can be versioned, i.e. you only get them if you `use v6.e` or whatever. So anything that can potentially break stuff should use that mechanism. There are limitations to that, unfortunatelyā¦ you'll seeā¦ | 10:41 | |||||||||||||||||||||||||||||||||||||
kawaii: do you have an account on rt.perl.org/ ? | 10:42 | ||||||||||||||||||||||||||||||||||||||
I think you should be able to login with auth0 using your github accountā¦ | |||||||||||||||||||||||||||||||||||||||
kawaii | I do now | 10:43 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | ok now who is able to give privs for that thingā¦ | ||||||||||||||||||||||||||||||||||||||
Coke probablyā¦ | 10:44 | ||||||||||||||||||||||||||||||||||||||
kawaii: at the top it says āLogged in as ā¦ā, can you mail Coke asking for privs on RT for that āā¦ā ? | 10:46 | ||||||||||||||||||||||||||||||||||||||
kawaii: now, this is the old issue tracker that we no longer use, you won't be touching it often | |||||||||||||||||||||||||||||||||||||||
kawaii | Will do, I assume Coke works at/has some affiliation with TFP then? | ||||||||||||||||||||||||||||||||||||||
AlexDaniel | I guess so! | 10:47 | |||||||||||||||||||||||||||||||||||||
:) | |||||||||||||||||||||||||||||||||||||||
you do need access to that just because eventually you'll get someone asking to close/move a ticket or whatever, and it'd be stupid if you didn't have privs for that | |||||||||||||||||||||||||||||||||||||||
CC me just in case | |||||||||||||||||||||||||||||||||||||||
kawaii: you asked about keys, it's all fine except that I probably need to figure out how to sign your gpg key | 10:50 | ||||||||||||||||||||||||||||||||||||||
and your ssh key needs to be added to the rakudo user on www.p6c.org so that you can upload tars | |||||||||||||||||||||||||||||||||||||||
kawaii: can you try ssh [email@hidden.address] | 10:52 | ||||||||||||||||||||||||||||||||||||||
kawaii | AlexDaniel: got asked for a password | 10:53 | |||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: I added the one on top: github.com/kawaii.keys | ||||||||||||||||||||||||||||||||||||||
kawaii | yep that's right, how old is this box? Some people have issues using my key on older systems | 10:54 | |||||||||||||||||||||||||||||||||||||
ah | |||||||||||||||||||||||||||||||||||||||
my mistaje | |||||||||||||||||||||||||||||||||||||||
mistake* | |||||||||||||||||||||||||||||||||||||||
I'm in! | |||||||||||||||||||||||||||||||||||||||
AlexDaniel | ok there's another machine, but I have to run real quickā¦ we'll be back in 30 mins | 10:56 | |||||||||||||||||||||||||||||||||||||
kawaii | no problem! | ||||||||||||||||||||||||||||||||||||||
11:05
leont joined
11:42
leont left
|
|||||||||||||||||||||||||||||||||||||||
Geth | rakudo: a6a607054a | (Elizabeth Mattijsen)++ | src/core/Rakudo/Internals.pm6 Add R:I.ITERATIONSET2LISTITER method Takes an IterationSet type / object and returns a nqp::list iterator for its keys. To be used in SetHash/BagHash/MixHash iterators. |
11:45 | |||||||||||||||||||||||||||||||||||||
rakudo: 1f066d96a2 | (Elizabeth Mattijsen)++ | src/core/SetHash.pm6 Make SetHash iterators more safe Currently, deleting a key from a nqp::iterator of an nqp::hash and then continuing iterating over the hash, may cause segfaults. Prevent that situation by letting the SetHash.kv/pairs/values methods run on an iterator on a pre-made list of keys, rather than directly on the nqp::hash iterator. ... (6 more lines) |
|||||||||||||||||||||||||||||||||||||||
AlexDaniel | kawaii: ah no, that's the only machine you need access to | 11:50 | |||||||||||||||||||||||||||||||||||||
kawaii: the sakefile attempts to upload to p6c.org and rakudo.org, these used to be different at the time | 11:51 | ||||||||||||||||||||||||||||||||||||||
but not anymore | 11:52 | ||||||||||||||||||||||||||||||||||||||
12:42
squashable6 left
12:43
squashable6 joined
|
|||||||||||||||||||||||||||||||||||||||
Geth | rakudo: 63657986f9 | (Elizabeth Mattijsen)++ | src/core/BagHash.pm6 Make BagHash iterators more safe Currently, deleting a key from a nqp::iterator of an nqp::hash and then continuing iterating over the hash, may cause segfaults. Prevent that situation by letting the BagHash.kv/pairs/values methods run on an iterator on a pre-made list of keys, rather than directly on the nqp::hash iterator. ... (6 more lines) |
13:21 | |||||||||||||||||||||||||||||||||||||
13:57
robertle left
|
|||||||||||||||||||||||||||||||||||||||
Geth | rakudo: c5664301b8 | (Elizabeth Mattijsen)++ | src/core/BagHash.pm6 Fix adding object to BagHash after it having been removed This was a forgotten fix from the previous commit |
13:58 | |||||||||||||||||||||||||||||||||||||
rakudo: 787d5bf6e2 | (Elizabeth Mattijsen)++ | src/core/MixHash.pm6 Make MixHash iterators more safe Currently, deleting a key from a nqp::iterator of an nqp::hash and then continuing iterating over the hash, may cause segfaults. Prevent that situation by letting the MixHash.kv/pairs/values methods run on an iterator on a pre-made list of keys, rather than directly on the nqp::hash iterator. ... (6 more lines) |
|||||||||||||||||||||||||||||||||||||||
13:58
robertle joined
14:05
lizmat left,
lizmat joined
14:10
lizmat_ joined,
lizmat left,
lizmat__ joined
14:14
lizmat_ left
14:20
lizmat__ left
14:29
lizmat joined
14:39
squashable6 left
14:42
squashable6 joined
14:50
squashable6 left
14:52
lizmat left
14:56
squashable6 joined,
ChanServ sets mode: +v squashable6
15:22
ExtraCrispy joined
15:35
robertle left
16:16
epony left
|
|||||||||||||||||||||||||||||||||||||||
vrurg | what does nqp::freshcoderef do? | 16:17 | |||||||||||||||||||||||||||||||||||||
16:30
epony joined
16:34
leont joined
16:51
TimToady left
16:52
TimToady joined
17:37
lucasb joined
17:46
robertle joined
17:47
lizmat joined
17:48
lizmat left,
lizmat joined
|
|||||||||||||||||||||||||||||||||||||||
timotimo | vrurg: i think it strips a closure from a code object | 17:58 | |||||||||||||||||||||||||||||||||||||
m: my $foo = 999; sub test { say $foo }; use nqp; my $blubb = nqp::freshcoderef(nqp::decont(&test)); $blubb() | 17:59 | ||||||||||||||||||||||||||||||||||||||
camelia | freshcoderef requires a coderef in block <unit> at <tmp> line 1 |
||||||||||||||||||||||||||||||||||||||
timotimo | m: my $foo = 999; sub test { say $foo }; use nqp; my $blubb = nqp::freshcoderef(nqp::decont(nqp::getattr(&test, Code, '$!do')); $blubb() | ||||||||||||||||||||||||||||||||||||||
camelia | 5===SORRY!5=== Error while compiling <tmp> Cannot use variable $blubb in declaration to initialize itself at <tmp>:1 ------> 3nt(nqp::getattr(&test, Code, '$!do')); $7ā5blubb() expecting any of: argument list ā¦ |
||||||||||||||||||||||||||||||||||||||
vrurg | timotimo: thanks! | ||||||||||||||||||||||||||||||||||||||
timotimo | m: my $foo = 999; sub test { say $foo }; use nqp; my $blubb = nqp::freshcoderef(nqp::decont(nqp::getattr(&test, Code, '$!do'))); $blubb() | ||||||||||||||||||||||||||||||||||||||
camelia | 999 | ||||||||||||||||||||||||||||||||||||||
timotimo | ah, well | ||||||||||||||||||||||||||||||||||||||
needs a sub around test and $foo to work i guess | |||||||||||||||||||||||||||||||||||||||
jnthn | In MoarVM, iirc, it clones the StaticFrame | 18:02 | |||||||||||||||||||||||||||||||||||||
As well as the code object | |||||||||||||||||||||||||||||||||||||||
So that you get a new "family" of closures | |||||||||||||||||||||||||||||||||||||||
Which is used so that in the compiler we can install the same compile-me-on-demand thunk | |||||||||||||||||||||||||||||||||||||||
But distinguish between clones of each one | 18:03 | ||||||||||||||||||||||||||||||||||||||
timotimo | huh, interesting | 18:04 | |||||||||||||||||||||||||||||||||||||
i don't think i recall compile-me-on-demand stuff in moar/nqp/rakudo | |||||||||||||||||||||||||||||||||||||||
vrurg | jnthn: It doesn't help in the case with onlystar proto cloned for a child class. R#2772 | 18:05 | |||||||||||||||||||||||||||||||||||||
Thunk used in the stub sub wrapped in freshcoderef lazy-compiles the original code from the parent class even though it is a clone. | 18:06 | ||||||||||||||||||||||||||||||||||||||
So, when multi method is called on a child class at compile time ā it actually calls proto from the parent class effectively loosing all candidates from childrent. | 18:08 | ||||||||||||||||||||||||||||||||||||||
jnthn | timotimo: It's for when things are called at BEGIN time that are in the current compilation unit | 18:09 | |||||||||||||||||||||||||||||||||||||
18:15
lizmat left
|
|||||||||||||||||||||||||||||||||||||||
timotimo | oh, so "on demand" isn't like "lazily at some later point" | 18:19 | |||||||||||||||||||||||||||||||||||||
18:21
lizmat joined
18:35
ggoebel joined
|
|||||||||||||||||||||||||||||||||||||||
ggoebel | jnthn: is there an link to an index of your papers? www.jnthn.net/papers/ gives "forbidden" | 18:40 | |||||||||||||||||||||||||||||||||||||
18:53
patrickb joined
|
|||||||||||||||||||||||||||||||||||||||
timotimo | ggoebel: slides.html has many links into the papers/ folder | 19:09 | |||||||||||||||||||||||||||||||||||||
not sure if it's all of them, but i'm sure it's most | 19:10 | ||||||||||||||||||||||||||||||||||||||
19:15
patrickb left
|
|||||||||||||||||||||||||||||||||||||||
ggoebel | thanks, I'm wondering if one is a symbolic link to the other | 19:16 | |||||||||||||||||||||||||||||||||||||
19:21
ggoebel left
19:34
lizmat left
19:36
lizmat joined
|
|||||||||||||||||||||||||||||||||||||||
vrurg | jnthn: are you available yet? | 19:39 | |||||||||||||||||||||||||||||||||||||
looks like freshcoderef doesn't work the way I expected... :( | 19:45 | ||||||||||||||||||||||||||||||||||||||
timotimo | maybe it's more of a lookup problem than a cloning problem? just spibtalling | 19:46 | |||||||||||||||||||||||||||||||||||||
vrurg | I just don't see a new closure created for a new code. | 19:47 | |||||||||||||||||||||||||||||||||||||
timotimo | is the closure responsible for finding the candidates? i'm not familiar with this code tbh | 19:48 | |||||||||||||||||||||||||||||||||||||
vrurg | gist.github.com/vrurg/26a12bfe4758...11dd2379a9 - this code outputs 1 then 2 | ||||||||||||||||||||||||||||||||||||||
timotimo: No, it is responsible for compiling a code object on the fly, before serialization is ready. | 19:49 | ||||||||||||||||||||||||||||||||||||||
And two different code object get same bytecode because of the same closure used. | 19:50 | ||||||||||||||||||||||||||||||||||||||
lizmat | m: say "\x0075\x0308".chars # ok | 19:52 | |||||||||||||||||||||||||||||||||||||
camelia | 1 | ||||||||||||||||||||||||||||||||||||||
vrurg | nqp::getcodeobj then always returns code object for the parent class because it sees the same precompiled block. Im guessing the cause, but not the result. | ||||||||||||||||||||||||||||||||||||||
lizmat | m: say "\x0075 \x0308".chars # huh? sorta expected "3" | ||||||||||||||||||||||||||||||||||||||
camelia | 2 | ||||||||||||||||||||||||||||||||||||||
lizmat | m: .say for "\x0075 \x0308".uninames | 19:53 | |||||||||||||||||||||||||||||||||||||
camelia | LATIN SMALL LETTER U SPACE COMBINING DIAERESIS |
||||||||||||||||||||||||||||||||||||||
lizmat | ahhh... the diaeresis got combined with the space, duh | ||||||||||||||||||||||||||||||||||||||
m: say "\x0075 \x0308" | |||||||||||||||||||||||||||||||||||||||
camelia | u Ģ | ||||||||||||||||||||||||||||||||||||||
lizmat | right, sorry for the noise :-) | 19:54 | |||||||||||||||||||||||||||||||||||||
19:57
MasterDuke left
|
|||||||||||||||||||||||||||||||||||||||
timotimo | i wonder if for freshcoderef to do what you want, you'd still have to have a takeclosure call somewhere | 20:03 | |||||||||||||||||||||||||||||||||||||
vrurg | timotimo: I'll check it. Thank you! | 20:05 | |||||||||||||||||||||||||||||||||||||
timotimo | or is it capturelex? | 20:06 | |||||||||||||||||||||||||||||||||||||
vrurg | Dunno. My deepest wish is to get complete documentation on moar ops. | 20:07 | |||||||||||||||||||||||||||||||||||||
timotimo | i'm sure you've seen the partial docs we have? | ||||||||||||||||||||||||||||||||||||||
vrurg | Ideally ā a book/docs on MoarVM, including internals. | ||||||||||||||||||||||||||||||||||||||
timotimo | the ops.md or something similar in nqp/docs | 20:08 | |||||||||||||||||||||||||||||||||||||
vrurg | I'm not closing nqp/ops.markdown tab in the browser. :) | ||||||||||||||||||||||||||||||||||||||
timotimo | good :) | ||||||||||||||||||||||||||||||||||||||
vrurg | But it lags behind. And sometimes too brief. | 20:09 | |||||||||||||||||||||||||||||||||||||
timotimo | aye, it's rather brief indeed | 20:10 | |||||||||||||||||||||||||||||||||||||
have you ever tried the debugserver? it could a) help you inspect what's happening or b) tell me what else is missing :D | |||||||||||||||||||||||||||||||||||||||
vrurg | timotimo: we've already discussed it. Debugging is probably on of a few things where I hate CLI. | 20:11 | |||||||||||||||||||||||||||||||||||||
timotimo | oh! | 20:12 | |||||||||||||||||||||||||||||||||||||
OK, then the next step has to be that comma can run the debugger against a "foreign" process :) | |||||||||||||||||||||||||||||||||||||||
i.e. one that it didn't start itself | |||||||||||||||||||||||||||||||||||||||
vrurg | This is another thing: neither comma nor intelliJ work for me. comma fails with vim plugin, intelliJ fails with comma plugin. | 20:13 | |||||||||||||||||||||||||||||||||||||
timotimo | ugh! sorry to hear that! | ||||||||||||||||||||||||||||||||||||||
(and now i remember that you already told us about that) | |||||||||||||||||||||||||||||||||||||||
vrurg | I can help with debugging intelliJ some time later, if you wish. | 20:14 | |||||||||||||||||||||||||||||||||||||
Lunch time. I'll be gone for some time... | 20:15 | ||||||||||||||||||||||||||||||||||||||
timotimo | perhaps sena_kun would be a better candidate for a debug session like that; i haven't had my feet dipped into the code for a while now :( | ||||||||||||||||||||||||||||||||||||||
20:20
lizmat left
20:25
lizmat joined
20:30
lizmat left
21:05
ufobat joined
|
|||||||||||||||||||||||||||||||||||||||
Geth | rakudo: ugexe++ created pull request #2778: Retain original provides metadata structure |
21:25 | |||||||||||||||||||||||||||||||||||||
vrurg | jnthn: if you're here by chance. Do I understand it correctly that nqp::freshcoderef creates new "family" of closures but the lexicals in those closures are shared? | 22:22 | |||||||||||||||||||||||||||||||||||||
22:32
lizmat joined
|
|||||||||||||||||||||||||||||||||||||||
lizmat | Something's amiss with Inline::Perl5 detection in "make spectest" | 22:39 | |||||||||||||||||||||||||||||||||||||
$ zef install Inline::Perl5 | |||||||||||||||||||||||||||||||||||||||
All candidates are currently installed | |||||||||||||||||||||||||||||||||||||||
Testing Roast version 6.d-proposals using test file list from t/spectest.data | |||||||||||||||||||||||||||||||||||||||
Inline::Perl5 not installed: not running Perl 5 integration tests | |||||||||||||||||||||||||||||||||||||||
I tried force-installing Inline::Perl5, but that didn't make a difference | 22:40 | ||||||||||||||||||||||||||||||||||||||
vrurg | lizmat: I didn't dig deep, but I suspect that spectest is using local perl6 in the build directory. And you seemingly use installed binary. | 22:46 | |||||||||||||||||||||||||||||||||||||
$ENV{'HARNESS_PERL'} = ".${slash}perl6-" . ($js ? "js" : $moar ? "m" : $jvm ? "j" : "m"); | |||||||||||||||||||||||||||||||||||||||
A line from harness5 script | |||||||||||||||||||||||||||||||||||||||
lizmat | but that would imply that the Inline::Perl5 tests would *never* get run | ||||||||||||||||||||||||||||||||||||||
and they did get run in the past... hmmm... | 22:47 | ||||||||||||||||||||||||||||||||||||||
vrurg | kinda. That amazed me too at some point. | ||||||||||||||||||||||||||||||||||||||
lizmat | ok, too tired now to think straight | ||||||||||||||||||||||||||||||||||||||
if nobody has filed an issue about this tomorrow, I will | |||||||||||||||||||||||||||||||||||||||
vrurg | that one must be easy to fix. Though I would avoid using the installation path because it may influence spectest results. | 22:49 | |||||||||||||||||||||||||||||||||||||
Perhaps Inline::Perl5 must be installed on request. | 22:50 | ||||||||||||||||||||||||||||||||||||||
gfldex | I filed the odd bug that regessed WWW in #2779 . | 22:56 | |||||||||||||||||||||||||||||||||||||
As a side-effect I got Rakudo now 100x on my harddisk :) | 22:57 | ||||||||||||||||||||||||||||||||||||||
That's how much I like PerlĀ 6. | |||||||||||||||||||||||||||||||||||||||
jnthn | vrurg: What do you mean by "the lexicals"? | 23:01 | |||||||||||||||||||||||||||||||||||||
vrurg | gist.github.com/vrurg/26a12bfe4758...11dd2379a9 | 23:02 | |||||||||||||||||||||||||||||||||||||
jnthn | vrurg: Each one would be lexically capture per clone | ||||||||||||||||||||||||||||||||||||||
Ah, in that case there's no capturelex going on | |||||||||||||||||||||||||||||||||||||||
Or closure clone | 23:03 | ||||||||||||||||||||||||||||||||||||||
So yeah, they'd see the same $a. | |||||||||||||||||||||||||||||||||||||||
vrurg | Is it possible to have closures cloned? That'd be the best solution for proto methods cloning. | ||||||||||||||||||||||||||||||||||||||
jnthn | Hm, I thought deriving a dispatcher already did a clone? | 23:04 | |||||||||||||||||||||||||||||||||||||
vrurg | It does. But it's a stub which is cloned, actually, during compile time. | 23:05 | |||||||||||||||||||||||||||||||||||||
So, when stub runs its thunk it happens only once ā for the parent class and the original proto. | |||||||||||||||||||||||||||||||||||||||
jnthn | Routine.HOW.add_method(Routine, 'derive_dispatcher', nqp::getstaticcode(sub ($self) { | 23:06 | |||||||||||||||||||||||||||||||||||||
my $clone := $self.clone(); | |||||||||||||||||||||||||||||||||||||||
nqp::bindattr($clone, Routine, '@!dispatchees', | |||||||||||||||||||||||||||||||||||||||
nqp::clone(nqp::getattr($self, Routine, '@!dispatchees'))); | |||||||||||||||||||||||||||||||||||||||
$clone | |||||||||||||||||||||||||||||||||||||||
vrurg | Just because $precomp in World::finish_code_object is shared among stubs. | ||||||||||||||||||||||||||||||||||||||
jnthn | Hm, did you figure out why it the ends up with the wrong set of dispatchees? | 23:07 | |||||||||||||||||||||||||||||||||||||
vrurg | I know this code. But it gets called before the actual method is called and the stub got its change to complete the work. | ||||||||||||||||||||||||||||||||||||||
jnthn | s/change/chance/ ? | 23:08 | |||||||||||||||||||||||||||||||||||||
vrurg | Yes, I did. It takes them from the original proto returned by getcodeobj | ||||||||||||||||||||||||||||||||||||||
Not the cloned proto in the class. | |||||||||||||||||||||||||||||||||||||||
So, what happens is the proto is kept in AST form until the method is called. But before that it is cloned by derive_dispatcher. | 23:09 | ||||||||||||||||||||||||||||||||||||||
23:09
Geth left
|
|||||||||||||||||||||||||||||||||||||||
vrurg | Now we have two+ copies. Both as AST. Then the method gets called ā stubs is executed ā and it returns the code tied to the code object of the original proto. Always. | 23:10 | |||||||||||||||||||||||||||||||||||||
my $stub := nqp::freshcoderef(sub (*@pos, *%named) { | 23:11 | ||||||||||||||||||||||||||||||||||||||
unless $precomp { | |||||||||||||||||||||||||||||||||||||||
$compiler_thunk(); | |||||||||||||||||||||||||||||||||||||||
nqp::say("PRECOMP:" ~ $precomp.HOW.name($precomp)) if nqp::getenvhash<RAKUDO_DEBUG>; | |||||||||||||||||||||||||||||||||||||||
} | |||||||||||||||||||||||||||||||||||||||
$precomp(|@pos, |%named); | |||||||||||||||||||||||||||||||||||||||
}); | |||||||||||||||||||||||||||||||||||||||
This is where it happens. Now, it doesn't matter on what other class the proto is called ā $precomp is what it gets. | 23:12 | ||||||||||||||||||||||||||||||||||||||
This wouldn't happen if closures of the cloned stubs would be cloned too. | 23:13 | ||||||||||||||||||||||||||||||||||||||
jnthn | Hmm...I wonder if there's a more general issue that can be demonstrated here... | 23:15 | |||||||||||||||||||||||||||||||||||||
m: sub foo($a) { -> { $a } }; BEGIN { my @a = foo(1), foo(2); say @a[1]() } | 23:16 | ||||||||||||||||||||||||||||||||||||||
camelia | 2 | ||||||||||||||||||||||||||||||||||||||
jnthn | m: sub foo($a) { -> { $a } }; BEGIN { my @a = foo(1), foo(2); say @a[0]() } | 23:17 | |||||||||||||||||||||||||||||||||||||
camelia | 1 | ||||||||||||||||||||||||||||||||||||||
jnthn | Ah, of course, that won't 'cus we compile foo() before ever dealing with the closure | ||||||||||||||||||||||||||||||||||||||
vrurg | So far I can only observe it within a module. | 23:18 | |||||||||||||||||||||||||||||||||||||
Moving the classes from a module into a script "fixes" the bug. | 23:19 | ||||||||||||||||||||||||||||||||||||||
jnthn | I wonder if perhaps replacing `$precomp(|@pos, |%named);` with `nqp::curcode()(|@pos, |%named) could do it | ||||||||||||||||||||||||||||||||||||||
vrurg | I'm not that good with the ops. But wouldn't it call the stub again? | 23:20 | |||||||||||||||||||||||||||||||||||||
But I'd try. | 23:21 | ||||||||||||||||||||||||||||||||||||||
jnthn | Ah, right, I guess it needs to be nqp::getcodeobj(nqp::curcode()) | 23:22 | |||||||||||||||||||||||||||||||||||||
Which I believe gets updated by calling $compiler_thunk | |||||||||||||||||||||||||||||||||||||||
The aim being that we when invoke it on the correct code object instance with the correct @!dispatchees | |||||||||||||||||||||||||||||||||||||||
But I've got a feeling something else might be afoot. I'm probably too exhausted to figure it out right now. | 23:23 | ||||||||||||||||||||||||||||||||||||||
vrurg | I wasn't shure if $!do is updated at this point. And doubt it is. | ||||||||||||||||||||||||||||||||||||||
But it can be taken care of. I'll investigate. Thank you! | |||||||||||||||||||||||||||||||||||||||
jnthn | $compiler_thunk calls compile_in_context which I'm pretty sure does do that | 23:24 | |||||||||||||||||||||||||||||||||||||
23:24
MasterDuke joined,
MasterDuke left,
MasterDuke joined
|
|||||||||||||||||||||||||||||||||||||||
jnthn | See the chunk of code beneath the comment "# We un-stub any code objects for already-compiled inner blocks" | 23:24 | |||||||||||||||||||||||||||||||||||||
vrurg | Yep, already found it. | 23:25 | |||||||||||||||||||||||||||||||||||||
Deep recursion. But I think I'll manage it now. Thanks! | 23:26 | ||||||||||||||||||||||||||||||||||||||
I was looking for a way to avoid $precomp and you gave it to me. | 23:27 | ||||||||||||||||||||||||||||||||||||||
jnthn | Glad I could help a bit; thanks for working on this stuff | 23:29 | |||||||||||||||||||||||||||||||||||||
timotimo | \o/ | ||||||||||||||||||||||||||||||||||||||
jnthn | That whole area (compile_in_context and the machinery around it) is a tad fraught; I suspect there's a better way, but I never managed to put my finger on it yet. | ||||||||||||||||||||||||||||||||||||||
'night o/ | 23:41 | ||||||||||||||||||||||||||||||||||||||
vrurg | gnight, jnthn ! | 23:47 | |||||||||||||||||||||||||||||||||||||
MasterDuke | ugh. nqp was rebootstrapped recently and now my default-int PR has conflicts. what's a good way to go about rebasing it and creating its new bootstraps? | 23:56 |