japhb .ask Zoffix I've got a test failure in Inline::Perl5 on one of my machines here (fresh Rakudo stack as of 2 PM PDT + panda modules install, on a fully patched Linux Mint 17 + perl 5.18.2 system) in t/p6_object_destructor.t: "chars requires a concrete string, but got null" ... known? 01:13
yoleaux2 japhb: I'll pass your message to Zoffix.
[Tux] This is Rakudo version 2016.09-113-g01321ca built on MoarVM version 2016.09-15-gc8b4228 06:12
csv-ip5xs 3.109
test 16.334
test-t 7.244
csv-parser 17.854
nine japhb: I've seen it a couple of times in the past week. It's a real heisenbug again. Vanishing with totally unrelated code changes. Makes debugging quite hard. 06:29
sortiz \o #perl6-dev 06:35
nine Seems like CUnions can not be used as value types :/ Reading the tests in 13-union.t, rakudo seems to always expect passing and getting pointers. 06:56
This sucks. What do we have Pointer[MyUnion] for then? And can we still change that? 06:57
arnsholt I don't think any of the NativeCall types can be used as value types yet 08:34
It's been discussed, but noone's figured out quite the correct API yet
I think a large-scale overhaul of the whole NC API might be in order, honestly
Unfortunately, that'll break a non-trivial amount of modules 08:35
Zoffix 10:04
yoleaux2 01:13Z <japhb> Zoffix: I've got a test failure in Inline::Perl5 on one of my machines here (fresh Rakudo stack as of 2 PM PDT + panda modules install, on a fully patched Linux Mint 17 + perl 5.18.2 system) in t/p6_object_destructor.t: "chars requires a concrete string, but got null" ... known?
nine arnsholt: the good news is that NativeCall is not in CORE. So we could use NativeCall:api<2> for example 10:07
Though it's sad that you'd have to type those extra characters (and know that you should do that) just to get the sane API 10:09
RabidGravy arnsholt, it will break about *half* of my modules (26) 10:25
arnsholt Yeah, lots of stuff 10:26
tadzik fortunately, we have module versioning and we shouldn't be depending on version * in our modules :) 10:27
arnsholt =)
nine I really should finish my lexical_module_load branch. Without that the whole versioning is only half as useful. 10:29
timotimo will we have something that'll allow us to put multiple versions of the same compunit into one distribution? 10:37
nine That would be very useful as well. Shouldn't be difficult to add that to CURI. It's just a matter of deciding on a META6.json format. 10:46
timotimo right 10:47
Zoffix Good god. They made a remote arbitrary code execution via a JPEG: thehackernews.com/2016/10/openjpeg-...-hack.html
timotimo i don't think there'd be a problem for entries into the provides list to become individual objects
Zoffix Makes me wonder how many sploits we got in Perl 6 :)
timotimo Zoffix: use NativeCall, write to arbitrary memory locations, success 10:48
Zoffix That's cheating :) 10:49
timotimo well, all arrays that you can use from perl6 are auto-growing. access outside of their bounds will just cause an array to expand 10:50
Zoffix m: [][^1e100] 10:51
timotimo rakudo startup and compilation is incredibly slow, so we can't just fuzz our way towards exploitable bugs
camelia rakudo-moar 01321c: OUTPUTĀ«(timeout)Ā»
Zoffix m: [][^1e10]
camelia rakudo-moar 01321c: OUTPUTĀ«(timeout)Ā» 10:52
Zoffix m: [][1e10] 10:59
camelia ( no output )
Zoffix m: [][1e100]
camelia rakudo-moar 01321c: OUTPUTĀ«Cannot unbox 333 bit wide bigint into native integerā¤ in block <unit> at <tmp> line 1ā¤ā¤Ā»
Zoffix m: [][1e9999] 11:00
camelia rakudo-moar 01321c: OUTPUTĀ«Cannot coerce Inf to an Intā¤ in block <unit> at <tmp> line 1ā¤ā¤Actually thrown at:ā¤ in block <unit> at <tmp> line 1ā¤ā¤Ā»
lizmat www.reddit.com/r/perl6/comments/55...h=85e69823 # might need some attention? 11:14
nine I guess Inline::Perl5 would be a nice example of a NativeCall user that compiles its own library 11:17
lizmat nine: yup
brrt that is a pretty good question actually
lizmat nine: is now a good time to look at CompUnit/Repository/FileSystem, or should I wait until you have merged the lexical_module_load branch 12:00
nine The branch doesn't touch the repository implementations at all 12:03
lizmat nine: how about te following idea 12:08
I think part of the delay we have in dirs with many files,
is that we create a SHA of all files, even if we could already know it's not going to match as soon as one file has changed 12:09
timotimo hm, but most of the time we're going to start up with everything the same, are we not?
lizmat nine: is that train of thought correct ?
timotimo: yes 12:10
timotimo hm. actually. if you use -I, maybe you are doing so because you're developing that stuff, so we should expect at least one file to change every single time we start up?
lizmat yup, that also :-)
hackedNODE What's using the .DUMP method, or is it meant to be called directly? 12:11
m: say %(:42a, :43b).DUMP
camelia rakudo-moar 01321c: OUTPUTĀ«Hash<1>(ā¤ :$!descriptor(Perl6::Metamodel::ContainerDescriptor<2>(...)),ā¤ :$!storage(BOOTHash<3>(...))ā¤)ā¤Ā»
hackedNODE m: 42.DUMP.say 12:13
camelia rakudo-moar 01321c: OUTPUTĀ«42ā¤Ā»
timotimo it's for developers and such
hackedNODE OK
[Coke] cfp.perladvent.org/ - I think this might be a good idea for us to do over at sixadvent this year, too. 12:14
lizmat [Coke]: yup, and I can put it in next P6W :-) 12:15
dalek ast: bbb5190 | (Zoffix Znet)++ | S32-hash/delete.t:
[coverage] :delete on Hash:U returns Nil
12:25
hackedNODE m: [1,2].DUMP 12:26
camelia rakudo-moar 01321c: OUTPUTĀ«MVMArray: Can't shift from an empty arrayā¤ in block <unit> at <tmp> line 1ā¤ā¤Ā»
lizmat fwiw, I've never used .DUMP and I wonder whether it's still useful 12:33
feels to me it's something from the parrot days
hackedNODE m: use Test; my class Foo {}; my $o = Foo.new; my %h{Any} = :42a, :72b, Foo => $o; is-deeply %h.antipairs, (42 => "a", 72 => "b", $o => Foo); 12:36
camelia rakudo-moar 01321c: OUTPUTĀ«not ok 1 - ā¤ā¤# Failed test at <tmp> line 1ā¤# expected: $(42 => "a", 72 => "b", (Foo.new) => Foo)ā¤# got: Seq.new-consumed()ā¤Ā»
hackedNODE m: use Test; my class Foo {}; my $o = Foo.new; my %h{Any} = :42a, :72b; is-deeply %h.antipairs, (42 => "a", 72 => "b");
camelia rakudo-moar 01321c: OUTPUTĀ«ok 1 - ā¤Ā»
nine timotimo: I rarely change my source files during a run of my 34 test files. 12:37
hackedNODE Any idea why using the object makes it return that stuff? The code is this: perl6.wtf/src_core_Hash.pm.coverage.html#L690
nine lizmat: the question we need to answer is: "is the repo the same as when we precompiled this file?". My solution for that is to store the repo's id with the precomp file, and then compare this against a freshly calculated id. 12:38
lizmat nine: yes, got that
nine lizmat: I cannot answer "has the repo changed?" when looking only at a part of the repo.
lizmat I just found out that apparently we're also recursing into .git and other dot directories 12:40
that's not needed as we cannot have a dot as part of a module name
nine Oh, that's certainly a worthy optimization. 12:41
lizmat anyways, now testing a patch that will bring down load time of a Foo.pm in my . dir down from 5 seconds to 3.8 12:42
nine niiice
gfldex none('.')++
lizmat before: $ 6 'use lib "."; use Foo' real0m5.068s 12:43
gfldex i was temped to have .git as a default exclude in File::Find :)
lizmat after: $ 6 'use lib "."; use Foo' real0m3.838s
bbi15 12:44
hackedNODE m: use Test; is-deeply % .antipairs, [42] 13:13
camelia rakudo-moar 01321c: OUTPUTĀ«not ok 1 - ā¤ā¤# Failed test at <tmp> line 1ā¤# expected: $[42]ā¤# got: Seq.new-consumed()ā¤Ā»
hackedNODE m: my $s = % .antipairs; $ = $s eqv [42]; say $s.perl 13:15
camelia rakudo-moar 01321c: OUTPUTĀ«Seq.new-consumed()ā¤Ā»
hackedNODE I see. It's just an artefact of the test used, rather than an issue in the code.
This is amusing. S24-testing/1-basic.t has comment saying "This file /exhaustively/ tests the Test module [...] because we are using this module to test Perl 6 itself, so I want to be sure that the error is not coming from within this module." 13:32
But the plot twists is that file apparently doesn't get run as part of the test suite, because it has a ton of code that fails :) 13:33
m: use Test; is(2 + 2, 5, :todo<feature>, :desc('2 and 2 doesnt make 5'));
camelia rakudo-moar 01321c: OUTPUTĀ«Unexpected named argument 'todo' passedā¤ in sub is at /home/camelia/rakudo-m-inst-2/share/perl6/sources/C712FE6969F786C9380D643DF17E85D06868219E (Test) line 128ā¤ in block <unit> at <tmp> line 1ā¤ā¤Ā»
hackedNODE heh
dalek kudo/nom: a5073f9 | (Zoffix Znet)++ | / (2 files):
Add Seq candidates to is-deeply to .cache them

This avoids the failing tests claiming that `Seq.new-consumed()` was either `expected` or `got`. The issue was due to consuming the Seq during
  `eqv` and then attempting to consume it again, when printing it with `.perl`
13:50
ast: 0ad2799 | (Zoffix Znet)++ | S24-testing/9-is_deeply.t:
is-deeply with Seqs does not claim `Seq.new-consumed` expected/got
13:51
kudo/nom: b12297e | (Zoffix Znet)++ | t/spectest.data:
Add S24-testing/9-is_deeply.t to list of test files to run
13:54
hackedNODE m: class Foo {}; my $o = Foo; my %h{Any} = :42a, :72b, Foo => Foo; dd %h.antipairs 14:03
camelia rakudo-moar 01321c: OUTPUTĀ«(42 => "a", (Foo) => "Foo", 72 => "b").Seqā¤Ā»
hackedNODE hm 14:04
nm, the stringification was due to the pair
lizmat commute& 14:06
dalek ast: b667c92 | (Zoffix Znet)++ | S09-typed-arrays/hashes.t:
[coverage] Cover .antipairs, .invert on typed hashes
hackedNODE t/04-nativecall/06-struct.t consistently dies on my linode box 14:12
s/dies/fails/
gist.github.com/zoffixznet/601dec2...69a41eaab9 14:13
ggoebel hacktoberfest inspired me... looking at github.com/perl6/doc/issues/926 14:16
hackedNODE sweet 14:17
ggoebel produce was introduced by timtowdi with commit: 7b54492b9ab887d6bc7d7e4e5792814ef77fdd21
hackedNODE s: [], 'produce'
SourceBaby hackedNODE, Sauce is at github.com/rakudo/rakudo/blob/0132...s.pm#L1520
ggoebel it was first discussed in irc irclog.perlgeek.de/perl6/2015-10-09#i_11348271 14:18
hackedNODE s: &find-reducer-for-op
SourceBaby hackedNODE, Sauce is at github.com/rakudo/rakudo/blob/0132...st.pm#L136
ggoebel and relates to a commit by pmichaud back in 6/30/2011 df9d49b9525930bab7797b6c5c5b1f8ffe9e5f2e 14:19
where he refers to "Add reduction metaops for triangle and right associative operators."
where does the reference to "triangle" come from? I.e. help me cure my ignorance... what does it mean in this context? 14:20
regarding produce... instead of reducing a list of values using an operator... it provides the "running" list of results 14:22
hackedNODE m: say [+] ^10 14:23
camelia rakudo-moar b12297: OUTPUTĀ«45ā¤Ā»
hackedNODE m: say [\+] ^10
camelia rakudo-moar b12297: OUTPUTĀ«(0 1 3 6 10 15 21 28 36 45)ā¤Ā»
ggoebel perl6: say reduce &[+], (1,2,3,4,5); 14:24
hackedNODE I think the second one is a triangle
camelia rakudo-jvm 2a1605, rakudo-moar b12297: OUTPUTĀ«15ā¤Ā»
ggoebel m: say reduce &[+], (1,2,3,4,5);
camelia rakudo-moar b12297: OUTPUTĀ«15ā¤Ā»
ggoebel m: say produce &[+], (1,2,3,4,5);
camelia rakudo-moar b12297: OUTPUTĀ«(1 3 6 10 15)ā¤Ā»
hackedNODE m: say [+] (1,2,3,4,5)
camelia rakudo-moar b12297: OUTPUTĀ«15ā¤Ā»
hackedNODE m: say [\+] (1,2,3,4,5)
camelia rakudo-moar b12297: OUTPUTĀ«(1 3 6 10 15)ā¤Ā»
ggoebel hackedNODE: i didn't realize you could do [\+] 14:25
hackedNODE So produce seems to be a reduce that produces values for each operation
ggoebel off to update the docs... If anyone knows where the term 'triangle' originates in this context please enlighten me :-)
see the link to the irc url for the beginning of a bike shedding conversation of why it was called produce 14:26
timotimo i think the triangle comes from what the operations look like, as well as the [\ 14:27
i.e. for addition you get something like
1
1 + 2
1 + 2 + 3
1 + 2 + 3 + 4
ggoebel so just a visual clue... no underlying reference to functional programming, etc.? 14:28
timotimo i don't do enough functional programming to know :\
[Coke] design.perl6.org/S03.html 14:32
reinforces the visual nature both of the operator and timotimo++'s math drawing. 14:33
ggoebel Coke: thx!
dalek kudo/nom: 2a2f26c | lizmat++ | src/core/CompUnit/Repository/FileSystem.pm:
Make CURF.id about 2.5x faster

  - don't recurse into any directory that starts with '.'
   Instead of just ".", ".." and ".precomp". Technically, this should
   probably be replaced with a check for a valid identifier (at least
   when recursing). This change makes sure we skip any .git dir.
  - check extension with native hash instead of with a Junction
  - check extension *before* checking existence on OS
   Checking the OS is much more expensive, so only check for files
   that we think we need.
  - rewrite looping logic using a direct iterator and nqp ops
   Instead of [~] map grep combo
e5d9012 | lizmat++ | src/core/Rakudo/Internals.pm: Simplify DIR-RECURSE a bit
Now only returns paths that actually exist
hackedNODE \o/ 14:35
lucasb_ nice explanataion/mnemonic of "triangle" :) 14:41
I see produce as a sister of reduce. never saw this 'produce' operation in any other language. if anybody knows other occurences of this, I would be very interest to hear 14:42
nine lizmat: we're now down to 46s from 58s for a test in a checkout of perl6/mu 14:49
lizmat ok, that's something :-) 14:50
nine I wonder how on earth we can get down to the 0.32 seconds of a find . -type f | xargs sha1sum | sha1sum 14:52
lizmat now, my next idea is to not sha1 the contents of files, but the name ~ last modified
nine Err...should of course be find . -type f | xargs cat | time sha1sum 14:53
lizmat *if* the last modified is within the same second of now() , then use now() instead of last modified
nine Or actually not
lizmat now() being millisecond resolution
so, if a file was changed in the current second (well, mod 2), then it will generate a sha that's always different 14:54
nine Soo....that's like rounding to worst case? A very interesting idea :)
lizmat so, going for that now
travis-ci Rakudo build failed. Zoffix Znet 'Add Seq candidates to is-deeply to .cache them 14:56
travis-ci.org/rakudo/rakudo/builds/164949669 github.com/rakudo/rakudo/compare/0...073f9615c1
buggable [travis build above] ā˜  Did not recognize some failures. Check results manually
hackedNODE t/04-nativecall/05-arrays.t though doesn't fail on my local box 14:58
nine lizmat: the profile for id()ing the mu repo looks...disturbing. 2 million (!!) blocks cloned. Half a million iterations of the loop in dir(). A quarter of a million FILETEST-D calls. 15:07
That's for a directory containing just 4630 files.
How on earth can that warrant 266229 calls to FILETEST-D??? 15:08
lizmat yeah, DIR-RECURSE is later on my list of things to look at
hmmm... but 2M seems wrong to me as well
nine 51K files opened
The id method itself (including the nqp::sha1 call) is only 7.85 % of the runtime. 15:09
That suggests that getting rid of sha1ing the contents will not actually buy us all that much. 15:10
lizmat hmmm.... 15:12
indeed... using path + modified only moves from 2.0 -> 1.9 seconds 15:14
lizmat starts to remember what she did for the newio branch 15:22
timotimo and this is why i want the profiler to also give us a list of "what frames called this routine how often" for any given routine in the routines list :)
nine timotimo: oh, yes, pretty please :) 15:25
lizmat nine: wrt to DIR-RECURSE: is there a reason you want to include dirs in there as well ? 15:44
not just the files in the dirs, but the dirs itself as well? 15:45
nine lizmat: my version was this: github.com/rakudo/rakudo/blob/7d80...tem.pm#L61 15:55
lizmat: ugexe++ implemented DIR-RECURSE. Any changes in semantics are non-intenional
lizmat ok 15:56
TimToady I have no recollection of adding a .Numeric to eval 16:00
yoleaux2 3 Oct 2016 21:51Z <Zoffix> TimToady: do you recall why you added a .Numeric to eval exception? In its default incantation it seems uncallable due to the payload being a string: github.com/rakudo/rakudo/blob/0132...on.pm#L111
hackedNODE It was 5 days pre-Christmas... maybe just copy-pasta? 16:01
TimToady likely
probably from something trying to support 'errno' or so
nine I don't get why but a simple perl6 --profile -e 'Rakudo::Internals.DIR-RECURSE(".")' gives me 4 "take"s per file 16:15
dir() itself does 2 takes per file 16:18
timotimo why :o 16:19
is autothreading f-ing it up?
no, that'd be extra weird
brrt lizmat: what did you do for the newio branch 16:22
nine OMG! 16:25
Do you want to know why the -Iing the mu repo takes soo much longer (almost a minute) than other examples?
/home/nine/install/mu/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/root_pugs/misc/kp6_misc/kp6_ast/roo 16:26
timotimo m(
nine We follow symlinks
TimToady um...
gfldex yes, I miss loop detection too :) 16:27
nine Lesson of the day: recursing directories is much harder than you'd think
hackedNODE :) 16:31
gfldex what is the reason of the @paths.append/.pop business instead of using real recursion?
timotimo call frames can be expensive 16:32
nine Removing the symlinks takes it down to 1.0s 16:34
hackedNODE hah
timotimo from over a minute, eh?
nine yes
timotimo nice
nine It just took a minute for the paths to get long enough so it would throw exceptions that we dutifully ignore 16:35
timotimo hahaha
nine And we ignore them because files may just vanish between the listing and the opening.
kudo/nom: 392d210 | lizmat++ | src/core/CompUnit/Repository/FileSystem.pm:
Don't need to check for existence anymore
lizmat takes it down from 2 to 1.9 for me 16:37
gfldex m: my $start = now; while my @paths.pop { last if $++ > 10000; @paths.append("foo") }; say now - $start; $start = now; my $c; { &?BLOCK if $c++ <= 10000 }; say now - $start;
camelia rakudo-moar 2a2f26: OUTPUTĀ«0.0029239ā¤0.0004262ā¤Ā»
gfldex m: my $start = now; while my @paths.pop { last if $++ > 10000; @paths.append("foo") }; say now - $start; $start = now; my $c; { &?BLOCK if $c++ <= 10000 }; say now - $start;
camelia rakudo-moar 2a2f26: OUTPUTĀ«0.0030003ā¤0.00034977ā¤Ā»
nine gfldex: I think it was really just a stylistic decision 16:38
gfldex it is a wee bit faster (for now)
nine I still think dir() could be faster than it is 16:41
timotimo probably
hackedNODE s: &dir
SourceBaby hackedNODE, Sauce is at github.com/rakudo/rakudo/blob/2a2f...ors.pm#L93
nine Oh. Now I remember why I keep procrastinating by speeding up Inline::Perl5 instead of finishing the new v6::inline implementation. I got stuck trying to find a reference leak... 16:43
lizmat dalek ? 17:09
dalek kudo/nom: 72e4f83 | lizmat++ | src/core/Rakudo/Internals.pm:
Add :file test parameter to DIR-RECURSE

So we don't need to take stuff we don't need
kudo/nom: f8da735 | lizmat++ | src/core/CompUnit/Repository/FileSystem.pm:
Simplify CURF.id now that we can specify :file
lizmat nine: takes it down to 1.8 from 1.9 for me
hankache hiya lizmat
lizmat hankache o/
nine lizmat: certainly the right direction :) 17:10
lizmat: any idea how to get a loop detection into that code?
lizmat nine: ??
nine will lose connectivity any minute now for an hour or so
lizmat ok, cu later!
nine lizmat: see backlog 17:11
lizmat: oh you dropped outbefore I posted my findings
lizmat ah, checking irclog now 17:12
gfldex nine: you need the inode and a %seen-Hash and a test for symlink. No idea how that works on windows with it's junctions/reparse-point madness 17:13
also, you may cross device boundaries and then you need to keep track of devid aswell 17:14
filesystems are hard
dalek kudo/nom: 1e7f69e | (Dominique Dumont)++ | docs/running.pod:
Remove pod instructions specific to Perl6 pod

With PR #751 , the file docs/running.pod became a Perl6 pod document (because of =begin pod).
This breaks man page generation on Debian build (see
  bugs.debian.org/cgi-bin/bugreport....bug=839059 ).
According to nine, this switch is not intentional.
17:15
kudo/nom: c6c0e69 | lizmat++ | docs/running.pod:
Merge pull request #895 from dod38fr/nom

d8309d0 | lizmat++ | src/core/Str.pm:
  en.wikipedia.org/wiki/Life,_the_Un...Everything
lucasb_ idk what this discussion is about, but... what about accepting a parameter :follow-symlinks (Bool) ? 17:30
this way, you can postpone the fixing of this issue later :) 17:31
hackedNODE Why postpone? We aren't in a hurry.
lucasb_ and for now, just say :!follow-symlinks
hackedNODE: ah, right :)
travis-ci Rakudo build errored. Elizabeth Mattijsen 'Don't need to check for existence anymore' 17:53
travis-ci.org/rakudo/rakudo/builds/165000650 github.com/rakudo/rakudo/compare/2...2d210fee82
buggable [travis build above] āœ“ All failures are due to timeout (1), missing build log (0), or GitHub connectivity (0)
nine gfldex: we could just ignore the symlink business on Windows until a Windows user actually discovers, that Windows supports them and push DIR-RECURSE into IO::Spec, so we can handle symlinks in IO::Spec::Unix 18:15
At least we should be able to get at the relevant information: nqp::const::STAT_PLATFORM_DEV and nqp::const::STAT_PLATFORM_INODE 18:22
gfldex IO::Spec::Unix would be a nice place for stat and readlink 18:24
AlexDaniel .
hackedNODE :
Why does everyone use dots? :)
gfldex you could use āœ‰ 18:25
lucasb_ 'use dots' .... this made me remind of this feature proposal for P5 back in 2013 18:26
unfortunately it wasn't accepted by the then pumpking :) 18:27
nine It seems like our whole stack is missing a way to actually get the complete stat() information. For every single test you have to call stat() again even though the stat buf contains all the information for all tests. 18:29
So we're gonna do FILETEST-E, FILETEST-D and FILETEST-L and that runs 3 stat() calls when one would completely suffice. 18:30
perlpilot notes that P5 had this "optimization" early on. (caching stat info) 18:31
(so early it was in P4 and probably P3 :-) 18:32
nine Probably because back then it took about a second to do a stat()...
awwaiid so we still have until P8 to add it to P6?
masak awwaiid: I like the way you're thinking 18:44
cygx o/ 18:52
so, any further comments on github.com/rakudo/rakudo/pull/894 ? 18:53
like, bikeshedding the name of X::IO::DoesExist...
hackedNODE would go for X::IO::Exists 18:55
cygx I just went the obvious way from X::IO::DoesNotExist 18:56
18:57
mst X::IO::IsADirectory or X::IO::AlreadyExists ?
hackedNODE I'd pick X::IO::DoesSoExist over X::IO::DoesExist... because originally I thought it read DoesNotExist, then I thought "is that a typo"...
cygx X::IO::IsADirectory is already known as X::IO::Directory 18:58
it gets thrown when you pass :exclusive to open, which fails on existing files
cygx is afk for a bit 18:59
lucasb_ oh, that's the Oxford comma? :) 19:25
dogbert17 hackedNODE: gist.github.com/dogbert17/89a6b2e5...6df6f8f300 19:28
hackedNODE That broke my tweet I posted a few months ago! I thought Perl 6 was meant to be backwards compatible! :)
xkcd.com/1172/ 19:29
dogbert17++ added thanks: rt.perl.org/Ticket/Display.html?id=128655 19:30
dogbert17 thank you for figuring out the problem
hackedNODE It was blind luck
perlpilot dogbert17++ and hackedNODE++ for sure 19:31
japhb Zoffix, nine: Sorry for the late night confusion about Inline::Perl5 ownership. 19:56
nine: Gah, damn Heisenbugs. SO SICK OF THOSE.
timotimo quite.
japhb hackedNODE, lizmat: DUMP's use case is when you need to understand things like containership and whether your data structure is a tree, a DAG, or a general graph. It's for understanding the bits that .perl and dd can't tell you (but as I recall predates dd) 19:57
timotimo predates dd by years 19:59
japhb History: Someone (pmichaud?) created a very early version of DUMP that couldn't handle non-tree structures. I extended it to handle more general structures, and tightened up the display of containerness and laziness. Sometime after I got DUMP working decently well, someone else did some further tweaks, and then it bitrotted somewhat as the active contributors were more often using dd by that point (which is much more terse when you don't need the full detail) 20:00
I'm guessing the real breaking point is that it probably wasn't fully updated for post-GLR and pre-Christmas changes. 20:01
lizmat decommute& 21:09