»ö« Welcome to Perl 6! | perl6.org/ | evalbot usage: 'p6: say 3;' or rakudo:, or /msg camelia p6: ... | irclog: irc.perl6.org or colabti.org/irclogger/irclogger_logs/perl6 | UTF-8 is our friend!
Set by moritz on 22 December 2015.
lookatme morning .o/ 00:34
Herby_ \o 00:42
samcv how do i make sure that i can use one library in another. they are in the same path as each other 01:32
but it seems i cannot refer to the same directory they are in?
or i guess the solution is to call them from another script instead of running the .pm6 file itself 01:33
lookatme do u mean `use lib <PATH>` ?
samcv yeah if i run some .pm6 file from an arbitrary directory. it can't find library files in the same folder as it
but if i use the .pm6 file in some perl 6 file i can run the .pl6 file fine 01:34
myscript.p6, ./folder/lib/module1.pm6, ./folder/lib/module2.pm6. i can't execute module1.pm6 from ./ 01:35
it doesn't find the other module located in the same folder. but if i include module1 in myscript.p6, then it is fine
lookatme The .p6 file contain a shebang `#!...`
include module ? 01:36
samcv use lib 'folder/lib'; use module1;
lookatme oh. 01:37
samcv so i guess the module1 is using the .p6 script's lib declaration or whatever
just seemed interesting i can't execute the .pm6 script itself and it doesn't recognize the module is in the same folder
it only looks at system/user installed perl 6 modules
lookatme How you execute module1.pm6 ? 01:39
samcv just running it with perl6
lookatme I think they can find other module except you include lib path `./folder/lib` 01:40
s/can/can't/ 01:42
Just try `perl6 -I... `
ugexe require $?FILE.parent.child("foo.pm6"); ? 01:44
ugexe otherwise remember lexical module loading 01:48
`run $*EXECUTABLE, "-I" <<~<< $*REPO.repo-chain.map(*.path-spec)` this is how the problem is solved when spawning procs
samcv heh i remember i did some shady IO stuff somewhere.. i finally am working back on certain code and found it
ugexe otherwise declare them in META6.json and -I. 01:49
samcv now to get this working again
hmm very interesting ugexe
ugexe -I. is different from -Ilib (so `use lib <.>` and `use lib <lib>`) in that it uses the provides of META6.json to map namespaces to paths 01:51
samcv very cool 01:54
ugexe my @parts = $?FILE.parts; repeat { } while pop(@parts) ne 'lib'; require @parts.join('/'); # something dirty like this probably works for finding a modules own lib/ to use 01:55
use lib @parts.join('/') rather 01:56
samcv yay my changes to markdown renderer got merged in :) 03:50
samcv i need to fix escaping of backticks so you can use backticks in codeblocks 04:33
samcv would be nice if i can get commit access to that repo :3 heh. since i've been making so many changes. maybe should ask for it 05:26
samcv wow finally this gets merged into vim syntax highlighting! after. i opened it in October! github.com/vim-perl/vim-perl6/pull...-303756374 05:30
Geth doc: antquinonez++ created pull request #1333:
Remove dupe consecutive words
05:33
Geth doc: eb8daf9160 | (Antonio Quinonez)++ | 6 files
Remove dupe consecutive words
05:34
doc: 5f7a089168 | (Samantha McVey)++ (committed using GitHub Web editor) | 6 files
Merge pull request #1333 from antquinonez/dupes

Remove dupe consecutive words
ugexe ugexe.com/perl-toolchain-summit-201...and-perl6/ 05:48
finanalyst feedback on perl6-users told me to talk here. I'm interested to know why Task::Star was thrown out of the Ecosystem without much notice 08:06
I relied on it to get a minimum set of modules
I've been tracking module popularity (there's a link on the module page), and the set of recursively cited modules is fairly stable. 08:07
lizmat finanalyst: I'm afraid we got a little bit too enthusiastic at the PTS :-( 08:08
finanalyst So how about a Task::Popular that basically takes the most popular modules (that pass all tests)
lizmat sounds like a good idea :-) 08:09
finanalyst I can look at all the data and see if there is significant break between the most popular and the less popular
then compile a Task module from the results. Update say once a month 08:10
If a module starts breaking on Travis, then it gets eliminated from the list
lizmat ++finanalyst :-) 08:17
nine I'm not sure that would actually be a useful collection. E.g. BioPerl6 is one of the dists with the most stars, but it has a very clearly separated audience.
I guess more useful would be several meta dists based on certain areas, e.g. a web focused dist, a math dist, whatever. 08:18
finanalyst nine: stars has nothing to do with my proposal! Take a look at finanalyst.github.io/ModuleCitation/
You will see that the modules with the highest recursive citations are really useful 08:19
basically a citation is a module that appears in a depends string in another module's meta.json 08:20
nine So those are our upriver modules. Certainly the ones that we should test the most, yes.
finanalyst by the way, I'm revising the page a bit. Rather than 'error' it should be non-ecosystem module. 08:22
vimal2012 Why this error message 08:29
p6: say split (';' , 'a;b;c')
camelia Too few positionals passed; expected at least 2 arguments but got only 1
in block <unit> at <tmp> line 1
vimal2012 I passed 2 arguments
This example is given in docs.perl6.org/routine/split 08:32
geekosaur because you put a space between 'split' and '(' 08:33
you therefore passed it *one* argument, which is a parenthesized list
eater hm, I'm gonna do a talk about Perl6, are there any topics I should definitely cover? 09:16
tyil eater: where? 09:34
pls do domcode
also, certain topics that strike me as a perl noob would be the perl 6 regexes, the Inline:: modules, NativeCall stuff, and that p6 does both OO and functional styles very well 09:36
mienaikage Huh. I had a lexically scoped variable in a postfix for, wasn't expecting it to still be in scope outside of the loop! 09:44
nine mienaikage: lexicals are scoped to the block. 09:45
mienaikage Yeah I know, it just caught me off guard as I don't recall the same happening in 5 09:52
nine mienaikage: 6 is much more consistent in that regard. 09:53
Zoffix finanalyst: it was removed due to being out of date and many people who don't use Rakudo Star thought it was a mandatory module to install, despite it not having a useful collection of modules. 10:02
nebuchadnezzar Hello, is it normal to get some .repo-id under ~/perl6/precomp/ even if the module is already precompiled? paste.debian.net/940184/ 10:04
Zoffix finanalyst: Task::Popular sounds entirely useless to me. The modules with most citations are the least wanted one to be stuffed into some distro, since by definition you're most likely to get them as a prereq to some other module.
nine nebuchadnezzar: yes, that cannot be avoided. 10:05
stmuk Zoffix: it's useful if only for having a collection to smoke
nine nebuchadnezzar: it's pretty much just a marker that "I know, the repo-chain changed (~/.perl6 was added since precompilation) but the precomp files are still up to date"
stmuk I thought it sounded a great idea
nebuchadnezzar nine: thanks for the explanation 10:06
Zoffix finanalyst: looking at your list, there are 7 different JSON modules and a bunch of HTTP modules. Just because something gets referenced a lot doesn't mean all the users are dying to get it.
nebuchadnezzar nine: I found some references to this in github.com/ugexe/zef/issues/117 but can not understand everything 10:07
Zoffix stmuk: why not just have a collection to smoke instead of releasing "Task::Popular" 10:08
nine nebuchadnezzar: are there still issues?
finanalyst Zoffix: For someone new to Perl6, some help is needed 10:09
stmuk Zoffix: I think using a Task:: to define a collection is sane and I don't particularly care what it's called
finanalyst Trashing Task::Star was reasonable for gurus, but not for occasional users
nine stmuk: I think Zoffix doesn't object to the name but to the contents (same as me)
stmuk I'd say the more collections being regularly tested the better 10:10
the better the coverage
finanalyst A statistical approach is a sane method. What is statistically significant says something about the Ecosystem 10:10
nebuchadnezzar nine: with the precomp stuff, no, I just have the .repo-id under ~/.perl6 10:11
nine finanalyst: Zoffix' argument carries some weight. Task::Popular would not contain the modules most likely used for some work, but the modules used by the module that do the user's work. And as that they would get installed anyway by the actually most useful module.s
finanalyst I think the prevalence of JSON::Tiny for example, instead of JSON::Fast is an artifact of the newness of the Ecosystem
nine: I understand and accept Zoffix's point
nebuchadnezzar nine: for the packaging of libraries, I tried to resume the options and it's time to think about which one we wants for Debian lists.alioth.debian.org/pipermail/...01138.html 10:12
finanalyst nine: problem with a 'useful' by someone's definition is that it is a 'curated' distribution. That means someone needs to curate it 10:13
then you get old modules
nine nebuchadnezzar: seems to make difficult to provide packages for multiple versions of
perl6 compilers": I don't see why?
nebuchadnezzar: "optimisation: pre-compilation can not take advantages of CPU features since they are not known." but MoarVM code is not dependent on CPU features anyway? 10:14
jnthn I was just going to comment on that one
stmuk maybe Task::Popular::Empirically
jnthn Precompilation is to bytecode
nine And what does "bin-NMU" mean? 10:15
jnthn So any CPU-specific stuff will happen in the JITting of that bytecdoe
nebuchadnezzar nine: bin-NMU is a binary Non Maintainer Upload, i.e. we force a rebuild for binary reason (no source change)
jnthn: ok, so there is no draw back to use .moarvm files generated by an old perl6 toolchain with a new one? 10:16
jnthn nebuchadnezzar: You can't do that, though, because the precomps are linked against the earlier toolchain 10:18
nebuchadnezzar meaning: we don't need to ask for a complete rebuild of packages when we package new MoarVM/NQP/Rakudo?
jnthn My point was just that precomp doesn't do anything architecture specific
nebuchadnezzar jnthn: ok, so optimisation is not a point
jnthn Right. 10:19
nine nebuchadnezzar: if there's no precomp file for the exact running rakudo, it will precompile transparently (except for the time required) and store the generated files in ~/.perl6 (most of the time)
nebuchadnezzar: so multiple Perl 6 installations are not an issue for distributing precompiled files.
nebuchadnezzar nine: ok, I'm not sure about the Debian policy for that point, I think we may want to avoid this step at run time for users, some of them may even not have a $HOME 10:21
nine nebuchadnezzar: at some point we may add a fix for when there's just no place to write the precomp files to. Wouldn't be difficult. Just hasn't come up so far. 10:22
nebuchadnezzar: from my point of view, there's practially 0 reason to precomp on installation. It only has downsides. 10:23
nebuchadnezzar: of course, Gentoo users may disagree :) But even there it doesn't make that much sense.
kent\n the question would be if dynamic-compile needs to write to privileged paths or not 10:24
and the security implications about where it *does* write to
but yeah, if they all turn up in ~/.perl6, its just ugly, not a technical problem 10:25
nine kent\n: what do you mean by "dynamic-compile"?
kent\n I mean, if you're avoiding pre-comp, but a real-compile-and-store happens, the latter is "on-demand", but they're otherwise equivalent conceptually 10:26
( I don't know exactly how rakudo works here, I'm just borrowing from my knowledge of python compilation ) 10:27
samcv m: my @array2 = [(1,2,3), (3,2,1)]; say @array2.all ~~ Cool
camelia True
samcv i don't wan tthis to happen. i don't want it to decend all levels. i only want it to match against the higest level
is there a way I can do that?
I want it to ~~ Positional = True; and ~~ Cool = False
for an array of arrays
as long as all items in it are positionals 10:28
jnthn m: say List ~~ Cool
camelia True
jnthn I don't think it's descending
samcv oh.
ok then :P
nebuchadnezzar Thanks a lot for the feed backs, I'll wait for dod and some other Debian folks for their point
nine nebuchadnezzar: happy to help :) 10:29
samcv will just make it more narrow!
kent\n the obvious downside of no-precomp I think will be if you have >1 user where every user who runs some perl6 app has to have compiled assets turn up in ~/.perl6 on-demand redundantly. ;) 10:31
kent\n I'd hope there's some sort of auto-magic though if the source file gets updated in some way so the assets are auto-recompiled though, if not, I'd have additional concerns. 10:31
samcv m: my @a = 10, 'string'; say @a.all ~~ any(Str, Int) 10:32
camelia False
samcv maybe can't use junctions on the right hand side with smart matching?
nine kent\n: of course there is :) That was almost all that made the implementation hard (it is cache-invalidation after all) 10:32
kent\n yeah ;) 10:33
nine is incredbily glad that he hasn't read "delete the .precomp directory" advise in a long time
kent\n One of our "fun" problems is the "oh, you have a X with a new ABI and all dependents need to recompile". Our package manager handles this now, but its a bit of effort. If we did pre-comp, we'd probably need to use that spice for that problem. 10:34
nine kent\n: the rule is simple. If you rebuild rakudo, you'll have to rebuild all Perl 6 packages. The Open Build Service rebuilds downstream always when a dependency gets rebuilt. 10:35
kent\n Yeah. So we'll probably do what we do with Perl ( because well, the libdir changes when you get a new version and @INC only sees the newer libdir and so you have to rebuild everything ) 10:36
samcv this is kind of crappy but this works:
m: sub thing (@body where { my $var = True; .map({ $var = False if $_ !~~ Str and $_ !~~ Int }); $var }) { }; say thing (1,2,3)
camelia Nil
samcv since i can't do .all ~~ any(Str, Int) :\ 10:37
kent\n nine: though I'd hope that "rebuild" is only needed when there was some relevant semantic difference, not simply "you ran the compile again but $(date) returned a different value from last time, so start over"
jnthn m: my @a = 10, 'string'; say all(@a >>~~>> any(Str, Int))
camelia all(any([False True], [True False]))
jnthn m: my @a = 10, 'string'; say so all(@a >>~~>> any(Str, Int))
camelia True
jnthn m: my @a = 10, 4.2; say so all(@a >>~~>> any(Str, Int))
camelia True
jnthn huh 10:38
m: my @a = 10, 4.2; say all(@a >>~~>> any(Str, Int))
camelia all(any([False False], [True False]))
jnthn oh...
Hmm
m: my @a = 10, 4.2; say so all(@a Z~~ any(Str, Int)) 10:39
camelia True
jnthn m: my @a = 10, 4.2; say all(@a Z~~ any(Str, Int))
camelia all(True)
jnthn oh, d'oh
m: my @a = 10, 4.2; say all(@a Z~~ any(Str, Int), *)
camelia all(True, False)
jnthn m: my @a = 10, 4.2; say so all(@a Z~~ any(Str, Int), *)
camelia False
jnthn m: my @a = 10, 'foo'; say so all(@a Z~~ any(Str, Int), *)
camelia True
jnthn samcv: Also an option ^^
nine kent\n: no, the rule is really simple. If you rebuild rakudo, you'll have to rebuild all Perl 6 modules. 10:40
samcv thanks jnthn 10:41
jnthn++
nine m: say $*PERL.compiler.id
camelia A96DD9EC356462F5EC86542DD7F881B1B9E8CABF.1495663814.80485
nine kent\n: notice the time stamp in there ^^^
kent\n nine: that's basically going to be a "I guess we can't use precomp at all" then. 10:42
not unless there's a way to control that timestamp :p
nine kent\n: oh, you are using precomp. Either you precompile on your build server, or on every single user's machine and there for every single user of that machine.
kent\n yeah, its the "for every single user of that machine" that I'd be wishing could be avoided, but probably can't. 10:43
nine But why?
openSUSE packages ship with precomp files and it just works very well 10:44
kent\n because we can't ship the precompiled files.
nine why?
kent\n because that's not a thing our users want.
nine I don't understand
kent\n it strips users of choice.
nine how?
kent\n are precompiled files not akin to binary packages? 10:45
nine The only thing they lose is the waiting time and the wasted disk space.
kent\n shipping precompiled files for us requires that we have a build chain that produces the assets, archives them, and publishes them somewhere for the install process, for every package 10:46
nine It's the same as when you ship SVG icons _and_ pre-generated PNG versions in different sizes to speed up the loading time of apps.
kent\n that makes sense if you're already a binary distro where you already do that. 10:47
but we don't do that.
nine It doesn't keep a user from scaling them to a different size in any way. It only adds.
kent\n we don't ship pre-generated PNGs, unless upstream provides them as PNGs.
nine kent\n: so who's "we" anyway?
kent\n you said "Gentoo"
sorry, if I lost context :D
( I have highlights for people who mention Gentoo here ) 10:48
nine Ooh, that explains a lot.
I thought even Gentoo had some infrastructure to provide some binary packages?
kent\n we do, on pain of torture, in edge cases: basically, it requires a dev to sit down and compile a package, and then upload the resulting tar.gz, and then end users install that. 10:49
But its not really "systemized"
we don't have a build chain optimised for doing that. 10:50
It works about the same as if upstream didn't produce source versions, and we shipped the binary files as-is from upstream.
kent\n ( and it induces significant compromises for users because of how dependencies work ) 10:50
kent\n as in, even though I *can* install libreoffice from both source, and binary, and binary versions are made available *because* of how long it takes to compile, I opted for a source build, because it introduced less headaches for dependency reasons 10:51
( because the binary version would have required me to downgrade a whole bunch of packages and solve a bunch of option conflicts due to the binary version forcing certain choices I didn't make myself ) 10:52
so even though we /can/ have binary packages in a few specific conditions, they're generally frowned upon, and a source-only alternative is generally preferred wherever possible. 10:53
And doing binary packages at *scale* is a significant maintenance headache
nine Ok, in that case, building on the target machine as part of installation does absolutely make sense. 10:54
kent\n ( we used to maintain an entire compatibility set of packages for legacy support of x86 32bit ABIs, but we abolished that entirely as soon as somebody worked out a way to make our compile toolchain do both 32bit and 64bit abis for the required packages in a single pass ) 10:54
nine There is still no _advantage_ for the user though. But as rakudo will be built on the user's machine, there's no way to centralize the build of the modules anyway. 10:55
kent\n Right, I'm just saying that given the constraints stated, we probably won't want to use precompilation, because if simply recompiling rakudo without change is enough to need all existing precompiled assets to be recompiled, then that's going to introduce more problems for our package manager than its worth, so we'll just hang our head and let users do it on-demand 10:56
somebody will surely complain about the space waste, but we'll just have to tell them its out of our control :) 10:59
( Well, there *might* be a way to do it, but it would be an independent task a sysadmin would have to explicitly run or something ) 11:00
kent\n => The precompiled assets would be "unmanaged" 11:00
nine kent\n: I still don't get why you can't rebuild the modules after rebuilding rakudo. Surely your package manager has to deal with such situations? 11:01
kent\n Because you have to understand *how* it determines they need to be rebuilt: It needs a version identifier that forces a dependency graph constraint. 11:02
and that version identifier must be defined *before* rakudo compiles
because the dependency manager must know in advance that rakudo /will/ break the system, and has to schedule all the modules for rebuild at the same time 11:03
nine What kind of sources for this identifier can you use? 11:05
kent\n for instance, when we upgrade ICU, we don't upgrade ICU and have the package manage realise there's a problem after-the-fact, the ebuild that compiles ICU has to state some identifier describing the version of ICU it /plans/ to build, and then the package manager identifies all packages that needed a *different* value of that identifier, and schedules them all for rebuild. 11:06
That identifier has to be practically hard-coded. 11:07
in the recipie that builds it.
nine Could it for example generate a sha of rakudo's sources and nqp's compiler id?
kent\n nope, that would require it to fetch rakudos sources, and unpackage it, and then execute the computation. But the identifier has to be defined before the sources are even fetched. 11:08
github.com/gentoo/gentoo/blob/mast...ebuild#L14 <-- this is about as dynamic as it gets, and that variable is defined before the file is sourced by bash. 11:09
once that file is sourced, the variable "SLOT" is extracted, and then used for dependency calcuations 11:10
( and that file is not allowed to do anything that does system IO during the source phase )
I mean, it can in some weird conditions, but you should basically assume the output of sourcing the file must be 100% invariant 11:11
kent\n topologically, you have metadata, and phase functions. Metadata is acquired, metadata is used to drive dependency calculations, and then the phase functions ( of which, one is "src_fetch" and another is "src_unpack" ) are called at defined steps in the install process 11:12
nine kent\n: oooh, we're going at this the wrong way! 11:13
kent\n: we're looking for a 100 % proof solution when we actually don't need one. We only need something that works for normal use cases which is: you rebuild rakudo when you upgrade to a new version. 11:14
And that's all the trigger we need. You rebuild modules when rakudo's version changes.
kent\n well, rakudo might get rebuilt in *nonupgrade* conditions. 11:14
that's the problem I'm concerned about.
bacek aloha 11:15
nine But we don't have to solve that! If it happes so what? Rakudo will precompile the modules on load. I.e. it falls back to the behavior we have if you don't precomp on installation.
kent\n right, but as I said ... this means "precomp on install is not going to be useful for even us" :D 11:16
nine So there's no loss. In most cases you will gain something and in some cases you will not benefit from that gain.
It will be useful almost all the time!
bacek speaking of "loss". Should be --optimize=3 be faster then --optimize=0?
as in "perl6 --optimize=<n> t.pl" 11:17
kent\n I really should ask one important question: how is the precompile cache structured, is it possible to ascertain the source of a precompiled asset from the asset?
kent\n because I think it could be possible to cheat 11:17
nine kent\n: it is. Though we'd have to commit to the structure of the precomp files if you use the information. 11:19
kent\n as long as the precompiled assets aren't "Owned by" the modules themselves, I can handle their recompilation, theoretically, as a post-installation hook, similar to how you might update a texmf index.
kent\n I'd just have to worry about the failure modes 11:19
nine kent\n: what are the nonupgrade conditions that can trigger a rakudo rebuild?
kent\n nine: if rakudo has dependencies that are C based, like for example, perl has gdbm bindings, if those bindings get upgraded, even though perl gets no new features, the ABI changes, and so perl needs to be recompiled to not be broken. 11:20
( but only when the ABI changes, which is infrequent, just used as an example ) 11:22
kent\n sometimes we need to trigger a reinstall for reasons that don't affect the built code, and its not really avoidable. But if we accidentally omit a runtime dependency in the .ebuild, then we need to ship an new minor revision to simply update the metadata. Anyone who previously installed the package by accident due to having the dependency already present will also get a rebuild triggered. 11:27
( we try to do this sparingly of course )
but as I said, I think there is possibly a sensible way to handle the precomp stuff, just it will require a bit of tooling 11:28
like, if for instance there's a way to say "hey, rakudo, to an inventory of *everything* installed and blanket precompile it from scratch", that would make the effort less. 11:29
do an*
and as you said, you've spent effort in cache validation, so we might be able to cheat and solve this "out-of-package-manager"
( we can kinda trigger that action from the post-install phase of rakudo itself, which will in effect recompile everything when rakudo finishes, and then we can trigger it from the post-install of every module, which will precompile the module itself into the cache ) 11:30
Geth perl6.org: 1b1f205043 | (Zoffix Znet)++ (committed using GitHub Web editor) | source/resources/index.html
Update Laurent's book's URL and status

Per github.com/LaurentRosenfeld/thinkp...-303986422
11:45
Geth perl6.org: c813a72fd3 | (Zoffix Znet)++ (committed using GitHub Web editor) | source/resources/index.html
Improve book status styling
11:47
perl6.org: 7355344152 | (Zoffix Znet)++ (committed using GitHub Web editor) | source/resources/index.html
Update moritz's book
11:48
Geth perl6.org: 280cc583fb | (Zoffix Znet)++ (committed using GitHub Web editor) | source/style.scss
Make <small> a bit lighter
11:50
perl6.org: a284d17664 | (Zoffix Znet)++ (committed using GitHub Web editor) | source/style.css
Make <small> a bit lighter
nine kent\n: those nonupgrade conditions sounds really rare, so I'm back at "rebuilding modules when rakudo's version changes is simply good enough". That said, precompiling all installed modules should be a rather small Perl 6 script thanks to CompUnit::Repository::Installation::installed 12:01
AlexDaniel LaurentRosenfeld++ 12:08
a book done right :)
AlexDaniel reads 12:12
AlexDaniel m: say hello world’ 12:15
camelia 5===SORRY!5=== Error while compiling <tmp>
Bogus postfix
at <tmp>:1
------> 3say hello world7⏏5’
expecting any of:
infix
infix stopper
postfix
statement end
statement modifier
AlexDaniel m: say hello world' 12:16
camelia 5===SORRY!5=== Error while compiling <tmp>
Two terms in a row
at <tmp>:1
------> 3say hello world7⏏5'
expecting any of:
infix
infix stopper
postfix
statement end
statement modifier…
AlexDaniel u: -2 12:24
unicodable6 AlexDaniel, U+106A MYANMAR SIGN WESTERN PWO KAREN TONE-2 [Mc] (◌ၪ)
AlexDaniel, U+1087 MYANMAR SIGN SHAN TONE-2 [Mc] (◌ႇ)
AlexDaniel O_O
unicodable6 AlexDaniel, 854 characters in total: gist.github.com/f0acd2f97ff8843859...d688839c85
AlexDaniel ah, ok…
lolo78_ AlexDaniel, thank you for your comment. 12:42
Laurent R.
Ven_ m: try { CATCH { default { say "x"; } }; PRE { False } } 12:42
camelia Precondition '{ False }' failed
in block at <tmp> line 1
in block <unit> at <tmp> line 1
Ven_ m: try { CATCH { default { say "x"; } }; { PRE { False } } }
camelia x
Ven_ obviously...
AlexDaniel oh wow, it is huge… 13:03
perlpilot
.oO( my god ... it's full of stars )
13:06
nebuchadnezzar erf, Unicode code may not be a really good idea mastodon.gougere.fr/users/sintzoff...ates/25415 ;-) 16:41
AlexDaniel nebuchadnezzar: why not? 16:42
looks exactly like something you'd show to a kid :) 16:43
or maybe not, but it's fun
nebuchadnezzar unfortunately, it does not works with perl6: paste.debian.net/940902/ 16:45
Geth perl6-examples: 38b8c30801 | (Sterling Hanenkamp)++ | categories/games/hangman.p6
Adding hangman
17:18
circ-user-G0Z2U p6: say (5 + 6); 18:23
camelia 11
AlexDaniel well, yes, it works :) 18:26
circ-user-iQZ3V p6: say 'öööööööööö' 18:28
camelia öööööööööö
brrt any idea why p6doc requires openssl 20:43
that stuff ain't right
ugexe where is it required at? 20:45
I dont *think* any of its depends use openssl 20:46
ugexe yep one of them does 20:48
Pod::To::Bigpage 20:49
ugexe for github.com/perl6/perl6-pod-to-bigp...e.pm6#L353 20:50
ugexe although it could just as well be optional 20:50
brrt hmm, i see 20:52
[Coke] if you're doing doc work locally, you can avoid using that module. 20:56
brrt i can't install it with zef at this point since i don't have the requiisite openssl libs installed 20:58
anyway, i'm going to sleep
Geth doc: antquinonez++ created pull request #1335:
Rewrap lines
21:40
ugexe you can install it with --force probably 21:53
ugexe still, shouldnt be in the depends of that module 21:54
Geth doc: d5b434ba52 | (Antonio Quinonez)++ | doc/Language/performance.pod6
Rewrap lines
23:21
doc: 03373646a9 | (Trey Harris)++ (committed using GitHub Web editor) | doc/Language/performance.pod6
Merge pull request #1335 from antquinonez/performance

Rewrap lines — thanks @antquinonez++ !
ingy can someone point me at a p6 modules that lexically changes the p6 grammar to introduce some new (multiline) syntax? say like writing a method in Python or somesuch... 23:23
ugexe github.com/tony-o/perl6-slang-sql but probably bit rotted 23:24
ingy: ^ 23:25
samcv ugexe, it would be nice if there was some distinction from --force. like if you want it to reinstall but you want it to still fail tests and not install
ugexe uninstall is pretty fast 23:26
samcv i guess we could have reinstall? 23:26
ingy ugexe: thx. I'll take a look 23:27
samcv that would be really useful as i sometimes reinstall modules as i'm working on them very often
ingy tony-o: ^^ know if that thing works, or could be made to?
ugexe: know of any p6 doc explaining how to do that kind of thing 23:28
ugexe yeah... although i really need --force-$phase for --force-build --force-test etc
ingy that's exactly what I was looking for btw
ugexe ingy: i do not, but the term you are looking for is `Slang` 23:29
ugexe github.com/FROGGS/p6-Slang-Tuxic/b...g/Tuxic.pm <- this is another one but just lets you put whitespace between the name of subroutine and open parens 23:31
ingy ugexe: thanks. this should get me going in the right direction. 23:36