00:45 summerisle left, summerisle joined
arkiuat . 01:18
no, I'm happy with your last comment on github.com/Raku/problem-solving/is...3856870175 , with which I entirely agree. I'm content to leave it at that. 01:19
01:27 librasteve_ left
nemokosch I have this annoying habit of writing something, then ruminating some more, and then either posting more and more comments or editing it to death 01:29
instead, I will write it here 01:30
even if we ignore side effects, precedence doesn't drive evaluation order in quite crucial situations 01:31
guard-condition && expensive-calculation, for example
one would get the (wrong) idea that the expensive operation is always going to run; fortunately not
or $input.defined && $input.some-operation() 01:32
oops, I missed a key detail: add a higher-precedence operator on the right-handside 01:33
guard-condition && expensive-calculation + 1 - there 😂
similarly, $input.defined && $input.some-operation() + 1
if evaluation order was driven by precedence, this code would be busted
this is not about "intermediate computations" or "side effects" - precedence just isn't about evaluation order 01:34
arkiuat that still seems like a pretty dramatic expansion of the scope of issue 4058, but I'm not very involved with that one, and perhaps should not have commented on it other than my initial request for a title change 02:05
(and sorry [Coke], but on further reflection now I feel your change was from alarmingly wrong to too vague, which is still an improvement though) 02:06
and I'm still not going to try to fix the title of that issue, because as I said, it's not really something I'm working on. Got too many other fish to fry
and *this* particular topic belongs on #raku-doc anyway 02:07
02:08 arkiuat left
nemokosch I didn't want to spam raku-doc with something that isn't clearly addressable and might catch others' attention 02:08
I for one surely don't mind the title of the issue, could be called "fresh fried fish" for all I care 02:09
it's more that after all this time, I find that the guy who opened the issue wasn't just rude, he was probably right (as well) 02:10
precedence is not about evaluation order at all - and if it isn't, the only thing to even consider is to fix that false statement some way, whether it's easy or not 02:12
and yes, I'm volunteering for that, not making extra work for others
aruniecrisps i managed to break the raku repl haha 02:31
@lizmat I can start doing that at some point
Voldenet obviously precedence is only used by parser to decide which expressions are embedded where, this says nothing about order of execution 02:33
`expensive-calculation() / 0` could be turned into a constant failure without any calculations 02:34
during ast optimization
nemokosch I casted a lizard and you just turned it into a dragon 😄 02:35
Voldenet quite fitting for compiler work, eh?
nemokosch perhaps 02:37
Voldenet though it depends if `expensive-calculation()` has side effects or not
nemokosch I mean this is a whole new aspect that spawns even bigger outliers, ast optimizations that is 02:38
02:44 kylese left 02:45 kylese joined
Voldenet in fact, I'm a fan of undefined execution order (apart from explicitly saying "this is expected to short-circuit"), this lets jit do magic that otherwise is impossible 02:47
nemokosch Voldenet: anyway, as we have come this far 02:48
if you have a good idea how exactly to fix the docs... 😅 02:49
> The precedence and associativity of Raku operators determine the order of evaluation of operands in expressions.
> Where two operators with a different precedence act on the same operand, the subexpression involving the higher-precedence operator is evaluated first
there is like one or two more sentences like this
concise but not true
it rather determines what operator the operands belong to 02:50
Voldenet "with the exception of short-circuiting operators"
nemokosch but the other sentence... how to not get lost in it
it's not true even then 02:51
there are too many exceptions in too many different ways - short-circuiting, side effects, other "thunky" operators, etc 02:52
I wouldn't try to hack it
"where two operators (...), the operand will become a part of the subexpression that includes the higher-precedence operator" 02:55
maybe this
Voldenet hm "The precedence and associativity of Raku operators determine how expression tree is constructed" 02:57
nemokosch yeah, that's the truest statement 03:00
if we don't bother to define "subexpression", maybe we could outright refer to "expression tree"
Voldenet or maybe "the precedence and associativity of Raku operators determine how expression is parenthesised"
to make it more trivial
nemokosch that's also something to consider 03:01
Voldenet that second sentence and example can then be expressed differently 03:02
nemokosch that's what I'm curious about
Voldenet for instance "1 + 2 * 3" becomes "(1 + (2 * 3))"
nemokosch because I think a lot of people would instinctively answer that parens "define evaluation order, of course"
which is equally false
Voldenet hm, in some cases operators define evaluation order, there's no avoiding it 03:06
nemokosch what precedence and associativity does is that it shows "what goes where" 03:07
where, but not when
Voldenet like <== vs ==> - you can only say that they're special 03:08
03:09 sibl joined
Voldenet so maybe "this defines evaluation order unless the operator says otherwise" 03:12
03:15 kylese left, kylese joined
nemokosch 😂 03:15
03:15 sibl left
"splunge" 03:15
"it defines evaluation order but there might be exceptions... and I'm not indecisive"
Voldenet "… and I'm not indecisive probably" 03:17
though in case of operators there are operators that are simply special in a way that makes them non-composable in some contexts 03:18
nemokosch that's true but you know, when we just mean operators as in syntax and the corresponding behavior, Raku's operators are really much like C's 03:19
they are as special, not more special
whenever I think about this, I arrive at the following point: there are languages, the most obvious example being Lisp, that do no precedence and associativity stuff 03:21
and that visibly has no effect or consequence to their evaluation model 03:22
03:22 sibl joined
so what prompts us to pretend there is any kind of connection, other than "we can pretend 80-85% of cases"? 03:22
it's not true even then, it just usually has no visible contradictions 03:23
why do we want to save this pretension so badly?
Voldenet Hm, how about a listing of operators that are different
e.g. 03:24
m: ^10 ==> say() <== ^10 # yeah, this can't work
camelia ===SORRY!=== Error while compiling <tmp>
Only identical operators may be list associative; since '==>' and '<==' differ, they are non-associative and you need to clarify with parentheses
at <tmp>:1
------> ^10 ==> say()<HERE> <== ^10 # yeah, thi…
Voldenet because what we do know is that some operators are simply made with use case in mind 03:25
nemokosch oh yeah, I forgot: associativity and precedence is quite literally about how to turn expression syntax "into Lisp"
Voldenet yes, hence the "how it's parenthesised"
nemokosch it's literally just this extra step we invented for visual pleasure 03:26
Voldenet it doesn't actually define how it's evaluated ;>
but still, I think it would be beneficial to group short-circuiting ops into one table of short-circuiting ops 03:29
and so on
nemokosch it could be beneficial 03:30
actually
github.com/Raku/doc/issues/4763
I think this is a pretty damn related point
Voldenet though I think thunkiness can be mimicked in userland 03:40
m: sub sc-and(*@i) { for @i { return False unless .() }; True }; sub T { say "T$^a"; True }; sub F { say "F$^a"; False }; say sc-and({ T(1) }, { F(2) }, { T(3) }) 03:41
camelia T1
F2
False
Voldenet it's a bit of a lie, but it does the same thing essentially 03:45
nemokosch this way, sure it can be 03:46
but in my dictionary, thunkiness is specifically about having the expression be the thunk 03:47
as opposed to hand-writing the thunk at the call site
Voldenet aye, something like
# sub sc-and(Expression *@i) { for @i { return False unless .() }; True }; sub T { say "T$^a"; True }; sub F { say "F$^a"; False }; say sc-and(T(1), F(2), T(3))
though this would require significant rewrite… 03:48
…and multis would be insane
C# does this
nemokosch it really does? 03:49
Voldenet > Thing(Expression<Func<int, int>> a) { … }; Thing2(Func<int, int> a) { … }; Thing((x) => x + 2); Thing2((x) => x + 2)); 03:50
something like this is valid
nemokosch I remember that niner suggested back in 2022 that you could play a bigger role in Rakudo development with your kind of knowledge
he wasn't wrong though
the SO answers about this are horrible 03:52
Voldenet dotnetfiddle.net/26eDj1 03:54
what is interesting is that Expression is the tree passed into the function that can be compiled into lambda or transformed
nemokosch I think RakuAST will allow these kind of things 03:56
or at the very least it will make them possible in user land
btw does this allow closures? 03:58
Voldenet Yes it does!
nemokosch omg
Voldenet closures actually turn all dependencies into the class 03:59
then that class is a reference to the constant class value in the expression 04:00
nemokosch 😵‍💫 04:01
Voldenet so `var x = 1; Thing(() => x + 1);` becomes `class RandomName0 { public int x }; var constantReference = new RandomName0 { }; constantReference.x = 1; Thing(() => constantReference.x + 1)` 04:03
that compiler-generated class is a trick used in C# for async methods, iterators and lambdas 04:04
04:05 lichtkind_ joined
nemokosch how does this really help, though. constantReference still apparently lives in an external scope 04:05
Voldenet it's compiler-generated, so whenever you call that outer method, the constant is allocated, and all lambdas hold that constantReference, so that GC doesn't free it 04:06
+ since it's constant, it can be passed to expressions by value 04:07
so `var x = 1; Thing(() => x = 1)` is also a valid code, because you can freely change fields of the class
04:08 lichtkind left
Voldenet in C# values storage live on a stack, so you can't pass pointer to that `x` directly, since after function invocation, it'd be gone 04:09
that's why explicit storage is allocated and then passed around
nemokosch so it really is like a managed shared pointer? 04:10
Voldenet Yes, kind of 04:11
nemokosch the constant reference playing the role of the pointer 04:13
and then that can go into the lambda by value
Voldenet in C++ terms it'd be `[&] { return x + 1; };` 04:14
nemokosch that wouldn't be a closure in C++ though, right? 04:15
you leave the declaring function and you're screwed
ooor
Voldenet of course, especially if x would be some int 04:16
nemokosch you allocate manually, and then at some point you'll have to free it
but not sure [&] would mean the right thing in that case
so C# tricks that by doing the allocation and the eventual garbage collection on your behalf 04:18
Voldenet in this case & would point onto the stack
so you'd have to do something like `struct RandomName0 { int x; }; auto constantReference = new RandomName0 { }; constantReference->x = 1; Thing([=] { return constantReference->x + 1; });` 04:19
memory leak included :) 04:20
nemokosch okay, I got confused for a sec - this is C++ still
Voldenet yes
equivalent code to the C# one above
nemokosch so you take over the pointer - that you allocated to hold the actual value(s) - by value 04:21
and thereby you eliminated the closure situation 04:22
because it's just an "innocent value" that needs to be taken from lexical scope
Voldenet indeed, from expression point of view it's a field access applied on a constant value 04:23
nemokosch nevermind that the "innocent value" is in fact a pointer to mutable data
😄
I can see it now 04:24
garbage collection solves a lot of referential/scoping problems
if you keep allocating memory and giving pointers to it for free (as in beer), then you won't run into invalid references 04:26
you might run out of memory, though, and in the C++ version you eventually will, unless you do something about it 04:27
Voldenet depends if the compiler is able to figure out lifetimes gcc.godbolt.org/z/nPo9rW96K :) 04:39
nemokosch what to see here? 04:41
Voldenet the whole closure allocation thing gets reduced into `lea eax, [rdi+5]` with -O1 alone 04:42
nemokosch okay but there is indeed not much happening in this snippet 04:43
the function is not returned but called right away, this is quite a bit easier 04:44
this is basically a convoluted way to write Pascal 😄
Voldenet of course it doesn't do much, otherwise compiler would never optimize it :P 04:50
nemokosch alright, now I'm sure you'd make a great addition to the doc team 🤣 04:51
"just want the right things and then the result will always be perfect"
Voldenet oh wow, I take that back, gcc.godbolt.org/z/879qhdcYa 04:53
clang is able to optimize it
nemokosch this is still not quite a closure, the call happens in the same scope as the declaration 04:55
not 100% sure but this still might work in Pascal
Voldenet :D
but back to the topic – apart from macros, passing expressions directly as variables sounds like an interesting possibility 04:56
nemokosch what is the difference from macros? 04:58
Voldenet runtime evaluation 04:59
a bit like WhateverCode but introspectionable
nemokosch Lisp, Prolog..
Voldenet rings some bells :> 05:00
I remember that Red ORM is able to simply use the block instead of expression in some hacky way 05:04
nemokosch smoke and mirrors straight from the smokemachine 05:09
05:37 Aedil joined, Aedil left, Aedil joined 06:05 Pixi left 06:18 Pixi joined 06:34 wayland joined 06:45 wayland left 07:07 merp joined
vendethiel @Voldenet you can probably do autothunking with macros right now 07:42
m: use experimental :macros; macro dotimes($a, $b) { quasi { for ^{{{ $b }}} { {{{ $a }}}; } }; }; dotimes(say(1), 3); 07:49
bah! 07:50
lizmat (or anyone else interested): If you have the tuits for it, I'd love a PR review on my other JSON-Mask at some point whenever possible :) thanks 07:51
08:18 Sgeo left 08:36 atcroft left 08:37 atcroft joined 08:40 sergot joined
Voldenet -m: use experimental :macros; macro dotimes($a, $b) { quasi { for ^{{{ $b }}} { {{{ $a }}}; } }; }; dotimes(say(1), 3); 09:11
m: use experimental :macros; macro dotimes($a, $b) { quasi { for ^{{{ $b }}} { {{{ $a }}}; } }; }; dotimes(say(1), 3);
camelia ===SORRY!=== Error while compiling <tmp>
Unknown QAST node type QAST::Unquote
at <tmp>:1
10:18 sibl left 11:28 jetchisel left 11:31 jetchisel joined 14:22 derpydoo joined 15:41 Romanson joined 16:26 human_blip left 16:28 human_blip joined 16:30 Sgeo joined 16:35 arkiuat joined 16:49 johnjay left 16:58 derpydoo left 17:43 johnjay joined 17:44 freedombroccoli joined
freedombroccoli so [1,2,3] ==> sum() returns 6 but [1,2,3] ==> [+] returns 3, what gives? is there a way to use [operator] with the feed operator? 17:45
nemokosch before anything else, it would be good to confirm this is not a result of other controversial behavior under discussion 17:46
that 3 is almost certainly the length of the array, so eventually the numeric coercion of the array 17:47
freedombroccoli [1,2,3] ==> [+]() also returns 3
nemokosch let me join IRC real quick 17:48
17:48 Nemokosch joined
Nemokosch m: say [»+«] [[5, 6, 7],] 17:49
camelia 3
17:49 Romanson left
Nemokosch that's from another issue, of course 17:49
I wonder if there is a connection
m: dd [[5, 6, 7],] ==> [+] 17:50
camelia Potential difficulties:
Useless use of [+] in sink context
at <tmp>:1
------> dd [[5, 6, 7],] ==> <HERE>[+]
[[5, 6, 7],]
Use of Nil in numeric context
in block <unit> at <tmp> line 1
Nemokosch brr
m: [[5, 6, 7],] ==> [+]() ==> dd()
camelia 1
Nemokosch hm, so it seems this is always the numeric coercion of the array 17:51
freedombroccoli m: [1,2,3] ==> [+] ==> dd()
camelia ===SORRY!=== Error while compiling <tmp>
Preceding context expects a term, but found infix ==> instead.
at <tmp>:1
------> [1,2,3] ==> [+] ==><HERE> dd()
freedombroccoli m: [1,2,3] ==> [+]() ==> dd()
camelia 3
freedombroccoli so it's returning the length of the array?
Nemokosch yes, "with extra steps" 17:52
I'm not saying it should, just looking into what happens atm
freedombroccoli is [+] called a metaoperator?
Nemokosch well [] certainly is
[+] is meant to be a reduction using the infix + operator 17:53
I'll need some time to look into how ==> works 17:55
of course if others know by heart, go ahead :P
freedombroccoli m: [3,4] Z- [6,8] ==> map * ** 2 ==> sum() ==> sqrt() 17:57
camelia ( no output )
freedombroccoli m: [3,4] Z- [6,8] ==> map * ** 2 ==> sum() ==> sqrt() ==> dd()
camelia 5e0
freedombroccoli i just wanted to try to use [+] instead of sum
Nemokosch well I'm not sure you made a mistake here 17:58
==> is quite tricky 17:59
freedombroccoli I use to do some haskell so I am used to chaining functions using composition in haskell 18:00
Nemokosch my opinionated suggestion is to favor `andthen` and .&{}
m: [3,4] Z- [6,8] andthen .map(* ** 2).sum.sqrt.&dd 18:01
camelia 5e0
freedombroccoli not familar with those
Nemokosch `andthen` sets the left-handside to the topic if it's defined, and then executes the right-handside with that topic 18:02
like this:
m: 1 + 4 andthen $_ ** $_ andthen .say 18:03
camelia 3125
Nemokosch m: Nil andthen say('We are missing out on this one')
camelia ( no output )
freedombroccoli so what's andthen do? 18:04
oh you just said so
Nemokosch .&{.....} is simply "execute this anonymous function with the object on the left as the topic"
m: (1 + 4).&{$_ ** $_}.say 18:05
camelia 3125
freedombroccoli so andthen is better then feed if I can use method chaining?
Nemokosch the .& syntax works with named functions as well - feeds the first argument in
unlike ==> which feeds the last in iirc 18:06
I tend to find `andthen` and .& kind of syntax easier to work with, easier to get right
freedombroccoli I was trying to golf the distance formula a bit as I learned more of raku's advanced syntax 18:07
Nemokosch m: [3,4] Z- [6,8] andthen .map(* ** 2) andthen [+] $_ andthen .&sqrt.&dd
camelia 5e0
18:08 arkiuat left
Nemokosch anyway, I'm definitely looking at ==> with [+] 18:08
freedombroccoli alright :)
Nemokosch so 18:16
m: dd [+] ((1, 2, 3), ) 18:17
camelia 3
Nemokosch this in itself does make sense: the sum of one element, the numeric value of that element is the length of it, so the sum is 3
now, for some reason, (1, 2, 3) ==> [+]() ends up generating this code 18:18
it wraps the LHS into a list
this actually gives us a way to work around it 18:19
m: slip(1, 2, 3) ==> [+]() ==> dd() 18:20
camelia 6
Nemokosch (this still doesn't explain why the wrapping happens but my gut feeling is that something would break badly if it didn't) 18:21
18:23 johnjay left
freedombroccoli so it's wrapping the lhs in another list? 18:29
Nemokosch it seems so 18:30
freedombroccoli m: slip([3,4] Z- [6,8]) ==> map * ** 2 ==> [+]() 18:31
camelia Potential difficulties:
Useless use of [+]() in sink context
at <tmp>:1
------> slip([3,4] Z- [6,8]) ==> map * ** 2 ==> <HERE>[+]()
freedombroccoli m: slip([3,4] Z- [6,8]) ==> map * ** 2 ==> [+]() ==> dd()
camelia 2
freedombroccoli guess I need to slip the map somehow
whats slip do"?
Nemokosch m: [3,4] Z- [6,8] ==> map * ** 2 ==> slip() ==> [+]() ==> dd() 18:32
camelia 25
18:32 arkiuat joined
Nemokosch slips acts like lists "without the surrounding parens" 18:32
freedombroccoli is this slip related to | in func(|@args)
Nemokosch yes, in the call site 18:33
when they are added into another list, they break down into individual elements
kind of the inverse of wrapping
[Coke] "without the surrounding parens" sounds more like a perl explanation than a raku one. (parens don't make lists here)
Nemokosch it wasn't meant to be any particular language, just grouping 18:34
freedombroccoli m: [3,4] Z- [6,8]==> map * ** 2 ==> slip() ==> [+]() ==> sqrt() 18:35
camelia ( no output )
freedombroccoli m: [3,4] Z- [6,8]==> map * ** 2 ==> slip() ==> [+]() ==> sqrt() ==> dd()
camelia 5e0
[Coke] ... I would recommend making it at least *this* particular language in this channel, though. :)
freedombroccoli seems I need to use [+] with () or it won't work right
Nemokosch by the way, it seems to me that it's [+] itself that adds this wrapping *under some circumstances* that I haven't deciphered yet 18:36
freedombroccoli is [ op ] or [+] in particular? 18:37
I tried some stuff with [*] and was getting 0
Nemokosch not just +, iirc it was with operators having one of 3 associativities
one of them was left associativity which is what + has
so [*] would be probably the same situation 18:38
freedombroccoli m: [1,2,3] ==> [*]() ==> dd() 18:43
camelia 0
freedombroccoli i wonder if the 0 has to do with the identity of *
Nemokosch the identity of * should be 1, right?
honestly, this is baffling 18:44
freedombroccoli oh
yea
arkiuat slip and the corresponding prefix operator | flatten a list into a surrounding list
Nemokosch at this point I'm just firing random operators into the same pattern 18:48
with / it's 0, with - it's -3
freedombroccoli does it have identities swapped between +- and */ ? 18:49
Nemokosch wait, I have theory 2.0
:D
so it seems this value is deliberately fed as the second argument 18:50
so the wrapped data type acts more like ((), (1, 2, 3)) than ((1, 2, 3),)
0/3, 0*3
0+3, 0-3
freedombroccoli so the 0 and 3 are from .elems? 18:51
Nemokosch technically they are from .Numeric but it will eventually delegate to .elems
say +(1, 2, 3) 18:52
evalable6 3
Nemokosch oh it's the other bot
same build though
18:53 ds7832 joined
Nemokosch to further strengthen this theory: 0 ** 0 is 1 18:53
m: say 0 ** 0 18:54
camelia 1
Nemokosch now, if I give it an empty list, I'd expect 1 as the result
m: () ==> [**]()
camelia Potential difficulties:
Useless use of [**]() in sink context
at <tmp>:1
------> () ==> <HERE>[**]()
Nemokosch m: () ==> [**]() ==> dd()
camelia 1
Nemokosch if I change the number of elements, it should be zero
m: (1,) ==> [**]() ==> dd()
camelia 0
Nemokosch I think this might help: 19:01
m: dd [+]() 19:02
camelia 0
freedombroccoli I saw something about needs .append when I was debugging ==> chains the toehr day
[Coke] m: dd [*]()
camelia 1
Nemokosch point in case, that `[+]()` in the feed chain is in itself a valid call 19:03
and the way ==> works is that it pushes arguments to the calls it finds 19:04
when you write `say()`, that's a zero-argument call, and pushing to the end of it causes no harm 19:08
19:08 freedombroccoli left, freedombroccoli joined
Nemokosch for some reason, `[+]()` is not a zero-argument call: it has one argument, it calls the comma operator with no operands, which will return an empty list 19:08
freedombroccoli Nemokosch 14:08:06
when you write `say()`, that's a zero-argument call, and pushing to the end of it causes no harm was the last thing i saw I got disconnected
Nemokosch > for some reason, `[+]()` is not a zero-argument call: it has one argument, it calls the comma operator with no operands, which will return an empty list 19:09
19:16 stanrifkin joined
lizmat m: dd infix:<+>() 19:22
camelia 0
19:33 ds7832 left
Nemokosch I found where this behavior is implemented in the RakuAST grammar 19:33
19:34 ds7832 joined
Nemokosch github.com/rakudo/rakudo/blob/b7a4...kumod#L498 19:35
19:37 Aedil left
Nemokosch in the old grammar it's probably this one github.com/rakudo/rakudo/blob/b7a4....nqp#L8379 19:37
well, the only thing we don't know is why they work like this
anyway, from a practical point of view, I don't think (1, 2, 3) ==> [+]() could do "the right thing" any time soon; it would almost require a surgical operation to not break a whole lot of things somewhere else 19:39
having said that, don't yall think it would be good to have this recorded somewhere, in a Github issue or something? 19:43
19:44 Nemokosch left 19:47 ds7832 left
freedombroccoli thanks nemokosch if you can still see this from discord 20:16
nemokosch no problem 🙂
I'm writing it up in a personal repo at the moment 20:17
to have it recorded somewhere
freedombroccoli could it work better with .assuming?
nemokosch do you have a specific snippet in mind? 20:18
freedombroccoli I know haskell allows and does alot of chaining with partial function application via currying
not really
just found out about assuming the other day 20:19
like if feed used assuming instead of append
nemokosch hm, I think they are kind of different, I mean assuming creates a new callable, right? 20:20
while the feed operator makes the call right away
@antononcube maybe this is more your topic 20:21
freedombroccoli was thinking something like my $f = &[+].assuming(*) which seems like valid code but $f([1,2,3]) errors 20:22
nemokosch &[+] is short for the addition operator itself 20:23
the "desugared" version would be: &infix:<+>
freedombroccoli the reduce [ ] metapoperator doesn't get applied?
nemokosch I'm not sure offhand if you can get references to these meta-applied operators 20:24
lizmat I prefer functional style myself... freedombroccoli: any reason why you want to use the feed syntax ?
freedombroccoli was just learning it the other day 20:25
thought maybe i could golf a distance formula over arrays
though . method chaining seems much shorter
nemokosch Larry Wall was quite good at golfing, or so I heard 20:28
antononcube churching too... 20:29
lizmat m: (1, 2, 3) ==> sum() ==> say() # feeds are just weird syntax 20:30
camelia 6
lizmat m: (1, 2, 7) ==> sum() ==> say() # feeds are just weird syntax 20:31
camelia 10
lizmat m: (1, 2, 7) ==> sum(3,4,5) ==> say() # feeds are just weird syntax
camelia 15
antononcube (Maybe, the new "Haskell" guy would get the pun...)
freedombroccoli a little bit
;)
m: [1,2,3] ==> sum() ==> say()
camelia 6
freedombroccoli m: [1,2,3] ==> [+]() ==> say() #what we were working on 20:32
camelia 3
freedombroccoli not the expected result for me
nemokosch so I think eventually I could connect the lines why it happens, but not sure if it would be an easy fix without breaking half of the world
connect the dots, even 😆 20:33
freedombroccoli is there a shorter way to slip the array into [+] in a feed operator chain
lizmat so in feed, the "()" is really some syntactic indicator, rather than the indicating that it should be called (which it is every where else)
antononcube Chanelling lizmat: Why use a assuming if can use wrap?!
lizmat m: (1, 2, 3) ==> sum() ==> say()
camelia 6
lizmat freedombroccoli: ^^ 20:34
antononcube But I understand wrap might not be "functional" enough.
freedombroccoli say I wanted to use [*] or something else there is no product subroutine
nemokosch lizmat: exactly - the call needs to be there so that it can be altered during the processing of the feed operator, much like a macro
lizmat and that's what I really hate about feed syntax 20:35
antononcube I have a love-hate relationship with ==>...
lizmat the problem with &[*]() is that it gets the (1,2,3) fed like $(1,2,3) (aka as an item) and thus it produces the number of elements 20:36
nemokosch that's not exactly what I found
lizmat m: (1, 2, 4) ==> &[+]() ==> say()
nemokosch what I found is that reductions have an invisible argument when called without arguments
camelia 3
lizmat that's the identity value 20:37
m: say infix:<*>()
camelia 1
lizmat m: say infix:<+>()
camelia 0
lizmat m: say infix:<->()
camelia 0
lizmat m: say infix:</>()
camelia No zero-argument meaning for: infix:</>
in block <unit> at <tmp> line 1
nemokosch or maybe we are just talking past one another
freedombroccoli I found a snippet like this online the other day and found it interesting as well 20:38
m: ([o] { $_ + 1 } xx 1000)(0).say
camelia 1000
antononcube Right! That was my study group question last weekend. 20:40
I wanted a produce version of it.
(With a functional noation, without using "side effects.") 20:41
nemokosch where did you find this? 20:43
antononcube Probably in the GitHub-published transcripts of the Raku study group. 20:44
nemokosch that would be quite cool wouldn't it 😆 20:58
freedombroccoli where did I find it? I don't recall I've been googling raku quite a bit lately 21:00
antononcube what do you mean by a produce version of it? 21:02
antononcube One that returns : [1, 2, ... , 1000] . 21:03
lizmat sum() is basically a sub that does [+], but there's no sub for [*] so you can't put that in a feed
ah, ok, sorry, misunderstoof
*d
21:04 elcaro joined
antononcube @freedombroccoli Here is the currying: my &func = sub ($a, $b) { "How many $a can be put in $b?"} &func.wrap(-> $b { callwith('golf balls', $b) }) &func('Toyota Corolla 2018') 21:04
freedombroccoli product for [*] would be nice
antononcube Do you mean [\*] > 21:05
freedombroccoli whats that do
scan?
elcaro freedombroccoli: You do have the option of `reduce(&[*])` if you don't want to break your method chain 21:06
antononcube [\*] 1 ... 5 gives (1 2 6 24 120) .
nemokosch Haskell I think calls this "scan"
freedombroccoli yea
antononcube When I said produce I meant that.
freedombroccoli ah 21:07
_elcaro_ He means product as in the result of multiplication. Similar to sum as the result of plus
lizmat m: sub product(*@a) { [*] @a }; (1,2,4) ==> product() ==> say() # freedombroccoli
camelia 8
antononcube @freedombroccoli Haskelisms (and especially Pythonisms) are not welcomed.
freedombroccoli oh :-/ 21:08
antononcube I.e. haskell-smashkell, etc.
nemokosch XDD 21:09
antononcube @freedombroccoli If you Haskell-minded check out / find "Monad laws in Raku". 21:10
nemokosch I knew you would get along great!
antononcube (Shamelss plug, BTW.)
librasteve or rakujourney.wordpress.com/2024/10/...-burritos/ 21:15
nemokosch wamba was a big fan of andthen, by the way 21:16
probably the biggest fan to date
antononcube @librasteve Make the vegan version of that post. (Might be more palatable to @freedombrocolli.)
librasteve fun fact, the first raku (perl6) parser was written in Haskell and all the Ur-rakuteers were Haskell coders 21:18
pugs
21:24 arkiuat left
nemokosch it probably had some influence on the language overall 21:25
21:26 arkiuat joined
antononcube One way to Haskellize Raku's grammars is to use functional parsers. 21:26
nemokosch you are a marketer 21:28
21:31 arkiuat left