🦋 Welcome to the MAIN() IRC channel of the Raku Programming Language (raku.org). Log available at irclogs.raku.org/raku/live.html . If you're a beginner, you can also check out the #raku-beginner channel! Set by lizmat on 6 September 2022. |
|||
00:02
arkiuat joined
00:34
abraxxa left
00:35
abraxxa joined
01:05
librasteve_ left
|
|||
disbot | <simon_sibl> I encounter an issue here, not sure if thats because of my code or not | 01:31 | |
<simon_sibl> termbin.com/v3bi | |||
<simon_sibl> the http part works well, but the netcat part (to create a paste) doesnt work well | |||
<simon_sibl> basically netcat doesnt wait for the server to send back the file ID | |||
<simon_sibl> I tried many different flags, but none of them seems to solve it, it seems that netcat doesnt want to read after the first_line receive part for some reason | 01:32 | ||
01:34
hulk joined
01:35
kylese left
02:07
Guest13 joined
02:15
hulk left,
kylese joined
02:23
Guest13 left
03:04
Aedil joined
|
|||
disbot | <simon_sibl> alright, this work: termbin.com/9bvp | 03:17 | |
<simon_sibl> but I need to use the --no-shutdown flag on netcat | |||
<simon_sibl> okok, that with nmap/ncat with gnu/netcat it works as expected ! | 03:23 | ||
04:24
arkiuat left
04:32
arkiuat joined
05:48
crnlskn joined
06:06
kst``` joined,
perryprog_ joined,
hulk joined
06:07
apa_c joined,
kylese left,
jmcgnh left,
apac left,
sftp left,
releasable6 left,
perryprog left,
kst`` left,
jdv left
06:08
jmcgnh_ joined
06:09
jdv joined
06:10
sftp joined,
sftp left,
sftp joined,
releasable6 joined,
jmcgnh_ is now known as jmcgnh
06:11
arkiuat left
06:20
lizmat left
06:23
arkiuat joined
06:25
kst``` left
06:27
arkiuat left
06:29
lizmat joined
06:34
[Coke]_ joined
06:37
[Coke] left
06:57
arkiuat joined
07:02
arkiuat left
07:22
Sgeo left
07:25
ACfromTX left
07:28
ACfromTX joined
07:29
arkiuat joined
07:36
lichtkind joined
07:38
arkiuat left
07:48
ACfromTX left
07:51
arkiuat joined
07:56
arkiuat left
07:58
lizmat left
07:59
lizmat joined
08:00
ACfromTX joined
08:01
Aedil left
08:17
dakkar joined
08:25
arkiuat joined
08:30
arkiuat left
08:33
arkiuat joined
08:38
arkiuat left
08:55
abraxxa left
08:56
abraxxa joined
09:09
arkiuat joined
09:23
arkiuat left
09:24
icovnik left
09:51
arkiuat joined
09:58
arkiuat left
|
|||
disbot | <antononcube> @librasteve One way to explain LLM graph to you is the following scenario. We make free different graph nodes computing measurements for queries by: 1. Crag 2. WolframAlpha 3. LLM And a judge node: 4. LLM to pick a result from 1,2,3 | 10:10 | |
<antononcube> The computations in the nodes 1,2,3 happen asynchronously. | 10:11 | ||
<librasteve> .oO - that's is very cool | 10:12 | ||
<antononcube> This can be an interesting experiment, actually, and a good example use case. | |||
10:13
arkiuat joined
|
|||
disbot | <librasteve> my use case is a "handy calculator on the command line" | 10:14 | |
<antononcube> "LLM::Graph" is infrastructural, it should be nice and easy to use it "under the hood." | 10:16 | ||
10:18
arkiuat left
10:21
Aedil joined
|
|||
disbot | <librasteve> =b | 10:25 | |
10:47
arkiuat joined
10:52
arkiuat left
11:08
arkiuat joined
11:12
pierrot left,
pierrot joined
11:13
arkiuat left
11:16
apa_c left
11:19
apogee_ntv left
11:20
apogee_ntv joined
11:37
arkiuat joined
|
|||
disbot | <antononcube> Does “App::Crag” use LLMs, or do | 11:41 | |
11:42
arkiuat left
|
|||
disbot | <antononcube> I mean, that there is also the inversion of control possibility using the so-called LLM function calling. | 11:42 | |
<librasteve> okaay - never heard of that - maybe a quick way to write an LLM::Graph node | 11:43 | ||
<antononcube> Please, see here: rakuforprediction.wordpress.com/20...-1-openai/ | 11:45 | ||
<antononcube> The idea is what LLMs can call a function running locally on your computer. That function is being described to them with some JSON schema. | 11:47 | ||
<antononcube> Until a year ago was unreliable.Now many people say it is. | 11:48 | ||
<librasteve> ok if you have App::Crag installed, then you can go crag '?^<elephant mass in kg>' on the command line | |||
<librasteve> you get back a Physics::Measure.new(value => 6000, units => 'kg').Str | 11:50 | ||
<librasteve> (ie its stringified to 6000kg) | |||
<librasteve> maybe should go in the "ways to check the calculation output of your LLM" | 11:51 | ||
<antononcube> That is LLM-graph way. The LLM-tool way is describe to the LLM what is the tool (run-cmd in this case) and the LLM would decide when to call it. | 11:57 | ||
<antononcube> The LLM-graph can be seen as a generalization of | |||
Voldenet | mcp is still still not strictly reliable | 11:59 | |
it's more reliable, but sometimes ollama updates just break things there | |||
12:00
arkiuat joined
|
|||
disbot | <librasteve> please dont use App::Crag to build a satellite | 12:00 | |
<antononcube> No, just to build an elephant. | |||
Voldenet | and hide it in the room | 12:01 | |
disbot | <antononcube> Well, or just measure a few… | ||
Voldenet | there are problems with mcp servers still, for example destructive or non-repeatable calls get called multiple times | ||
in non-interactive way | 12:02 | ||
disbot | <antononcube> @Voldenet Yeah, that is the challenge with updating the LLM packages— LLM providers can’t help themselves but break their APIs little by little. | ||
<librasteve> ?^<typical room volume in m3> / ?^<elephant volume in m3> | 12:03 | ||
<librasteve> 6① | |||
<antononcube> To some extent, that’s why I implemented LLM-graph. It can give you more control and let you use multiple agents to get results. | 12:04 | ||
Voldenet | I'm not really annoyed too much about it, because the models ecosystem still in very early stages | ||
disbot | <librasteve> ?^<elephant cost in US$> => 100000USD | ||
12:04
arkiuat left
|
|||
disbot | <antononcube> Well, I’m annoyed, because people demanded at work and expect a lot. | 12:05 | |
Voldenet | well, walking on water is as easy as implementing software according to spec | 12:08 | |
very easy if they're both frozen | |||
disbot | <antononcube> At some point some developers want to vaporize them all. | 12:12 | |
<antononcube> This reminds me — I like pointing out that Agile was invented/shaped/verbilized at ski resort. I.e. around “frozen water” with clear vision of obstacles when skiing downhill. | 12:15 | ||
12:20
arkiuat joined
|
|||
Voldenet | it still works when floor is lava, you just need to sprint more ;) | 12:21 | |
12:25
arkiuat left
12:31
crnlskn left
12:36
crnlskn joined
|
|||
disbot | <antononcube> Yeah, limited scenarios with floors. If they came up with such a methodology in the Everglades, here in Florida, I would be much more impressed. | 12:41 | |
<antononcube> @librasteve Can you "surface" run-cmd of "App::Crag", so it can be used in Raku sessions? | 12:49 | ||
12:51
apa_c joined
12:52
apa_c left,
apa_c joined
12:54
arkiuat joined
13:00
apa_c left
|
|||
Geth | raku-mode: ttn-ttn++ created pull request #62: Indent method calls when they happen on a new line |
13:05 | |
13:12
apa_c joined
|
|||
disbot | <librasteve> ok if you have App::Crag installed, then you can go crag '?^<elephant mass in kg>' on the command line | 13:24 | |
<antononcube> I get that. But I want to use "App::Crag" functionality within Raku sessions. Having those functionalities via run or shell seems too indirect. | 13:26 | ||
<antononcube> But, well, that is way to make a prototype of comparing "App::Crag" to other systems. | 13:27 | ||
13:51
apa_c left
14:12
crnlskn left
14:23
perryprog_ is now known as perryprog
|
|||
disbot | <librasteve> use App::Crag; run-cmd '?^<elephant mass in kg>'; now supported (give me a couple of mins to fez upload v0.0.33) | 14:27 | |
<librasteve> done the fez .... | 14:29 | ||
14:30
apa_c joined
15:08
[Coke]_ is now known as [Coke]
15:14
apa_c left
16:10
melezhik joined
16:32
japhb left
16:38
japhb joined
16:40
dakkar left
16:42
arkiuat left
|
|||
disbot | <antononcube> @librasteve Cool, but can give an example of using run-cmd and/or eval-me ? | 16:58 | |
<librasteve> run-cmd '?^<elephant mass in kg>'; | 17:03 | ||
<antononcube> I tried that and others... | |||
<antononcube> I get: > Error: X::Method::NotFound «No such method 'tell' for invocant of type 'Out'» | 17:04 | ||
<librasteve> looks like you are in the repl - that's what you start with the crag cmd with no args (kust like going raku on the command line | 17:05 | ||
<librasteve> assume I know nothing about jupyter - so you are either (i) in some raku code (eg in a file) that will be run via raku file.raku in which case put use App::Crag; run-cmd '?^<elephant mass in kg>'; in the file or (ii) you are at the Unix command prompt > in which case type in crag '?^<elephant mass in kg>' (like raku -e say "hi";') | 17:20 | ||
<librasteve> alles klar? | 17:21 | ||
tbrowder | hi, any user here ever hear of multi-os app Ventoy used to created live USB's | 17:27 | |
if so, is it known to be save for use with private data and free from viruses? | 17:28 | ||
*safe | |||
17:29
human-blip left
17:31
human-blip joined
|
|||
disbot | <antononcube> @librasteve This is what I got so far: | 17:55 | |
<antononcube> cdn.discordapp.com/attachments/633...67cdf& | |||
<antononcube> I cannot run crag in Jupyter easily and / or asynchronously. | 17:56 | ||
<librasteve> aha | 17:59 | ||
<librasteve> there are 3 raku prefixes in play - ^<foo>, ?<bar> and ?^<baz> | 18:01 | ||
<librasteve> ^<foo> means ^<value unit [±error]> | 18:02 | ||
<librasteve> ?<bar> means ?<some text goes to Gemini via LLM-DWIM> | 18:03 | ||
<librasteve> ?^<baz> means ?^<some text to llm 'in units'> | 18:05 | ||
<librasteve> note thats query caret ?^ | |||
<librasteve> btw all three are raku prefixes so any quoting format '', "" <> will work | 18:06 | ||
<librasteve> use "" if you want to interppolate $vars of course | |||
<librasteve> (<> is adjusted to join the words with a space to make a string) | 18:07 | ||
<librasteve> in the case of baz - there is some prompt engineering and the in units part at the end is both passed to the LLM and regexed out to make the units part, so has to be valid | 18:08 | ||
18:30
melezhik left
18:31
librasteve_ joined
18:33
phogg left
|
|||
disbot | <antononcube> The LLM extension is not that interesting using the LLM-graph POV. It is better to know when to use "App::Crag", with what kind of queries. (I.e. a certain Raku code.) | 18:44 | |
<antononcube> I am not sure to what degree "Physics::*" packages know about elephants and other animals. | |||
18:46
phogg joined
|
|||
disbot | <antononcube> Again, this is the perspective of using LLM-tools -- LLMs do the the LLM-ing and deterministic computations are done locally. | 18:46 | |
20:02
apac joined
20:33
Aedil left
|
|||
disbot | <librasteve> did you get ?^ to run? | 20:48 | |
Geth | Pod-To-HTML/main: 127a900228 | (Steve Dondley)++ (committed using GitHub Web editor) | README.md correct module name for install instructions |
20:52 | |
Pod-To-HTML/main: c2204db2f5 | librasteve++ (committed using GitHub Web editor) | README.md Merge pull request #94 from sdondley/master correct module name for install instructions |
|||
disbot | <antononcube> @librasteve Not in Jupyter -- used run. On the command line -- yes. | 20:58 | |
<librasteve> the point is this ?^<some text to llm 'in units'> ... ie dual parsing of LLM query ... one to condition the Unit::Parser, the other to shape the LLM response (right?) | 21:02 | ||
<librasteve> so to ingest the LLM response to an analytic codebase | |||
<antononcube> Ok. But I am more interested in the following: - Make a classifier for recognizing "App::Crag" deterministic computations input - Decide should "App::Crag" be used or not - If not just use Wolfram|Alpha and LLM - Or, use Wolfram|Alpha and LLM always - Have a LLM judge to choose the final answer | 21:06 | ||
<antononcube> The classifier can be made with the inputs in "App::Crag" test files. | |||
21:13
LainIwakura joined
21:16
LainIwakura left
21:28
jgaz joined
21:34
LainIwakura joined
22:05
phogg` joined,
japhb_ joined
22:10
phogg left,
japhb left
22:12
LainIwakura left
22:18
Sgeo joined
22:40
LainIwakura joined
22:52
LainIwakura left
23:24
phogg` is now known as phogg
23:47
lichtkind left
|