This channel is intended for people just starting with the Raku Programming Language (raku.org). Logs are available at irclogs.raku.org/raku-beginner/live.html Set by lizmat on 8 June 2022. |
|||
03:01
jgaz left
03:06
jgaz joined
05:44
teatwo left
06:01
teatime joined
07:08
ab5tract joined
07:41
ab5tract left,
ab5tract joined
08:01
Manifest0 joined
08:09
dakkar joined
08:21
teatwo joined
08:23
teatwo left,
teatwo joined
08:24
teatime left
08:35
Manifest0 left
08:50
Manifest0 joined
|
|||
rcmlz | Hello @antononcube , I like to try out Jupyter::Chatbook by running brew install python python3 -m venv $HOME/.virtualenvs/chatbook source $HOME/.virtualenvs/chatbook/bin/activate python3 -m pip install --upgrade pip python3 -m pip install --upgrade wheel python3 -m pip install --upgrade notebook brew install zmq # get zeromq 4.3.4 brew install rakudo-star # get v2023.08 zef install Jupyter::Chatbook but I | 13:20 | |
get ===> Testing [FAIL]: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> Aborting due to test failure: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> (use --force-test to override) I am running uname -a Darwin Macbook.local 21.6.0 Darwin Kernel Version 21.6.0: Thu Jul 6 22:18:26 PDT 2023; root:xnu-8020.240.18.702.13~1/RELEASE_X86_64 x86_64 Is it broken? Do I need to specify OpenAI keys beforehand? | |||
Thank you. | |||
antononcube | @rcmlz Thanks for trying that out! Can you install “LLM::Functions” by itself? | 13:22 | |
rcmlz | zef install LLM::Functions ===> Searching for: LLM::Functions ===> Searching for missing dependencies: HTTP::Tiny:ver<0.2.5+>, Text::SubParsers:ver<0.1.1+>, WWW::OpenAI:ver<0.2.8+>, WWW::PaLM:ver<0.1.8+> ===> Searching for missing dependencies: DateTime::Grammar:ver<0.1.3+> ===> Testing: HTTP::Tiny:ver<0.2.5>:auth<zef:jjatria> ===> Testing [OK] for HTTP::Tiny:ver<0.2.5>:auth<zef:jjatria> ===> Testing: | 13:24 | |
DateTime::Grammar:ver<0.1.3>:auth<zef:antononcube>:api<1> ===> Testing [OK] for DateTime::Grammar:ver<0.1.3>:auth<zef:antononcube>:api<1> ===> Testing: Text::SubParsers:ver<0.1.4>:auth<zef:antononcube>:api<1> ===> Testing [OK] for Text::SubParsers:ver<0.1.4>:auth<zef:antononcube>:api<1> ===> Testing: WWW::OpenAI:ver<0.2.8>:auth<zef:antononcube>:api<1> ===> Testing [OK] for | |||
WWW::OpenAI:ver<0.2.8>:auth<zef:antononcube>:api<1> ===> Testing: WWW::PaLM:ver<0.1.8>:auth<zef:antononcube>:api<1> ===> Testing [OK] for WWW::PaLM:ver<0.1.8>:auth<zef:antononcube>:api<1> ===> Testing: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> ===> Testing [FAIL]: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> Aborting due to test failure: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> (use | |||
--force-test to override) | |||
antononcube | Hmm… I am investigating this right now, then. (On my macOS computer “LLM::Functions“ installs without problems.) | 13:25 | |
Meanwhile, can you install it with —force-test and see does “Jupyter::Chatbook” installs ? | 13:27 | ||
nemokosch | also, could you maybe run it with --verbose and post a gist with the output? | 13:30 | |
rcmlz | [LLM::Functions] t/05-LLM-prompt-synthesizing.rakutest .. Dubious, test returned 1 [LLM::Functions] No subtests run [LLM::Functions] All tests successful. [LLM::Functions] [LLM::Functions] Test Summary Report [LLM::Functions] ------------------- [LLM::Functions] t/05-LLM-prompt-synthesizing.rakutest (Wstat: 256 Tests: 0 Failed: 0) [LLM::Functions] Non-zero exit status: 1 [LLM::Functions] Parse errors: No | 13:32 | |
plan found in TAP output [LLM::Functions] Files=5, Tests=41, 13 wallclock secs [LLM::Functions] Result: FAILED ===> Testing [FAIL]: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> Aborting due to test failure: LLM::Functions:ver<0.1.9>:auth<zef:antononcube>:api<1> (use --force-test to override) | |||
nemokosch | now that's more helpful | 13:33 | |
antononcube | Yes, @rcmlz -- thanks! | ||
That file does have a plan statement. I will try to setup a build workflow on GitHub. | 13:34 | ||
(That includes Linux.) | |||
rcmlz | Thank you. | 13:35 | |
I was today also playing around with Github workflows, attaching one to my little weekly-challenge-benchmark project. Is "massa" reading here? Have a look at "Run Benchmark" here github.com/rcmlz/perlweeklychallen...6749298659 - it seems to be that your solution for week234 task 2 is quite slow. | 13:39 | ||
In 3 seconds only problem of size 128 was solvable. | 13:40 | ||
antononcube | @rcmlz If it is not a secret -- what Linux distribution you are trying to install "Jupyter::Chatbook" on? | ||
nemokosch | wasn't it downright a Mac? | 13:41 | |
rcmlz | MacOS Monterey 12.8.6 | ||
nemokosch | > Darwin Macbook.local 21.6.0 Darwin Kernel Version 21.6.0: Thu Jul 6 22:18:26 PDT 2023; root:xnu-8020.240.18.702.13~1/RELEASE_X86_64 x86_64 | ||
antononcube | Duh... obvious... | ||
I will try installing "LLM::Functions" on my x86 macOS computer. | 13:42 | ||
nemokosch | it works on Silicon? | ||
antononcube | Yes, but, that is my environment. | 13:43 | |
nemokosch | Still, it's kinda surprising | ||
antononcube | Although, I did remove it wit zef before reinstalling. | ||
nemokosch | that anything would work better on Silicon xD | ||
antononcube | Well, the current macOS version is 13.5.2, so, I do not know... 🙂 | 13:44 | |
rcmlz | [Jupyter::Chatbook] # You failed 1 test of 1 [Jupyter::Chatbook] t/01-basic.t ......... Dubious, test returned 1 [Jupyter::Chatbook] Failed 1/1 subtests [Jupyter::Chatbook] t/02-sandbox.t ....... ok [Jupyter::Chatbook] t/03-service.t ....... ok [Jupyter::Chatbook] t/04-completions.t ... skipped [Jupyter::Chatbook] t/05-autocomplete.t .. ok [Jupyter::Chatbook] t/06-magic.t ......... Dubious, test returned 1 | 13:50 | |
[Jupyter::Chatbook] No subtests run [Jupyter::Chatbook] t/07-comms.t ......... ok [Jupyter::Chatbook] t/08-paths.t ......... ok [Jupyter::Chatbook] t/09-history.t ....... ok [Jupyter::Chatbook] t/20-end-to-end.t .... Dubious, test returned 1 [Jupyter::Chatbook] No subtests run [Jupyter::Chatbook] t/99-meta.t .......... ok | |||
nemokosch | can you run the tests on their own? | ||
rcmlz | I assume that also a "plan" is missing in the files, Anton? I run zef install --verbose Jupyter::Chatbook | 13:51 | |
nemokosch | > That file does have a plan statement. I will try to setup a build workflow on GitHub. | ||
rcmlz | [Jupyter::Chatbook] # Failed test 'Jupyter::Kernel module can be use-d ok' | 13:52 | |
Missing Dependency? | 13:53 | ||
[Jupyter::Chatbook] # Could not find WWW::MermaidInk in: | |||
antononcube | Yeah, I will check that -- I thought I included "WWW::MermaidInk" in the meta file. | 13:54 | |
13:59
ab5tract left
|
|||
I think I found and fixed the 2 problems for installing "LLM::Functions". I also made the GitHub workflows. Will proclaim success, when it comes. | 14:01 | ||
14:11
jgaz left
14:16
jgaz joined
|
|||
@rcmlz Well, it looks like the installation of "LLM::Functions" works now. | 14:16 | ||
I will fix "Juputer::Chatbook" later today. You can try installing it from GitHub : github.com/antononcube/Raku-Jupyter-Chatbook . (It has many new features.) | 14:17 | ||
github.com/antononcube/Raku-LLM-Fu...6756710534 | 14:18 | ||
rcmlz | when will it be available via zef? | 14:22 | |
antononcube | I just submitted it to zef (with fez.) Should take less than an hour to be indexed in raku.land . (Most likely) | 14:23 | |
14:32
ab5tract joined
|
|||
The improved "LLM::Functions" -- with verified installation on GitHbub -- is available in Zef ecosystem: raku.land/zef:antononcube/LLM::Functions | 14:39 | ||
14:57
ab5tract left
|
|||
rcmlz | 👍 | 15:01 | |
antononcube | Agh, one of the test files is misplaced -- but GitHub did not complain. Submitting the new version within 2 min. (After I verify the GitHub workflows finish with success.) | 15:04 | |
15:45
ab5tract joined
|
|||
@rcmlz Please, (bravely) install "Jupyter::Chatbook:ver<0.1.3>" from Zef ecosystem. I verified it can be installed from scatch on macOS x86 computer. | 16:36 | ||
16:39
dakkar left
16:59
jgaz left
18:37
MasterDuke left
18:48
MasterDuke joined
19:08
jgaz joined
19:32
lizmat_ joined
19:33
RakuIRCLogger__ left,
lizmat left
19:37
lizmat_ left,
lizmat joined
19:42
deoac joined
20:21
teatwo left,
teatwo joined
20:23
tea3po joined
20:26
teatwo left
22:02
samebchase left
22:03
samebchase joined
23:00
RakuIRCLogger left
23:55
Manifest0 left
|