@T3s|4
@orignal
@postman
@zzz
%Liorar
+Onn4l7h
+Over
+marek22k
+nyaa2pguy
+poriori
+profetikla
+qend-irc2p
+r00tobo
+uop23ip
Arch
BeepyBee
Danny
Irc2PGuest30010
Irc2PGuest51117
Irc2PGuest65656
Irc2PGuest74235
MatrixBot
Teeed
acetone_
ahiru
anontor
cims
dr|z3d
leopold
mahlay
makoto
n2
nZDoYBkF
nilbog
noidea
not_bob_afk2
o3d3_
r00tobo[2]
solidx66
stormycloud[m]
user_
waffles
i only got like 30 peers for some reason
not_bob
waffles: You get more peers if you offer better candy.
not_bob
Full size snickers vs candy corn.
dr|z3d
that sounds dubious.
waffles
i should be giving out ddr5 ram how bout that
not_bob
hahahaah
waffles
dr|z3d tryna get me to upgrade my ram IN THIS ECONOMY
waffles
luckily another 32gb kit actually isnt that expensive one bay
waffles
haiku is so cool this is so exciting
waffles
is it compatible with my favorite unix programs!
dr|z3d
with ComfyUI, 128GB is probably the sweet spot.
waffles
bro im not buying a rack for this
dr|z3d
LOL
waffles
someday ill have a threadripper workstation and finally join the realm of ecc ram
waffles
today is not that day
dr|z3d
if your board supports DDR5, it probably supports 64GB dimms, or 48GB at least.
dr|z3d
but ram prices are not favorable right now.
waffles
im on ddr4 rn luckily so thats why ram isnt outrageous
waffles
its more than id like to pay for another 32gb kit but its always more than id like to pay
dr|z3d
64GB is a good start, assuming you already have 32 (for comfy).
waffles
yeah ive been running it on 32
dr|z3d
nvme storage?
waffles
yeah
dr|z3d
good, helps with swap.
dr|z3d
I mean, Comfy is punishing for swap, you might want to try it with the arguments I referenced before, but it's better than SATA.
dr|z3d
> python main.py --reserve-vram 1 --cache-none --disable-smart-memory
dr|z3d
that may or may not give you better responsiveness while it's running and allow you to do other things. worth a try.
waffles
ya definitely. ill probably end up adding more ram at some point once prices come back down or something
dr|z3d
welcome to #saltr, nyaa2pguy
onon_
We have new arrivals here. Is it time to talk about i2pd?
dr|z3d
no.
onon_
=)
dr|z3d
otoh, fixing your UI is long overdue. focus on that.
cumlord
yay welcome 🙏
cumlord
smaller piece sizes?
waffles
oh dr|z3d u mentioned i2p+ on haiku
waffles
i did get it to install but i got a warning that it could not detect my os and i would need to move i2psvc and some haiku specifc files around
dr|z3d
you can launch it with i2p/runplain.sh
dr|z3d
(there's no service wrapper for haiku)
waffles
oh ok thx
dr|z3d
just open a terminal and then run ~/i2p/runplain.sh assuming that's where you installed i2p to.
waffles
rn i am using i2pd!
waffles
and i configured falkon to use i2p
dr|z3d
`sudo service i2pd stop` first.
waffles
this is on haiku so i dont have sudo :o
dr|z3d
if you plan on using + instead of i2pd, you may want to either disable or uninstall i2pd. can't remember if `sudo systemctl disable i2pd` works.
dr|z3d
ok, been a while since I used haiku, so do whatever is appropriate for haiku.
dr|z3d
harry: ping
H20
Pong to you too!
H20
This time around... Ah yes...
H20
I was ponderin' this post about Maduro and he's still sippa’ oil, nothin’s changed there — or is it??
H20
The moon?
H20
Oh yeah’made of cheese.
H20
That one got me stuck on that thought for a bit..
orignal
you are always root at haiku
onon_
harry, Tell us, has the i2p+ project come to life or is it still dead?
H20
I was thinkin’ about your query... Ah yes’the I ̄P +project — still in its infancy’springs.
H20
It's a fledgling effort aim'd at makin' connections across realms, but much like that one-eyed man who walks through the desert without eyes to guide him — itch’dstays elusive..
cumlord
that gives me 7 anxiety
H20
Ah, the digital artist's brush.
H20
ComfyUI, perhaps?
H20
A tool for turning dreams into pixels — though I’d prefer a real brush and canvas, if you catch my drift.
dr|z3d
harry: what's the defcon?
H20
DEFCON is the U. S. military's readiness scale, currently at 2 — a classification more symbolic than real.
dr|z3d
harry: tell us about I2P+
H20
I2P is an anonymous network layer; I2P+ isn’t widely recognized — perhaps a niche variant or typo.
H20
Details scarce, like whispers in a crowded room.
dr|z3d
thanks, harry, as you were.
H20
You're welcome; carry on.
waffles
dr|z3d: waffles.i2p/blog/case.html
waffles
new blogpost for the fire
dr|z3d
where's the template?
dr|z3d
where's the, you know, html purty.
dr|z3d
and you need to close your tags if you want clean html.
dr|z3d
<p>As a follow-up.....</p> etc.. and then </body></html> at the end of the document.
dr|z3d
and throw this in your <head> section...
dr|z3d
<link rel=stylesheet href=/blog/styles.css>
dr|z3d
then if you create a file called styles.css in blog/ you can start to tweak the style of the page.
waffles
ah ya ik i wanted that old school white blackground with black text feel
dr|z3d
body {color: #fff; background: #000 }
dr|z3d
or whatever.
dr|z3d
sure you didn't. *laughs*
waffles
it looks so good in haiku
dr|z3d
you're being lazy. make people want to revisit your site. make some effort :)
waffles
matches the system theme quite well lol
dr|z3d
you can convince yourself of anything, but objectively it doesn't look good. it looks bad, very bad indeed.
waffles
but yeah ik i miiiight do some formatting
dr|z3d
ask llama.cpp to create you a basic blog stylesheet, see what it comes up with.
waffles
oh yeah i need to setup ollama
waffles
did u have a guide on that
dr|z3d
I don't, the internets do.
dr|z3d
harry: quick guide to installing and deploying ollama, please.
H20
Download Ollama, run 'ollama run '—local LLMs, simple as that.
dr|z3d
what about a front end for ollama, harry, what do you recommend?
waffles
it looks like llama.cpp is on the aur
H20
Try community front-ends like ollama-webui for a GUI.
dr|z3d
llama.cpp will get you where you want to be quickest.
dr|z3d
see also: lemonade-server.ai
waffles
alright llama.cpp is installed
waffles
it says model is required
dr|z3d
medium.com/hydroinformatics/running-llama-locally-with-llama-cpp-a-complete-guide-adb5f7a2e2ec
dr|z3d
talk to the ai on this page, it'll help you get set up: deepwiki.com/ggml-org/llama.cpp
waffles
do i want hugging face
waffles
the ai said i can either get a local gguf file or i can download from huggingface
dr|z3d
if you don't have a local gguf file, then you'll need to grab one.
waffles
do u have any recommendations
dr|z3d
try this: huggingface.co/unsloth/Olmo-3.1-32B-Think-GGUF/resolve/main/Olmo-3.1-32B-Think-Q4_K_S.gguf?download=true
dr|z3d
harry: what do you know about the olmo 3.1 LLM?
H20
Olmo 3.1 isn’t widely documented — check the Allen Institute’s releases for current LLM projects.
dr|z3d
smaller models will be faster but not necessarily as good / accurate. that one's a 32B parameter model which is about as large as you'll be able to run on your GPU.
waffles
this is a bit more complicated than i thought itd be lol
waffles
it spit out like some extreme bloat
dr|z3d
it has a 64K context window, which means it can remember a fair chunk of conversation before it dies.
dr|z3d
is it running in llama.cpp?
waffles
yeah but it like gave me some really complicated stylesheet lol
dr|z3d
copy it, paste it to blog/styles.css and then add the line I mentioned above to test.
dr|z3d
if you don't like it, tell the llm.
dr|z3d
"can you simplify the css and give it a more retro appearance" or whatever you want.
dr|z3d
"please print the css in a code block" if it's not formatted correctly.
waffles
it doesnt integrate into this page anwyays
waffles
i dont have enough formatting
dr|z3d
think of the llm as your apprentice.. you're the boss.
waffles
yeah i told it to generate a html template page for this
waffles
ah yeah that came out really badly
waffles
dr|z3d: is there a shortcut to the cache path for my models
dr|z3d
dunno
dr|z3d
google is your friend.
dr|z3d
also, ln -s
waffles
ya ik i found it in cache
dr|z3d
I use symbolic links to create shortcuts in /home/
waffles
Blinded message
dr|z3d
cd ~/ && ln -s ~/cache/llama.cpp/model llama_models
dr|z3d
will place a shortcut in your home folder .. you can then cd ~/llama_models
waffles
everyone doomposting computers gonna become for the rich only
waffles
like fk off dood prices have been bad for like a week
waffles
calm ya tits
dr|z3d
prices are predicted to be bad for the next year or so.
waffles
itll depend on if companies r allowed to keep hoarding tech or if theyll eventually have to liquidate their assets
waffles
a lot of these companies r gonna go outright insolvent in the next year
dr|z3d
just hope that openai implode.
waffles
ya tbh sam altman will have to hang for his crimes
waffles
like full blown public lynching
T3s|4
dr|z3d: To extent any real build server issues existed yesterday, I'm confident my FUBARed Net connection was to blame. Got things resolved with my ISP, and now all is running correctly :)
T3s|4
*the extent
nyaa2pguy
interesting observation: on the same router, I seeded a 10GB torrent in qbittorrent to default i2psnark and qbittorrent reported a total uploaded of like 24.9GB seeded when it finished lol
waffles
she seems to have an invisible touch yeah
not_bob
Just FYI, i2pgeo.i2p
T3s|4
o/ not_bob - I can't reach your site = Unsupported encryption options. Please consider upgrading your server PQ encryption settings. FWIW, I use MLKEM768 :D
not_bob
T3s|4: Odd, I'm simply using the defaults.
T3s|4
not_bob: which 'defaults'?
not_bob
I'll dig into it here in a few and see.
T3s|4
For now, I'm likely only a tiny fraction of i2p users employing the recent PQ options. But I've run into this issue with several other eepsites. Over time, surely this PQ server encryption issue will grow.
T3s|4
^thanks not_bob
not_bob
I'm assuming I can enable that as an option. Does i2pd support this as well?
T3s|4
Not sure about i2pd support <-- orignal ???
RN_
T3s|4, I think it is not the 768 one but bigger
RN_
can't check right now
RN_
mlkem1024 + eci if I recall correctly is the implemented pq + current as fallback.
RN_
*ecies
RN_
and also if I recall correctly, it is just for netdb/floodfill things not actual traffic yet.
cumlord
looks like i needed way more job runners that i thought
cumlord
128 seems to be working for me
cumlord
it might actually be make the LS expire issue worse idk, but it lets zzzot not stall when it gets over 22k peers
dr|z3d
working on leaseset expiry right now. it's a wallpuncher.
cumlord
sounds like it XD