~dr|z3d
@RN
@RN_
@StormyCloud
@T3s|4
@T3s|4_
@eyedeekay
@orignal
@postman
@zzz
%Liorar
+FreefallHeavens
+Over
+Xeha
+bak83
+cumlord
+hk
+onon_
+poriori
+profetikla
+r00tobo
+uop23ip
+weko
An0nm0n
Arch
Danny
DeltaOreo
Irc2PGuest53061
Irc2PGuest57148
Irc2PGuest60340
Irc2PGuest99578
Meow
Nausicaa
Onn4l7h
Onn4|7h
acetone_
anon3
anu3
boonst
carried6590
enoxa
mareki2pb
not_bob_afk
plap
shiver_
simprelay
solidx66
thetia
u5657
dr|z3d
zzz: re multithreading for checking, your original post -> pr3t6cjlcth36djl3qgnfaollcvcs2kbzigzon6derjzjghttp4q.b32.i2p/posts/20456
dr|z3d
~35 new/updated avatars.
not_bob
NIce
dr|z3d
hopefully they amuse someone other than me.
not_bob
I like mine!
not_bob
A battle van!
dr|z3d
orginal was tickled.
dr|z3d
something about that size of image that can really make an impact done right, in the right context like a forum.
dr|z3d
and yeah, your van, wouldn't look out of place in gaza.
not_bob
Hahaha, yeah.
not_bob
I have no desire to visit that location.
not_bob
I don't expect americans are welcome there.
not_bob
And, none of the languages I know would be useful.
dr|z3d
I don't think airbnb are running any promotions over there right now.
not_bob
That is sad.
not_bob
But, I imagine it's cheap to buy used property.
dr|z3d
I don't think there's any left.
not_bob
Bah, I need to compile the new i2pd trunk.
not_bob
Fair point.
dr|z3d
inline images working on zzz's mirror site: pr3t6cjlcth36djl3qgnfaollcvcs2kbzigzon6derjzjghttp4q.b32.i2p/posts?page=35
not_bob
Hah, I can't look. My local i2p isn't working.
dr|z3d
zzz hasn't picked up the ball yet, I think he's pretending he didn't notice.
not_bob
Heh
dr|z3d
maybe I need to give him a naked dancing man avatar. then he'll notice. *grin*
not_bob
ooooo ;)
dr|z3d
kidding, obviously. :)
not_bob
Oh, I know.
dr|z3d
his avatar's one of my favorites.
not_bob
Though, honestly. I don't think i2p will run on a real 386.
not_bob
I like idk's icon :)
dr|z3d
cleaned it up, removed the background, enhanced it.
not_bob
*nod*
not_bob
It looks good.
not_bob
Tonight I'm going to get my stats together.
dr|z3d
you'll be able to do all that fast soon, no?
dr|z3d
your card on the way?
dr|z3d
apu..
dr|z3d
amd..
dr|z3d
accelerator...
not_bob
I've found some, have not bought one.
dr|z3d
get on with it!
not_bob
But, I'll do so.
not_bob
I know!
not_bob
Anyway, tonight is notbob jump stats for end of year.
not_bob
I wrote a script to do most of the work, I just need to find it and remember how to use it.
dr|z3d
make it look nice. put it on an svg graph or pie chart or something.
dr|z3d
joking, probably doesn't merit that treatment.
not_bob
It does not. It will be a simple list.
not_bob
Ahh, now I remember how it works!
not_bob
It's silly.
not_bob
Anwyay, I'm going to get on with it.
not_bob
Shit, about 90,000 jumps in the last year~!
not_bob
Wait, no.
dr|z3d
wait, no, 90000 billion!
dr|z3d
most of them originating from mars.
not_bob
564,000 jumps
not_bob
The 90,000 was the previous year.
not_bob
Well then.
not_bob
And this number may not be quite right. I only ran it on one instance and * number of instnaces.
not_bob
I'll have to run this on every instance and then do more math.
dr|z3d
sure, math ain't hard. basic addition.
RN
deploy script to each, put results in file, rsync back to one and add them up
orignal
question
orignal
why do we need to separate encryption and singing keys in identity
orignal
like signing key can't exceed 128 bytes?
orignal
if we have ECIES+P521 combination why signing key can't be 132 bytes wihtout moving these extra 4 bytes to the certificate?
orignal
also applied to future long keys
not_bob
RN: Yep. Though, I've got it compiled at this point. It looks nothing like I expected.
not_bob
direct.i2p is the top site?
not_bob
And it's dead.
not_bob
A lot of the top jumps are dead.
not_bob
I need to do some more work to only put in living sites.
not_bob
Hmm, no. Some sites that were living this year are now dead. Argh, two lists!
RN
people look even harder for a cool site that is gone than ones that are still around
not_bob
Very true.
not_bob
But, my jump service very clearly tells you that it's dead.
RN
it's abscence creates more of a mythos around it
not_bob
Fair
RN
ford the count you are getting include if someone uses a jump link from your index, or just when the router triggers the jump page due to not having a site in address book?
not_bob
Almost 500 requests for the site fakeboobs.i2p!
not_bob
Only when a router doens't have it in the addressbook.
not_bob
If someone uses a link from my index I will never know.
not_bob
I don't use a refer link (though, I could) to track what people click on in the list.
RN
fakeboobs? was/is a site?
RN
just mentioning it probably added a dozen more hits to yer jumper
not_bob
I decided several years ago not to do that.
not_bob
Also, i was wrong on fakeboobs, read that wrong.
not_bob
But, still.
RN
LOL
RN
I like that you don't do google like tracking things such as referrer links
not_bob
Yep
not_bob
That site does not exist, and never has.
not_bob
lol, yes.
RN
the thing that makes tracking nasty in general is that on the outernet you have someone else do it for you
RN
so of course those offering the services consolidate the datas
RN
and monitze it
not_bob
Yep, exactly. All your clicks get logged and sold. I'm not cool with that.
RN
most of us here probably feel much aligned with that
RN
er.. rather... i mean... "yah, same"
not_bob
Shit, do the logs in jetty rotate out?
not_bob
It looks like they do.
not_bob
Bah
not_bob
Well, the hosts.txt stats will be an average then, argh.
zzz
yes orignal if we had a new sigtype with long keys, we could say you can't use it with ElG enctype, and put up to 352 bytes in identity
zzz
not in the spec because we haven't needed to yet
zzz
dr|z3d, yes I'm ignoring you, I am not taking any changes
dr|z3d
just north of 60 new avatars now, for the lulz as they say.
orignal
how about P521
orignal
?
orignal
if cert doesn't have extra length for it it means it inside padding
zzz
yeah we could, that's not the way it's defined now, it's only 4 extra bytes and nobody's using P521 so it doesn't matter much
zzz
basically the layout is defined for all current enctype/sigtype combinations. If we add new ones, we'll document the layout in the common structures spec, as usual.
dr|z3d
in other news, it appears that around 5-6K possible attack routers have fallen off the network.
dr|z3d
any timeframe for the full keepalive implementation to land in the MR tray, zzz?
zzz
"early january" but see my roadmap for dependencies and updates. Awaiting review of MR 166
zzz
did you take the news status and susimail speedup MRs yet, if you're looking for something to merge?
zzz
so there's lots of other MRs up, both approved and not, for you to play with now. keepalive will be up after I've really beaten on it, no time yet
dr|z3d
yup, think I got both of those MRs, the more weighty ones I'm waiting for them to be merged, they're a bit unwieldy to do manually.
dr|z3d
or maybe I haven't got the news one yet, but susimail fo' sure.
dr|z3d
susimail loading, good stuff, just reading the headers seems sane.
dr|z3d
is there any value caching the mail index to disk and updating when new mails arrive?
zzz
the current file system storage is the disk cache. the headers are in memory
dr|z3d
yeah, sure, I'm asking if it's worth caching an index of the headers once they've been read to speed up subsequent loading.
zzz
maybe, but it's fast enough for now
dr|z3d
up to you, just an idea.
zzz
I'm testing on about 4500 emails totally a half a gig
dr|z3d
and how long is that taking from login to inbox display, roughly?
zzz
couple seconds
zzz
vs. 90 sec before
dr|z3d
tolerable :)
zzz
the real place where we'd need caching is for mail body search, but I don't know how to do that
zzz
the search I have now is sender+subject only
zzz
can't scan half a gig for every search
dr|z3d
you don't know how to search the body, or you don't know how to cache the results?
zzz
don't know how to do it quickly without a cache, and don't know how to do a cache, so I'm stuck
dr|z3d
basically you want an efficient index you can search, which is updated when new mails arrive. how we get there, not sure.
dr|z3d
let's see if there's anything out there that can make things easy..
zzz
I don't work for google so I know nothing about search caching
zzz
how would I make a search index of a half a gig of emails that isn't another half a gig?
dr|z3d
ok, here's your starter: lucene.apache.org
zzz
oy no
dr|z3d
oy no to you too :P
zzz
but presumably most of the half gig is attachments, so that's a start
dr|z3d
this is the meat of the project: lucene.apache.org/core
dr|z3d
> index size roughly 20-30% the size of text indexed
zzz
and somebody has to parse the html emails before indexing
zzz
but yeah, if e.g. 90% of the size of the email corpus is attachments, and half the rest is html nonsense, 5% is plain text, an indexer cuts that in 4, leaving the index as 1% of the corpus size
zzz
in theory
zzz
but yes, lucene is the mac daddy
dr|z3d
you can parse html with something like jsoup to get at the text.
dr|z3d
not sure if lucene is overkill, maybe you just need solr.apache.org
dr|z3d
I say "just need", it's built on top of lucene so ignore that part.
zzz
it's just far outside my comfort zone atm, not saying it wouldn't be a fun project some time
dr|z3d
if you did decide to take that on, it'd probably be useful globally, not just in susimail.
zzz
where else?
dr|z3d
and not just in the router, either. conceivably it could be used on the website.
dr|z3d
where else, all over the router, for a start.
dr|z3d
think of firefox's config search, for example.
zzz
that's console, not router. where would router need text search?
dr|z3d
you know, where's the wotsit for configuring floodfill, that sort of thing.
dr|z3d
I meant the console, sorry. it's user facing, so not so much in the router per se, unless you include looking for dests, RIs, that sort of thing.
zzz
console index could be an offline solution, doesn't have to be java
zzz
ditto website
dr|z3d
true enough, just a question of leveraging the library if you decide to use it to do more.
zzz
yeah it's just two different problem spaces. might have similar solutions, might not
dr|z3d
global search in console and webapps for a dest, or partial dest, for example, that could be handy.
dr|z3d
well, it's worth thinking about, if you decide to roll with it, then the scope and usefulness will become clearer for different contexts.
zzz
call that susimail search phase 3, after GET search and XHR search, ETA 2025
dr|z3d
:)
T3s|4
zzz: they don't describe the process steps, but protonmail (somewhat recently) supports content search. The key seems to be d/ling an index. On some of my larger Inboxes, that's taken 30+ mins. Process overview: proton.me/support/search-message-content
snex
in snark: create new torrent, in "data to seed" put /some/file/here. then go into settings and try to change your default data dir. that torrent you just made gets hosed
snex
specifically it will lose the fact that you are seeding from a location and it will start trying to download to that new location from scratch
dr|z3d
hey snex
snex
hi
dr|z3d
finally, welcome to the channel :)
dr|z3d
if changing the data dir is messing with torrents outside the data dir, that's something for zzz to look at.
zzz
yeah, don't do that
snex
wish i knew this about 30 mins ago
zzz
we could prevent changing data dir if any torrents are running (if we don't, not sure) but even that wouldn't help you here
snex
i just lost like 400 torrents of my home videos, linux isos, and other completely legal files
zzz
hmph
snex
i feel like it should remember that a given torrent was created with a given data dir. it seems to survive router restarts just fine
zzz
not sure exactly why
snex
i have kind of a unique setup going on here. my data is all on a NAS NFS share. my i2p is running on a rasp-pi-like device that mounts this NFS share
zzz
the full absolute path to the data dir is stored in the per-torrent config files
zzz
you can see it in ~/.i2p/i2psnark.config.d/*/*.config files
zzz
it's not relative to the data dir
snex
well ive already wiped everything and am rebuilding so i cant say what was in those prior
snex
it does look fine for the ones i have already re-entered so far
snex
does changing the default dir update these files?
zzz
no
zzz
the actual data files got corrupted as it started to re-download each one? or just the .torrent files or the config files got changed?
zzz
if your .torrent files got renamed to .torrent.BAD you can rename them back
snex
the data files are fine, but it started trying to re-download each one into the new data dir rather than recognzing they were being seeded
snex
there were no torrent.BAD files anywhere
zzz
were torrents running when you changed the data dir?
snex
i had to shut everything down because it was trying to put 1TB worth of data onto a 64G drive
snex
yes
zzz
ok well that we should probably fix
zzz
not sure if that's the cause or not though
snex
the only other thing of note is that snark seems to think the data dir i originally tried is not writable
snex
so maybe setting to a valid dir doesnt reveal any problems, but setting to an invalid one and then back does
snex
hm, so heres a question. if i make a torrent file on my desktop, then also manually make one of these config files and place it there, will snark recognize it? because this will make things so much easier for me
snex
whats the significance of these subfolders "s2" "sJ" etc?
zzz
you
zzz
just so it's not one huge directory
zzz
I don't know if it will find the config file
zzz
probably not?
snex
i would very much prefer to make the torrent on my desktop and then just have snark recognize the data location
zzz
you could stop i2p, change the data dir in the snark config file, copy all the torrent files over to the snark dir, and restart
zzz
then if you want to change the data dir back, stop and edit the config file again
dr|z3d
no need to stop i2p, just stop i2psnark in /configwebapps
zzz
should work
snex
well my data dir issue is a whole nother thing. it insists it cannot write to a folder on the NFS share when it definitely can
zzz
just guessing though
zzz
set all your seeded files 444 just in case
snex
so this may sound weird but i want the data dir to be something different than my seed location
snex
because the seed location is organized in a way that i want, whereas downloaded data is organized in the way other people want. i need to be able to fix it before i put it into my data store
snex
if that makes sense
zzz
yeah I get it
zzz
we don't support move-completed-torrents-somewhere-else
zzz
or two data dirs
snex
well i dont necessarily want that either because it depends on the nature of the data. i want linux isos to go into "linux_isos" and unix isos to go into "unix_isos" for example
snex
"steamboat willie" should go into "public_domain_movies"
snex
it would be great if i could do all this stuff with symlinks but nobody likes those :(
zzz
maybe see if biglybt will do what you need?
snex
it might but i dont want my desktop to be running anything server-like
snex
im going to look into XD after i get all these torrents back
zzz
XD would be a step back. no DHT
snex
bleh
zzz
but worth playing with
snex
if you need a hand working on any of this stuff i can help out. java isnt my favorite but i can do it
zzz
anyway,sorry for your trouble, especially if I had anything to do with it
zzz
seems like all your old .torrent files should still be in the old data dir, right where you left them, so I don't get it
snex
they were. i had to nuke them. or thought i did and then did
zzz
hmm
snex
i just didnt know how to make them refer back to the original data locations
snex
now that i know about those config files, i should be able to play with them if this happens again
zzz
yeah because the torrent file monitor would have blown away all the config files after you changed the location
snex
hmm im gonna try something real quick...
RN
famous last words
snex
cant get into a worse spot than im already in
RN
:D
RN
well, sure you could. you could loose the data files
snex
i have backups
snex
or in the case of linux isos, i can just re-grab
RN
good good
RN
just teasing... cause... you know... 'snex'
zzz
as usual, it's the panic after the first problem that dooms you
snex
if i didnt panic when i sliced a giant chunk of my hand off with a mandolin, im not gonna panic over a few lost days of productivity by having to re-add torrents
zzz
yikes
snex
but if what im thinking of works, that time will be shortened by a lot
zzz
yeah every disaster has a lesson
snex
i just have to wait for this 12G file to add itself first
zzz
I have some ideas on how to make it more bullet-proof, adding to the list
snex
snexs_12th_birthday_2160p.mkv
snex
i bet you didnt know they had 2160p cameras back in 1994
snex
damn, didnt work
snex
i was thinking to upload the .torrent file that i already have, then stop it, change the configs, then start it back up. but that doesnt work
snex
i might have to stop the whole app for this to work
zzz
yeah in the best case, if you switched your data dir, didn't delete your torrents, realized the issue, and switched back, ...
zzz
it would still delete all the torrent config files and then would have to recheck every torrent when you switched back
snex
well i switched back first. the problem is the original data dir was already not where the data is
zzz
oh then it would have created all new empty data files
snex
yup
snex
it tried to create 1TB of empty files in 64G of space
zzz
bottom line, if there's no .torrent file it's going to delete the .config file
zzz
yup
zzz
but yeah, you could have stopped snark, fixed up all the config files, and restarted. But if it's yacking on out-of-disk then it gets harder
snex
so like heres my setup, i have my data in /mnt/data/live_data. my i2p datadir is /var/lib/i2p/i2psnark. i WANT my datadir to be /mnt/data/staging_data but snark insists that it cant write here, so i had to set it back
zzz
dunno, we just use java to tell us if a dir is writable
snex
yeah. i have no idea why it insists it cant write here because sudo -u i2psvc touch /mnt/data/staging_data/wtf works just fine
zzz
anyway, as long as you still have the data files, it's all recoverable, not a mandolin type situation ))
snex
eh, my hand healed eventually