@eyedeekay
&eche|on
&kytv
&zzz
+R4SAS
+RN
+RN_
+T3s|4
+acetone
+dr|z3d
+hk
+orignal
+postman
+weko
+wodencafe
An0nm0n
Arch
Danny
DeltaOreo
FreefallHeavens
Irc2PGuest21357
Irc2PGuest21881
Irc2PGuest43426
Leopold
Nausicaa
Onn4l7h
Onn4|7h
Over
Over1
Sisyphus
Sleepy
Soni
T3s|4_
aargh2
anon2
b3t4f4c3
bak83
boonst
cancername
cumlord
dr4wd3
eyedeekay_bnc
hagen_
khb
not_bob_afk
plap
poriori_
profetikla
r3med1tz-
rapidash
shiver_
solidx66
tr
u5657
uop23ip
w8rabbit
x74a6
orignal
yes, good point
orignal
for example, if router is ygg only
orignal
thoretically it should be "R" because it reachable by some transport
orignal
pratctially it's not
orignal
so, since no R or U it means I'm doing right
eyedeekay
Oh that's dumb. the cronjob that downloads my plugins from github and put them onto my eepsite has been hitting the github rate-limit every time it runs for months
StormyCloud
them darn rate-limits
eyedeekay
That's why the only plugin anyone can ever download is railroad, the scripts have moved them all to the backup location then failed to download the new plugin and also failed to return the backup to the correct location
eyedeekay
because wget is cheerfully naming the text of the error page "package.tar.gz" or whatever because that's what I told it to do
dr|z3d
why don't you move the build process in house, eyedeekay?
dr|z3d
oh, wait, you're saying you've rate-limited your own github instance?
eyedeekay
No what actually happens is that I build everything on my laptop then I push it all up to github then I either ssh in to my eepsite host(s there are 2) and:
eyedeekay
sudo -u i2psvc bash -i
eyedeekay
cd ~/i2p-config/eepsite/docroot
eyedeekay
find . -type d -maxdepth 1 -exec bash -c "cd {} && edgar" \;
eyedeekay
where `edgar` re-generates the pages locally and downloads the releases back from github to the eepsite host
eyedeekay
OR I wait for the cronjob to do the same thing
eyedeekay
But since I have like a hundred or so various artifacts to download from github I hit the rate-limit before it finishes
eyedeekay
Basically I use github as a mirror and to store the files while I transfer them to the eepsite hosts, it's also why I go to lengths to make my eepsite and my github page look exactly the same, ideally they should be exactly the same
eyedeekay
But even the github page gets built locally, on my laptop, in the ~/.i2p/eepsite/docroot actually so I can see it
eyedeekay
All github ever does is copy the files
dr|z3d
well, you need to make some adjustments to avoid being rate limited, this much I can tell you :)
dr|z3d
well, if you're building the eepsite stuff locally, maybe you can bypass github for .i2p?
eyedeekay
and I'd have a new instance ready to multihome right next to the others
dr|z3d
you'll figure it out I'm sure. you love complexity :)
eyedeekay
Less so after having introduced so much of it into my life lol
dr|z3d
well, when the workflow you've introduced to make your life easier starts making your life harder, time to rethink.
eyedeekay
I'm trying to get it back to the point where if I just clone everything and download the release assets, I'm done, no more recursing or running make targets, just mirroring
eyedeekay
That was the original idea
eyedeekay
But I guess I need an API key for the mirroring part
eyedeekay
There that's the rate-limit taken care of
dr|z3d
*thumbs up*