&zzz
+R4SAS
+RN_
+T3s|4
+eche|off
+nilbog
+orignal
+postman
+qend-irc2p
+sourceress
Arch
Birdy
Danny
Irc2PGuest30010
Irc2PGuest36077
Irc2PGuest49364
Irc2PGuest51117
Irc2PGuest6564
Irc2PGuest65656
Irc2PGuest67278
Irc2PGuest74235
Irc2PGuest83482
MatrixBot
Onn4l7h
Over
Sleepy
Teeed
Yotsu
__bob_
aargh3
ac9f
acetone_
ahiru
anontor
b3t4f4c3__
cims
dr4wd3_
dr|z3d
duanin2
hababam
hagen_
leopold
makoto
marek
marek22k
n2
noidea
not_bob_afk2
nyaa2pguy
o3d3_
poriori
profetikla
r00tobo
rapidash
solidx66
stormycloud[m]
test7363673
uop23ip
urist_
user_
w8rabbit
zelgomer
orignal
yes, good point
orignal
for example, if router is ygg only
orignal
thoretically it should be "R" because it reachable by some transport
orignal
pratctially it's not
orignal
so, since no R or U it means I'm doing right
eyedeekay
Oh that's dumb. the cronjob that downloads my plugins from github and put them onto my eepsite has been hitting the github rate-limit every time it runs for months
StormyCloud
them darn rate-limits
eyedeekay
That's why the only plugin anyone can ever download is railroad, the scripts have moved them all to the backup location then failed to download the new plugin and also failed to return the backup to the correct location
eyedeekay
because wget is cheerfully naming the text of the error page "package.tar.gz" or whatever because that's what I told it to do
dr|z3d
why don't you move the build process in house, eyedeekay?
dr|z3d
oh, wait, you're saying you've rate-limited your own github instance?
eyedeekay
No what actually happens is that I build everything on my laptop then I push it all up to github then I either ssh in to my eepsite host(s there are 2) and:
eyedeekay
sudo -u i2psvc bash -i
eyedeekay
cd ~/i2p-config/eepsite/docroot
eyedeekay
find . -type d -maxdepth 1 -exec bash -c "cd {} && edgar" \;
eyedeekay
where `edgar` re-generates the pages locally and downloads the releases back from github to the eepsite host
eyedeekay
OR I wait for the cronjob to do the same thing
eyedeekay
But since I have like a hundred or so various artifacts to download from github I hit the rate-limit before it finishes
eyedeekay
Basically I use github as a mirror and to store the files while I transfer them to the eepsite hosts, it's also why I go to lengths to make my eepsite and my github page look exactly the same, ideally they should be exactly the same
eyedeekay
But even the github page gets built locally, on my laptop, in the ~/.i2p/eepsite/docroot actually so I can see it
eyedeekay
All github ever does is copy the files
dr|z3d
well, you need to make some adjustments to avoid being rate limited, this much I can tell you :)
dr|z3d
well, if you're building the eepsite stuff locally, maybe you can bypass github for .i2p?
eyedeekay
and I'd have a new instance ready to multihome right next to the others
dr|z3d
you'll figure it out I'm sure. you love complexity :)
eyedeekay
Less so after having introduced so much of it into my life lol
dr|z3d
well, when the workflow you've introduced to make your life easier starts making your life harder, time to rethink.
eyedeekay
I'm trying to get it back to the point where if I just clone everything and download the release assets, I'm done, no more recursing or running make targets, just mirroring
eyedeekay
That was the original idea
eyedeekay
But I guess I need an API key for the mirroring part
eyedeekay
There that's the rate-limit taken care of
dr|z3d
*thumbs up*