IRCaBot 2.1.0
GPLv3 © acetone, 2021-2022
#i2p-dev
/2025/08/28
@eyedeekay
&zzz
+R4SAS
+RN
+RN_
+StormyCloud
+dr|z3d
+eche|off
+hagen
+hk
+mareki2p
+orignal
+postman
+qend-irc2p
+snex
+wodencafe
Arch
BravoOreo
BubbRubb
Chrono
Danny
FreeB
FreefallHeavens
HowardPlayzOfAdmin
Irc2PGuest5036
Irc2PGuest59861
Irc2PGuest63688
Onn4l7h
Onn4|7h
Over
Rogueone
Sisyphus
Sleepy
St1nt
T3s|4_
T3s|4__
Teeed
aargh
acetone_
aisle
ardu
b3t4f4c3___
cumlord
dr4wd3
duanin2
eyedeekay_bnc
nilbog
not_bob_afk
ohThuku1
poriori_
pory
profetikla
r00tobo
rapidash
shiver_
solidx66
thetia
uop23ip
w8rabbit
weko_
wew
x74a6
eyedeekay I think PR's should be fixed now zzz: git.idk.i2p/I2P_Developers/i2p.i2p/pulls/523 the manual fix is simple enough to apply automatically so I'm cleaning it up every half-hour with a cron job
eyedeekay So if it happens again, 30 minutes should fix it
zzz thanks eyedeekay, I believe you, but I've been getting 503 for the last 12 hours, ssh down too
eyedeekay Damn, it looks pretty intermittent from this side, I seem to get from 503 to working page in ~1 refresh, but 12 hours is 12 hours I'll keep looking
eyedeekay the 503's come from gitea QOS, maybe I can tighten up some things before reaches that point
eyedeekay I know it's not dogfooding but if it comes to a pinch clearnet ssh is also up
eyedeekay Google's once again taken it upon itself to inconvenience me, at least a little ahead of release this time
zzz immediate refresh magic doesn't work here
eyedeekay Are the 503's coming from gitea or i2ptunnel?
zzz the latter
eyedeekay I just got one to trigger from the i2ptunnel side and it's taking a long time to clear is why I ask, I might be able to tune that down
eyedeekay OK good that probably makes this easier
eyedeekay Oh wow that's a little odd...
dr|z3d 503 from gitea isn't a major issue, that means everything's working as expected. with the autorefresh feature we discussed, eyedeekay, will be a non-issue. 503 from i2ptunnel, otoh, issue.
eyedeekay Yeah and it's weird on my end because the tunnel is pointing at a port that is definitely listening locally, so it shouldn't be 503ing in any case
eyedeekay Oh shoot that was a dumb one on my part, be about 15 minutes
eyedeekay I periodically back up the git server to a desktop PC in my house which is kept in an exactly identical software configuration to i2pgit.org. Normally it stays shut down at non-backup times.
eyedeekay Unfortunately this time I forgot to shut down the i2p router on that host with the backup, but I *did* shut down the gitea mirror
eyedeekay Effectively meaning that gitea was being multihomed against the wrong backend
dr|z3d make sure it doesn't happen again!
eyedeekay The backup server has been shut down and for good measure I unplugged the ethernet cable
eyedeekay I think that goes a way to explaining why it was working for me and not you zzz, we must have been racing to different places
eyedeekay As far as i2pgit.org users go, the gitssh parts be back to normal now, push em if you got em
zzz ok, I can see PR 523 now
zzz it's slooooow but it works
zzz made some edits but got a gitea 503 trying to add a comment
zzz worked on a refresh
eyedeekay I'll try and tighten things down so gitea 503's happen less
eyedeekay Best thing I can think of to speed it up is to try and stop people hammering the server so hard so it'll probably happen through fail2ban stuff
snex put go-away in front of it
eyedeekay Not familiar with that but I'll look into it
zzz so it's overall query load, not per-query almost-OOMing?
snex its like anubis but not woke: git.gammaspectra.live/git/go-away
eyedeekay There's two problems going on, there's the git diff issue and the scraper issue
eyedeekay The git diff issue sucks but it's what is happening when you see the 503's that come from gitea and usually clears itself
eyedeekay the scraper issue is worse because it triggers the git diff issue when they hit specific pages
snex this should prevent scrapers totally
snex it might also annoy some legit users who dont like JS, but it also has some no-JS challenges
eyedeekay 99% of the issues come from the clearnet side, I can probably give the i2p-side users a break
snex should be easy to let i2p connections bypass