IRCaBot 2.1.0
GPLv3 © acetone, 2021-2022
#i2p-dev
/2025/05/27
@eyedeekay
&eche|on
&zzz
+R4SAS
+RN
+RN_
+StormyCloud
+T3s|4
+acetone
+cumlord
+dr|z3d
+eche|off
+orignal
+postman
+qend-irc2p
+snex
+wodencafe
Arch
Birdy
BubbRubb
Chrono
Daddy
Danny
DeltaOreo
HowardPlayzOfAdmin
Irc2PGuest31543
Irc2PGuest45728
Irc2PGuest47416
Irc2PGuest49308
Irc2PGuest50448
Irc2PGuest92214
Irc2PGuest96178
Onn4l7h
Over1
RealGuey
Sisyphus
Sleepy
Teeed
aargh2
ac9f_
b3t4f4c3__
bak83
dr4wd3
duanin2
eyedeekay_bnc
nilbog
not_bob_afk
poriori
profetikla
r00tobo_BNC
shiver_
solidx66
u5657
w8rabbit
x74a6
zzz eyedeekay, gitea down
eyedeekay It was back by the time I looked at it
zzz ok, the message was a little different than usual
zzz remote:
zzz remote: error:
zzz remote: error: Internal Server Connection Error
zzz remote: error:
zzz error:
zzz error: Failed to execute git command
zzz error:
zzz remote: . Processing 1 references
zzz send-pack: unexpected disconnect while reading sideband packet
zzz error: error in sideband demultiplexer
eyedeekay Hm, that's new. All this stuff that's been seems to come down to a few locking issues in the interaction between git and gitea and instinctively, this seems like maybe related manifestation of that thing. I've been archiving the log every time it goes down to find it but most problems just go away after cleaning up the locks and restarting
eyedeekay The locking is also demonstrably the cause, malfunctions begin to occur when git operations compete for a lock
eyedeekay So now we're automatically cleaning locks every restart, and automatically restarting when lock competition causes a timeout, which takes about 2-5 minutes because it looks for false positives before killing/restarting and takes a backup, with restarts happening about 1-2 times a day, if you hit it in exactly that window which I think you did it probably acted weird
eyedeekay See if I can find you in the logs...
zzz no need, I leave it to you
eyedeekay Well I'm curious, and if I can find the event maybe it's another clue