Sometimes when connecting in client mode to a server using an open socket (Sn_SR = SOCK_INIT) the W5500 Sn_SR doesn’t behave like I’d expect. After the CONNECT command is sent, the W5500 sets Sn_CR to zero, then I record the values of Sn_SR and Sn_IR at 5ms intervals while waiting for the connection to complete. Sn_SR remains SOCK_INIT (0x13) and Sn_IR = 0. Eventually Sn_SR goes to 0 and Sn_IR to 0x08, indicating a timeout.
Sometimes every Sn_SR sample is SOCK_SYNSENT (0x15) instead of SOCK_INIT. I’d tempted to tempted to label this a sample rate issue if it weren’t so absolutely consistent in returning the same Sn_SR value every time when it behaves this way.
Neither case occurs frequently, but I’d like to understand what’s happening and fix it in some more reasonable way than detecting the problem and trying again.
My processor platform is the Adafruit Metro M4, and my Ethernet shield is the red Wiznet W5500 shield. I’m using the Arduino Ethernet 2.0.0 library (https://github.com/arduino-libraries/Ethernet/releases/tag/2.0.0).