In the last e-mail, I forgot to mention two
things:
1. ext_select() is called at the server
side;
2. Debugging is done at the server
side.
Regards,
Marjan
----- Original Message -----
From:
Marjan
Bozinovski
To: rsplib-discussion AT sctp.de
Sent: Tuesday, November 05, 2002 2:18
PM
Subject: [RSPlib-Discusssion]
ext_select() does not react
Hi all,
I have another problem concerning an SCTP
association establishment. The scenario is very basic:
SCTP client --> SCTP server
I only want to make these endpoints communicate
on the SCTP layer. In addition, I will show just the relevant code
lines of both programs. So, I have the following code on the client side
(taken from examplepu.c):
/*Client side*/
//**************
ListenSocketFamily =
AF_INET6;
ListenSocketProtocol = IPPROTO_SCTP; ListenSocketType = SOCK_STREAM; //saddr contains the IP address of the server; I
am ommitting that step here
socketnumber = ext_socket(ListenSocketFamily,
ListenSocketType, ListenSocketProtocol);
setNonBlocking(socketnumber);
result = ext_connect(socketnumber, (struct sockaddr*)&saddr,
getSocklen((struct sockaddr*)&saddr));
FD_ZERO(&fdset);
FD_SET(socketnumber,&fdset); timeout.tv_sec = ConnectTimeout /
(card64)1000000;
timeout.tv_usec = ConnectTimeout % (card64)1000000; result = ext_select(1 + socketnumber, &fdset, &fdset, &fdset, &timeout); //****************
The code of server side is as follows (taken from
examplepe.c):
/*Server side*/
//****************
ListenSocketFamily =
AF_INET6;
ListenSocketProtocol = IPPROTO_SCTP; ListenSocketType = SOCK_STREAM; ListenSocket =
ext_socket(ListenSocketFamily, ListenSocketType,
ListenSocketProtocol);
setNonBlocking(ListenSocket); memset((void*)&localAddress,0,sizeof(localAddress));
((struct
sockaddr*)&localAddress)->sa_family =
ListenSocketFamily;
setPort((struct sockaddr*)&localAddress,ListenPort); if(bindplus(ListenSocket,(struct
sockaddr_storage*)&localAddress,1) == false) {
perror("Unable to bind application socket"); cleanUp(); exit(1); } if(ListenSocketType != SOCK_DGRAM)
{
if(ext_listen(ListenSocket,5) < 0) { perror("Unable to set application socket to listen mode"); cleanUp(); exit(1); } } //****************
I first run the server and then the client. The
server successfully creates a socket. The client successfully sends an INIT.
With debugging on, here is the output:
adding parent 0x8470244
(SCTPSocket::ReadUpdateCondition) to 0x8470128
(SCTPSocket::EstablishCondition)
adding parent 0x8470244 (SCTPSocket::ReadUpdateCondition) to 0x847000c (SCTPSocket::GlobalQueueCondition) adding parent 0xbfbfbe54 (ext_select::GlobalCondition) to 0xbfbfbf70 (ext_select::ReadCondition) adding parent 0xbfbfbe54 (ext_select::GlobalCondition) to 0xbfbfc08c (ext_select::WriteCondition) adding parent 0xbfbfbe54 (ext_select::GlobalCondition) to 0xbfbfc1a8 (ext_select::ExceptCondition)select: adding FD 1021 for read adding parent 0xbfbfbf70 (ext_select::ReadCondition) to 0x8470244 (SCTPSocket::ReadUpdateCondition) select: waiting... select: wake-up Condition 0 (Type 1021, FD 134871355) not fired removing parent 0xbfbfbf70 (ext_select::ReadCondition) to 0x8470244 (SCTPSocket::ReadUpdateCondition) On Ethereal, it is seen that the client really
sends an INIT chunk to the server, but the server does not react anyhow.
I guess the problem is at the server side. I
don't understand why the server does not see the SCTP packet coming from the
client.
Best regards,
Marjan
|
Archive powered by MHonArc 2.6.19+.