my friend have written a basic server and client using socket, an the again a server and a client using a socketchannel, using UDP
The client connect to server, send a packet containing the result of System.nanoTime(), the server echo this packet, then client receive the echo and find the delta between the packet time and the actual time (something like a PING)
The results are very strange: it has run on the same machine (linux system).
real ping say:
PING 127.0.0.1 (127.0.0.1) 56(84) bytes of data.
64 bytes from 127.0.0.1: icmp_req=1 ttl=64 time=0.018 ms
Socket say it has taken 0.5ms (500000ns)
Socketchannel say it has taken 2ms (2000000ns)
how this is possible? Socketchannel should be faster, no? also 2ms are really too much lag!
UDP doesn't have Nagle's algorithm, what can cause this big delay?
thanks for response, i'll add the code tomorrow if i meet my friend, but writing it from scratch should not take more than 30 minute.