I've been trying to fix this annoying little headache for a while now; I've got a completely tile based, networked game (entities/player moves one tile at a time) which runs pretty much fine in Eclipse, however there are two problems that can occur.
1. Whilst running in Eclipse a client can sometimes see delays in hitting a key and his player moving if the client is running a while.
2. When running from a exported Jar file any client running for a while will generally notice the delay.
As far as I can tell this is not related to the speed at which the server is sending/receiving data to/from the client; essentially the process is like this:
1. Client KeyListener picks up input and sends it to the server using a PrintWriter
2. A server input thread picks this input up and looks up the world the player is in by using whatever key the player has stored in the player class to find the world in a hashlist (the server supports multiple worlds(maps) being run simultaneously).
3. A move method in the world class moves the player in the "world" depending on the key pressed.
As far as I can tell this can take between 20000 - 200000 nanoseconds averaging at about 50000 ( 0.00005 Seconds)
4. The serverOutput Thread for that client (which is continuously sending data at about 50fps) then sends a serialized object containing a string representation of a small chunk of the map around the player to the ClientInput thread (which gets the data at the same fps as the server output) and stores it.
5. The client Thread handling graphics is continuously rendering that store of information at 43(ish) fps to a JPanel.
As far as I can tell there shouldn't be any delay here, when the delay occurs it does not seem to be affected by multiple clients joining/disconnecting.
I apologise that this is not alot to go on, I'm probably doing something really stupid and blindingly obvious, but I can't seem to find it.
If you're willing to take a look at the source code it can be found here (the two jars in the top directory are the most recently compiled client/server, the server needs to be terminated in task manager afterwards.):
Source: (Removed due to fix)
(Hit 1 to get rid of the intro pic)
Video showing bug with terrible voice commentary ^^: http://www.youtube.com/watch?v=0qChX1uzQ6Y&feature=youtube_gdata
I've only been programming around 5 months now so pity me the rough design, I mostly made up how it was going to work as I went along; also the reason why the world loading is so awkward is because I'm going to implementing loading from object streams later with data already in the entities, I haven't got round to implementing spritesheets yet too ^^.
Any help of any sort/advice on architecture is appreciated
I don't expect anyone to look through my code for me, but a helpful pointer in how to solve it would be awesome, I've been banging my head against a wall for a while now.