Carpathia has a thought-provoking infographic on Mobile Content Usage and Expectations. They reveal how long people are willing to wait for data to load in a website or mobile app…
The infographic goes on to explain how the wait time is related to latency and the proximity of the server to the user. While this is partly true, it doesn’t correlate with my experiences of having worked on lots of apps that access the server.
First of all, you don’t get 5 secs latency just because the server is on the other side of the World. The latency of the Internet itself tends to be of the order of hundreds of milliseconds. If the server is close then you can get low 100s of milliseconds or less, the other side of the World might give you mid to high hundreds of milliseconds. In practice, it doesn’t matter hugely where the server is located. What matters more is the latency of the mobile network and the speed of the server in processing multiple requests. Large companies often split servers across geographic locations for resilience, legislative data protection issues and reasons of scaling - and rarely primarily due to speed of access.
Also, not all slow apps are due to the network latency. Accessing storage can be slow. However, where apps do access the server, aggregating server requests can reduce the wait time as well as save power and reduce the need to scale the server.