Rethinking the Mobile Last Mile for Mobile Apps.

Accelerating the mobile last mile.

Maximizing Available Throughput - When used over standard Web connections, TCP makes Multiple round trips to discover available bandwidth, but since mobile networks are mostly private (AT&T , Verizon etc) and sender side TCP has no initial idea about available bandwidth on wildly differing networks, every new TCP connection pays a huge penalty before reaching optimality. The end result is that mobile users constantly complain about the network even while the available bandwidth is going unused.

Managing Mobile Congestion - TCP relies on packet drop monitoring to manage network congestions, but over mobile networks packet loss does not necessarily signal network congestion. The result is a steep drop-off in throughput even for tiny amounts of packet loss (which is almost guaranteed to be present on mobile networks)

Dead Zones - Dead zones are common aspect of mobile networks, but unfortunately TCP was not designed to handle this scenario well. When the user enters a dead zone TCP has to wait for the full TCP timeout in order to recover.

Stateless Protocol - Http was designed as a stateless protocol to show static pages. As the web evolved to serve full-featured applications, web applications had to resort to hacks such as cookies and header parameters to maintain a sense of a “user session”. There’s no reason for native mobile apps to continue to use the same hacks for their stateful nature.

Session Disruption (network level) - Intermittent mobile connections are a fact that mobile apps have to deal with. Unlike the wired Internet, there are many variables to account for during data transfer such as changing device location, signal strength and interference from other networks (WiFi networks). Disruption during sessions requires apps to repeat actions, slowing down responsiveness to user requests.
Mobile Network Penalty(network level) - Mobile apps have an added layer of delay or latency due to the fact that they have to travel over cellular and WiFi networks to get to the public internet. When the latency is added up for all the round trip requests between an app and content server, and combined with mobile network conditions that can’t match wired connections, the sum total time makes an app bog down and become unresponsive.
HTTP Sub-optimized for Mobile Networks (application level): As a carry over from wired networks HTTP config parameters such as read timeout, connect timeout, write timeout, and concurrency use defaults. However mobile networks are more variable due to a number of factors such as network conditions, location, and signal strength. Without dynamically adjusting the parameters mobile apps sometimes wait too long to initiate transactions and at other times move too quickly, both creating a poor user experience.
Content Caching(device level) - In many cases content is cached far from the device thereby creating more latency and a less responsive app.