Imagine a web page that includes a significant number of images. Currently, the browser requests the page, and as it lays out the received data, it makes additional requests to obtain the associated images.
Suppose instead the images and the page were "packaged" at the Server? It doesn't even have to be compressed data! It just needs to be Formally Grouped Together so that the original data-fetch would get all the data. The browser would of course know how to tell the pieces of the group apart, and be able to present a fully-constructed page to the user, without making any other requests to the Server. Overall, somewhat less Internet Bandwidth would be needed.
As a possible example of such a data-grouping, consider the "array". Element Zero of the array might be the main web page, while elements 1 through 14 might be associated images. When the page is requested, the Server sends the entire array. There are enough already-existing standard types of arrays "out there" in the computer-software world, that something like this should be easy to do, both for the Server and for the browser.
Note: there is no requirement here that all web-pages be Served this way. The browser only needs to recognize whether the returned data is Grouped, or not, and be able to either use Grouped data, or continue as at present to make a bunch of additional requests, for portions of the overall web page, when it isn't received as part of a Group.-- Vernon, May 17 2014 This is partially done already using CSS sprite sheets - a single request for all images. There are a number of other ways too: you can embed images into the CSS or HTML directly using data URLs, but the benefit falls off with larger images as the base64 is obviously less efficient than binary data.
Minimising requests is an old problem, and most practical ways of solving it have already been implemented, it was the motivation behind frames, ajax, etc...-- mitxela, May 17 2014 I believe Wisent already does exactly this.-- MaxwellBuchanan, May 17 2014 random, halfbakery