h a l f b a k e r yYou could have thought of that.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
diff http extension
Make webservers only transmit differences from last time you watched a website | |
This could especially increase
the speed of browsing in
threaded discussions (like in
Slashdot or many other
bloglike
sites), but also bulletin
boards, news-websites and
collaboration platforms (like
wikipedia).
The way it
is now, every time
you browse on a website
created
by request, the browser will
have to download the whole
page
again, even when nothing or
very little changed; just
because it could have.
So here is my suggestion: When
the webserver initially sends
out a website, it puts a
timestamp in there. So if they
browser notices the same
request is sent out a second
time, it adds the diff-http
identifier and the delivered
timestamp to the request. The
webserver holds a cache of
recent versions and
differences
to the most recent. It sends
out only a patch to the
original
html-file, in case the page
with a timestamp older than
the one sent is
still in cache.
Possible extensions could be:
- The webserver sends out
how the different
request-fields relate to the
feature. The ID-field, for
example, defines the page. The
session ID on the other hand
doesn't influence the website
much and can be seen as
another timestamp. This
feature
has to be used with care on
public computers.
- Extend the feature to all
pages on a website, not just
the one displaying the same
content. This way,
navigational
elements don't have to be
retransmitted. It's a balance
between serverside CPU usage
and needed bandwidth.
- If there is no such
serverside implementation, it
still makes for a good feature
in the browser itself, if it
can handle such diff files.
The browser will just run diff
itself while downloading. It
tells the browser which parts
of the document it has to
realign and repaint, leading
to faster rendering and less
flickery display on
half-loaded html pages on the
same site.
Please log in.
If you're not logged in,
you can see what this page
looks like, but you will
not be able to add anything.
Destination URL.
E.g., https://www.coffee.com/
Description (displayed with the short name and URL.)
|
|
http already has timestamps; assuming things are set up correctly then a page doesn't have to be downloaded again if nothing has changed. |
|
|
Transferring diffs/updates of web pages sounds like a new idea though (unless you count client-side scripts that talk to proprietary servers/extensions thereof to do the same). |
|
|
As I understand it, when a page is unchanged from the previous visit, the server will send a response 304 and the browser will pull the page from cache (assuming it's in cache). This idea would require the server to maintain a full history of all changes to each page so that it could 'patch' any request. If a page has changed 400 times since it was written, and 400 different browsers requested the page, they may well each require 400 different patches. The benefit to the browser seems to be small compared to the additional requirements on the server. |
|
|
( benjamin )
You're right, that's an
aspect, you could implement
this as javascript. However,
back when
I was creating websites,
programming in JavaScript was
a pain. How little it is
applied to problems my idea
is thought for, seems to
indicate that it still is. |
|
|
( angel ) It's not meant to
replace the browser cache.
It's meant to speed up
posting operations. Between
when you started writing your
comment and after you
comfirmed it there usually
aren't too many changes to a
page. |
|
|
Ah, I think I see. Is it analogous to the way ASP or PHP builds a page on the server by referring to a database then transmits the result to the browser? Here, it's moved down a level, so your new layer is requesting the data fields (just the ones changed from the version held in cache) from the server and re-building the page on the browser? |
|
|
I don't think the "diff" is worth it - it's asking a lot from server and client. Maybe ten years ago, when data speeds were slower. |
|
|
It is difficult to track source changes through a rendering process. If the flickering is annoying, you're better off rendering into an off-screen bitmap and updating only those pixels that changed. |
|
| |