We have a need to leverage client side resources for lists containing tasks.
The client needs to:
- be notified of updates to the list
- be able to re-order/filter the list (requesting an update from the server with tasks that the client does not know of/have in cache)
The problem comes on initial load or large list updates (changing from "tasks assigned to me" to "tasks regarding x")
The fastest thing to do is get all the tasks back in a list, instead of individual (10+) requests.
But E-tags will not help when I request an update to a task in the list, as it was not downloaded individually.
Is there some way of getting the browser to cache items in a list against their individual urls?
- If I navigate away, and go to the task url, will my js objects survive? I suspect no.
- If so, is it possible to have a "task list load" page that will inspect the history and go back to the existing task list? I think no - security.
I'm thinking I'll just have to take the initial loading hits and individually retrieve tasks, so that later requests are fast (and take the load off the server).
In HTML5, there is the Window.sessionStorage attribute that is able to store content against a specific session within a tab/window (other windows will have their own storage, even when running against the same session).
There is also localStorage, and database Storage options available in the spec as well.
HTML5 is still a bit far away from being implemented, but you might be able to use Google Gears for your local storage needs.
Also, I've no idea how many simultaneous clients and tasks we're talking about, how big the tasks are and what strain such a request might put on the database, but I'd think you should be able to run smooth without any caching, 10+ requests doesn't seem that much. If traffic is not the bottleneck, I would put caching on the server and keep the client 'dumb'.