For this example the basic premise is the following: we want to display some data on our home page that is buried in another page elsewhere on the same site. This event will be triggered by mousing over an anchor tag. So imagine we have a file named "network.php" which includes a page element w/ ID="WiFi" and we want to grab/display the data contained in that element on our home page. The data is loaded then displayed in a hidden container when the mouseover event is triggered.
I hope this gives you a clear idea of not only how to do page scraping but also why it is a useful technique to know. The documentation in the code should make things fairly self explanatory but if you still have questions please don't hesitate to leave them in the comments.
I recently had the need to keep some client-side settings persistent so I whipped up a very basic set of functions to both set and read cookies. Cookies are very handy when you aren't working with a single-sign-on environment (where user preferences are very easy to store in a database) but still need to keep some user data/settings. In this example there is a div element that can be shown/hidden and that selection remains persistent between browser sessions.