ITunes also allows you to bless locally networked computers withHome Sharing privileges, allowing unrestricted access to their music,videos, podcasts, apps, and playlists, which can be copied betweencomputers directly within iTunes. It's a great feature for families orany multicomputer household, and can even be set up to transfer any newiTunes store purchases between all of your computers automatically. Ofcourse, content added to your library using means other than the iTunesstore (heaven forbid) is excluded from automatic updates, but can stillbe transferred manually through Home Sharing.
![](/uploads/1/2/6/3/126334734/453645235.jpg)
SiteSucker is an application that automatically downloads Web sites from the Internet. It does this by copying the site's HTML documents, images, backgrounds, movies, and other files to your local hard drive. Just enter a URL and click a button and SiteSucker can download the entire site.
![SiteSucker SiteSucker](http://a5.mzstatic.com/us/r30/Purple122/v4/a0/e1/92/a0e192d4-a3ba-5547-8f6b-1f22021bdf92/screen1280x800.jpeg)
SiteSucker 2.8.4
Description
SiteSucker is an Macintosh application that automatically downloads Web sites from the Internet. It does this by asynchronously copying the site’s webpages, images, PDFs, style sheets, and other files to your local hard drive, duplicating the site’s directory structure. SiteSucker can be used to make local copies of Web sites. By default, SiteSucker “localizes” the files it downloads, allowing you to browse a site offline, but it can also download sites without modification.
What’s New in Version 2.8.4
Fixed a bug that could prevent a webpage from being analyzed if its specified character set is wrong.
Fixed the way URLs with port numbers are handled.
Fixed some problems localizing redirected URLs.
Reduced the time it takes to pause when analyzing.
Ensured that the download is paused before saving a document when downloading.
Saved supporting files that are already in memory when downloading using web views.
Downloaded the sitemap specified in robots.txt unless the Delete robots.txt preference is on.
Fixed the way URLs with port numbers are handled.
Fixed some problems localizing redirected URLs.
Reduced the time it takes to pause when analyzing.
Ensured that the download is paused before saving a document when downloading.
Saved supporting files that are already in memory when downloading using web views.
Downloaded the sitemap specified in robots.txt unless the Delete robots.txt preference is on.
Developer: Rick Cranisky
![](/uploads/1/2/6/3/126334734/453645235.jpg)