Is it possible to make a script to use wget to download a bunch of pages, and then filter out any swear words or anything, and then save them as their original file names?
Our school has a filter and I want to be able to use this site, it's blocked. So i figured i could use my server to redirect requests to this site. I have one right now kind of. It will use wget -c "$query_string" to get a site. but how would I go about filtering words out of it?