The web is a mish-mash of text and images. Text is easy for search engines and webcrawlers to search and index because words immediately reveal content. Images (and other media files) don't readily reveal content. One can only guess the content of an image using the context of nearby words on the
webpage or by the image name itself.
This leads to difficulties when it comes to screening sites for adult content. There's no way to know if the site has image content to be avoided if the designer hasn't used obviously adult wording. I was thinking, there might be a way to prevent this by imposing regulations requiring adult images to be tagged with parental ratings.
For example, a GIF of pornstar Jenna Jameson in action might make use of the Keyword property of the image file and include: "Jenna Jameson", "sex", and "rated X." Browsers could then read this Keyword info from the file prior to displaying the images and screen out the appropriate ones according to parental guidelines, whether the aversion be to sex, violence, terrorism, or whatever.
In order to not make the regulation too cumbersome, the only required keyword on sensitive image files would be "Rated _" where the blank would be any of the common movie ratings. The other keywords are helpful, but optional. Furthermore, sites having images that would not be "Rated R" or greater would not be required to tag their images. In this way, only sites having very sensitive content need bother with this task. The rest of the world may continue as usual.
The key for making this work is obviously imposing and enforcing well-defined regulations across the internet. I hesitate to mention something like this since most people want less bureaucracy, but I thought it worth a ponder.