The news that Flickr has introduced content filtering for photos on their site is frankly a bit bewildering. I'm guessing that the aim is to resolve the previous unsatisfactory situation whereby some user accounts were marked 'NISPA' (not displayed in searches or public areas). However I can't see how introducing the three content levels of 'safe', 'moderate' and 'restricted' can possibly help - they just seem like meaningless labels. They may as well have called the restricted level 'dangerous', to really confuse things.
Additionally, you can now set your own preference for searching. The default however is 'SafeSearch', whereby all 'moderate' and 'restricted' photos are filtered out from searches. If you don't realise that this change has been made, you might never discover that the photos are being hidden from you.
The big problem for me with all this is that it's hugely unclear as to what photos are meant to be 'moderate' and 'restricted'? Is it nude shots which are causing all this apparent offence? Photos of people drinking and smoking? Pics depicting violence? It's all so vague and weird.
I predict that very few people will bother to classify their own photos as anything other than 'safe'. If anything, it'll just become a tool used by Flickr staff to hide photos they find offensive or provocative.
The only exception might be if it becomes possible to search for ONLY 'moderate' or 'restricted' photos. In which case, I predict that there will a great interest in these photos, and people will start to deliberately place their photos in these categories in order to attract the interest.