Illegal to share: In Norway you cannot freely publish photos of others even if you are the photographer because personal photos are considered personal information. See the Personal Information Act here. It also applies in closed social groups. A group of members is enough that you cannot share photos freely. In the case of nude photos, it can be punishable even to display those on your own screen.
Can my child be prosecuted for sending indecent images of themselves?
17 Mind-Blowing Pics That'll Make You Say "Whoa" Like Keanu Reeves
UK, remember your settings and improve government services. We also use cookies set by other sites to help us deliver content from their services. You can change your cookie settings at any time. People who maliciously share sexually explicit pictures of former partners will face prosecution under new laws. Revenge porn — the distribution of a private sexual image of someone without their consent and with the intention of causing them distress — will be made a specific offence in the Criminal Justice and Courts Bill, which is currently going through Parliament. The fact that there are individuals who are cruelly distributing intimate pictures of their former partners without their consent is almost beyond belief.
Camera eats first
The short answer is yes. Under the Protection of Children Act and the Criminal Justice Act , taking, possessing, or sharing sexualised images of any person under the age of 18 is a criminal offence. This is true despite the age of consent for having sex being just 16, and unfortunately the fact that the young person either consented to the pictures being taken, or took the pictures themselves, is not a defence. In the UK it is not only an offence to take an indecent image of a minor, but also to possess one or to distribute it. These cases are more often than not dealt with severely by the Courts, with those guilty of such offences being made subject to notification requirements , in some cases Sexual Harm Prevention Orders also, and being at risk of a custodial sentence being imposed.
Apple is planning to scan U. Apple unveiled plans to scan U. The tool designed to detected known images of child sexual abuse, called "neuralMatch," will scan images before they are uploaded to iCloud. If it finds a match, the image will be reviewed by a human. If child pornography is confirmed, the user's account will be disabled and the National Center for Missing and Exploited Children notified.