|
Swedish government proposes to criminalise men who pay adult cammers and OnlyFans performers
|
|
|
 | 14th May 2025
|
|
| See article from xbiz.com See
petition from eswalliance.org |
Repressive Swedish law currently criminalizes purchasing or procuring in-person sexual services but does not criminalize sex workers who provide such services. This approach is commonly referred to as the Nordic model. Under the new proposal,
anyone who pays someone to perform a sexual act online, without actual physical contact, would be subject to the same criminal liability as those who hire in-person sex workers. The proposal also includes liability for procuring such services, which
could lead to enforcement against fan and webcam platforms, or conceivably even against creators who collaborate together. Swedish creators have expressed concern that, under the proposed law, OnlyFans' terms of service could preclude Swedish
creators from using the site -- and could even criminalize their personal lives, since living with a partner or receiving support could now be considered pimping. The ESWA, a sex worker-led network representing more than 100 organizations in 30
countries across Europe and Central Asia, told XBIZ that it has launched a public petition urging the Swedish government to reject this proposal. The group writes: This proposal represents a regressive and dangerous
step that threatens the human rights, privacy, safety and livelihoods of sex workers and digital creators in Sweden and beyond. Websites and platforms hosting or facilitating consensual digital sex work could be prosecuted for 'digital pimping' creating
a chilling effect that may lead to mass deplatforming of sex workers. The ESWA is urging anyone who supports Swedish sex workers to add their name to the open letter. See
petition from eswalliance.org |
|
UK Internet censor Ofcom selects its first victims for porn censorship, scoreland.com and undress.cc
|
|
|
 | 11th May 2025
|
|
| See press release from ofcom.org.uk
|
Ofcom has investigations into two pornographic services - Itai Tech Ltd and Score Internet Group LLC - under our age assurance enforcement programme. Under the Online Safety Act, online services must ensure children cannot access
pornographic content on their sites. In January, we wrote to online services that display or publish their own pornographic content to explain that the requirements for them to have highly effective age checks in place to protect children had come into
force. We requested details of services' plans for complying, along with an implementation timeline and a named point of contact. Encouragingly, many services confirmed that they are implementing, or have plans to implement, age
assurance on around 1,300 sites. A small number of services chose to block UK users from accessing their sites, rather than putting age checks in place. Certain services failed to respond to our request and have not taken any
steps to implement highly effective age assurance to protect children from pornography. We are today opening investigations into Itai Tech Ltd - a service which runs the nudification site Undress.cc - and Score Internet Group LLC,
which runs the site Scoreland.com. Both sites appear to have no highly effective age assurance in place and are potentially in breach of the Online Safety Act and their duties to protect children from pornography. Next steps We
will provide an update on both investigations on our website in due course, along with details of any further investigations launched under this enforcement programme
|
|
Campaigner has a good whinge about sex work in her role as a United Nations Special Rapporteur
|
|
|
 | 9th February 2025
|
|
| See article from catholicvote.org
|
In a recent interview, Reem Alsalem, the United Nations (UN) Special Rapporteur on violence against women and girls, claimed that both prostitution and pornography represent serious human rights violations, not legitimate employment opportunities. The
anti-sex work campaigner was discussing her 2024 Prostitution and violence against women and girls report. She said: In my report, I demonstrated that prostitution is a system of exploitation and violence against women
and girls. It is very gendered; it predominantly affects females, and it is perpetrated by males.
Alsalem is an independent campaigner, not a UN staff member. Her role entails reporting on the successes and failures of governments,
businesses, militaries, and other entities in addressing violence against women and girls. Alsalem notably refused to use the expression sex work in her report. The rport said: The term wrongly depicts prostitution as
an activity as worthy and dignified as any other work. It fails to take into account the serious human rights violations that characterize the prostitution system and 'gaslights' victims and their experiences. According to Alsalem,
pornography operates the exact same way as prostitution and is considered as filmed prostitution. She said: It has the same perpetrators of violence, the same exploitation, the same consequences in terms of all forms of
violence inflicted on women and girls, in terms of being exploited by pimps, in terms of also having immense harmful impact on all society, including, I would say, men and boys, [and] younger girls, and harmful to gender equality overall in society.
Alsalem claimed that the vast majority of pornography is grotesque, degrading, and violent, lacking safeguards such as age verification and measures to prevent trafficking. Despite this, it is deliberately marketed to young women and
girls as a lucrative and glamorous pursuit. She said: Normalizing consuming pornography has become an issue that is an epidemic as well, on global proportions.
|
|
Ofcom initiates bounteous times for hackers, scammers, phishers and identity thieves
|
|
|
 |
17th January 2025
|
|
| See press release from ofcom.org.uk
|
Children will be prevented from encountering online pornography and protected from other types of harmful content under Ofcom's new industry guidance which sets out how we expect sites and apps to introduce highly effective age assurance. Today's
decisions are the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children. It follows tough industry standards, announced last month, to tackle illegal content online, and comes
ahead of broader protection of children measures which will launch in the Spring. Robust age checks are a cornerstone of the Online Safety Act. It requires services which allow pornography or certain other types of harmful content to introduce
'age assurance' to ensure that children are not normally able to encounter it.[1] Age assurance methods -- which include age verification, age estimation or a combination of both -- must be 'highly effective' at correctly determining whether a particular
user is a child. We have today published industry guidance on how we expect age assurance to be implemented in practice for it to be considered highly effective. Our approach is designed to be flexible, tech-neutral and future-proof. It also
allows space for innovation in age assurance, which represents an important part of a wider safety tech sector where the UK is a global leader[2]. We expect the approach to be applied consistently across all parts of the online safety regime over time.
While providing strong protections to children, our approach also takes care to ensure that privacy rights are protected and that adults can still access legal pornography. As platforms take action to introduce age assurance over the next six
months, adults will start to notice changes in how they access certain online services. Our evidence suggests that the vast majority of adults (80%) are broadly supportive of age assurance measures to prevent children from encountering online
pornography.[3] What are online services required to do, and by when? The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action we expect all of them to take
starts from today:
- Requirement to carry out a children's access assessment. All user-to-user and search services -- defined as 'Part 3' services[4] -- in scope of the Act, must carry out a children's access assessment to establish if their service -- or part of
their service - is likely to be accessed by children. From today , these services have three months to complete their children's access assessments, in line with our guidance, with a final deadline of 16 April . Unless they are already
using highly effective age assurance and can evidence this, we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply
with the children's risk assessment duties and the children's safety duties.[5]
- Measures to protect children on social media and other user-to-user services. We will publish our Protection of Children Codes and children's risk assessment
guidance in April 2025. This means that services that are likely to be accessed by children will need to conduct a children's risk assessment by July 2025 -- that is, within three months. Following this, they will need to implement measures to
protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified. These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful
content.
- Services that allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July 2025 at the latest to protect
children from encountering it. The Act imposes different deadlines on different types of providers. Services that publish their own pornographic content (defined as 'Part 5 Services[6]) including certain Generative AI tools, must begin taking steps
immediately to introduce robust age checks, in line with our published guidance. Services that allow user-generated pornographic content -- which fall under 'Part 3' services -- must have fully implemented age checks by July.
What does highly effective age assurance mean? Our approach to highly effective age assurance and how we expect it to be implemented in practice applies consistently across three pieces of industry guidance, published today[5]. Our final
position, in summary:
- confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
- sets out a non-exhaustive list of methods that we consider are capable of being
highly effective. They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
- confirms that methods including
self-declaration of age and online payments which don't require a person to be 18 are not highly effective;
- stipulates that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should
services host or permit content that directs or encourages users to attempt to circumvent an age assurance process; and
- sets expectations that sites and apps consider the interests of all users when implementing age assurance -- affording strong
protection to children, while taking care that privacy rights are respected and adults can still access legal pornography.
We consider this approach will secure the best outcomes for the protection of children online in the early years of the Act being in force. While we have decided not to introduce numerical thresholds for highly effective age assurance at this stage
(e.g. 99% accuracy), we acknowledge that numerical thresholds may complement our four criteria in the future, pending further developments in testing methodologies, industry standards, and independent research. Opening a new enforcement programme
We expect all services to take a proactive approach to compliance and meet their respective implementation deadlines. Today Ofcom is opening an age assurance enforcement programme , focusing our attention first on Part 5 services that display or
publish their own pornographic content. We will contact a range of adult services -- large and small -- to advise them of their new obligations. We will not hesitate to take action and launch investigations against services that do not engage or
ultimately comply. For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don't ask or, when they do, the checks are minimal and easy to avoid.
That means companies have effectively been treating all users as if they're adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change. As age checks start to roll out in the coming
months, adults will start to notice a difference in how they access certain online services. Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services -- including social media - which
allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest. We'll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to
face enforcement action from Ofcom. Notes
- Research shows that children are being exposed to online pornography from an early age. Of those who have seen online pornography, the average age they first encounter it is 13 -- although more than a quarter come across it by age 11 (27%), and one
in ten as young as 9 (10%). Source: 'A lot of it is actually just abuse'- Young people and pornography Children's Commissioner for England
- Research from the UK Government indicates that UK firms account for an estimated one-in-four (23%) of the
global safety tech workforce. 28% of safety tech companies are based in the UK according to recent research by Paladin Capital and PUBLIC .
- Source: Yonder Consulting - Adult Users' Attitudes to Age Verification on Adult Sites
- 'Part 3'
services include those that host user-generated content, such as social media, tube sites, cam sites, and fan platforms.
- Services that conclude they are not likely to be accessed by children -- including where this is because they are using
highly effective age assurance -- must record the outcome of their assessment and must repeat the children's access assessment at least annually.
- 'Part 5' services are those that publish their own pornographic content, such as studios or pay
sites, where operators control the material available.
|
|
|