Adult World News

2025: Jan-March

 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   2025   Latest 

 

The Bertin Report: The Challenge of Strangling Online Pornography...

A government commissioned report attempts to ban much or most of the online porn available to the UK


Link Here 2nd March 2025
Full story: UK Government Pornography Review...A review of censorship law
A little over a year ago Rishi Sunak's government commissioned a review to consider the censorship of online pornography in the UK. The review goes way beyond simply requiring age verification to keep out the under 18s as implemented via the Online 'Safety' Act. It considers all aspects of the censorship of online porn available in the UK.

The review is authored by Gabby Bertin, a conservative peer, promoted by David Cameron. Her relevant background is from campaigning in the sphere of domestic abuse.

Inevitably the majority of 'evidence' invited for the review was from anti-porn campaigners and those that would advantage from the set up of an internet censorship process for adult websites. Much of the language of the report directly uses feminist tropes such as twisting the word 'violent' to mean non violent content that offends feminist axioms.

The author of course introduces her report: "I'm not a prude ...BUT... She writes:

I want to be clear that I do not approach this subject from a prudish or disapproving position. I am a liberal Conservative and a proponent of free speech. I believe that people should be able to do whatever they want if it doesn't harm anyone -- and that includes safely consuming adult content that has been made by consenting adults. ...BUT... we need to strike the right balance between protecting those principles and protecting society, particularly the most vulnerable, from potential risks. And the time has now come to take a stand on this and call out what is really happening and the damage it is doing.

Of course she then goes on to outline a few Trojan horses that would inevitably lead to the banning of blocking of most adult websites available in the UK.

Here is a summary of the chapters of the report most relevant to censorship.

 

Recommendation 1: 'legal but harmful' content to be banned from publication

Bertin has gotten a bit tied up in contradictory language for this section. Some material is cut by the BBFC because it is illegal in the UK, eg as defined as Extreme Pornography (eg bestiality and real injury). Other material is cut by the BBFC under it's own BBFC guidelines. Initially Bertin defines such content as 'legal but harmful' and then goes to call for the publication of such material to be made illegal to publish.

The BBFC details 'legal but harmful' content as follows:

  • material (including dialogue) likely to encourage an interest in sexually abusive activity, which may include adults role-playing as non-adults

  • the portrayal of sexual activity which involves real or apparent lack of consent and any form of physical restraint which prevents participants from indicating a withdrawal of consent

  • the infliction of pain or acts which are likely to cause serious physical harm, whether real or (in a sexual context) simulated. Some allowance may be made for non-abusive, consensual activity

  • penetration by any object likely to cause physical harm

  • sexual threats, humiliation, or abuse which do not form part of a clearly consenting role-playing game

And Bertin adds her own additions to the list:

  • content that shows racism or could encourage racist attitudes

  • content where a performer or creator has withdrawn their consent to being in a film

  • stolen content that has been shared without the performer’s knowledge or consent

Bertin goes on to suggest two options for implementing the above censorship:

  1. Let Ofcom define a 'Safe Pornography Code of Practice' that defines content to banned and also the process to be implemented to censor such content.
  2. Implement the ban via the government creating a criminal publication offence to ban such material with detailed censorship rules written by the Crown Prosecution Service.

Bertin also suggests two extra censorship ideas under this section:

  • keyword matching on website searches would mean that terms including girl, young, rape, drunk etc. would result in a warning message.
  • paid-for porn services would require daily, weekly, or monthly spend limits on content, much like gambling services.

 

Recommendation 2 & 25: Content to be made illegal to possess, distribute, and publish (by adding to the definition of already banned extreme porn)

  1. So-called choking content, where there is external pressure on the neck, is rife on platforms that host pornography and is a very popular category of content. People acting it out in their sex lives may face devastating consequences. Evidence shows that even a small amount of pressure to the neck can harm the brain, and there is no safe way to strangle a person.

  2. Pornographic content that depicts incest should be made illegal. While it is currently a criminal offence to have penetrative sexual activity with a family member (both blood-related and adopted), it is not illegal to act out depictions of incest in pornography.  It should be noted that that this would not include pornography that depicts sex between step-relations -- this is not illegal activity in the real world, however this content is rife on mainstream platforms.

 

Recommendation 3: The non-consensual taking and making of intimate images - whether real or deepfake - should be made an offence.

Whilst it is uncontroversial to ban real non-consensual images it seems a little unjust to bundle this up with deep fakes. It should be noted that there are large amounts of celebrity material that is already commonplace on adult websites and AI tools are already available offline that can be used to create ones own porn without any keyword restrictions as may be imposed by the gatekeepers of online services such as Gemini and ChatGPT

 

Recommendation 9 & 10. A separate body should conduct content audits

This body will ensure platforms hosting pornographic content are tackling illegal and prohibited content effectively.

To this end, government could appoint a body, such as the BBFC, who have experience in moderating content, to audit content from platforms hosting pornography to ensure they are tackling prohibited and illegal content. This body could expedite reports to Ofcom where there is evidence of lack of compliance (Ofcom could then deploy enforcement measures if there was non-compliance. Government would need to look to additional funding for training and/or additional resource for this body.

Companies that pass the audit could receive an accreditation of good practice. This would signal to the public, government, and ancillary services that this company is well-regulated, acting as an incentive to platforms themselves to raise their own standards.

This would also serve a public awareness angle. I have found that there is currently little public awareness about what good pornographic content looks like, or what a good platform is. This accreditation would be a way for the public to clearly know if the service they are viewing pornography on meets the accreditation or not, allowing for more informed choices.

Of course the accreditation could also serve to warn discerning porn users that the website only serves highly censored porn and may be best avoided.

 

Recommendation 11. Restricted porn content that is made harder to find, so that it is only available to users if they intentionally seek it out.

  • Incest pornography between step-relations

  • Teen 18+ category pornography

This type of content should not be served up on a homepage to a first-time user. Industry should collaborate on a watch-list of types of content in this space, restricting and down-ranking this content so that it is not available on homepages to first-time users. Government could decide to regulate this content or conduct further research on harms if there is later proof of harm or inaction in this space.

 

Recommendation 12. Increased, effective, and quick business disruption measures across the ecosystem of pornography.

Including ancillary services that support the platforms -- should be in place to ensure swift removal of illegal and legal but harmful pornographic content. A clear and enforceable sanctions framework, under the Online Safety Act, should also be established.

 

Recommendation 14. The Advertising Standards Authority (ASA) should review its approach to advertising on online pornography sites.

As detailed in Chapter 1, there is limited regulatory oversight of advertisements appearing on pornography sites. I recommend that the ASA critically reviews its approach to regulating the content of advertising on online pornography sites in the UK. This could lead to further oversight to ensure platforms are fully abiding by the code and not featuring advertisements that promote any content that would be captured under the prohibited list in Recommendation 1.

Should a platform not abide by the code, or push harmful content to viewers, strict enforcement measures should be more consistently applied.

 

Recommendation 21. Performer consent and age verification

Companies that host pornographic content should have consistent safety protocols, processes and safeguards in place to ensure that all performers/creators are consenting adults, are of age (18+), and have not been exploited or coerced into creating content.

 

Recommendation 22. Performers to be able to get their videos taken down regardless of contractual obligations

There should be clear and standardised processes across the sector to enable performers and creators to withdraw consent and to have content they appear in removed from sites. Even if a performer or creator has provided consent for the initial recording and sharing of pornographic content, they should have every right to withdraw consent at a later point (whatever the reason may be) and have that content removed.

Withdrawing consent, and therefore the content in which one appears, is not a decision that performers take lightly. There are costs associated with the creation of content, and its removal may mean that content no longer generates income, which could result in cutting off a segment of income to a performer, their co-star(s), producer or director. In some cases, if a user has purchased the content and may retain it offline, or subscribed to receive the performer’s work, this makes it more difficult for a performer’s content to be deleted completely. I acknowledge that there are obstacles to overcome due to contract law and cost recovery of film production. Nonetheless, I believe consent must supersede any other consideration. If a performer or ex-performer wants a video removed, I believe their request should be granted.

 

Recommendation 23. Stolen content

Platforms that host pornographic content should have robust protocols and processes to prevent and respond to stolen content. This should include easy reporting and removal of content stolen from performers.

 

Recommendation 29. Nudification or nudify apps should be banned.

The government should strongly consider banning apps that have been developed for users to nudify themselves or others. Alternatively, government could explore banning these apps at a device level so that users in the UK are unable to download them on their smartphone, laptops, and other devices.

 

I don't suppose that there are many porn websites that would survive an audit against the above rules. But of courser Bertin never thinks to ask about the practical consequences of such an industry wide ban as suggested above.

I guess that there is enough porn already in circulation that could as a last resort be passed around on memory stick to keep everyone satisfied for the next 100 years. If not, then I am sure there will be a few websites around the world that will keep going. And if all else fails then perhaps the county lines can branch out into selling contraband porn.

I think porn is now simply too commonplace and too widely accepted by society to effectively ban.

 

 

Porn epidemic...

Campaigner has a good whinge about sex work in her role as a United Nations Special Rapporteur


Link Here9th February 2025
In a recent interview, Reem Alsalem, the United Nations (UN) Special Rapporteur on violence against women and girls, claimed that both prostitution and pornography represent serious human rights violations, not legitimate employment opportunities.

The anti-sex work campaigner was discussing her 2024 Prostitution and violence against women and girls report. She said:

In my report, I demonstrated that prostitution is a system of exploitation and violence against women and girls. It is very gendered; it predominantly affects females, and it is perpetrated by males.

Alsalem is an independent campaigner, not a UN staff member. Her role entails reporting on the successes and failures of governments, businesses, militaries, and other entities in addressing violence against women and girls.

Alsalem notably refused to use the expression sex work in her report. The rport said:

The term wrongly depicts prostitution as an activity as worthy and dignified as any other work. It fails to take into account the serious human rights violations that characterize the prostitution system and 'gaslights' victims and their experiences.

According to Alsalem, pornography operates the exact same way as prostitution and is considered as filmed prostitution. She said:

It has the same perpetrators of violence, the same exploitation, the same consequences in terms of all forms of violence inflicted on women and girls, in terms of being exploited by pimps, in terms of also having immense harmful impact on all society, including, I would say, men and boys, [and] younger girls, and harmful to gender equality overall in society.

Alsalem claimed that the vast majority of pornography is grotesque, degrading, and violent, lacking safeguards such as age verification and measures to prevent trafficking. Despite this, it is deliberately marketed to young women and girls as a lucrative and glamorous pursuit. She said:

Normalizing consuming pornography has become an issue that is an epidemic as well, on global proportions.

 

 

Age/ID verification to be required on porn sites by July 2025...

Ofcom initiates bounteous times for hackers, scammers, phishers and identity thieves


Link Here 17th January 2025
Children will be prevented from encountering online pornography and protected from other types of harmful content under Ofcom's new industry guidance which sets out how we expect sites and apps to introduce highly effective age assurance.

Today's decisions are the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children. It follows tough industry standards, announced last month, to tackle illegal content online, and comes ahead of broader protection of children measures which will launch in the Spring.

Robust age checks are a cornerstone of the Online Safety Act. It requires services which allow pornography or certain other types of harmful content to introduce 'age assurance' to ensure that children are not normally able to encounter it.[1] Age assurance methods -- which include age verification, age estimation or a combination of both -- must be 'highly effective' at correctly determining whether a particular user is a child.

We have today published industry guidance on how we expect age assurance to be implemented in practice for it to be considered highly effective. Our approach is designed to be flexible, tech-neutral and future-proof. It also allows space for innovation in age assurance, which represents an important part of a wider safety tech sector where the UK is a global leader[2]. We expect the approach to be applied consistently across all parts of the online safety regime over time.

While providing strong protections to children, our approach also takes care to ensure that privacy rights are protected and that adults can still access legal pornography. As platforms take action to introduce age assurance over the next six months, adults will start to notice changes in how they access certain online services. Our evidence suggests that the vast majority of adults (80%) are broadly supportive of age assurance measures to prevent children from encountering online pornography.[3]

What are online services required to do, and by when?

The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action we expect all of them to take starts from today:

  • Requirement to carry out a children's access assessment. All user-to-user and search services -- defined as 'Part 3' services[4] -- in scope of the Act, must carry out a children's access assessment to establish if their service -- or part of their service - is likely to be accessed by children. From today , these services have three months to complete their children's access assessments, in line with our guidance, with a final deadline of 16 April . Unless they are already using highly effective age assurance and can evidence this, we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply with the children's risk assessment duties and the children's safety duties.[5]
  • Measures to protect children on social media and other user-to-user services. We will publish our Protection of Children Codes and children's risk assessment guidance in April 2025. This means that services that are likely to be accessed by children will need to conduct a children's risk assessment by July 2025 -- that is, within three months. Following this, they will need to implement measures to protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified. These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful content.
  • Services that allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July 2025 at the latest to protect children from encountering it. The Act imposes different deadlines on different types of providers. Services that publish their own pornographic content (defined as 'Part 5 Services[6]) including certain Generative AI tools, must begin taking steps immediately to introduce robust age checks, in line with our published guidance. Services that allow user-generated pornographic content -- which fall under 'Part 3' services -- must have fully implemented age checks by July.
What does highly effective age assurance mean?

Our approach to highly effective age assurance and how we expect it to be implemented in practice applies consistently across three pieces of industry guidance, published today[5]. Our final position, in summary:

  • confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
  • sets out a non-exhaustive list of methods that we consider are capable of being highly effective. They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
  • confirms that methods including self-declaration of age and online payments which don't require a person to be 18 are not highly effective;
  • stipulates that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should services host or permit content that directs or encourages users to attempt to circumvent an age assurance process; and
  • sets expectations that sites and apps consider the interests of all users when implementing age assurance -- affording strong protection to children, while taking care that privacy rights are respected and adults can still access legal pornography.

We consider this approach will secure the best outcomes for the protection of children online in the early years of the Act being in force. While we have decided not to introduce numerical thresholds for highly effective age assurance at this stage (e.g. 99% accuracy), we acknowledge that numerical thresholds may complement our four criteria in the future, pending further developments in testing methodologies, industry standards, and independent research.

Opening a new enforcement programme

We expect all services to take a proactive approach to compliance and meet their respective implementation deadlines. Today Ofcom is opening an age assurance enforcement programme , focusing our attention first on Part 5 services that display or publish their own pornographic content.

We will contact a range of adult services -- large and small -- to advise them of their new obligations. We will not hesitate to take action and launch investigations against services that do not engage or ultimately comply.

For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don't ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they're adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.

As age checks start to roll out in the coming months, adults will start to notice a difference in how they access certain online services. Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services -- including social media - which allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest.

We'll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom.

Notes
  • Research shows that children are being exposed to online pornography from an early age. Of those who have seen online pornography, the average age they first encounter it is 13 -- although more than a quarter come across it by age 11 (27%), and one in ten as young as 9 (10%). Source: 'A lot of it is actually just abuse'- Young people and pornography Children's Commissioner for England
  • Research from the UK Government indicates that UK firms account for an estimated one-in-four (23%) of the global safety tech workforce. 28% of safety tech companies are based in the UK according to recent research by Paladin Capital and PUBLIC .
  • Source: Yonder Consulting - Adult Users' Attitudes to Age Verification on Adult Sites
  • 'Part 3' services include those that host user-generated content, such as social media, tube sites, cam sites, and fan platforms.
  • Services that conclude they are not likely to be accessed by children -- including where this is because they are using highly effective age assurance -- must record the outcome of their assessment and must repeat the children's access assessment at least annually.
  • 'Part 5' services are those that publish their own pornographic content, such as studios or pay sites, where operators control the material available.

 

 

The age of censorship...

Kansas takes legal action against 13 adult websites that have not complied with the state's recent age/ID verification law


Link Here15th January 2025
Full story: Age Verification in USA...Requiring age verification for porn and social media

The Kansas sate attorney general, Kris Kobach has taken legal action against 13 porn websites that have not implemented the required age/ID verification for their readers. A press release explains:

Kansas Attorney General Kris Kobach today announced his office has filed a lawsuit against SARJ LLC, the operator of 13 adult websites. The Kansas Attorney General's Office filed the suit in Shawnee County District Court.

Since July 1, 2024, Kansas law has required that adult websites verify the age of its users. SARJ LLC's websites distribute erotic films, photography, and live streaming platforms without verifying the age of users. The lawsuit marks the first such suit under the 2024 law. Kobach said:

Protecting our children against the harmful effects of pornography is a high priority for all Kansans. This law is making a difference. When the Kansas Legislature passes a law, I will enforce it faithfully to the letter of the law. That is what the people of Kansas elected me to do.
Under Kansas law, SARJ LLC's practices are subject to civil penalties of up to $10,000 per violation per day.

The 13 websites listed are:

metartnetwork.com;
metart.com;
metartx.com;
sexart.com;
vivthomas.com;
thelifeerotic.com;
eroticbeauty.com;
lovehairy.com;
domai.com;
goddessnudes.com;
rylskyart.com;
stunning18.com; and
straplez.com

 

 

So how will age verification for porn pan out?...

Age verification is imminently coming to France and the first site to implement it loses 95% of viewers in the process


Link Here10th January 2025
Full story: Age Verification in France...Macron gives websites 6 months to introduce age verification
Porn websites in France must verify users' ages or face being blocked within days under new rules which come into force after a years-long battle between operators and state censors.

Among the new requirements is to offer at least one double blind option for users to prove their age without revealing their identity to the porn site. However the verifying company will surely be able to maintain a log of a users porn website history.

Sites already offering verification using a credit card have a grace period until April 11 2025 to put in place their double blind checks. These entail the user uploading an identity document to the verifying company, which then sends confirmation they are old enough to visit the site to the porn provider without revealing the user's identity, at least to the porn site.

This niche is being targeted by small firms selling the service to big platforms, with several start-ups offering the service. However age verification seems to be a natural monopoly where users will only want to verify once with one company for all their porn viewing. No doubt something like 'verified by Google' will become the norm and so US giants will take over.

Internet censor Arcom's regulation has been a real boost to the still-emerging sector, said Jacky Lamraoui, head of French startup IDxLab. The firm's Anonymage service is already being used by around 20 sites, all of them adult platforms. Among them is French porn site Tukif.porn, which turned to IDxLab and other verification providers after a court ordered it blocked in October.

Tukif manager Jerome, who declined to give his last name, commented that Tukif was the only free French porn site currently verifying users' ages. He complained that age verification was costing his site one or two (euro) cents per visitor. He added that age verification was also turning some users away from centralised porn sites to less regulated social media platforms such as X or Reddit, which do not have to verify ages. He said:

Since November, less than five percent of users arriving at the verification system come out verified on the other side. It's killed traffic to our site.

For the coming months, competitors from within the European Union enjoy an advantage, as the age check rules only apply to French and non-EU adult services. Arcom is still putting in place procedures for notifying other governments that sites based in their countries are not fulfilling French law before blocking them altogether. EU adult sites will however be expected to comply with the French law in the future.

Aylo, the parent company of major porn sites Pornhub and Brazzers with an office in Cyprus, told AFP in December:

The French rules would likely prove ineffective and dangerous for users' security and privacy. Beyond diverting users to other platforms, France's rules could add to the growing demand worldwide for virtual private network (VPN) services.

A VPN creates a tunnel between the source and the destination of internet traffic,. Using one can prevent intermediaries such as internet service providers (ISPs) from seeing the content of internet traffic, as well as allowing users to change their IP address, browsing as if they are in another location or country. Worldwide, around 28% of internet users aged 16-64 were using a VPN in 2023, according to specialist analytics site DataReporta.

 

 

Florida sets itself up as a VPN hub...

Reports of an upsurge in VPN usage in response to a new internet censorship law mandating age verification for porn


Link Here6th January 2025
VPN company reports a massive rise in VPN demand on 1st of January 2025 when a new Florida censorship law requiring age/ID verification for access to porn came into force. VPN-pushing vpnMentor documented a rather incredible 1150% spike in Floridians wanting to use a VPN to hide their location.

The major porn website Pornhub decided to self ban access from any IP address based in Florida. So even those viewers willing to stupidly hand over ID data to a porn site would be blocked, leaving a VPN as the main way of continuing to access Pornhub.

A vpnMentor spokesperson explained to the tech news site The Register:

To measure the impact of VPN demand the research team compiles data from a variety of sources. The team uses internal tools to assess changes in terms of search volume, web traffic, and clicks related to VPN services in general. We work with different metrics which we analyze, and we evaluate the searches or impressions that transform into downloads.

In March, Florida Governor Ron DeSantis signed the Online Protection for Minors act, aka House Bill 3 , into law. The legislation requires websites to verify visitors' ages, and for those hosting a substantial portion of material harmful to minors, such as Pornhub, to block access to anyone under 18 in an effort to prevent kids and teens from peeping on any pornographic videos.  HB3 allows fines of up to $50,000 for websites that don't comply with the regulations.

And so in response, Pornhub's parent company Aylo decided to yank the site from Florida users as it had already done in other states with similar laws, including Kentucky, Indiana, Idaho, Kansas, Nebraska, Texas, North Carolina, Montana, Mississippi, Virginia, Arkansas, and Utah. Pornhub explained:

Unfortunately, the way many jurisdictions worldwide, including Florida, have chosen to implement age verification is ineffective, haphazard, and dangerous.

Any regulations that require hundreds of thousands of adult sites to collect significant amounts of highly sensitive personal information is putting user safety in jeopardy. Moreover, as experience has demonstrated, unless properly enforced, users will simply access non-compliant sites or find other methods of evading these laws.


 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   2025   Latest 

old Walking Street sign
 
Top

Home

Index

Links
 
GoGos

Bars

Nightlife Latest
 
News

Nightlife

Diary

Email
 

 


 

Thai News

Pattaya News

Thai Life
 

Farangland News

Adult World News

Sex Aware