Could Children’s Use of Social Media be Banned in the UK?

Some argue that the primary goal of social media is no longer genuine connection, but the maximisation of user engagement for commercial gain. Platforms generate vast revenues by delivering highly targeted, personalised advertising, incentivising designs that keep users scrolling for longer. With the rise of AI, this content stream has become even more relentless, often amplified by manipulative or overly flattering language that encourages continuous interaction. 

Unsurprisingly, many parents are concerned about their children’s use of social media. Endless scrolling and exposure to videos featuring mindless pranks or viral challenges can have negative effects on both mental and physical health. Increasingly, attention is turning to the platforms themselves: critics suggest that their design may not only encourage excessive use, but also contribute to addiction, anxiety and other forms of harm. 

The US Court Case  

On 25th March 2026, a jury in Los Angeles delivered a damning verdict on two of the world’s most popular social media platforms. It ruled that Instagram and You Tube were deliberately designed to be addictive and consequently their parent companies have been negligent in failing to safeguard their child users. Meta and Google, owners of Instagram and YouTube, must now pay $6m (£4.5m) in damages to “Kaley”, the young woman who was the plaintiff (claimant) in this case. Her lawyers argued that the design of Instagram and YouTube caused her to be addicted to the social media platforms. This addiction impacted her mental health during childhood leaving her with body dysmorphia, depression and suicidal thoughts.  

The judgement has sent shockwaves through tech companies worldwide, not just in Silicon Valley. One tech company insider, who asked not to be identified, told the BBC, “we’re having a moment”. Even the Royal Family chimed in. In a statement, the Duke and Duchess of Sussex said: “This verdict is a reckoning. For too long, families have paid the price for platforms built with total disregard for the children they reach.”   

Both companies vigorously defended the claim and intend to appeal the judgement. Meta maintains that a single platform cannot be solely responsible for a user’s mental health crisis. Google, meanwhile, argues that YouTube is not a social network. 

English Law 

Could such a claim succeed in this country? The tort of negligence provides the best hope for claimants who allege harm from social media use subject to the elements of the tort (duty of care, breach, causation and foreseeability) being satisfied. There is growing recognition in UK law that online platforms may owe a duty of care to users, particularly if the users are children. And the harms of over use of social media  are well documented. However causation is likely to be the most difficult hurdle for claimants in the UK. To succeed, a claimant must prove that a platform’s design caused or materially contributed to the harm they suffered through their use of social media. This is a difficult hurdle when it comes to social media. Psychological harm rarely has a single identifiable cause. Social media companies are likely to argue that their platforms are only one of the many factors which can contribute to an individual’s mental health; alongside family environment, school experiences, pre-existing vulnerabilities and offline relationships to name a few.  

Could social media platforms be treated as “defective products” under the Consumer Protection Act 1987 (CPA)  which carries strict liability for harm? Products, under the CPA, are traditionally understood as tangible goods, not the likes of YouTube and Instagram. It is arguable though that social media platforms are not just intermediaries but “manufacturers” of digital environments, making them liable for defects in algorithms or addictive design. The Law Commission is currently reviewing the CPA to determine if it is fit for the digital age, with a focus on artificial intelligence, software and online platforms. The review, which began in September 2025, may lead to expanded liability for online platforms and software providers. 

It is worth noting that the US case was decided by a jury. In the UK civil cases, particularly those involving negligence, are decided by judges. Juries may be influenced by emotional arguments, whereas judges are trained to apply the law strictly and are less susceptible to being swayed by emotion at the expense of legal principles. 

Despite the issues around causation, a legal action in negligence is probably the best option for aggrieved social media users in the UK; although the lack of Legal Aid and the UK courts restrictive approach to class actions mean a test case would require significant upfront funding. Perhaps insurers, emboldened by the US Judgement, may now be more willing to cover the costs of such a test case.  

Regulating Social Media 

Unlike the US, the UK has moved toward statutory regulation rather than litigation as the primary means of controlling social media harms. 

Since the passage of the Online Safety Act in 2023 (OSA), social media companies and search engines have a duty to ensure their services aren’t used for illegal activity or to promote illegal content, with particular protections for children. The communications regulator, Ofcom, has been tasked with implementing the OSA and can fine infringing companies of up to £18 million, or 10% of their global revenue (whichever is greater). Last month, it published guidance on how platforms must protect children. Furthermore, since platforms are processing users’ personal data, they have to comply with the UK GDPR. The Data (Use and Access) Act 2025, which mainly came into force in February, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data.   

Even before the US judgement, many countries had been considering whether, to regulate social media further and/or ban children from using it. Australia has banned it and others, like France and Denmark, have introduced or are planning to introduce tighter rules. 

The UK government is currently carrying out a consultation to consider whether additional measures are required to keep children safe in the online world. This includes setting a minimum age for children to access social media, restricting risky functionalities and design features that encourage excessive use, such as infinite scrolling and autoplay, whether the digital age of consent should be raised, whether the guidance on the use of mobile phones in schools should be put on a statutory footing and better support for parents, including clearer guidance and simpler parental controls. The consultation ends on 26th May, and the government will respond before the end of July. Alongside the consultation, the government is running a pilot scheme which will see 300 teenagers have their social media apps disabled entirely, blocked overnight or capped to one hour’s use – with some also seeing no such changes at all – in order to compare their experiences. Children and parents involved in the pilot will be interviewed before and after to assess its impact. 

Meanwhile, on 27th March 2026, the government published national guidance that urges parents to strictly limit screen exposure in early years over health and development risks. The new recommendations advise that there should be no screen exposure for children under two except for shared activities. For those aged two to five, usage should be capped at one hour per day, with additional guidance to avoid screens at mealtimes and before bed. 

Parliament is also debating the use of social media platforms by children but remains divided on what action to take. In March, during a debate on the Children’s Wellbeing and Schools Bill, the House of Lords supported a proposal to ban under-16s in the UK from social media platforms. It is the second time peers have defeated the government over the proposal. There is now a standoff between the Commons and the Lords. Whatever happens the verdict in the California court has signalled a rising public expectation for more aggressive regulation of social media platforms. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.   

This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

New Podcast: Filming the Public for Social Media

Act Now is pleased to bring you episode 6 of the Guardians of Data podcast.  

Think about the last time you walked down a busy street, sat in a pub, or queued for a train. Now imagine that moment, completely ordinary to you, being filmed by a stranger, uploaded to TikTok or YouTube and watched by millions. 
Maybe it’s monetised; maybe it’s mocked. One thing is for sure though, it never disappears. 

Filming people in public has now become second nature for some. But what happens when those images are shared, edited and turned into social media content? Can you stop someone filming you in public? What rights do you have when the footage is published? 

In this episode, we are joined by Naomi Mathews, a lawyer who specialises in Data Protection, Freedom of Information and Surveillance Law. Naomi helps us explore what the law actually says about filming people in public; where it falls short and how that affects real people who find themselves turned into content without consent. We’ll also ask the harder questions about ethics, power and whether the UK needs a new law to better protect the public. 

Download and listen here, or on your preferred podcast app. Available on Apple Podcasts, Spotify, and all major podcast platforms. 

Previous episodes of the Guardians of Data podcast have featured Jon Baines, reflecting on his career as a Data Protection Specialist and the hot issues in information governance,  Lynn Wyeth discussing the recent controversy around Grok AI, Maurice Frenkel looking back at 20 years of the Freedom of Information Act, Olu Odeniyi analysing recent cyber breaches and discussing the lessons to learn and Raz Edwards talking about how to succeed as an IG leader.

Data Protection Complaints Procedure Deadline Approaching

A new section 164A has been inserted into the Data Protection Act 2018 (DPA) by the Data (Use and Access) Act 2025 (DUA Act). 

From 19th June 2026, Data Controllers will be required to have a complaints procedure to handle data protection complaints. They must also: 

  • acknowledge receipt of complaints within 30 days of receiving them; 
  • without undue delay, take appropriate steps to respond to complaints, including making appropriate enquiries, and keep Data Subjects informed; and 
  • without undue delay, tell Data Subjects the outcome of their complaints 

Under the DPA, individuals are entitled to raise complaints where they believe there has been a breach of the UK GDPR e.g. not responding to a subject access request. This extends to any alleged non-compliance involving an individual’s personal data. The key requirement is that the issue must relate to the individual bringing the complaint. In other words, there needs to be a direct connection between the person and the alleged infringement. For example, if a complaint concerns deficiencies in a privacy notice, the individual will need to demonstrate how those shortcomings affect their own personal data, rather than simply pointing to general non-compliance. 

There is no prescribed format for handling complaints and organisations have discretion in designing their processes. The essential requirement is that individuals must have a clear way to submit a complaint, and that complaints are acknowledged and responded to. Data Controllers may wish to build on existing complaint-handling frameworks that are already in place and functioning effectively; for example your FOI complaints procedure. 

Notably, the legislation does not impose strict deadlines for issuing a final response. As long as responses are provided within a reasonable timeframe and individuals are kept informed of progress, there is no obligation to conclude an investigation within a fixed period. The ICO recently published its guidance explaining the new requirements. Data protection expert, and guest on the first Guardians of Data podcast, Jon Baines writes on his personal blog that in declining to suggest how long controllers should normally take to respond to data subject complaints, the ICO has missed an opportunity to provide regulatory clarity.  

If you are looking to implement the changes made by the DUA Act to the UK data protection regime, consider our very popular half day workshop.  

The newly updated UK GDPR Handbook (2nd edition) includes all amendments introduced by the DUA Act, with colour-coded changes for easy navigation and links to relevant recitals, ICO guidance, and caselaw that help make sense of the reforms in context. We have included relevant provisions of the amended DPA 2018 to support a deeper understanding of how the laws interact.

The Right to Erasure and Unfounded Malicious Allegations

The Victims and Prisoners Act 2024 (Commencement No. 10) and Data (Use and Access) Act 2025 (Commencement No. 8) Regulations 2026 brings into force an important change to Article 17 of the UK GDPR (the right to erasure).    

In 2023, Stella Creasy MP was subjected to a social services investigation after a man complained to Leicestershire Police that the MP’s children should be taken into care due to her “extreme views”. The Labour MP told Today on BBC Radio 4 that the complaint was made because the man disagreed with her campaign against misogyny. 

Waltham Forest Council launched an investigation, as it was legally required to do, following a referral from Leicestershire Police. But despite Ms Creasy being cleared, the council said it was legally prevented from removing the man’s complaint from its records. 

The MP then tabled an amendment to the Victims and Prisoners Bill which was going through Parliament. This was enacted as section 31 of the Victims and Prisoners Act 2024.  Section 31 inserts a new Article 17(1)(g) into the UK GDPR. It extends the grounds upon which a data subject has a right to erasure, to cases of unfounded malicious allegations where: 
 
“the personal data have been processed as a result of an allegation about the data subject- 

(i) which was made by a person who is a malicious person in relation to the data subject (whether they became such a person before or after the allegation was made),

(ii) which has been investigated by the controller, and 

(iii) in relation to which the controller has decided that no further action is to be taken” 
 
New Article 17(4) of the UK GDPR defines a “malicious person” as one who has been convicted of a specified offence or who is subject to a stalking protection order. 

At the same time, the 2026 order also commenced paragraph 32 of Schedule 11 of the Data (Use and Access) Act 2025, which extends the same provisions to Scotland and Northern Ireland. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.   

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop. 

New Podcast: How to Succeed as an IG Leader 

Act Now is pleased to bring you episode 5 of the Guardians of Data podcast.  

In information governance, there is no substitute for learning from those who have walked the path before us. Experienced IG leaders bring a wealth of knowledge from years at the frontline of data protection and information rights – navigating challenges, overcoming obstacles and shaping best practice along the way.
By sharing their stories, lessons learned and practical advice, they help both new starters and seasoned professionals grow in confidence, strengthen their practice and prepare for the challenges of tomorrow. 

In this episode we are joined by Raz Edwards, Head of Data Security and Protection at Wolverhampton NHS Trust. Raz has over 17 years of experience as a Data Protection Officer, including more than a decade in the NHS. She is also Chair of the National Strategic Information Governance Network and serves as a member of the Upper Tribunal and First-Tier Tribunal in the Information Rights Jurisdiction. 

In our conversation, Raz shares her journey into Information Governance, the challenges she’s faced and overcome as an IG leader, her advice for both new starters and seasoned professionals and her perspective on the future of the profession.
She also reflects on what she’s learned through her tribunal role and what it takes to succeed as an IG leader. 

 Download and listen here, or on your preferred podcast app. Available on Apple Podcasts, Spotify, and all major podcast platforms. 

Previous episodes of the Guardians of Data podcast have featured Jon Baines, reflecting on his career as a Data Protection Specialist and the hot issues in information governance, Lynn Wyeth discussing the recent controversy around Grok AI, Maurice Frenkel looking back at 20 years of the Freedom of Information Act and Olu Odeniyi analysing recent cyber breaches and discussing the lessons to learn.

ICO Focus on Children’s Data Processing 

In February we wrote about the Information Commissioner’s Office (ICO) issuing fines under the UK GDPR to two social media companies. Reddit was fined £14.47 million and MediaLab (owner of Imgur) was fined £247,590 for failing to implement age‑assurance measures and for processing children’s personal data in a way that potentially exposed them to harmful content. 

Safeguarding children’s privacy is a key enforcement priority for the ICO. The ICO’s investigation into TikTok (opened in March 2025) is still ongoing. It is considering how the platform uses personal data of 13-17 year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in the light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to them being served inappropriate or harmful content. The ICO is also investigating 17 other platforms including Discord, Pinterest, and X, and has been in discussions with Meta and Snapchat over how they use children’s location data in their user map features.  

Safeguarding children’s privacy is also a duty of the ICO under the Online Safety Act, alongside Ofcom. Last week the ICO published an open letter to social media and video‑sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children cannot access services that are not designed for them. The letter sets out the ICO’s expectations about measures that platforms with a minimum age must implement, beyond relying on children to self-declare their ages (which they can easily bypass).  Instead, platforms should make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services. The ICO has also written directly to platforms, starting with TikTok, Snapchat, Facebook, Instagram, YouTube and X to ask them to demonstrate how their age assurance measures meet the ICO’s expectations.  

The Data (Use and Access) Act 2025, most of which came in to force earlier this month, explicitly requires those who provide an online service that is likely to be used by children, to take their needs into account when deciding how to use their personal data.  

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.  

This and other developments relating to children’s data will be covered forthcoming workshop, Working with Children’s Data.

Join Our Team – Become an Act Now Trainer! 

We are Hiring – Become an Act Now Trainer! 

Are you an information governance expert with a passion for sharing your knowledge? Do you have experience delivering engaging training on GDPR, FOI, AI or Cyber Security? Do you want to help build a privacy-conscious world? If so, we want to hear from you! 

Why Act Now Training? 

Act Now Training is Europe’s leading provider of information governance training. For the past 24 years, we have been working with government organisations, multinational corporations, financial institutions, and corporate law firms. Our team of expert trainers delivers high-quality, practical courses that make complex topics easy to understand. 

With a comprehensive programme ranging from short webinars and one-day workshops to advanced practitioner certificate courses, we equip professionals with the knowledge and skills they need to navigate the evolving landscape of data protection and privacy. 

Our Mission 

At Act Now, we believe in building a privacy-conscious world. Our goal is to promote trust and respect for privacy, ensuring organisations embed data protection into their operations by default. We break down complex legal concepts, making education accessible and empowering professionals to become leaders in their field. By fostering a culture of responsible data usage, we help build public trust and drive positive change. 

Why Join Us? 

Due to increasing demand, we are expanding our team of expert trainers. With new courses launched in 2026, we are looking for talented trainers to deliver engaging, practical and jargon-free training. 

We offer opportunities for both full-time trainers and those looking to complement their existing roles. We are looking for passionate professionals who bring energy and innovation to their training sessions. 

What We Are Looking For: 

  • Experience of delivering GDPR, FOI, AI or Cyber Security training 
  • A passion for teaching and the ability to simplify complex concepts. 
  • A commitment to delivering interactive, engaging training (no “death by PowerPoint”!). 
  • Availability for two to ten training days per month  

We have opportunities to deliver a variety of courses, including our flagship GDPR Practitioner Certificate and our AI Governance Practitioner Certificate as well as customised in house training. 

Apply Now 

If you are ready to take the next step in your career and join a team dedicated to shaping the future of information governance, we’d love to hear from you! 

Email info@actnow.org.uk your CV, detailing your training and consultancy experience in GDPR, FOI, or Cyber Security. Closing date for applications is 5th May 2026. A full privacy policy is available on our website. 

New Podcast: Lessons from Cyber Breaches

Act Now is pleased to bring you episode 4 of the Guardians of Data podcast. This is a show where we explore the world of information law and information governance; from privacy and AI to cybersecurity and freedom of information.  

The topic of this episode is cyber security. Every week we read about organisations being hacked, held to ransom or their data being stolen. The BBC recently discovered, through an FOI request, that around 10 million people had their data stolen when Transport for London (TfL) was hacked in 2024, making it one of the biggest hacks in British history. The so-called Scattered Spider crime group, breached TfL’s internal computer systems, disrupting its online services and causing £39m of damage. 

And the breakout of war in the Middle East has significantly increased the risk of cyber-attack. The National Cyber Security Centre (NCSC) recently warned that organisations should prepare for the risk of collateral damage from Iran-linked hacktivists. It said those with a presence in the region should consider boosting the monitoring of their IT systems and follow the centre’s guidelines for dealing with a heightened threat of cyber-attacks. 

In this podcast we talk about cyber security through the lens of the recent cyberattacks on major UK retailers. In just the past few months, household names like, Jaguar Land Rover, Gucci, Marks & Spencer and Co-op have suffered significant disruption from ransomware attacks and other cyber incidents. These caused empty shelves, disrupted online orders and shook customer trust. 

To help us unpack what happened and what lessons we can all take away, we are joined by Olu Odeniyi a Cyber Security expert and trusted advisor with more than 30 years’ experience in this field. In our conversation, we also explore how businesses can build resilience and trust in the face of growing threats, the future of cybersecurity and practical tips for all of us to stay ahead of the hackers.  

Download and listen here, or on your preferred podcast app. 
Available on Apple Podcasts, Spotify, and all major podcast platforms. 

Previous episodes of the Guardians of Data podcast have featured Jon Baines, talking about his career as a Data Protection specialist and the hot issues in information governance, and Lynn Wyeth discussing the recent controversy around Grok AI and Maurice Frenkel talking about 20 years of the Freedom of Information Act.

Police Scotland Fined for Mishandling Alleged Victim’s Mobile Phone Data 

The Information Commissioner’s Office (ICO) has fined the Police Scotland £66,000 and issued a Reprimand for serious failures in the handling of sensitive personal data. 

Detective Constable Lianne Gilbert, who has now waived her right to anonymity, made domestic abuse allegations, including serious sexual assault, against another officer in 2020. However when a misconduct inquiry took place two years later, it emerged data extracted from Ms Gilbert’s phone was given to the accused officer, his lawyer and his Scottish Police Federation (SPF) representative. There were 40,000 pages of extracted data including 80,000 images, medical records and contact details of Ms Gilbert’s friends and family. Some of the images were of an intimate nature.  

Ms Gilbert has given her account to BBC Scotland News. She said: 

“It’s been absolutely horrific and very, very traumatic.” 

“At the time it happened I had a five-month-old baby. It’s really impacted my motherhood journey. At times I still feel quite numb.” 

It is important to note that the officer in question has not been charged with any offences against Ms Gilbert and the case remains live. 

UK GDPR Breaches 

The ICO investigation concluded that:  

a) Police Scotland failed to implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk associated with the processing of personal data by the PSD for the purposes of compiling misconduct packs for disclosure as part of its investigations (Article 32(1) UK GDPR); 

b) These deficiencies put the personal data processed by the PSD at risk of unauthorised disclosure, in breach of the requirement to ensure appropriate security of personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage (Article 5(1)(f) UK GDPR); 

c) Police Scotland failed, at the time of the determination of the means of processing and at the time of the processing itself, to implement appropriate technical and organisational measures designed to implement data protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the UK GDPR and protect the rights of data subjects (Article 25(1)-(2) UK GDPR); 

d) Police Scotland failed to ensure that the personal data processed by the PSD when compiling misconduct packs for disclosure was adequate, relevant and limited to what was necessary in relation to the purposes for which it was processing that data (Article 5(1)(c) UK GDPR); and 

e) Police Scotland failed to inform the Commissioner of the personal data breach within 72 hours of becoming aware of the same (Article 33(1) UK GDPR) 

In assessing the fine amount, the ICO considered the seriousness of the incident, the sensitivity of the data involved and the impact on the affected person. It initially concluded that a £132,000 fine would be effective, proportionate and dissuasive. However applying its controversial public sector approach to enforcement, it decided to reduce the amount by a factor of 50%. 

The Monetary Notice states that Police Scotland paid a sum of money (amount redacted) as compensation to Ms Gilbert. This may have been in anticipation of a civil claim by Ms Gilbert. Article 82 UK GDPR gives a data subject a right to compensation for material or non-material damage for any breach of the UK GDPR. Section 168 of the DPA 2018 confirms that “non-material damage” includes distress. There may be more claims to come; no doubt amongst the data extracted (and shared) from Ms Gilbert’s phone there will have been personal data related to third parties. 

Part 3 DPA Reprimand 

The related reprimand was issued under Part 3 of the Data Protection Act 2018 (law enforcement processing). Police Scotland is a competent authority under Part 3 and was, according to the ICO, processing Ms Gilbert’s data for law enforcement purpose when it extracted the data. The ICO found that Police Scotland had infringed sections 35 and 37 of the DPA by failing to ensure that: 

a) The bulk download of personal data on the mobile phone of the Data Subject was lawful and fair (section 35 DPA); and 

b) The personal data processed from the mobile phone download was adequate, relevant and not excessive in relation to the purposes for which it was processed (section 37 DPA). 

The ICO initially considered that a fine would be appropriate for these DPA breaches, and considered notifying Police Scotland of its intention to impose a fine of £78,750. However, once again, due to the revised approach to public sector enforcement it decided a reprimand was more appropriate. 

Listen to the Guardians of Data Podcast for the latest news and views on data protection, cyber security, AI and freedom of information.   

This and other data protection developments will be discussed in detail on our forthcoming  GDPR Update workshop and our Law Enforcement Data Processing workshop.

Transparency and FOI: 20 Years On

Act Now is pleased to bring you episode 3 of the Guardians of Data podcast. This is a show where we explore the world of information law and information governance – from privacy and AI to cybersecurity and freedom of information.  

In the past few weeks, we have had a stark reminder of why transparency in public life is a democratic necessity. The US Government’s release of millions of documents linked to the Jeffrey Epstein investigation has triggered, amongst other things, the arrest of the King’s brother, the sacking and subsequent arrest of a former Government minister, political jeopardy for the Prime Minister and questions about the future of the British monarchy.  

In Episode 3, our guest is Maurice Frankel OBE, Director of the Campaign for Freedom of Information. We discuss the remarkable story behind the UK’s Freedom of Information Act. From his early work with the campaigner Des Wilson in the 1980s, to the later attacks launched to weaken FOI’s impact, Maurice shares insights on:

• Life before the Act and how public authorities’ culture has evolved

• The key battles to see the law passed and fully implemented

• Lessons from major disclosures, inquiries and data releases

• FOI shortcomings, from excessive public interest extensions to the need for proactive publication

• Emerging threats to transparency

Hear what still inspires one of the UK’s foremost transparency advocates and why FOI remains a vital tool for public accountability.

Listen via this link, or on your preferred podcast app. Available on Apple Podcasts, Spotify, and all major podcast platforms. 

Previous episodes of the Guardians of Data podcast have featured Jon Baines, talking about his career as a Data Protection specialist and the hot issues in information governance, and Lynn Wyeth discussing the recent controversy around Grok AI.