How an AI-written Book Shows why the Tech 'Frightens' Creatives
For Christmas I got an intriguing present from a friend - my extremely own "very popular" book.
"Tech-Splaining for Dummies" (terrific title) bears my name and my photo on its cover, and it has radiant reviews.
Yet it was completely composed by AI, with a couple of easy triggers about me supplied by my good friend Janet.
It's a fascinating read, and uproarious in parts. But it also meanders rather a lot, and is somewhere between a self-help book and a stream of anecdotes.
It mimics my chatty design of writing, but it's also a bit recurring, oke.zone and really verbose. It might have surpassed Janet's triggers in collating information about me.
Several sentences start "as a leading innovation reporter ..." - cringe - which might have been scraped from an online bio.
There's also a strange, repeated hallucination in the type of my cat (I have no animals). And there's a metaphor on practically every page - some more random than others.
There are lots of business online offering AI-book composing services. My book was from .
When I called the primary executive Adir Mashiach, based in Israel, he informed me he had actually sold around 150,000 customised books, mainly in the US, considering that pivoting from assembling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The company uses its own AI tools to produce them, based on an open source big language design.
I'm not asking you to purchase my book. Actually you can't - only Janet, who produced it, can purchase any further copies.
There is currently no barrier to anybody producing one in any person's name, including stars - although Mr Mashiach says there are guardrails around violent content. Each book contains a printed disclaimer stating that it is fictional, produced by AI, and created "exclusively to bring humour and delight".
Legally, the copyright comes from the firm, but Mr Mashiach worries that the item is intended as a "personalised gag present", and the books do not get sold even more.
He wants to broaden his variety, creating different genres such as sci-fi, and possibly offering an autobiography service. It's designed to be a light-hearted type of customer AI - selling AI-generated goods to human clients.
It's likewise a bit terrifying if, like me, you compose for a living. Not least since it probably took less than a minute to generate, and it does, definitely in some parts, sound similar to me.
Musicians, authors, artists and actors worldwide have actually revealed alarm about their work being used to train generative AI tools that then churn out similar material based upon it.
"We must be clear, when we are speaking about information here, we actually imply human developers' life works," states Ed Newton Rex, creator of Fairly Trained, which projects for AI companies to respect developers' rights.
"This is books, this is articles, this is images. It's works of art. It's records ... The entire point of AI training is to find out how to do something and then do more like that."
In 2023 a song including AI-generated voices of Canadian vocalists Drake and The Weeknd went viral on social media before being pulled from streaming platforms because it was not their work and they had actually not consented to it. It didn't stop the track's creator trying to nominate it for a Grammy award. And even though the artists were fake, it was still hugely popular.
"I do not believe using generative AI for imaginative purposes should be banned, but I do think that generative AI for these purposes that is trained on individuals's work without authorization must be prohibited," Mr Newton Rex includes. "AI can be really effective but let's develop it ethically and fairly."
OpenAI says Chinese competitors using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes market and dents America's swagger
In the UK some organisations - consisting of the BBC - have picked to obstruct AI developers from trawling their online content for training functions. Others have chosen to collaborate - the Financial Times has partnered with ChatGPT developer OpenAI for instance.
The UK government is thinking about an overhaul of the law that would allow AI developers to utilize developers' content on the internet to help establish their models, unless the rights holders decide out.
Ed Newton Rex describes this as "insanity".
He explains that AI can make advances in locations like defence, healthcare and logistics without trawling the work of authors, reporters and artists.
"All of these things work without going and changing copyright law and destroying the livelihoods of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in your home of Lords, is also strongly against eliminating copyright law for AI.
"Creative industries are wealth developers, 2.4 million jobs and a great deal of pleasure," states the Baroness, who is also an advisor to the Institute for Ethics in AI at Oxford University.
"The government is weakening among its finest carrying out industries on the unclear pledge of development."
A federal government spokesperson said: "No relocation will be made until we are absolutely confident we have a useful strategy that provides each of our goals: increased control for best holders to help them certify their material, access to top quality product to train leading AI designs in the UK, and more transparency for best holders from AI developers."
Under the UK federal government's new AI strategy, a nationwide data library consisting of public information from a wide variety of sources will likewise be made offered to AI researchers.
In the US the future of federal rules to manage AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to improve the security of AI with, among other things, firms in the sector required to share details of the operations of their systems with the US government before they are launched.
But this has actually now been repealed by Trump. It remains to be seen what Trump will do rather, however he is said to desire the AI sector to face less regulation.
This comes as a number of claims versus AI firms, and especially versus OpenAI, continue in the US. They have actually been secured by everyone from the New York Times to authors, music labels, and qoocle.com even a comic.
They declare that the AI firms broke the law when they took their content from the internet without their permission, and utilized it to train their systems.
The AI companies argue that their actions fall under "fair use" and are therefore exempt. There are a number of elements which can constitute fair usage - it's not a straight-forward definition. But the AI sector is under increasing examination over how it gathers training data and whether it must be paying for it.
If this wasn't all sufficient to ponder, Chinese AI company DeepSeek has actually shaken the sector over the past week. It became the a lot of downloaded totally free app on Apple's US App Store.
DeepSeek claims that it developed its technology for a portion of the price of the likes of OpenAI. Its success has raised security issues in the US, and threatens American's current dominance of the sector.
As for larsaluarna.se me and a profession as an author, I think that at the moment, if I really want a "bestseller" I'll still need to compose it myself. If anything, Tech-Splaining for Dummies highlights the current weak point in generative AI tools for bigger jobs. It has lots of inaccuracies and hallucinations, and it can be rather hard to check out in parts since it's so long-winded.
But given how quickly the tech is evolving, I'm unsure how long I can stay positive that my substantially slower human writing and modifying skills, are much better.
Register for our Tech Decoded newsletter to follow the most significant advancements in worldwide innovation, with analysis from BBC reporters around the globe.
Outside the UK? Register here.