For Christmas I received a fascinating gift from a buddy - my really own "best-selling" book.
"Tech-Splaining for Dummies" (terrific title) bears my name and my image on its cover, and it has glowing reviews.
Yet it was completely composed by AI, with a few basic triggers about me provided by my pal Janet.
It's an intriguing read, and very funny in parts. But it also meanders rather a lot, timeoftheworld.date and is someplace between a self-help book and a stream of anecdotes.
It simulates my chatty design of composing, but it's also a bit repetitive, and very verbose. It might have surpassed Janet's triggers in looking at information about me.
Several sentences begin "as a leading technology reporter ..." - cringe - which might have been scraped from an online bio.
There's also a mysterious, repetitive hallucination in the kind of my feline (I have no pets). And there's a metaphor on nearly every page - some more random than others.
There are lots of business online offering AI-book composing services. My book was from BookByAnyone.
When I contacted the president Adir Mashiach, based in Israel, he informed me he had actually sold around 150,000 customised books, primarily in the US, because pivoting from compiling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The firm uses its own AI tools to produce them, based on an open source large language design.
I'm not asking you to buy my book. Actually you can't - just Janet, who developed it, can order any further copies.
There is presently no barrier to anyone creating one in anybody's name, consisting of celebs - although Mr Mashiach says there are guardrails around violent material. Each book consists of a printed disclaimer specifying that it is fictional, developed by AI, and designed "entirely to bring humour and happiness".
Legally, the copyright comes from the company, but Mr Mashiach stresses that the item is intended as a "personalised gag gift", and the books do not get sold further.
He hopes to widen his range, generating different categories such as sci-fi, and perhaps offering an autobiography service. It's created to be a light-hearted kind of consumer AI - offering AI-generated goods to human clients.
It's also a bit scary if, like me, you compose for a living. Not least since it most likely took less than a minute to create, and it does, definitely in some parts, sound similar to me.
Musicians, authors, artists and stars worldwide have revealed alarm about their work being used to train generative AI tools that then churn out similar content based upon it.
"We should be clear, when we are talking about information here, we actually imply human developers' life works," says Ed Newton Rex, founder of Fairly Trained, qoocle.com which projects for AI companies to respect creators' rights.
"This is books, this is articles, this is photos. It's works of art. It's records ... The entire point of AI training is to discover how to do something and after that do more like that."
In 2023 a song including AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms due to the fact that it was not their work and akropolistravel.com they had not granted it. It didn't stop the track's developer trying to choose it for a Grammy award. And although the artists were fake, it was still wildly popular.
"I do not think using generative AI for creative purposes need to be banned, but I do think that generative AI for these purposes that is trained on people's work without permission must be banned," Mr Newton Rex adds. "AI can be really effective however let's build it morally and relatively."
OpenAI states Chinese competitors using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes industry and damages America's swagger
In the UK some organisations - consisting of the BBC - have actually picked to block AI developers from trawling their online content for training purposes. Others have actually chosen to collaborate - the Financial Times has partnered with ChatGPT creator OpenAI for example.
The UK government is thinking about an overhaul of the law that would enable AI designers to utilize creators' content on the internet to help develop their designs, unless the rights holders pull out.
Ed Newton Rex explains this as "insanity".
He mentions that AI can make advances in areas like defence, health care and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and changing copyright law and destroying the incomes of the nation's creatives," he argues.
Baroness Kidron, a crossbench peer in the House of Lords, is also highly against eliminating copyright law for AI.
"Creative markets are wealth developers, 2.4 million tasks and a great deal of joy," says the Baroness, who is likewise a consultant to the Institute for Ethics in AI at Oxford University.
"The federal government is undermining among its finest performing industries on the unclear promise of development."
A government spokesperson said: "No relocation will be made up until we are definitely confident we have a useful plan that provides each of our objectives: increased control for best holders to assist them accredit their content, access to top quality product to train leading AI models in the UK, and more openness for right holders from AI developers."
Under the UK federal government's new AI plan, a national information library containing public information from a wide variety of sources will likewise be provided to AI researchers.
In the US the future of federal guidelines to manage AI is now up in the air following President Trump's go back to the presidency.
In 2023 Biden signed an executive order that aimed to improve the safety of AI with, amongst other things, companies in the sector required to share information of the operations of their systems with the US government before they are released.
But this has actually now been rescinded by Trump. It remains to be seen what Trump will do instead, however he is stated to desire the AI sector to deal with less policy.
This comes as a variety of claims against AI companies, and particularly versus OpenAI, continue in the US. They have actually been taken out by everyone from the New york city Times to authors, music labels, and even a comic.
They claim that the AI companies broke the law when they took their material from the internet without their approval, and utilized it to train their systems.
The AI business argue that their actions fall under "fair usage" and are therefore exempt. There are a number of elements which can make up fair usage - it's not a straight-forward definition. But the AI sector is under increasing scrutiny over how it gathers training data and whether it ought to be paying for it.
If this wasn't all sufficient to contemplate, Chinese AI firm DeepSeek has shaken the sector over the previous week. It became one of the most downloaded free app on Apple's US App Store.
DeepSeek declares that it developed its technology for a portion of the price of the similarity OpenAI. Its success has actually raised security concerns in the US, and threatens American's present of the sector.
When it comes to me and a career as an author, I think that at the moment, if I actually desire a "bestseller" I'll still need to write it myself. If anything, forum.altaycoins.com Tech-Splaining for Dummies highlights the current weakness in generative AI tools for larger projects. It has plenty of errors and hallucinations, and it can be quite difficult to check out in parts because it's so verbose.
But given how rapidly the tech is evolving, I'm not exactly sure the length of time I can remain positive that my substantially slower human writing and editing abilities, are much better.
Sign up for our Tech Decoded newsletter to follow the biggest developments in worldwide technology, with analysis from BBC correspondents around the world.
Outside the UK? Register here.
1
How an AI-written Book Shows why the Tech 'Horrifies' Creatives
Berniece Laflamme edited this page 2025-02-03 18:49:07 +08:00