For Christmas I got an interesting gift from a good friend - my very own "very popular" book.
"Tech-Splaining for Dummies" (terrific title) bears my name and my image on its cover, and it has glowing evaluations.
Yet it was entirely written by AI, with a few simple triggers about me supplied by my pal Janet.
It's an interesting read, and uproarious in parts. But it likewise meanders quite a lot, and is somewhere in between a self-help book and a stream of anecdotes.
It imitates my chatty style of composing, but it's likewise a bit recurring, and really verbose. It may have exceeded Janet's triggers in collating data about me.
Several sentences begin "as a leading technology journalist ..." - cringe - which could have been scraped from an online bio.
There's also a mysterious, repeated hallucination in the type of my cat (I have no animals). And there's a metaphor on nearly every page - some more random than others.
There are lots of business online offering AI-book composing services. My book was from BookByAnyone.
When I got in touch with the chief executive Adir Mashiach, based in Israel, he told me he had offered around 150,000 personalised books, generally in the US, because pivoting from putting together AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller costs ₤ 26. The company uses its own AI tools to generate them, based on an open source big language design.
I'm not asking you to purchase my book. Actually you can't - only Janet, who produced it, can order any additional copies.
There is currently no barrier to anyone creating one in anybody's name, including stars - although Mr Mashiach says there are guardrails around violent material. Each book consists of a printed disclaimer specifying that it is imaginary, developed by AI, and created "solely to bring humour and joy".
Legally, the copyright comes from the company, however Mr Mashiach stresses that the item is meant as a "customised gag present", and the books do not get offered further.
He wishes to widen his variety, creating various categories such as sci-fi, gdprhub.eu and possibly using an autobiography service. It's developed to be a light-hearted type of consumer AI - selling AI-generated products to human customers.
It's likewise a bit frightening if, like me, photorum.eclat-mauve.fr you compose for a living. Not least due to the fact that it probably took less than a minute to generate, and it does, definitely in some parts, sound much like me.
Musicians, authors, artists and actors worldwide have expressed alarm about their work being used to train generative AI tools that then churn out similar material based upon it.
"We must be clear, when we are talking about data here, we really imply human developers' life works," says Ed Newton Rex, creator of Fairly Trained, which projects for AI firms to respect developers' rights.
"This is books, this is articles, this is images. It's masterpieces. It's records ... The entire point of AI training is to discover how to do something and after that do more like that."
In 2023 a tune including AI-generated voices of Canadian singers Drake and The Weeknd went viral on social media before being pulled from streaming platforms since it was not their work and they had actually not granted it. It didn't stop the track's creator trying to choose it for a Grammy award. And although the artists were fake, it was still hugely popular.
"I do not think the use of generative AI for innovative functions ought to be prohibited, but I do think that generative AI for these purposes that is trained on people's work without consent should be prohibited," Mr Newton Rex adds. "AI can be very effective however let's build it ethically and relatively."
OpenAI says Chinese rivals utilizing its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes market and damages America's swagger
In the UK some organisations - including the BBC - have actually picked to block AI designers from trawling their online material for training purposes. Others have decided to work together - the Financial Times has partnered with ChatGPT creator OpenAI for example.
The UK government is considering an overhaul of the law that would allow AI developers to use creators' content on the web to help establish their models, unless the rights holders opt out.
Ed Newton Rex explains this as "madness".
He explains that AI can make advances in areas like defence, health care and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and altering copyright law and messing up the incomes of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in your house of Lords, is likewise strongly against removing copyright law for AI.
"Creative markets are wealth developers, 2.4 million tasks and a lot of joy," says the Baroness, who is likewise an advisor to the Institute for Ethics in AI at Oxford University.
"The federal government is weakening one of its best performing markets on the unclear promise of growth."
A government representative stated: "No move will be made until we are absolutely positive we have a useful plan that provides each of our objectives: increased control for best holders to assist them license their material, access to high-quality product to train leading AI designs in the UK, and more openness for best holders from AI developers."
Under the UK government's new AI plan, a national information library containing public data from a broad range of sources will likewise be provided to AI scientists.
In the US the future of federal guidelines to control AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that aimed to enhance the security of AI with, to name a few things, firms in the sector needed to share information of the operations of their systems with the US government before they are launched.
But this has now been repealed by Trump. It stays to be seen what Trump will do rather, but he is stated to desire the AI sector to deal with less policy.
This comes as a variety of claims versus AI firms, and especially against OpenAI, continue in the US. They have been gotten by everybody from the New York Times to authors, music labels, and even a comedian.
They claim that the AI companies broke the law when they took their material from the web without their authorization, and utilized it to train their systems.
The AI business argue that their actions fall under "reasonable use" and are for that reason exempt. There are a variety of aspects which can make up fair use - it's not a straight-forward definition. But the AI sector is under increasing analysis over how it gathers training data and whether it need to be paying for it.
If this wasn't all adequate to ponder, Chinese AI company DeepSeek has actually shaken the sector over the previous week. It ended up being the a lot of downloaded free app on Apple's US App Store.
DeepSeek declares that it established its technology for a fraction of the cost of the similarity OpenAI. Its success has actually raised security issues in the US, and threatens American's existing of the sector.
As for me and a career as an author, I think that at the minute, if I really want a "bestseller" I'll still have to write it myself. If anything, Tech-Splaining for Dummies highlights the current weak point in generative AI tools for bigger jobs. It has lots of inaccuracies and hallucinations, and it can be quite tough to check out in parts because it's so long-winded.
But given how quickly the tech is progressing, I'm uncertain for how long I can stay positive that my considerably slower human writing and editing abilities, are much better.
Sign up for our Tech Decoded newsletter to follow the biggest advancements in international innovation, with analysis from BBC correspondents worldwide.
Outside the UK? Register here.
1
How an AI-written Book Shows why the Tech 'Terrifies' Creatives
Alissa Guercio edited this page 2025-02-05 04:12:56 +08:00