Sayida Masanja, a businessman has sued Vodacom Tanzania for Sh10 billion (approximately $4 million) for sending his personal data to Open AI’s ChatGPT.
AI-led chatbot adoption has raised privacy concerns.
Masanja wants Vodacom to pay Sh10 billion for helping him lose his privacy. The plaintiff claims ChatGPT accessed his mobile network data as well as his user data. The Citizen reports the case’s first hearing is on September 13.
He first found out that his data was on ChatGPT sometime between February and March of this year. According to him, the personal details included incoming and outgoing calls, IMEI numbers, SIM information, and location data. To serve as evidence for his case, Masanja made physical copies of the exposed data.
Masanja was worried because some of his personal information was on a site that anyone could see. Paul Kaunda, the lawyer for the plaintiff, said that Vodacom was careless and released Masanja’s 65 records on purpose. Masanja could be a target of cybercrime and other types of crime.
Read also: Vodacom, Netstar introduce in-taxi WiFi connectivity to South Africa
After the phone location data leak, Kaunda said it was easy to find where his clients had been for the past 30 days. “The time-stamped data on the plaintiff’s moves can show his family, political, professional, religious, and sexual ties. “By putting them all together, you can make a reasonable guess about the plaintiff’s life,” he said.
Vodacom said that the case was not true. The telco also wants the case thrown out because Masanja broke Section 13 of the Civil Procedure Code. The rule says that all lawsuits must be brought to the court with the least power.
A local judge and district court must hear the case before any other court. Masanja moved the issue to the Shinyanga High Court, which has more power than the lower courts. Vodacom wants the case dismissed due to this error.
The phone company stated it would not reimburse that sum. It defended its data privacy standards by saying, “The defendant operates with the utmost care and a high standard of professionalism in handling the personal data of the customer against third-party invasion.”
Vodacom denied sharing Masanja’s data with any parties, including ChatGPT. It denied giving user data to the AI bot because it had nothing to do with ChatGPT. It claimed no privacy violations.
Kaunda disagrees with Vodacom’s argument that it’s unrelated to Open AI. He added that the carrier contributes Open AI user data to improve ChatGPT’s Large Language Models.
Musk sues Wachtell law firm to recoup fees from Twitter buyout
Data privacy in the AI age
ChatGPT has made a lot of noise about how talking AI technology is here to make life better. Since its start in 2022, the bot has grown into a world force that can help in many different ways. But its rise has been slowed by some problems. One of these problems is the rising issue of data privacy breaches. Its parent company, Open AI, was sued two weeks ago for training its robot with a lot of stolen data.
Technology is all around us in the world we live in now. And so is information about users. No matter if it’s a social media site or an online game, people now have to give their personal information when they sign up. Local data security laws say that tech companies have to handle customer information in a certain way, but some brands have been accused of doing the opposite.
Due to concerns that ChatGPT did not adhere to data privacy laws, Italy became the first Western country to forbid its citizens from accessing the service at the beginning of this year. Last month, Google joined Apple, Samsung, and a few other tech companies in telling their employees to be careful when talking to AI robots.
Chatbots learn from the conversations they have with users, so if an employee isn’t careful and shares a prompt with sensitive information, it could lead to a data leak. If you don’t think that’s likely, you should know that ChatGPT had a data breach in May that made active users’ chat histories and payment information public. Think about what hackers could do with that much information.