After being scammed for over $500,000, one man paid one such company $6,500 to recover his stolen savings. A year later, he’s received nothing — and he’s not alone. Like countless other cryptocurrency scam victims, this southern California man’s story began the same way: an unsolicited text message from someone who said they had the wrong number in late 2021.
Weeks of texts later, the man, who Forbes agreed to identify by one of his initials, “M,” realized that he had been conned, and had lost over $500,000 – 10 years worth of his financial savings. Immediately, M went to his local police department, which he says declined to take a report, telling him to go to federal authorities instead. After a few more weeks went by, M was feeling frustrated that his case wasn’t advancing quickly enough with conventional law enforcement.
So he turned to CipherBlade, a company that claims to have recovered “millions of dollars of stolen cryptocurrency.” M signed a contract with the company, agreeing to pay $6,500 for “up to” 10 hours of work. If CipherBlade helped recover any of his money, they would also get 12.5% of that, too. But now, more than a year has passed, and M hasn’t gotten a dollar back.
“Practically all such private services require upfront payments regardless of the outcome. The reason is obvious: if they were to receive funds only when the victim actually recovers losses, most of them would be out of business very soon.”…..Binance
Pig butchering, as it’s known, is a new type of online con perpetrated by overseas scammers who “fatten” up victims – making them believe they have made boatloads of money in cryptocurrency often using manipulated apps and websites – before absconding with all their money. Experts say billions of dollars are lost to this type of pernicious scheme each year.
The hard truth is that recovering money lost to crypto scams is extremely rare, even when law enforcement does take up a case. But in recent years, a nascent industry has cropped up, offering services that promise to do just that. These companies convince consumers to spend more money in order to recover their already-lost sums, with scant evidence that they regularly work as advertised.
Multiple U.S. financial and law enforcement agencies, including the Federal Trade Commission and the Commodity Futures Trading Commission, generally tell scam victims to treat these services with a healthy dose of skepticism. That’s because even if one of these companies is involved, law enforcement still has to do its own independent investigation — for which victims aren’t charged. Plus, no private company has the authority to compel the freezing, much less the seizure, of crypto assets held at an exchange.
That sad-sounding email may not be on the level…getty
Financial trickery is an ongoing problem, and it’s amazing how successful it often is.Rick Kahler, president of Kahler Financial Group in Rapid City, S. D., has an eagle eye for such frauds. Here are his tips to avoid being gulled.
Larry Light: Ours is a cynical age. You’d think people would be more wise to attempted cons.
Rick Kahler: Financial scams and con artists have probably been around at least as long as people have been using money. In the past when I read stories about scams, I often wondered how people could be so gullible. I assumed that the victims of fraud were the vulnerable elderly or less educated, had suffered a recent loss of a loved one, or were isolated. This is not necessarily the case.
While some data suggests that one in five of those over age 65 have been targeted by email scammers, being scammed can happen to anyone. Nobody is immune to fraud, and sometimes people simply fall for scams due to the psychological techniques employed by fraudsters. Often, their strategies are meant to take advantage of our desire to give.
Light: What are some of the most widely used scams?
Kahler: Shipping and mailing is a big one. If you’re sending gifts, be suspicious if you get an email or text message that appears to be from UPS, FedEx or the U.S. post office. One scam involves sending messages that you need to pay a fee for a missed delivery, which is an attempt to lure you to a fake website that asks for your credit or debit card information.
Hang onto receipts that include tracking numbers in case you need to find a misdelivered package. It’s also a good idea to let recipients know the tracking number and the expected date of delivery to help them guard against packages being stolen by porch pirates.
Mail checks or gift cards at the post office or a drop box rather than putting them in your home mailbox. A friend of mine had thieves steal a check out of her mailbox, alter the payee and amount, and try to cash it at her bank. Fortunately, an alert teller refused the transaction and also called the police with information about the thieves’ car, and they were caught. Stolen checks can also be used for identity theft or sold on the dark web.
Light: Yikes. What else?
Kahler: Then there’s the family-member-in-need scam. A standard phone scam targeting grandparents is the call from someone claiming to be “your oldest grandson,” who is in trouble and needs money urgently. A related method sends random text messages pretending to be a family member who has lost their phone. If a parent responds to a text saying “this is my new number,” the scammer then asks for money because of some problem related to losing their phone.
In cases like this, you want to be like one of my clients who got the “grandma, I’m in trouble” phone call. Her oldest grandson has a distinctive voice, so she knew immediately that the call was phony. Instead of responding with the grandson’s name, which is what the scammer hopes for, she just said, “Oh?” and waited. “I won,” she later said proudly, “The scammer hung up on me!”
Light: That’s good to hear. Emotions are great weapons for these crooks.
Kahler: Charitable giving scams depend on that. Scammers may pose as well-known legitimate charities or use fake names similar to those of real organizations. They might also take advantage of our desire to help by inventing stories of children with acute illnesses or using current events like the war in Ukraine or natural disasters.
Be alert for charities with unfamiliar names, websites that don’t look quite right and texts or emails from unverified sources. Instead of responding to an unsolicited message asking for donations, do your own search for a legitimate charity’s website and initiate your donation there.
Where scammers are concerned, suspicion is a good idea.
Binance, the world’s largest cryptocurrency exchange, has processed transactions totaling at least $2.35 billion stemming from hacks, investment frauds and illegal drug sales, according to a Reuters investigation, published Monday. The data provided by Amsterdam-based analysis firm Crystal Blockchain showed that from 2017 to 2022 buyers and sellers on the world’s largest darknet drugs market, a Russian-language site called Hydra, used the exchange to make and receive payments worth $780 million.
Additionally, the German police said investigators began seeing criminals in Europe turn to Binance in 2020 to launder some of the proceeds from investment fraud schemes that caused victims, many of them pensioners, to lose a total of $750 million euros ($800 million).
The flow of illicit funds through the exchange represents a very small portion of Binance’s overall trading volume (over $9.5 trillion in 2021 according to The Block) but is still significant as regulators and policymakers, including U.S. Treasury Secretary Janet Yellen and European Central Bank President Christine Lagarde, raise concerns over the illegal use of cryptocurrencies. The FTC last week reported that more than $1 billion had been illicitly obtained from crypto fraud and scams between January 2021 and March 2022.
Reuters has also revealed for the first time how North Korea’s hacking group Lazarus, which allegedly helps fund Pyongyang’s nuclear weapons program, used Binance to launder some $5.4 million of cryptocurrency stolen in September 2020 from Slovakian crypto exchange Eterbase. In January, Reuters reported that Binance has kept weak money-laundering checks on its users until mid-2021 despite concerns raised by senior company officials.
Binance’s chief communications officer Patrick Hillmann told Reuters in an email that Binance did not consider the news outlet’s calculation to be accurate. Hillmann reportedly said that the exchange uses transaction monitoring and risk assessments to “ensure that any illegal funds are tracked, frozen, recovered and/or returned to their rightful owner” and is working closely with law enforcement to disrupt criminal networks using cryptocurrencies, including in Russia.
In a statement to Forbes, Binance has called the report a “woefully misinformed op-ed that uses outdated information from 2019 and unverified personal attestations.” “The fact is that Binance has some of the strictest AML policies in the fintech industry and plays a significant leadership role in helping law enforcement deal with cyber and financial crime. Since the article ran, we have received an outpouring of support from partners in law enforcement across the globe,” a Binance spokesperson said.
Editor’s note: the story and headline were updated to reflect Binance’s response.
Binance, the world’s largest cryptocurrency exchange by volume, has disputed claims that it has acted as a vehicle for the laundering of at least $2.35 billion in illicit funds.
A Reuters report claimed that Binance has become a “hub for hackers, fraudsters and drug traffickers” with strong links to Russia-based dark web market Hydra.
Matthew Price, Binance’s senior director of investigations who was the lead investigator on Hydra when he worked at the IRS criminal investigation, told CoinDesk: “What I think is very skewed in this report is that every exchange has exposure to dark net markets.”
Tigran Gambaryan, the exchange’s global head of intelligence who also worked at the IRS’ cyber crimes unit, added: “It’s something that completely disregards facts to get an agenda across.”
“The biggest part of this story is completely ignored. You can’t control deposits, you can only control what you can do afterwards,” Gambaryan added.
Price and Gambaryan said that Binance has a stringent process in place that handles exposure to fraud, dark net markets and scams using blockchain analytics software provided by Chainalysis and Elliptic.
“There is a system in place. We do have risk scoring for everything you can think of. We have everything tagged internally based on our tools, then we are able to do post-transaction monitoring with Chainalysis,” Gambaryan noted.
However, no evidence has been reported that links back to the game itself. Rather, warnings from head teachers and Police have led to misinformation about the content of the game and potential impact on children.
Most of the panic surrounds related content created on TikTok and YouTube that features the game characters in unsettling scenarios. One of these video included a song, Free Hugs, with lyrics “Cause I could just hug you here. Forever, forever. Till you breathe your last breath.”
If you are a parent or guardian concerned about this, it’s important to understand the game before you delete it from children’s devices. Rather than a knee-jerk reaction, it’s a chance to talk to your child about the content and then make an informed decision about it with them.
Poppy Playtime Age Rating
The game itself is a scary experience designed to thrill and unsettle. It has been rated as suitable for 13 year-olds by ESRB and for 12 year-olds by PEGI. This includes descriptors for Violence, Blood from ESRB and Moderate Violence and Horror from PEGI.
The VSC Rating Board, extend the PEGI rating by stating “this game features a sense of threat and dread throughout as the player’s character explores an abandoned factory. In one intense sequence, the player’s character is pursued by a monster, including through a series of dark air vents. In another sequence, a heavy box is dropped onto a fantasy character, causing it to fall from a height. Blood appears on some pipes that the character strikes as it falls.”
This applies to the game itself rather than any fan created content. There are also unofficial fan made versions of the game on Roblox (Poppy Playtime Morphs) which do not fall under the remit of ESRB or PEGI as they are user generated content.
Taking care to understand the actual source of potentially upsetting content is important for parents. Not only so we can ensure that the settings on our children’s social media and video accounts are appropriately configured, but to ensure we don’t over react to what is a popular game.
The real danger is that stories about Poppy Playtime and the Huggy Wuggy character spiral out of control like the Momo Challenge. We’ve already seen reports eager to connect the scary Huggy Wuggy character to children jumping out of windows or breath holding playground games.
This leads to a muddled response to actual concerns children have. Banning a child from a game they are enjoying because of a related video makes it much less likely for them to talk to parents if something genuinely upsetting happens online.
The real danger with this panicked response is that it separates parents from the gaming world of their child. Much better, is to use rating advice and to play the game ourselves. We can then be present in the gaming world of our children and provide informed guidance.
Poppy Playtime Creator
I spoke to Zach Belanger, President and CEO of Enchanted Mob who made the Poppy Playtime game. I asked who the game was aimed at. “Poppy Playtime was not created with the intention to target any specific audience. Bear in mind that this was the first game our studio ever created, and our main priority was to create something that we would enjoy playing ourselves.
Beyond that, we have a passion for any content we create to be enjoyable by audiences of all ages. To us, it isn’t accurate to say that we created Poppy Playtime to be consumed by kids or adults, but rather our goal was simply to inspire and entertain anyone who decided to play the game.”
With this in mind, I wondered if the warnings from schools had come as a surprise? “The vast majority of the controversy we are seeing regarding warnings from schools about the Huggy Wuggy character are completely untrue and/or grossly exaggerated.
One of the things we’ve read online is that Huggy Wuggy whispers creepy things into one’s ear while playing, but anyone who has actually played Poppy Playtime would know that Huggy Wuggy does not even have a voice in Chapter 1, so it’s impossible for him to have whispered anything.”
“As far as we are aware, all of these warnings from schools are originating from fan made content based off of our game, but if you want my personal opinion, I do not think that any of these videos should be cause for concern, and we appreciate all the hard work and dedication our fans are put toward creating content inspired off of Poppy Playtime.”
Huggy Wuggy Song Creator
The creator of one of the more popular pieces of fan content was Igor Gordiyenko who is TryHardNinja on YouTube. He created the controversial Huggy Wuggy song that has around 5 million views.
I asked what the inspiration for the song and reason for the lyrics. “I wrote the song inspired by the story and lore of Huggy Wuggy from the game Poppy Playtime. In the game the player investigates a toy factory in which all the employees disappeared and some of the toys that used to be developed there have become sinister killer monsters. Huggy Wuggy is one of the antagonist monsters in the game.
The jingle in the game and game’s soundtrack has the lyrics, ‘He’ll squeeze you ‘til you pop’. I thought it would be creative to take the original jingle which mentions hugging forever and make it into a more obvious sinister version to be truer to his new sinister persona following the event the game.”
I asked what he made of the response to the song and the warnings that were appearing in headlines. “As a father, I completely understand the concern. I didn’t intentionally make the song to scare young kids. It’s a song based on a monster from the indie horror game Poppy Playtime rated for teens and up. My video is targeted to the same audience.”
“The themes and visuals of my song and video are true to the character’s lore, actions and depiction in the game. I am not trying to make an innocent character seem scarier than they are. Much like Chucky from Child’s Play, Huggy Wuggy is and always was a horror character. My song is for fans of the source material which is not for young kids.”
I asked him what he had done to ensure that younger children didn’t have access to the video. “As a YouTube creator I have done everything in my power to make sure the video is not served to kids younger than 13. Since the moment of upload the video has been marked “Not made for kids.”
Since reports of the song being served on YouTube Kids started about a month ago I have been doing my own periodical sweeps of that platform and I have never found that video or song. I understand how my video being recommended to young kids would be concerning and inappropriate, but all evidence points to the previous reports saying that it’s on YouTube Kids to being false.”
What advice would he have for parents if they were worried about children finding the song and being upset by it? “As a parent, if even after making sure I’ve done everything I could to filter out this content and it still gets through, I would sit with my child and talk to them about what they saw, their feelings and reassure them that Huggy Wuggy is a made up character that can’t hurt them.”
Keeping Children Safe
Rather than warning children about specific dangers such as Momo or Huggy Wuggy, parents and professionals can better help children by teaching them good practices online.
Fostering an atmosphere of openness and transparency about online activity ensures that children can thrive. If you do notice them switching screens on their devices when approached or new numbers or email addresses on their devices it’s worth checking in with them.
Keep video games and YouTube watching in shared family spaces. In video games, you can also set-up restrictions on friends and accessing user generate content that may include Poppy Playtime themed add ons. Also, ensure you have Restricted mode on for your child’s account this content is not available to them.
Controversial facial recognition firm Clearview AI has been ordered to destroy all images and facial templates belonging to individuals living in Australia by the country’s national privacy regulator.
Clearview, which claims to have scraped 10 billion images of people from social media sites in order to identify them in other photos, sells its technology to law enforcement agencies. It was trialled by the Australian Federal Police (AFP) between October 2019 and March 2020.
Now, following an investigation, Australia privacy regulator, the Office of the Australian Information Commissioner (OAIC), has found that the company breached citizens’ privacy. “The covert collection of this kind of sensitive information is unreasonably intrusive and unfair,” said OAIC privacy commissioner Angelene Falk in a press statement. “It carries significant risk of harm to individuals, including vulnerable groups such as children and victims of crime, whose images can be searched on Clearview AI’s database.”
Said Falk: “When Australians use social media or professional networking sites, they don’t expect their facial images to be collected without their consent by a commercial entity to create biometric templates for completely unrelated identification purposes. The indiscriminate scraping of people’s facial images, only a fraction of whom would ever be connected with law enforcement investigations, may adversely impact the personal freedoms of all Australians who perceive themselves to be under surveillance.”
The investigation into Clearview’s practices by the OAIC was carried out in conjunction with the UK’s Information Commissioner’s Office (ICO). However, the ICO has yet to make a decision about the legality of Clearview’s work in the UK. The agency says it is “considering its next steps and any formal regulatory action that may be appropriate under the UK data protection laws.”
As reported by The Guardian, Clearview itself intends to appeal the decision. “Clearview AI operates legitimately according to the laws of its places of business,” Mark Love, a lawyer for the firm BAL Lawyers representing Clearview, told the publication. “Not only has the commissioner’s decision missed the mark on the manner of Clearview AI’s manner of operation, the commissioner lacks jurisdiction.”
Clearview argues that the images it collected were publicly available, so no breach of privacy occurred, and that they were published in the US, so Australian law does not apply.
Around the world, though, there is growing discontent with the spread of facial recognition systems, which threaten to eliminate anonymity in public spaces. Yesterday, Facebook parent company Meta announced it was shutting down the social platform’s facial recognition feature and deleting the facial templates it created for the system. The company cited “growing concerns about the use of this technology as a whole.” Meta also recently paid a $650 million settlement after the tech was found to have breached privacy laws in Illinois in the US.