“Hey everyone, don’t accept anything from me. It’s not me!” How many times has a message like this come across your social media feed? They’re often the result of “sock puppet” requests, false online identities and user accounts created for deceptive purposes.
A researcher at Southern Illinois University Carbondale is working on technologies to stop such requests, that can lead to profile hacking, identity theft and other havoc for social media users.
“The massive growth of social media usage has incentivized perpetrators to connect to users through identity deception, which is often successful due to the lack of adequate verification of declared information,” Talukder said.
Talukder aims to build a digital framework rooted in cognitive psychology, user-centric research and machine learning methods to defend against such accounts and requests in online social networks. The work will begin in April and last at least two years.
In 2019, identity theft cost victims almost $17 million. Talukder’s work could not only help prevent identity theft but also cut down on fake accounts that push propaganda during ongoing conflicts, such as currently in Ukraine, and a hostile government’s attempts to influence American politics.
Social media users prime for identity fraud
Social media sites generate revenue with targeted advertising, using personal information to hone and deliver such messaging. The more info you provide, the better targeted the ads, but that also means crooks have more opportunities than ever to steal identities or perpetrate fraud.
Attackers often use connection relations on social media to access private and sensitive user data, post false or abusive information, or scam and influence the perceptions of victims, Talukder said. People who actively use social media are 30% more likely to be affected by identity fraud.
How sock puppets work
Sock puppets are automatic or semi-automatic profiles that mimic human profiles. Fake profiles or their operators send requests to ‘‘follow’’ or ‘‘friend’’ social media users, who often accept them. If a user has friends in common with the fake account, for instance, the chance of accepting the request is 80%.
Other times, a fake profile is created to essentially duplicate a user’s online presence. Such attacks, called identity clone attacks, are devised to collect personal information and direct online fraud.
But fake accounts can still be detected by examining characteristics such as profile history, posting frequency, posting pattern, profile age, befriending pattern and like/follow patterns. Ultimately, the researchers will develop an app platform aimed at collecting detailed, baseline “truth behavior data” from social network users based on those characteristics.
“The underlying assumption is that the behavior of a true profile would be different than that of a fake one. In order to devise mechanisms for identifying and eliminating fake accounts on a near real-time basis, we need to collect the truth behavior data that will help us to differentiate between fake and real ones,” Talukder said.
Suspicious connection attempts
Talukder will use his findings to create a pending connection decision classifier for Facebook, which will route suspicious connection attempts to a spam folder. The researchers will combine sock puppet education and motivation in developing this new interface, which will reduce clutter and cognitive load for users by displaying each pending friend request on a single screen, with a large, centrally-placed profile photo.
When the user taps on the profile photo of a pending friend, a screen that includes the profile summary of the pending friend will appear. Users can navigate their pending friend list using the next and previous buttons on the sides of the profile photo.
“Further, the interface will transform the ‘Confirm’ button into an inhibitive attractor, by displaying it in the same gray color as the ‘Delete’ button. To address the case where the user does not feel comfortable making a decision, we also will include a ‘Skip’ button shown in the same gray color,” Talukder said.
Getting into the minds of both sock puppeteers and their victims will allow researchers to address the dynamic that exists between them, Talukder said.
“The work not only will enhance the understanding of just-in-time motivations and behaviors related to social network risks,” he said, “but it will also help sociologists gain deeper insights from underexplored social and spatial dimensions provided by social networks in order to test relevant theories.”