FaceApp – Reasons to think twice about using FaceApp
FaceApp – Reasons to think twice about using FaceApp

What is FaceApp ; Features of FacApp

FaceApp was developed a couple of years ago by Wireless Lab, whose headquarters are in St. Petersburg, Russia. In simple words you can change photos of yourself so you appear older or younger, or “swap” genders. FaceApp uses photos from your phone to show what an older or younger version of yourself could look like. The app has gone viral in recent days and It’s currently one of the most downloaded apps for both iOS and Android, as #faceappchallenge posts have taken over Facebook and Instagram.

FaceApp is not new. It first hit the headlines in 2017 with its “ethnicity filters”. These purported to transform faces of one ethnicity into another – a feature that sparked a backlash and was quickly dropped. FaceApp has been controversial in the past, back in 2017 when its two features were removed in response to criticism for highlighting physical racism. The president of the company, Yaroslav Goncharov, apologized in April of that year after one of his filters, called “hot” , clearing the skin but according to Wireless Lab the “hot” filter there was an “unfortunate side effect of the underlying neural network “. The “neural network” to which he referred is the set of artificial intelligence algorithms “to modify a face in any photo while remaining photo realistic ” explained Goncharov at the time. However it wasn’t a technical glitch because in past when Google photos was in initial stages it used to mix up objects There was an other filter that allowed users to change race , to look like Afro-descendant, Indian, Asian or Caucasian, was erased a day after being launched before the wave of criticism received from all over the word.

Security Concerns

The app is developed by a small team from Saint Petersburg, who claim to have built the tech in-house using some open source AI tech. Of course, counter to what many alarmist tweets might have you think, being Russian doesn’t automatically make it a spy app. But it’s something to keep in mind given the difference in regulatory checks and balances between countries. But the question is still there whether FaceApp is doing enough to protect users’ data. Some researchers have already questioned the app’ security risks related to FaceApp, which has been out for years, suddenly went viral all over again seemingly overnight. Others have pointed to the fact that the app requires a data connection, suggesting that might be indicate the app is surreptitiously grabbing users’ photos. Experts in computer security warning that for a user it is very difficult to know if facial recognition is being used in applications and for what purposes.

David Shipley with Beauceron Security said that while the product may be advertised as ‘free’, it’s your information that’s the real price. He noted that even a picture of your face can do plenty of damage. “It can be used to identify you and unlock things like your smartphone or other things and you want to make sure you protect your identity.”

And, in some corners of Twitter, people have pointed to the app’s Russian origins — FaceApp is owned by a company, Wireless Lab, that’s based in St. Petersburg — as a sign of something nefarious. Issues like this can pose serious problems in the future and People are rightfully wary of the numerous ways their data could be accessed or exposed by an app developer.

In recent months, FaceApp has likely a fun feature you’ve probably seen on your social media feeds, letting you and your friends morph faces to look older, younger, or smiling when you frown. You can even see what you’d look like if you presented as another gender (with this last feature often proving a little problematic in and of itself). And a brand-new feature from the app has landed the company in even more hot water online.

The app just added ethnicity filters, which morphs your face into what the app deems Asian, black, Caucasian, and Indian ethnicity. Yes, this involves lightening or darkening your skin, plus altering your facial features and hair texture to fit ethnic stereotypes. Here are a few shots of the filters in action

 

Privacy Policy of FaceApp

If you’re only using pictures that you already have online as Facebook profile pics or Instagram selfies, probably not. It’s possible FaceApp could use your images in conjunction with other data it collects to profile you, as many companies on the internet do, but there’s no indication the threat of misuse is any higher here. You should always exercise caution by not uploading sensitive material. The app’s privacy statement is vague, which isn’t a great sign, especially combined with the fact that it does not explicitly tell users it will be uploading their selfies. The terms indicate it also keeps metadata from photos and may identify or track you with cookies. It says it will not share or sell your photos, but data it collects from you could be anonymity and used for advertising or other purposes.

In addition to photos generated via the app, FaceApp’s privacy policy states that it also collects location information and information about users’ browsing history. “These tools collect information sent by your device or our Service, including the web pages you visit, add-ons, and other information that assists us in improving the Service,” the policy says.

And though it states that “we will not rent or sell your information to third parties outside FaceApp,” it explicitly says that it shares information with “third-party advertising partners,” in order to deliver targeted ads.

FaceApp CEO Yaroslav Goncharov has not yet responded to questions about the company’s privacy policy. But this type of privacy policy isn’t necessarily unusual, though it is definitely vague. It’s also yet another example of how tech companies quietly vacuum up information about their users in ways that aren’t immediately clear.

It also doesn’t help that FaceApp doesn’t exactly have the best track record. The app was widely criticized for “racist” selfie filters that lightened users’ skin tones in 2017, soon after it launched. A few months later, the app sparked even more outrage when it unveiled a series of “ethnicity change” filters.

If anything, the latest controversy about the app’s privacy practices is a sign that we might finally be starting to learn from Cambridge Analytica and so many other data privacy nightmares. Yes, the viral app of the moment might be irresistible, but there are reasons to think twice before giving up access to your information.

Data Collected by FaceApp

Some people online have claimed that FaceApp automatically uploads all the photos on your phone as soon as you use it for the first time. However security experts using network traffic analysers have cast doubt on this claim. It does not appear that the app uploads any photos until you ask it to.The terms of service of FaceApp are not very different from those of other applications.

The application states that it can collect “user content (for example, photos and other materials) that it publishes through the service”.

“We will not rent or sell your information to third parties outside FaceApp,” says their privacy notice.

But one aspect that analysts have highlighted is that FaceApp indicates that it can take the information to a different jurisdiction than the country where the user is.

Facial Recognition in FaceApp

A controversy similar to that of FaceApp revolved around Facebook at the beginning of the year with the call # 10YearsChallenge, the challenge of uploading a photo from 10 years ago and a current one to admire the passage of time.

There are plenty of papers that use tasks like taking features from one face and applying to another to demonstrate the quality of a generative model. Things like putting on or taking out glasses, changing gender, etc.

For example, if you have a good autoencoder for faces you can try to:

  1. Get latent representations for a photo of a person with and without glasses,
  2. Subtract them and add to a latent representation of your face without glasses
  3. GFenerate a new face with the result

Probably this would be a face of you wearing glasses, with varying degrees of success and quality. Kind of like the “king – man + woman = queen” in Word2vec? Many recent generative models can do something similar for pictures too.

Something like the DiscoGAN or CycleGAN can do even better by learning how to map from the subset of faces with glasses to the subset of faces without glasses even if you don’t have explicit glasses/no-glasses pairs of otherwise similar photos.

I am pretty sure that a CycleGAN​ could be trained to do exactly what this app do and probably would have results as good as the ones in the app. It’s just a question of manually tagging the celeb faces dataset with age/gender/etc labels and training the maps (and the results of the app look a lot like typical results using this dataset, so I suspect that’s exactly what they did).

But I suspect that you could even do it (probably with less quality) with a dcGAN based adversarial autoencoder, with the procedure I described above. I don’t know if it would match the quality of the results from this app but it would be an initial version.

End Points :

Some experts said that could serve as platforms for social networking that “train” their recognition tools facial that can be used both for commercial purposes (sell advertising) as of surveillance (private or government) like in past Brexit election was hacked using people’s data or we could say there profiles form social media , the idea was to find different POI( Point of interest) to show them relevant ads .

Other questions

Did I miss anything  you’re finding answers about ? Let me know in the comments below or over on Facebook.