Facebook, stung by congressional hearings into alleged Russian meddling in the US Presidential Elections 2016 through its platform, came back with a roar and posted solid third quarter earnings for 2017. Revenue for the quarter grew by 49%, while profits exploded by 79%.
Facebook still remains a huge magnet, attracting users worldwide as the company added 284 monthly active users in the last four quarters, finishing the fourth quarter with 2.072 billion users, a growth rate of 16% year-over-year.
It is indeed stunning that Facebook is still able to boast double digit user base expansion even after crossing 2 billion users worldwide, but a large portion of that growth is coming from outside developed countries, with Asia Pacific and Rest of the World regions doing bulk of the heavy lifting.
The Real Issue Facing Facebook
But unfortunately for Facebook, a lot of attention was diverted away from its solid third quarter earning report and into how Facebook is going to act, now that the whole world is aware of how a dedicated group will be able to use the platform and influence opinions in the area of politics, among others.
Facebook announced recently that the company will be doubling the number of employees working on safety and security issues to 20,000 by the end of 2018. Facebook, had 23,165 employees at the end of third quarter, and the plan to add 10,000 more to just focus on safety and security is quite a massive decision. Facebook will also be using Artificial intelligence and other technological solutions to help it combat what Mark Zuckerberg calls “bad content and bad actors.”
Facebook made this comment before U.S. Senators, who were also questioning Alphabet and Twitter regarding Russian interference in last year’s elections. Top Facebook lawyer Colin Stretch explained to the panel that the new crop of employees would be deployed to keep track of extremists and their behavior on the site.
This is what Mark Zuckerberg said during the third quarter earnings call with analysts, and it goes to show how serious the issue has already become:
“I’m dead serious about this, and the reason I’m talking about this on our earnings call is that I’ve directed our teams to invest so much in security — on top of the other investments we’re making — that it will significantly impact our profitability going forward, and I wanted our investors to hear that directly from me. I believe this will make our society stronger and in doing so will be good for all of us over the long term. But I want to be clear about what our priority is: protecting our community is more important
than maximizing our profits.So security and the integrity of our services will be a major focus.”
Facebook knows that the regulatory noose will start to tighten as lawmakers seethe with ire over social media in general being used to tilt public opinion. It is now a matter of when, not if, and Facebook isn’t going to sit around and lose control. It is already taking several proactive steps to address the issue as best it can.
The only problem is, this is a gargantuan task by any measure. Considering the number of users, posts, images, videos and ads on the platform, even a team of 20,000 is only likely to scratch the surface. AI and other algorithm-based automation tools will help, as will users reporting offensive content, but the sheer magnitude of the task is more than a little daunting.
Unfortunately, Facebook has two hard choices to make: sit and wait for allegations to continue being thrown at it, which will eventually lead to regulatory stifling; or do as much as can in the public eye so the government sees it not as an adversary to be punished, but a common platform requiring more than just punitive action against Facebook.
Thanks for visiting. Please support 1redDrop on social media: Facebook | Twitter