Disclaimer: This is a user generated content submitted by a member of the WriteUpCafe Community. The views and writings here reflect that of the author and not of WriteUpCafe. If you have any complaints regarding this post kindly report it to us.

Social media is a platform where information is shared on a large scale by the masses. Information that varies in nature, any information uploaded on any online platform could be Personal, Professional, Informative, Entertaining, etc.

Facebook is the biggest social media platform where millions of people upload all kinds of information. However, the motive of such platforms is to promote goodness, awareness, etc. there are always some people and entities that push misleading, abusive, or hateful information on it which in return promotes bad behavior to the users.

So in an attempt to curb such issues and problems, Facebook has come up with a subtle method to improve how its users use their platform.

Last year in June, Twitter tested prompts that urged its users to open and read a link before retweeting it, which stood out to be successful and resulted in people opening the content and reading it. In some cases, people chose not to retweet some stories at all.

Just like Twitter, Facebook has just started testing its new interface feature, which notifies people through a prompt to read or at least open the link before sharing it on their feed. Facebook said that it is trying to promote and encourage positive and informed sharing. The Facebook prompt says, “You’re about to share this article without opening it,” and later, “Sharing articles without reading them may mean missing key facts.” The prompt then gives the user two options to either open the article or repost it as usual.

“This test in its initial phase will be rolled out to 6 percent of Android users worldwide,” A Facebook spokesperson revealed. 

Although a user will still be able to click through to share a story, the primary motive behind this testing is to create obstruction or friction to the process of sharing so that the user might rethink and give it a second thought before sharing the content which could contain harmful or inflammatory information. This prompt has come up as the right way to tackle such misleading information that has dominated the platform for quite some time.

Facebook, similar to this, released a pop-up message last year in June that warned people before sharing an article that is older than 90-days to stop people from sharing out- 

of-the-context misleading information. Again after a few months, Facebook rolled out another pop-up message which informed people and mentioned the date and source of the Covid-19 related topic they are sharing.

Any piece of information that is misleading or false might impact the psychology or behavior of a person towards that particle topic; topics like Covid-19, Elections, Rumor spreading, etc., and a drive to promote such false information or propaganda can change the course of action of many, leading to undesired results. Any such information on a social media platform like Facebook doesn’t only affect an individual but also those who are around that person, and the whole audience that is consuming such information as well which, usually is in very high numbers. That’s why social media platforms keep exploring and testing such prompts to discourage their users from sharing such false information.

Social media may not be the right place to get or learn your facts and news. Social media, more than often, has turned out to be deceiving and not correct. So it would help if you also trust your sensibility and conscience before believing any information you may come across on Facebook, Twitter, or the Internet as a whole.

Such practices are not implemented to deliver immediate results. Instead, these are done to get gradual but effective results in the long run. And such methods prove to be fruitful since it helps clean out the platform of any negative influence. When users become aware that their bad behavior is being observed, whether through alerts or pop-up messages, they tend to become more careful while using social platforms resulting in automated moderation. 

Some also argue that social media platforms dominated by harmful or misleading content should scrap their platform altogether and start themselves again clean from scratch.

Source: Read The Article Before You Share It: Facebook In Its New Testing

Login

Welcome to WriteUpCafe Community

Join our community to engage with fellow bloggers and increase the visibility of your blog.
Join WriteUpCafe