On Thursday, Instagram has introduced changes to policies assisting in shutting accounts. The social network is strengthening its moderation policies. It is going to add a new notification which will warn users who violate rules when their account is closed or being deleted. Even more, the warning will reveal the history of posts, comments, and stories that the company has had to remove from the user’s account. The alert will also display the reason, i.e., why Instagram has removed the data, to the user who published it. After that, Instagram will warn users that if they post something that breaches their guidelines again, the company will delete the particular account. The move reveals the combative steps that the photo-sharing site is taking to remove offensive material from the network.
According to the company, it will start informing users if their account is under risk of deletion. The notification system will enable users to appeal directly through the app if the company has removed some material. In other words, users will not have to rush for getting assistance from the Help Center to file their complaint. The company noted in the beginning; users can appeal for data deleted on aspects like hate speech, anti-terrorist material, bullying and harassment, along with nudity and pornography. The social network platform aims to introduce more options in the future.
Whereas Instagram assures if they remove some content by mistake, the company will restore the post. Besides, it will remove the assault from the account’s record. According to Instagram, those are the preventive measures which will assist it to keep the platform safe and helpful place. Along with the new alert, Instagram is going to offer its moderating team more liberty to ban bad actors. Before this, the social network has blocked users who publish a particular percentage of breaching material. From now, it will ban people who knowingly and repeatedly break its policies within a specific time frame.