SLGP Header

A Mechanism to Inhibit Unsolicited Texts from OSN User Walls

IJCSEC Front Page

Abstract
Online Social Networks (OSNs) are today one of the most popular interactive medium to share, communicate, and distribute a significant amount of human life information. In OSNs, information filtering can also be used for a different, more responsive, function. This is owing to the fact that in OSNs there is the possibility of posting or commenting other posts on particular public/private regions, called in general walls. Information filtering can therefore be used to give users the ability to automatically control the messages written on their own walls, by filtering out unwanted messages. OSNs provide very little support to prevent unwanted messages on user walls. For instance, Facebook permits users to state who is allowed to insert messages in their walls (i.e., friends, defined groups of friends or friends of friends). Though, no content-based partialities are preserved and therefore it is not possible to prevent undesired communications, for instance political or offensive ones, no matter of the user who posts them. To propose and experimentally evaluate an automated system, called Filtered Wall (FW), able to filter unwanted messages from OSN user walls.
Keywords:Information filtering, online social networks, Short text classification, policy-based personalization
INTRODUCTION
ONLINE Social Networks (OSNs) are today one of the most popular interactive medium to share, communicate and distribute an important amount of human living information. On a daily basis and continuous messages involve the swap of several types of content, including free content, image, audio, and video information. Along with Facebook information1 average user creates 90 pieces of substance every month, while more than 30 billion quantity of substance (web links, news stories, notes, blog posts, photo albums, etc.) are distributed every month. The vast and dynamic character of this information produces the premise for the employment of web content mining strategies aimed to automatically discover useful information dormant contained by the information. They are instrumental to give a dynamic support in complex and sophisticated tasks involved in OSN administration, for example such as access power or information filtering. Information filtering has been significantly searched for what concerns textual documents and, more recently, web content. However, the aim of the majority of these proposals is mainly to provide users a classification mechanism to avoid they are overwhelmed by unsuccessful information. In OSNs, information filtering can also be exploited for a dissimilar, more responsive, purpose. This is due to the fact that in OSNs there is the possibility of posting or commenting other posts on exacting public/private regions, called in common walls.

References:

  1. A. Adomavicius and G. Tuzhilin, “Toward the Next Generation of Recommender Systems: A Survey of the State-of-the-Art and Possible Extensions,” IEEE Trans. Knowledge and Data Eng., vol. 17, no. 6, pp. 734-749, June 2005.
  2. M. Chau and H. Chen, “A Machine Learning Approach to Web Page Filtering Using Content and Structure Analysis,” Decision Support Systems, vol. 44, no. 2, pp. 482-494, 2008.
  3. R.J. Mooney and L. Roy, “Content-Based Book Recommending Using Learning for Text Categorization,” Proc. Fifth ACM Conf. Digital Libraries, pp. 195-204, 2000. [4] F. Sebastiani, “Machine Learning in Automated Text Categorization,” ACM Computing Surveys, vol. 34, no. 1, pp. 1-47, 2002.
  4. M. Vanetti, E. Binaghi, B. Carminati, M. Carullo, and E. Ferrari, “Content-Based Filtering in On-Line Social Networks,” Proc. ECML/PKDD Workshop Privacy and Security Issues in Data Mining and Machine Learning (PSDML ’10), 2010.
  5. N.J. Belkin and W.B. Croft, “Information Filtering and Information Retrieval: Two Sides of the Same Coin?” Comm. ACM, vol. 35, no. 12, pp. 29-38, 1992.
  6. P.J. Denning, “Electronic Junk,” Comm. ACM, vol. 25, no. 3, pp. 163-165, 1982.
  7. P.W. Foltz and S.T. Dumais, “Personalized Information Delivery: An Analysis of Information Filtering Methods,” Comm. ACM, vol. 35, no. 12, pp. 51-60, 1992.
  8. P.S. Jacobs and L.F. Rau, “Scisor: Extracting Information from On- Line News,” Comm. ACM, vol. 33, no. 11, pp. 88-97, 1990.
  9. S. Pollock, “A Rule-Based Message Filtering System,” ACM Trans. Office Information Systems, vol. 6, no. 3, pp. 232-254, 1988.
  10. P.E. Baclace, “Competitive Agents for Information Filtering,” Comm. ACM, vol. 35, no. 12, p. 50, 1992.
  11. P.J. Hayes, P.M. Andersen, I.B. Nirenburg, and L.M. Schmandt, “Tcs: A Shell for Content-Based Text Categorization,” Proc. Sixth IEEE Conf. Artificial Intelligence Applications (CAIA ’90), pp. 320-326, 1990.
  12. G.Amati and F. Crestani, “Probabilistic Learning for Selective Dissemination of Information,” Information Processing and Management, vol. 35, no. 5, pp. 633-654, 1999.
  13. M.J. Pazzani and D. Billsus, “Learning and Revising User Profiles: The Identification of Interesting Web Sites,” Machine Learning, vol. 27, no. 3,pp.313-331,1997.